Hello Jeremy, The branch HPSJAVA-494 is now updated for sequential read. I have tested it with several evio files. Both the EvioToLcio and the monitoring app work fine with these changes. LcioToEvio can be forced back to Memory Mapped mode using the -M switch. The two modes of reading are equally fast, give or take a second, however, the memory foot print of the Memory Mapped version is significantly larger, requiring > 3GB to run the code, 3.44 GB versus 1.51 GB sequentially. On the farm machines at JLab, it will have a big advantage to run the smaller memory footprint. Benchmark: LcioToEvio running "DataQualityRecon" (which has a memory leak :-) (time java -cp distribution/target/hps-distribution-3.3.1-SNAPSHOT-bin.jar org.hps.evio.EvioToLcio -d HPS-EngRun2015-2mm-v1 -L SEVERE -x DataQualityRecon.lcsim -DoutputFile=tmp /data/HPS/data/hps_005184.evio.0 ) Sequential: time = 2m20.002s user, 0m4.167s system -- Memory: 1.51 GB Memory Map: time = 2m15.889s user, 0m8.047s system -- Memory = 3.44 GB I tested this with a *local* compiled copy of jevio. When I remove this copy from the .m2 directory and try to down load a jevio-4.4.6-SNAPSHOT, Maven seems to not be able to retrieve the jar. Perhaps no surprise if this this jar isn’t there? See: https://jira.slac.stanford.edu/browse/HPSJAVA-494 <https://jira.slac.stanford.edu/browse/HPSJAVA-494> Best, Maurik ######################################################################## Use REPLY-ALL to reply to list To unsubscribe from the HPS-SOFTWARE list, click the following link: https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1