LISTSERV mailing list manager LISTSERV 16.5

Help for HPS-SVN Archives


HPS-SVN Archives

HPS-SVN Archives


HPS-SVN@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

HPS-SVN Home

HPS-SVN Home

HPS-SVN  December 2015

HPS-SVN December 2015

Subject:

r3998 - in /java/branches/jeremy-dev: ./ analysis/src/main/java/org/hps/analysis/trigger/ conditions/ conditions/src/main/java/org/hps/conditions/ conditions/src/main/java/org/hps/conditions/api/ conditions/src/main/java/org/hps/conditions/cli/ conditions/src/main/java/org/hps/conditions/database/ conditions/src/main/java/org/hps/conditions/dummy/ conditions/src/main/java/org/hps/conditions/ecal/ conditions/src/main/java/org/hps/conditions/run/ conditions/src/main/java/org/hps/conditions/svt/ crawler/src/main/java/org/hps/crawler/ datacat-client/src/main/java/org/hps/datacat/client/ detector-data/detectors/HPS-EngRun2015-1_5mm-v3-4-fieldmap/ detector-data/detectors/HPS-EngRun2015-Nominal-v3-4-fieldmap/ detector-model/src/main/java/org/lcsim/geometry/compact/converter/ evio/src/main/java/org/hps/evio/ job/ job/src/main/java/org/hps/job/ monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/ monitoring-drivers/src/main/java/org/hps/monitoring/ecal/plots/ recon/src/main/java/org/hps/recon/filtering/ record-util/src/main/java/org/hps/record/ record-util/src/main/java/org/hps/record/daqconfig/ record-util/src/main/java/org/hps/record/epics/ record-util/src/main/java/org/hps/record/evio/ record-util/src/main/java/org/hps/record/svt/ record-util/src/main/java/org/hps/record/triggerbank/ run-database/src/main/java/org/hps/run/database/ run-database/src/test/java/org/hps/run/database/ steering-files/src/main/resources/org/hps/steering/monitoring/ steering-files/src/main/resources/org/hps/steering/readout/ steering-files/src/main/resources/org/hps/steering/recon/ steering-files/src/main/resources/org/hps/steering/users/baltzell/ steering-files/src/main/resources/org/hps/steering/users/phansson/ tracking/src/main/java/org/hps/recon/tracking/ tracking/src/main/java/org/hps/recon/tracking/gbl/ tracking/src/main/java/org/hps/svt/alignment/ users/src/main/java/org/hps/users/baltzell/ users/src/main/java/org/hps/users/meeg/ users/src/main/java/org/hps/users/phansson/ users/src/main/java/org/hps/users/spaul/ users/src/main/java/org/hps/users/spaul/feecc/

From:

[log in to unmask]

Reply-To:

Notification of commits to the hps svn repository <[log in to unmask]>

Date:

Tue, 1 Dec 2015 23:56:10 -0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (9307 lines)

Author: [log in to unmask]
Date: Tue Dec  1 15:55:47 2015
New Revision: 3998

Log:
Dev work on run db and datacat; also includes trunk merges.

Added:
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/AbstractConditionsObjectConverter.java
      - copied, changed from r3997, java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/AbstractConditionsObjectConverter.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioReconMetadataReader.java
      - copied, changed from r3960, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioMetadataReader.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/PathFilter.java
    java/branches/jeremy-dev/detector-data/detectors/HPS-EngRun2015-1_5mm-v3-4-fieldmap/
      - copied from r3995, java/trunk/detector-data/detectors/HPS-EngRun2015-1_5mm-v3-4-fieldmap/
    java/branches/jeremy-dev/detector-data/detectors/HPS-EngRun2015-Nominal-v3-4-fieldmap/millepede-dump-HPS-EngRun2015-Nominal-v3-4-fieldmap.dat
      - copied unchanged from r3995, java/trunk/detector-data/detectors/HPS-EngRun2015-Nominal-v3-4-fieldmap/millepede-dump-HPS-EngRun2015-Nominal-v3-4-fieldmap.dat
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/RfFitFunction.java
      - copied unchanged from r3995, java/trunk/evio/src/main/java/org/hps/evio/RfFitFunction.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/RfFitterDriver.java
      - copied unchanged from r3995, java/trunk/evio/src/main/java/org/hps/evio/RfFitterDriver.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/RfHit.java
      - copied unchanged from r3995, java/trunk/evio/src/main/java/org/hps/evio/RfHit.java
    java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/BeamspotTransformFilter.java
      - copied unchanged from r3964, java/trunk/recon/src/main/java/org/hps/recon/filtering/BeamspotTransformFilter.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/baltzell/EngineeringRun2015TrigPairs1_Pass2.lcsim
      - copied unchanged from r3964, java/trunk/steering-files/src/main/resources/org/hps/steering/users/baltzell/EngineeringRun2015TrigPairs1_Pass2.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/HitRecon.lcsim
      - copied unchanged from r3995, java/trunk/steering-files/src/main/resources/org/hps/steering/users/phansson/HitRecon.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim
      - copied, changed from r3968, java/trunk/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLKinkData.java
      - copied unchanged from r3995, java/trunk/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLKinkData.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/svt/alignment/MillepedeCompactDump.java
      - copied unchanged from r3995, java/trunk/tracking/src/main/java/org/hps/svt/alignment/MillepedeCompactDump.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java
      - copied, changed from r3964, java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java
      - copied, changed from r3964, java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfHit.java
      - copied unchanged from r3964, java/trunk/users/src/main/java/org/hps/users/baltzell/RfHit.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/phansson/STUtils.java
      - copied unchanged from r3995, java/trunk/users/src/main/java/org/hps/users/phansson/STUtils.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/phansson/StraightThroughAnalysisDriver.java
      - copied unchanged from r3995, java/trunk/users/src/main/java/org/hps/users/phansson/StraightThroughAnalysisDriver.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/spaul/StyleUtil.java
      - copied unchanged from r3965, java/trunk/users/src/main/java/org/hps/users/spaul/StyleUtil.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/spaul/SumEverything.java
      - copied unchanged from r3965, java/trunk/users/src/main/java/org/hps/users/spaul/SumEverything.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/spaul/feecc/
      - copied from r3965, java/trunk/users/src/main/java/org/hps/users/spaul/feecc/
Removed:
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/AbstractConditionsObjectConverter.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioMetadataReader.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunProcessor.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java
    java/branches/jeremy-dev/run-database/src/test/java/org/hps/run/database/TiTriggerOffsetTest.java
Modified:
    java/branches/jeremy-dev/   (props changed)
    java/branches/jeremy-dev/analysis/src/main/java/org/hps/analysis/trigger/TriggerTurnOnDriver.java
    java/branches/jeremy-dev/conditions/   (props changed)
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/Converter.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/DatabaseConditionsManager.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasConditionsLoader.java
    java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasMyaDataReader.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileUtilities.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatUtilities.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
    java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/RunFilter.java
    java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java
    java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java
    java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java
    java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java
    java/branches/jeremy-dev/detector-model/src/main/java/org/lcsim/geometry/compact/converter/HPSTrackerBuilder.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/AbstractSvtEvioReader.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/EvioToLcio.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/SvtEventFlagger.java
    java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/TestRunTriggeredReconToLcio.java
    java/branches/jeremy-dev/job/pom.xml
    java/branches/jeremy-dev/job/src/main/java/org/hps/job/JobManager.java
    java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SensorOccupancyPlotsDriver.java
    java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SvtPlotUtils.java
    java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/ecal/plots/EcalLedSequenceMonitor.java
    java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/EventFlagFilter.java
    java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/PulserTriggerFilterDriver.java
    java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/V0CandidateFilter.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/RecordProcessor.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigDriver.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigEvioProcessor.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoop.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java
    java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsType.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsVariable.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseDaoFactory.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunManager.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummary.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java
    java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/monitoring/EcalLedSequenceStandalone.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/readout/HPSReconNoReadout.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/recon/EngineeringRun2015FullReconMC_Pass2.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullReconMC_Pass2_Gbl.lcsim
    java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullRecon_Pass2_Gbl.lcsim
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/TrackUtils.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutput.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutputDriver.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLRefitterDriver.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GblUtils.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/HpsGblRefitter.java
    java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/MakeGblTracks.java
    java/branches/jeremy-dev/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java

Modified: java/branches/jeremy-dev/analysis/src/main/java/org/hps/analysis/trigger/TriggerTurnOnDriver.java
 =============================================================================
--- java/branches/jeremy-dev/analysis/src/main/java/org/hps/analysis/trigger/TriggerTurnOnDriver.java	(original)
+++ java/branches/jeremy-dev/analysis/src/main/java/org/hps/analysis/trigger/TriggerTurnOnDriver.java	Tue Dec  1 15:55:47 2015
@@ -50,7 +50,7 @@
     IHistogram1D clusterEOne_Random_thetaY[][] = new IHistogram1D[2][5];
     IHistogram1D clusterEOne_RandomSingles1_thetaY[][] = new IHistogram1D[2][5];
     
-    private boolean showPlots = true;
+    private boolean showPlots = false;
     private int nEventsProcessed = 0;
     private int nSimSingles1 = 0;
     private int nResultSingles1 = 0;
@@ -70,6 +70,10 @@
     public TriggerTurnOnDriver() {
     }
     
+	public void setShowPlots(boolean showPlots) {
+		this.showPlots = showPlots;
+	}
+
     @Override
     protected void detectorChanged(Detector detector) {
         

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java	Tue Dec  1 15:55:47 2015
@@ -33,7 +33,10 @@
  * time to achieve the proper behavior.
  *
  * @author Jeremy McCormick, SLAC
+ * 
+ * @deprecated Use built-in options of job manager.
  */
+@Deprecated
 public class ConditionsDriver extends Driver {
 
     /** The name of the detector model. */

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java	Tue Dec  1 15:55:47 2015
@@ -15,7 +15,7 @@
  *
  * @author Jeremy McCormick, SLAC
  */
-public class BaseConditionsObject implements ConditionsObject {
+public abstract class BaseConditionsObject implements ConditionsObject {
 
     /**
      * Field name for collection ID.
@@ -440,4 +440,22 @@
         }
         return rowsUpdated != 0;
     }
+    
+    public boolean equals(Object object) {
+        // Is it the same object?
+        if (object == this) {
+            return true;
+        }
+        // Are these objects the same class?
+        if (object.getClass().equals(this.getClass())) {
+            BaseConditionsObject otherObject = BaseConditionsObject.class.cast(object);
+            // Do the row IDs and database table name match?
+            if (otherObject.getTableMetaData().getTableName().equals(this.getTableMetaData().getTableName()) &&
+                    this.getRowId() == otherObject.getRowId()) {
+                // These are considered the same object (same database table and row ID).
+                return true;
+            }
+        }
+        return false;
+    }
 }

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java	Tue Dec  1 15:55:47 2015
@@ -102,6 +102,15 @@
         if (object == null) {
             throw new IllegalArgumentException("The object argument is null.");
         }
+        //checkCollectionId(object);
+        final boolean added = this.objects.add(object);
+        if (!added) {
+            throw new RuntimeException("Failed to add object.");
+        }
+        return added;
+    }
+
+    private void checkCollectionId(final ObjectType object) {
         // Does this collection have a valid ID yet?
         if (this.getCollectionId() != BaseConditionsObject.UNSET_COLLECTION_ID) {
             // Does the object that is being added have a collection ID?
@@ -122,11 +131,6 @@
                 }
             }
         }
-        final boolean added = this.objects.add(object);
-        if (!added) {
-            throw new RuntimeException("Failed to add object.");
-        }
-        return added;
     }
 
     /**
@@ -344,7 +348,7 @@
         } else {
             // If the collection already exists in the database with this ID then it cannot be inserted.
             if (this.exists()) {
-                throw new DatabaseObjectException("The collection " + this.collectionId
+                throw new DatabaseObjectException("The collection ID " + this.collectionId
                         + " cannot be inserted because it already exists in the " + this.tableMetaData.getTableName()
                         + " table.", this);
             }
@@ -703,7 +707,6 @@
     public void writeCsv(final File file) throws IOException {
         FileWriter fileWriter = null;
         CSVPrinter csvFilePrinter = null;
-
         try {
             fileWriter = new FileWriter(file);
             csvFilePrinter = new CSVPrinter(fileWriter, CSVFormat.DEFAULT);

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java	Tue Dec  1 15:55:47 2015
@@ -32,16 +32,13 @@
      */
     private static final Options OPTIONS = new Options();
     static {
-        OPTIONS.addOption(new Option("h", false, "print help for add command"));
-        OPTIONS.addOption("r", true, "starting run number (required)");
-        OPTIONS.getOption("r").setRequired(true);
-        OPTIONS.addOption("e", true, "ending run number (default is starting run number)");
-        OPTIONS.addOption("t", true, "table name (required)");
-        OPTIONS.getOption("t").setRequired(true);
-        OPTIONS.addOption("c", true, "collection ID (required)");
-        OPTIONS.getOption("c").setRequired(true);
-        OPTIONS.addOption("u", true, "user name (optional)");
-        OPTIONS.addOption("m", true, "notes about this conditions set (optional)");
+        OPTIONS.addOption(new Option("h", "help", false, "print help for add command"));
+        OPTIONS.addOption("r", "run-start", true, "starting run number (required)");
+        OPTIONS.addOption("e", "run-end", true, "ending run number (default is starting run number)");
+        OPTIONS.addOption("t", "table", true, "table name (required)");
+        OPTIONS.addOption("c", "collection", true, "collection ID (required)");
+        OPTIONS.addOption("u", "user", true, "user name (optional)");
+        OPTIONS.addOption("m", "notes", true, "notes about this conditions set (optional)");
     }
 
     /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java	Tue Dec  1 15:55:47 2015
@@ -34,13 +34,12 @@
     private static Options OPTIONS = new Options();
 
     static {
-        OPTIONS.addOption(new Option("h", false, "print help"));
-        OPTIONS.addOption(new Option("d", true, "detector name"));
-        OPTIONS.addOption(new Option("r", true, "run number"));
-        OPTIONS.addOption(new Option("p", true, "database connection properties file"));
-        OPTIONS.addOption(new Option("x", true, "conditions XML configuration file"));
-        OPTIONS.addOption(new Option("t", true, "conditions tag to use for filtering records"));
-        OPTIONS.addOption(new Option("l", true, "log level of the conditions manager (INFO, FINE, etc.)"));
+        OPTIONS.addOption(new Option("h", "help", false, "print help"));
+        OPTIONS.addOption(new Option("d", "detector", true, "detector name"));
+        OPTIONS.addOption(new Option("r", "run", true, "run number"));
+        OPTIONS.addOption(new Option("p", "connection", true, "database connection properties file"));
+        OPTIONS.addOption(new Option("x", "xml", true, "conditions XML configuration file"));
+        OPTIONS.addOption(new Option("t", "tag", true, "conditions tag to use for filtering records"));
     }
 
     /**
@@ -177,13 +176,6 @@
         // Create new manager.
         this.conditionsManager = DatabaseConditionsManager.getInstance();
 
-        // Set the conditions manager log level (does not affect logger of this class or sub-commands).
-        if (commandLine.hasOption("l")) {
-            final Level newLevel = Level.parse(commandLine.getOptionValue("l"));
-            Logger.getLogger(DatabaseConditionsManager.class.getPackage().getName()).setLevel(newLevel);
-            LOGGER.config("conditions manager log level will be set to " + newLevel.toString());
-        }
-
         // Connection properties.
         if (commandLine.hasOption("p")) {
             final File connectionPropertiesFile = new File(commandLine.getOptionValue("p"));

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java	Tue Dec  1 15:55:47 2015
@@ -33,12 +33,10 @@
      */
     private static final Options OPTIONS = new Options();
     static {
-        OPTIONS.addOption(new Option("h", false, "print help for load command"));
-        OPTIONS.addOption(new Option("t", true, "name of the target table (required)"));
-        OPTIONS.getOption("t").setRequired(true);
-        OPTIONS.addOption(new Option("f", true, "input data file path (required)"));
-        OPTIONS.getOption("f").setRequired(true);
-        OPTIONS.addOption(new Option("d", true, "description for the collection log"));
+        OPTIONS.addOption(new Option("h", "help", false, "print help for load command"));
+        OPTIONS.addOption(new Option("t", "table", true, "name of the target table (required)"));
+        OPTIONS.addOption(new Option("f", "file", true, "input data file path (required)"));
+        OPTIONS.addOption(new Option("d", "description", true, "description for the collection log"));
     }
 
     /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java	Tue Dec  1 15:55:47 2015
@@ -38,12 +38,12 @@
     static Options options = new Options();
 
     static {
-        options.addOption(new Option("h", false, "print help for print command"));
-        options.addOption(new Option("t", true, "table name"));
-        options.addOption(new Option("i", false, "print the ID for the records (off by default)"));
-        options.addOption(new Option("f", true, "write print output to a file (must be used with -t option)"));
-        options.addOption(new Option("H", false, "suppress printing of conditions record and table info"));
-        options.addOption(new Option("d", false, "use tabs for field delimiter instead of spaces"));
+        options.addOption(new Option("h", "help", false, "print help for print command"));
+        options.addOption(new Option("t", "table", true, "table name"));
+        options.addOption(new Option("i", "print-id", false, "print the ID for the records (off by default)"));
+        options.addOption(new Option("f", "file", true, "write print output to a file (must be used with -t option)"));
+        options.addOption(new Option("H", "no-header", false, "suppress printing of conditions record and table info"));
+        options.addOption(new Option("d", "tabs", false, "use tabs for field delimiter instead of spaces"));
     }
 
     /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java	Tue Dec  1 15:55:47 2015
@@ -35,8 +35,8 @@
      */
     static Options options = new Options();
     static {
-        options.addOption(new Option("h", false, "Show help for run-summary command"));
-        options.addOption(new Option("a", false, "Print all collections found for the run"));
+        options.addOption(new Option("h", "print", false, "Show help for run-summary command"));
+        options.addOption(new Option("a", "all", false, "Print all collections found for the run"));
     }
 
     /**

Copied: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/AbstractConditionsObjectConverter.java (from r3997, java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/AbstractConditionsObjectConverter.java)
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/api/AbstractConditionsObjectConverter.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/AbstractConditionsObjectConverter.java	Tue Dec  1 15:55:47 2015
@@ -1,11 +1,17 @@
-package org.hps.conditions.api;
+package org.hps.conditions.database;
 
 import java.sql.SQLException;
 import java.util.logging.Logger;
 
+import org.hps.conditions.api.BaseConditionsObjectCollection;
+import org.hps.conditions.api.ConditionsObject;
+import org.hps.conditions.api.ConditionsObjectCollection;
+import org.hps.conditions.api.ConditionsObjectException;
+import org.hps.conditions.api.ConditionsRecord;
+import org.hps.conditions.api.DatabaseObjectException;
+import org.hps.conditions.api.TableMetaData;
+import org.hps.conditions.api.TableRegistry;
 import org.hps.conditions.api.ConditionsRecord.ConditionsRecordCollection;
-import org.hps.conditions.database.DatabaseConditionsManager;
-import org.hps.conditions.database.MultipleCollectionsAction;
 import org.lcsim.conditions.ConditionsConverter;
 import org.lcsim.conditions.ConditionsManager;
 
@@ -17,7 +23,6 @@
  * @author Jeremy McCormick, SLAC
  * @param <T> The type of the returned data which should be a class extending {@link BaseConditionsObjectCollection}.
  */
-// TODO: Move to conditions.database package (not an API class).
 public abstract class AbstractConditionsObjectConverter<T> implements ConditionsConverter<T> {
 
     /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java	Tue Dec  1 15:55:47 2015
@@ -3,7 +3,6 @@
 import java.sql.ResultSet;
 import java.sql.SQLException;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.ConditionsObject;
 import org.hps.conditions.api.ConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectException;

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java	Tue Dec  1 15:55:47 2015
@@ -5,7 +5,6 @@
 import java.sql.ResultSet;
 import java.sql.SQLException;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.ConditionsObjectException;
 import org.hps.conditions.api.ConditionsTag;
 import org.hps.conditions.api.ConditionsTag.ConditionsTagCollection;

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/Converter.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/Converter.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/Converter.java	Tue Dec  1 15:55:47 2015
@@ -4,8 +4,6 @@
 import java.lang.annotation.Retention;
 import java.lang.annotation.RetentionPolicy;
 import java.lang.annotation.Target;
-
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 
 /**
  * This is an annotation for providing converter configuration for {@link org.hps.conditions.api.ConditionsObject}

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java	Tue Dec  1 15:55:47 2015
@@ -6,7 +6,6 @@
 
 import javassist.Modifier;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.BaseConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObject;
 import org.hps.conditions.api.TableRegistry;

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/DatabaseConditionsManager.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/DatabaseConditionsManager.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/DatabaseConditionsManager.java	Tue Dec  1 15:55:47 2015
@@ -20,7 +20,6 @@
 import java.util.logging.Level;
 import java.util.logging.Logger;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.ConditionsObject;
 import org.hps.conditions.api.ConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsRecord.ConditionsRecordCollection;
@@ -349,7 +348,7 @@
      * Close the database connection.
      */
     public synchronized void closeConnection() {
-        LOGGER.fine("closing connection");
+        //LOGGER.finer("closing connection");
         if (this.connection != null) {
             try {
                 if (!this.connection.isClosed()) {
@@ -361,7 +360,7 @@
         }
         this.connection = null;
         this.isConnected = false;
-        LOGGER.fine("connection closed");
+        //LOGGER.finer("connection closed");
     }
 
     /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java	Tue Dec  1 15:55:47 2015
@@ -1,6 +1,6 @@
 package org.hps.conditions.dummy;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
+import org.hps.conditions.database.AbstractConditionsObjectConverter;
 import org.hps.conditions.dummy.DummyConditionsObject.DummyConditionsObjectCollection;
 
 /**

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java	Tue Dec  1 15:55:47 2015
@@ -4,12 +4,12 @@
 import java.util.HashMap;
 import java.util.Map;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.AbstractIdentifier;
 import org.hps.conditions.api.BaseConditionsObject;
 import org.hps.conditions.api.BaseConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectException;
+import org.hps.conditions.database.AbstractConditionsObjectConverter;
 import org.hps.conditions.database.Converter;
 import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.conditions.database.Field;
@@ -290,11 +290,16 @@
         public EcalChannelCollection getData(final ConditionsManager conditionsManager, final String name) {
             final EcalChannelCollection collection = super.getData(conditionsManager, name);
             final Subdetector ecal = DatabaseConditionsManager.getInstance().getEcalSubdetector();
-            if (ecal.getDetectorElement() != null) {
-                collection.buildGeometryMap(ecal.getDetectorElement().getIdentifierHelper(), ecal.getSystemID());
+            if (ecal != null) {
+                if (ecal.getDetectorElement() != null) {
+                    collection.buildGeometryMap(ecal.getDetectorElement().getIdentifierHelper(), ecal.getSystemID());
+                } else {
+                    // This can happen when not running with the detector-framework jar in the classpath.
+                    throw new IllegalStateException("The ECal subdetector's detector element is not setup.");
+                }
             } else {
-                // This can happen when not running with the detector-framework jar in the classpath.
-                throw new IllegalStateException("The ECal subdetector's detector element is not setup.");
+                // Bad detector or conditions system not initialized properly.
+                throw new IllegalStateException("The ECal subdetector object is null.");
             }
             return collection;
         }

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java	Tue Dec  1 15:55:47 2015
@@ -25,7 +25,7 @@
  * The rows are accessible as raw CSV data through the Apache Commons CSV library, and this data must be manually cleaned up and converted 
  * to the correct data type before being inserted into the conditions database.
  *
- * @author Jeremy McCormick
+ * @author Jeremy McCormick, SLAC
  */
 public final class RunSpreadsheet {
 
@@ -38,23 +38,23 @@
         "start_time", 
         "end_time", 
         "to_tape", 
-        "n_events", 
+        "events",
         "files",
         "trigger_rate", 
         "target", 
         "beam_current",
         "beam_x", 
-        "beam_y", 
-        "trigger_config",
-        /*
-        "ecal_fadc_mode", 
+        "beam_y",
+        "trigger_config", 
+        /* Next 7 are actually hidden in the spreadsheet! */
+        "ecal_fadc_mode",
         "ecal_fadc_thresh", 
         "ecal_fadc_window", 
         "ecal_cluster_thresh_seed", 
         "ecal_cluster_thresh_cluster",
         "ecal_cluster_window_hits", 
-        "ecal_cluster_window_pairs",
-        */ 
+        "ecal_cluster_window_pairs", 
+        /* End hidden fields. */
         "ecal_scalers_fadc", 
         "ecal_scalers_dsc", 
         "svt_y_position", 
@@ -62,8 +62,7 @@
         "svt_offset_time",
         "ecal_temp", 
         "ecal_lv_current", 
-        "notes"
-    };
+        "notes"};
 
     /**
      * Read the CSV file from the command line and print the data to the terminal (just a basic test).
@@ -161,14 +160,15 @@
         return records;
     }
     
-    public static final AnotherSimpleDateFormat DATE_FORMAT = new AnotherSimpleDateFormat("MM/dd/yyyy H:mm"); 
+    public static final RunSpreadsheetDateFormat DATE_FORMAT = new RunSpreadsheetDateFormat("MM/dd/yyyy H:mm"); 
     private static final TimeZone TIME_ZONE =  TimeZone.getTimeZone("EST");
     
     
     @SuppressWarnings("serial")
     public
-    static class AnotherSimpleDateFormat extends SimpleDateFormat {
-        public AnotherSimpleDateFormat(String formatstring) {
+    static class RunSpreadsheetDateFormat extends SimpleDateFormat {
+        
+        public RunSpreadsheetDateFormat(String formatstring) {
             super(formatstring);
             //Calendar c = Calendar.getInstance(TIME_ZONE,Locale.US);
             //setTimeZone(TIME_ZONE);

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasConditionsLoader.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasConditionsLoader.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasConditionsLoader.java	Tue Dec  1 15:55:47 2015
@@ -6,22 +6,15 @@
 import hep.aida.IPlotter;
 import hep.aida.IPlotterStyle;
 
-import java.io.BufferedReader;
 import java.io.File;
-import java.io.FileReader;
-import java.io.IOException;
 import java.sql.SQLException;
-import java.text.SimpleDateFormat;
 import java.util.ArrayList;
 import java.util.Collections;
 import java.util.Date;
 import java.util.GregorianCalendar;
-import java.util.HashMap;
 import java.util.HashSet;
 import java.util.List;
-import java.util.Map;
 import java.util.Set;
-import java.util.TimeZone;
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
@@ -180,7 +173,7 @@
         options.addOption(new Option("t", false, "use run table format (from crawler) for bias"));
         options.addOption(new Option("d", false, "discard first line of MYA data (for myaData output)"));
         options.addOption(new Option("g", false, "Actually load stuff into DB"));
-        options.addOption(new Option("b", true, "beam current file"));
+//        options.addOption(new Option("b", true, "beam current file"));
         options.addOption(new Option("s", false, "Show plots"));
 
         final CommandLineParser parser = new DefaultParser();
@@ -261,9 +254,9 @@
             }
         }
 
-        if (cl.hasOption("b") && cl.hasOption("m") && cl.hasOption("p")) {
-            readBeamData(new File(cl.getOptionValue("b")), runList, positionRunRanges, biasRunRanges);
-        }
+//        if (cl.hasOption("b") && cl.hasOption("m") && cl.hasOption("p")) {
+//            readBeamData(new File(cl.getOptionValue("b")), runList, positionRunRanges, biasRunRanges);
+//        }
 
         // load to DB
         if (cl.hasOption("g")) {
@@ -427,111 +420,111 @@
         }
     }
 
-    private static void readBeamData(File file, List<RunData> runList, List<SvtPositionRunRange> positionRanges, List<SvtBiasRunRange> biasRanges) {
-        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
-        dateFormat.setTimeZone(TimeZone.getTimeZone("America/New_York"));
-
-        Map<Integer, SvtPositionRunRange> positionRangeMap = new HashMap<Integer, SvtPositionRunRange>();
-        for (SvtPositionRunRange range : positionRanges) {
-            positionRangeMap.put(range.getRun().getRun(), range);
-        }
-        Map<Integer, SvtBiasRunRange> biasRangeMap = new HashMap<Integer, SvtBiasRunRange>();
-        for (SvtBiasRunRange range : biasRanges) {
-            biasRangeMap.put(range.getRun().getRun(), range);
-        }
-
-        try {
-            BufferedReader br = new BufferedReader(new FileReader(file));
-            String line;
-            System.out.println("myaData header: " + br.readLine()); //discard the first line
-            System.out.println("run\ttotalQ\ttotalQBias\tfracBias\ttotalQNom\tfracNom\ttotalQ1pt5\tfrac1pt5\ttotalGatedQ\ttotalGatedQBias\tfracGatedBias\ttotalGatedQNom\tfracGatedNom\ttotalGatedQ1pt5\tfracGated1pt5");
-
-            for (RunData run : runList) {
-                double totalCharge = 0;
-                double totalChargeWithBias = 0;
-                double totalChargeWithBiasAtNominal = 0;
-                double totalChargeWithBiasAt1pt5 = 0;
-                double totalGatedCharge = 0;
-                double totalGatedChargeWithBias = 0;
-                double totalGatedChargeWithBiasAtNominal = 0;
-                double totalGatedChargeWithBiasAt1pt5 = 0;
-                Date lastDate = null;
-
-                while ((line = br.readLine()) != null) {
-                    String arr[] = line.split(" +");
-
-                    if (arr.length != 4) {
-                        throw new java.text.ParseException("this line is not correct.", 0);
-                    }
-                    Date date = dateFormat.parse(arr[0] + " " + arr[1]);
-                    if (date.after(run.getEndDate())) {
-                        break;
-                    }
-                    if (date.before(run.getStartDate())) {
-                        continue;
-                    }
-
-                    double current, livetime;
-                    if (arr[2].equals("<undefined>")) {
-                        current = 0;
-                    } else {
-                        current = Double.parseDouble(arr[2]);
-                    }
-                    if (arr[3].equals("<undefined>")) {
-                        livetime = 0;
-                    } else {
-                        livetime = Math.min(100.0, Math.max(0.0, Double.parseDouble(arr[3]))) / 100.0;
-                    }
-
-                    if (date.after(run.getStartDate())) {
-                        if (lastDate != null) {
-                            double dt = (date.getTime() - lastDate.getTime()) / 1000.0;
-                            double dq = dt * current; // nC
-                            double dqGated = dt * current * livetime; // nC
-
-                            totalCharge += dq;
-                            totalGatedCharge += dqGated;
-                            SvtBiasRunRange biasRunRange = biasRangeMap.get(run.getRun());
-                            if (biasRunRange != null) {
-                                for (SvtBiasMyaRange biasRange : biasRunRange.getRanges()) {
-                                    if (biasRange.includes(date)) {
-                                        totalChargeWithBias += dq;
-                                        totalGatedChargeWithBias += dqGated;
-
-                                        SvtPositionRunRange positionRunRange = positionRangeMap.get(run.getRun());
-                                        if (positionRunRange != null) {
-                                            for (SvtPositionMyaRange positionRange : positionRunRange.getRanges()) {
-                                                if (positionRange.includes(date)) {
-                                                    if (Math.abs(positionRange.getBottom()) < 0.0001 && Math.abs(positionRange.getTop()) < 0.0001) {
-                                                        totalChargeWithBiasAtNominal += dq;
-                                                        totalGatedChargeWithBiasAtNominal += dqGated;
-                                                    } else if (Math.abs(positionRange.getBottom() - 0.0033) < 0.0001 && Math.abs(positionRange.getTop() - 0.0031) < 0.0001) {
-                                                        totalChargeWithBiasAt1pt5 += dq;
-                                                        totalGatedChargeWithBiasAt1pt5 += dqGated;
-                                                    }
-                                                    break;
-                                                }
-                                            }
-                                        }
-
-                                        break;
-                                    }
-                                }
-                            }
-
-                        }
-                    }
-                    lastDate = date;
-                }
-//                System.out.format("run\t%d\ttotalQ\t%.0f\ttotalQBias\t%.0f\tfracBias\t%f\ttotalQNom\t%.0f\tfracNom\t%f\ttotalQ1pt5\t%.0f\tfrac1pt5\t%f\ttotalGatedQ\t%.0f\ttotalGatedQBias\t%.0f\tfracGatedBias\t%f\ttotalGatedQNom\t%.0f\tfracGatedNom\t%f\ttotalGatedQ1pt5\t%.0f\tfracGated1pt5\t%f\n", run.getRun(), totalCharge, totalChargeWithBias, totalChargeWithBias / totalCharge, totalChargeWithBiasAtNominal, totalChargeWithBiasAtNominal / totalCharge, totalChargeWithBiasAt1pt5, totalChargeWithBiasAt1pt5 / totalCharge, totalGatedCharge, totalGatedChargeWithBias, totalGatedChargeWithBias / totalGatedCharge, totalGatedChargeWithBiasAtNominal, totalGatedChargeWithBiasAtNominal / totalGatedCharge, totalGatedChargeWithBiasAt1pt5, totalGatedChargeWithBiasAt1pt5 / totalGatedCharge);
-                System.out.format("%d\t%.0f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%f\n", run.getRun(), totalCharge, totalChargeWithBias, totalChargeWithBias / totalCharge, totalChargeWithBiasAtNominal, totalChargeWithBiasAtNominal / totalCharge, totalChargeWithBiasAt1pt5, totalChargeWithBiasAt1pt5 / totalCharge, totalGatedCharge, totalGatedChargeWithBias, totalGatedChargeWithBias / totalGatedCharge, totalGatedChargeWithBiasAtNominal, totalGatedChargeWithBiasAtNominal / totalGatedCharge, totalGatedChargeWithBiasAt1pt5, totalGatedChargeWithBiasAt1pt5 / totalGatedCharge);
-            }
-            br.close();
-
-        } catch (IOException e) {
-            throw new RuntimeException(e);
-        } catch (java.text.ParseException e) {
-            throw new RuntimeException(e);
-        }
-    }
+//    private static void readBeamData(File file, List<RunData> runList, List<SvtPositionRunRange> positionRanges, List<SvtBiasRunRange> biasRanges) {
+//        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
+//        dateFormat.setTimeZone(TimeZone.getTimeZone("America/New_York"));
+//
+//        Map<Integer, SvtPositionRunRange> positionRangeMap = new HashMap<Integer, SvtPositionRunRange>();
+//        for (SvtPositionRunRange range : positionRanges) {
+//            positionRangeMap.put(range.getRun().getRun(), range);
+//        }
+//        Map<Integer, SvtBiasRunRange> biasRangeMap = new HashMap<Integer, SvtBiasRunRange>();
+//        for (SvtBiasRunRange range : biasRanges) {
+//            biasRangeMap.put(range.getRun().getRun(), range);
+//        }
+//
+//        try {
+//            BufferedReader br = new BufferedReader(new FileReader(file));
+//            String line;
+//            System.out.println("myaData header: " + br.readLine()); //discard the first line
+//            System.out.println("run\ttotalQ\ttotalQBias\tfracBias\ttotalQNom\tfracNom\ttotalQ1pt5\tfrac1pt5\ttotalGatedQ\ttotalGatedQBias\tfracGatedBias\ttotalGatedQNom\tfracGatedNom\ttotalGatedQ1pt5\tfracGated1pt5");
+//
+//            for (RunData run : runList) {
+//                double totalCharge = 0;
+//                double totalChargeWithBias = 0;
+//                double totalChargeWithBiasAtNominal = 0;
+//                double totalChargeWithBiasAt1pt5 = 0;
+//                double totalGatedCharge = 0;
+//                double totalGatedChargeWithBias = 0;
+//                double totalGatedChargeWithBiasAtNominal = 0;
+//                double totalGatedChargeWithBiasAt1pt5 = 0;
+//                Date lastDate = null;
+//
+//                while ((line = br.readLine()) != null) {
+//                    String arr[] = line.split(" +");
+//
+//                    if (arr.length != 4) {
+//                        throw new java.text.ParseException("this line is not correct.", 0);
+//                    }
+//                    Date date = dateFormat.parse(arr[0] + " " + arr[1]);
+//                    if (date.after(run.getEndDate())) {
+//                        break;
+//                    }
+//                    if (date.before(run.getStartDate())) {
+//                        continue;
+//                    }
+//
+//                    double current, livetime;
+//                    if (arr[2].equals("<undefined>")) {
+//                        current = 0;
+//                    } else {
+//                        current = Double.parseDouble(arr[2]);
+//                    }
+//                    if (arr[3].equals("<undefined>")) {
+//                        livetime = 0;
+//                    } else {
+//                        livetime = Math.min(100.0, Math.max(0.0, Double.parseDouble(arr[3]))) / 100.0;
+//                    }
+//
+//                    if (date.after(run.getStartDate())) {
+//                        if (lastDate != null) {
+//                            double dt = (date.getTime() - lastDate.getTime()) / 1000.0;
+//                            double dq = dt * current; // nC
+//                            double dqGated = dt * current * livetime; // nC
+//
+//                            totalCharge += dq;
+//                            totalGatedCharge += dqGated;
+//                            SvtBiasRunRange biasRunRange = biasRangeMap.get(run.getRun());
+//                            if (biasRunRange != null) {
+//                                for (SvtBiasMyaRange biasRange : biasRunRange.getRanges()) {
+//                                    if (biasRange.includes(date)) {
+//                                        totalChargeWithBias += dq;
+//                                        totalGatedChargeWithBias += dqGated;
+//
+//                                        SvtPositionRunRange positionRunRange = positionRangeMap.get(run.getRun());
+//                                        if (positionRunRange != null) {
+//                                            for (SvtPositionMyaRange positionRange : positionRunRange.getRanges()) {
+//                                                if (positionRange.includes(date)) {
+//                                                    if (Math.abs(positionRange.getBottom()) < 0.0001 && Math.abs(positionRange.getTop()) < 0.0001) {
+//                                                        totalChargeWithBiasAtNominal += dq;
+//                                                        totalGatedChargeWithBiasAtNominal += dqGated;
+//                                                    } else if (Math.abs(positionRange.getBottom() - 0.0033) < 0.0001 && Math.abs(positionRange.getTop() - 0.0031) < 0.0001) {
+//                                                        totalChargeWithBiasAt1pt5 += dq;
+//                                                        totalGatedChargeWithBiasAt1pt5 += dqGated;
+//                                                    }
+//                                                    break;
+//                                                }
+//                                            }
+//                                        }
+//
+//                                        break;
+//                                    }
+//                                }
+//                            }
+//
+//                        }
+//                    }
+//                    lastDate = date;
+//                }
+////                System.out.format("run\t%d\ttotalQ\t%.0f\ttotalQBias\t%.0f\tfracBias\t%f\ttotalQNom\t%.0f\tfracNom\t%f\ttotalQ1pt5\t%.0f\tfrac1pt5\t%f\ttotalGatedQ\t%.0f\ttotalGatedQBias\t%.0f\tfracGatedBias\t%f\ttotalGatedQNom\t%.0f\tfracGatedNom\t%f\ttotalGatedQ1pt5\t%.0f\tfracGated1pt5\t%f\n", run.getRun(), totalCharge, totalChargeWithBias, totalChargeWithBias / totalCharge, totalChargeWithBiasAtNominal, totalChargeWithBiasAtNominal / totalCharge, totalChargeWithBiasAt1pt5, totalChargeWithBiasAt1pt5 / totalCharge, totalGatedCharge, totalGatedChargeWithBias, totalGatedChargeWithBias / totalGatedCharge, totalGatedChargeWithBiasAtNominal, totalGatedChargeWithBiasAtNominal / totalGatedCharge, totalGatedChargeWithBiasAt1pt5, totalGatedChargeWithBiasAt1pt5 / totalGatedCharge);
+//                System.out.format("%d\t%.0f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%.0f\t%f\t%.0f\t%f\t%.0f\t%f\n", run.getRun(), totalCharge, totalChargeWithBias, totalChargeWithBias / totalCharge, totalChargeWithBiasAtNominal, totalChargeWithBiasAtNominal / totalCharge, totalChargeWithBiasAt1pt5, totalChargeWithBiasAt1pt5 / totalCharge, totalGatedCharge, totalGatedChargeWithBias, totalGatedChargeWithBias / totalGatedCharge, totalGatedChargeWithBiasAtNominal, totalGatedChargeWithBiasAtNominal / totalGatedCharge, totalGatedChargeWithBiasAt1pt5, totalGatedChargeWithBiasAt1pt5 / totalGatedCharge);
+//            }
+//            br.close();
+//
+//        } catch (IOException e) {
+//            throw new RuntimeException(e);
+//        } catch (java.text.ParseException e) {
+//            throw new RuntimeException(e);
+//        }
+//    }
 }

Modified: java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasMyaDataReader.java
 =============================================================================
--- java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasMyaDataReader.java	(original)
+++ java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/svt/SvtBiasMyaDataReader.java	Tue Dec  1 15:55:47 2015
@@ -141,10 +141,6 @@
 
             records = parser.getRecords();
 
-            // Remove first two rows of headers.
-            records.remove(0);
-            records.remove(0);
-
             parser.close();
         } catch (FileNotFoundException ex) {
             Logger.getLogger(SvtBiasMyaDataReader.class.getName()).log(Level.SEVERE, null, ex);

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	Tue Dec  1 15:55:47 2015
@@ -11,6 +11,7 @@
 import java.util.Set;
 
 import org.hps.conditions.database.ConnectionParameters;
+import org.hps.datacat.client.DatacatConstants;
 import org.hps.datacat.client.DatasetFileFormat;
 import org.hps.datacat.client.DatasetSite;
 
@@ -41,20 +42,18 @@
 
     /**
      * The name of the folder in the data catalog for inserting data (under "/HPS" root folder).
-     * <p>
-     * Default provided for Eng Run 2015 data.
      */
     private String datacatFolder = null;
 
     /**
-     * Set whether extraction of metadata from files is enabled.
+     * Set whether extraction of metadata is enabled.
      */
     private boolean enableMetadata;
 
     /**
-     * Set of file formats for filtering files.
-     */
-    Set<DatasetFileFormat> formats = new HashSet<DatasetFileFormat>();
+     * Set of accepted file formats.
+     */
+    private Set<DatasetFileFormat> formats = new HashSet<DatasetFileFormat>();
 
     /**
      * The maximum depth to crawl.
@@ -69,7 +68,7 @@
     /**
      * The dataset site for the datacat.
      */
-    private DatasetSite site;
+    private DatasetSite site = DatasetSite.JLAB;
 
     /**
      * A timestamp to use for filtering input files on their creation date.
@@ -85,6 +84,21 @@
      * Dry run for not actually executing updates.
      */
     private boolean dryRun = false;
+    
+    /**
+     * Base URL of datacat client.
+     */
+    private String baseUrl = DatacatConstants.BASE_URL;
+    
+    /**
+     * Root URL of datacat client (e.g. 'HPS').
+     */
+    private String rootFolder = DatacatConstants.ROOT_FOLDER;
+    
+    /**
+     * Set of paths used for filtering files (file's path must match one of these).
+     */
+    private Set<String> paths = new HashSet<String>();
 
     /**
      * Get the set of runs that will be accepted for the job.
@@ -329,4 +343,28 @@
     boolean dryRun() {
         return this.dryRun;
     }
+    
+    void setBaseUrl(String baseUrl) {
+        this.baseUrl = baseUrl;        
+    }
+    
+    String baseUrl() {
+        return this.baseUrl;
+    }
+    
+    void setRootFolder(String rootFolder) {
+        this.rootFolder = rootFolder;
+    }
+    
+    String rootFolder() {
+        return this.rootFolder;
+    }    
+    
+    void addPath(String path) {
+        this.paths.add(path);
+    }
+    
+    Set<String> paths() {
+        return this.paths;
+    }
 }

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileUtilities.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileUtilities.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileUtilities.java	Tue Dec  1 15:55:47 2015
@@ -3,7 +3,7 @@
 import java.io.File;
 
 /**
- * File utilities for crawler.
+ * File utilities for the datacat crawler.
  *
  * @author Jeremy McCormick, SLAC
  */
@@ -19,4 +19,45 @@
         final String name = file.getName();
         return Integer.parseInt(name.substring(4, 10));
     }
+    
+    /**
+     * Get a cached file path, assuming that the input file path is on the JLAB MSS e.g. it starts with "/mss".
+     * If the file is not on the JLAB MSS an error will be thrown.
+     * <p>
+     * If the file is already on the cache disk just return the same file.
+     *
+     * @param mssFile the MSS file path
+     * @return the cached file path (prepends "/cache" to the path)
+     * @throws IllegalArgumentException if the file is not on the MSS (e.g. path does not start with "/mss")
+     */
+    public static File getCachedFile(final File mssFile) {
+        if (!isMssFile(mssFile)) {
+            throw new IllegalArgumentException("File " + mssFile.getPath() + " is not on the JLab MSS.");
+        }
+        File cacheFile = mssFile;
+        if (!isCachedFile(mssFile)) {
+            cacheFile = new File("/cache" + mssFile.getAbsolutePath());
+        }
+        return cacheFile;
+    }
+    
+    /**
+     * Return <code>true</code> if this is a file on the cache disk e.g. the path starts with "/cache".
+     *
+     * @param file the file
+     * @return <code>true</code> if the file is a cached file
+     */
+    public static boolean isCachedFile(final File file) {
+        return file.getPath().startsWith("/cache");
+    }
+    
+    /**
+     * Return <code>true</code> if this file is on the JLAB MSS e.g. the path starts with "/mss".
+     *
+     * @param file the file
+     * @return <code>true</code> if the file is on the MSS
+     */
+    public static boolean isMssFile(final File file) {
+        return file.getPath().startsWith("/mss");
+    }
 }

Added: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java	(added)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,84 @@
+package org.hps.crawler;
+
+import java.io.File;
+import java.io.FileFilter;
+import java.nio.file.FileVisitResult;
+import java.nio.file.Path;
+import java.nio.file.SimpleFileVisitor;
+import java.nio.file.attribute.BasicFileAttributes;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.hps.datacat.client.DatasetFileFormat;
+
+/**
+ * Visitor which creates a {@link FileSet} from walking a directory tree.
+ * <p>
+ * Any number of {@link java.io.FileFilter} objects can be registered with this visitor to restrict which files are
+ * accepted.
+ *
+ * @author Jeremy McCormick, SLAC
+ */
+final class CrawlerFileVisitor extends SimpleFileVisitor<Path> {
+
+    /**
+     * The run log containing information about files from each run.
+     */
+    private final FileSet fileSet = new FileSet();
+
+    /**
+     * A list of file filters to apply.
+     */
+    private final List<FileFilter> filters = new ArrayList<FileFilter>();
+
+    /**
+     * Run the filters on the file to tell whether it should be accepted or not.
+     *
+     * @param file the EVIO file
+     * @return <code>true</code> if file should be accepted
+     */
+    private boolean accept(final File file) {
+        boolean accept = true;
+        for (final FileFilter filter : this.filters) {
+            accept = filter.accept(file);
+            if (!accept) {
+                break;
+            }
+        }
+        return accept;
+    }
+
+    /**
+     * Add a file filter.
+     *
+     * @param filter the file filter
+     */
+    void addFilter(final FileFilter filter) {
+        this.filters.add(filter);
+    }
+
+    /**
+     * Get the file set created by visiting the directory tree.
+     *
+     * @return the file set from visiting the directory tree
+     */
+    FileSet getFileSet() {
+        return this.fileSet;
+    }
+
+    /**
+     * Visit a single file.
+     *
+     * @param path the file to visit
+     * @param attrs the file attributes
+     */
+    @Override
+    public FileVisitResult visitFile(final Path path, final BasicFileAttributes attrs) {
+        final File file = path.toFile();
+        if (this.accept(file)) {
+            final DatasetFileFormat format = DatacatUtilities.getFileFormat(file);
+            fileSet.addFile(format, file);
+        }
+        return FileVisitResult.CONTINUE;
+    }
+}

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	Tue Dec  1 15:55:47 2015
@@ -1,15 +1,10 @@
 package org.hps.crawler;
 
 import java.io.File;
-import java.io.FileFilter;
 import java.io.IOException;
 import java.nio.file.FileVisitOption;
-import java.nio.file.FileVisitResult;
 import java.nio.file.Files;
-import java.nio.file.Path;
-import java.nio.file.SimpleFileVisitor;
 import java.nio.file.attribute.BasicFileAttributes;
-import java.util.ArrayList;
 import java.util.Date;
 import java.util.EnumSet;
 import java.util.HashMap;
@@ -21,10 +16,10 @@
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.DefaultParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
-import org.apache.commons.cli.DefaultParser;
 import org.hps.datacat.client.DatacatClient;
 import org.hps.datacat.client.DatacatClientFactory;
 import org.hps.datacat.client.DatasetFileFormat;
@@ -35,81 +30,10 @@
  *
  * @author Jeremy McCormick, SLAC
  */
+// TODO: add support for patching metadata if resource exists
 public final class DatacatCrawler {
 
     /**
-     * Visitor which creates a {@link FileSet} from walking a directory tree.
-     * <p>
-     * Any number of {@link java.io.FileFilter} objects can be registered with this visitor to restrict which files are
-     * accepted.
-     *
-     * @author Jeremy McCormick, SLAC
-     */
-    final class DatacatFileVisitor extends SimpleFileVisitor<Path> {
-
-        /**
-         * The run log containing information about files from each run.
-         */
-        private final FileSet fileSet = new FileSet();
-
-        /**
-         * A list of file filters to apply.
-         */
-        private final List<FileFilter> filters = new ArrayList<FileFilter>();
-
-        /**
-         * Run the filters on the file to tell whether it should be accepted or not.
-         *
-         * @param file the EVIO file
-         * @return <code>true</code> if file should be accepted
-         */
-        private boolean accept(final File file) {
-            boolean accept = true;
-            for (final FileFilter filter : this.filters) {
-                accept = filter.accept(file);
-                if (!accept) {
-                    break;
-                }
-            }
-            return accept;
-        }
-
-        /**
-         * Add a file filter.
-         *
-         * @param filter the file filter
-         */
-        void addFilter(final FileFilter filter) {
-            this.filters.add(filter);
-        }
-
-        /**
-         * Get the file set created by visiting the directory tree.
-         *
-         * @return the file set from visiting the directory tree
-         */
-        FileSet getFileSet() {
-            return this.fileSet;
-        }
-
-        /**
-         * Visit a single file.
-         *
-         * @param path the file to visit
-         * @param attrs the file attributes
-         */
-        @Override
-        public FileVisitResult visitFile(final Path path, final BasicFileAttributes attrs) {
-            final File file = path.toFile();
-            if (this.accept(file)) {
-                final DatasetFileFormat format = DatacatUtilities.getFileFormat(file);
-                fileSet.addFile(format, file);
-            }
-            return FileVisitResult.CONTINUE;
-        }
-    }
-
-    /**
      * Make a list of available file formats for printing help.
      */
     private static String AVAILABLE_FORMATS;
@@ -118,7 +42,7 @@
      * Setup the logger.
      */
     private static final Logger LOGGER = Logger.getLogger(DatacatCrawler.class.getPackage().getName());
-
+    
     /**
      * Command line options for the crawler.
      */
@@ -148,6 +72,7 @@
         OPTIONS.addOption("t", "timestamp-file", true, "existing or new timestamp file name");
         OPTIONS.addOption("x", "max-depth", true, "max depth to crawl");
         OPTIONS.addOption("D", "dry-run", false, "dry run which will not update the datacat");
+        OPTIONS.addOption("u", "base-url", true, "provide a base URL of the datacat server");
     }
 
     /**
@@ -168,6 +93,11 @@
      * The options parser.
      */
     private final DefaultParser parser = new DefaultParser();
+    
+    /**
+     * The data catalog client interface.
+     */
+    private DatacatClient datacatClient;
 
     /**
      * Throw an exception if the path doesn't exist in the data catalog or it is not a folder.
@@ -176,7 +106,6 @@
      * @throws RuntimeException if the given path does not exist or it is not a folder
      */
     private void checkFolder(final String folder) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
         if (!datacatClient.exists(folder)) {
             throw new RuntimeException("The folder " + folder + " does not exist in the data catalog.");
         }
@@ -328,13 +257,21 @@
             if (cl.hasOption("D")) {
                 config.setDryRun(true);
             }
+            
+            if (cl.hasOption("u")) {
+                config.setBaseUrl(cl.getOptionValue("u"));
+            }
+            
+            if (!cl.getArgList().isEmpty()) {
+                for (String arg : cl.getArgList()) {
+                    config.addPath(arg);
+                }
+                
+            }
 
         } catch (final ParseException e) {
             throw new RuntimeException("Error parsing options.", e);
         }
-
-        // Check the datacat folder which must already exist.
-        this.checkFolder(config.datacatFolder());
 
         // Check that there is at least one file format enabled for filtering.
         if (this.config.getFileFormats().isEmpty()) {
@@ -351,7 +288,7 @@
      */
     private void printUsage() {
         final HelpFormatter help = new HelpFormatter();
-        help.printHelp(70, "DatacatCrawler [options]", "", OPTIONS, "");
+        help.printHelp(70, "DatacatCrawler [options] path ...", "", OPTIONS, "");
         System.exit(0);
     }
 
@@ -359,19 +296,33 @@
      * Run the crawler job.
      */
     private void run() {
+        
+        LOGGER.config("creating datacat client with url = " + config.baseUrl() + "; site = " + config.datasetSite() + "; rootFolder = " + config.rootFolder());
+        datacatClient = new DatacatClientFactory().createClient(config.baseUrl(), config.datasetSite(), config.rootFolder()); 
+        
+        // Check the datacat folder which must already exist.
+        this.checkFolder(config.datacatFolder());
 
         // Create the file visitor for crawling the root directory with the given date filter.
-        final DatacatFileVisitor visitor = new DatacatFileVisitor();
+        final CrawlerFileVisitor visitor = new CrawlerFileVisitor();
 
         // Add date filter if timestamp is supplied.
         if (config.timestamp() != null) {
             visitor.addFilter(new DateFileFilter(config.timestamp()));
+            LOGGER.config("added timestamp filter " + config.timestamp());
+        }
+        
+        if (!config.paths().isEmpty()) {
+            visitor.addFilter(new PathFilter(config.paths()));
+            StringBuffer sb = new StringBuffer();
+            for (String path : config.paths()) {
+                sb.append(path + ":");
+            }
+            sb.setLength(sb.length() - 1);
+            LOGGER.config("added paths " + sb.toString());
         }
 
         // Add file format filter.
-        for (final DatasetFileFormat fileFormat : config.getFileFormats()) {
-            LOGGER.info("adding file format filter for " + fileFormat.name());
-        }
         visitor.addFilter(new FileFormatFilter(config.getFileFormats()));
 
         // Run number filter.
@@ -379,11 +330,17 @@
             visitor.addFilter(new RunFilter(config.acceptRuns()));
         }
 
-        // Walk the file tree using the visitor.
+        // Walk the file tree using the visitor with the enabled filters.
         this.walk(visitor);
+        
+        LOGGER.info(visitor.getFileSet().toString());
 
         // Update the data catalog.
-        this.updateDatacat(visitor.getFileSet());
+        if (!visitor.getFileSet().isEmpty()) {
+            this.updateDatacat(visitor.getFileSet());
+        } else {
+            LOGGER.warning("no files found");
+        }
     }
 
     /**
@@ -392,7 +349,6 @@
      * @param runMap the map of run information including the EVIO file list
      */
     private void updateDatacat(final FileSet fileSet) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
         for (final DatasetFileFormat fileFormat : config.getFileFormats()) {
             List<File> formatFiles = fileSet.get(fileFormat);
             LOGGER.info("adding " + formatFiles.size() + " files with format " + fileFormat.name());
@@ -400,28 +356,50 @@
 
                 LOGGER.info("adding file " + file.getAbsolutePath());
 
-                // Create metadata if this is enabled (will take awhile).
                 Map<String, Object> metadata = new HashMap<String, Object>();
+
+                // Use file on JLAB cache disk if necessary.
+                File actualFile = file;
+                if (CrawlerFileUtilities.isMssFile(file)) {
+                    actualFile = CrawlerFileUtilities.getCachedFile(file);
+                    LOGGER.info("using cached file " + actualFile.getPath());
+                }
+                
                 if (config.enableMetaData()) {
-                    LOGGER.info("creating metadata for " + file.getPath());
-                    metadata = DatacatUtilities.createMetadata(file);
+                    // Create metadata map for file.
+                    LOGGER.info("creating metadata for " + actualFile.getPath());
+                    metadata = DatacatUtilities.createMetadata(actualFile);
+                    metadata.put("scanStatus", "OK");
+                } else {
+                    // Assign run number even if metadata is not enabled.
+                    metadata = new HashMap<String, Object>();
+                    int run = CrawlerFileUtilities.getRunFromFileName(file);
+                    metadata.put("runMin", run);
+                    metadata.put("runMax", run);
+                    metadata.put("scanStatus", "UNSCANNED");
                 }
 
                 // Register file in the catalog.
                 if (!config.dryRun()) {
-                    int response = DatacatUtilities.addFile(datacatClient, config.datacatFolder(), file, config.datasetSite(), metadata);
+                    int response = DatacatUtilities.addFile(
+                            datacatClient, 
+                            config.datacatFolder(),
+                            file,  
+                            actualFile.length(),
+                            config.datasetSite(), 
+                            metadata);
                     LOGGER.info("HTTP response " + response);
                     if (response >= 400) {
                         // Throw exception if response from server indicates an error occurred.
-                        throw new RuntimeException("HTTP error code " + response + " received from server.");
+                        throw new RuntimeException("HTTP error code " + response + " was received from server.");
                     }
                 } else {
-                    LOGGER.info("skipped updated on " + file.getPath() + " from dry run");
-                }
-            }
-            LOGGER.info("successfully added " + formatFiles.size() + " " + fileFormat + " files");
-        }
-        LOGGER.info("done updating datacat");
+                    LOGGER.info("Skipped update on " + file.getPath() + " because dry run is enabled.");
+                }
+            }
+            LOGGER.info("Successfully added " + formatFiles.size() + " " + fileFormat + " files to data catalog.");
+        }
+        LOGGER.info("Done updating data catalog.");
     }
        
     /**
@@ -429,7 +407,7 @@
      *
      * @param visitor the file visitor
      */
-    private void walk(final DatacatFileVisitor visitor) {
+    private void walk(final CrawlerFileVisitor visitor) {
         try {
             // Walk the file tree from the root directory.
             final EnumSet<FileVisitOption> options = EnumSet.noneOf(FileVisitOption.class);

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatUtilities.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatUtilities.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatUtilities.java	Tue Dec  1 15:55:47 2015
@@ -31,21 +31,6 @@
     /**
      * Add a file to the data catalog.
      *
-     * @param datacatClient the data catalog client
-     * @param folder the target folder in the data catalog
-     * @param file the file with the full path
-     * @param metadata the file's meta data
-     */
-    static int addFile(final DatacatClient datacatClient, final String folder, final File file,
-            DatasetSite site, final Map<String, Object> metadata) {
-        final DatasetFileFormat fileFormat = DatacatUtilities.getFileFormat(file);
-        final DatasetDataType dataType = DatacatUtilities.getDataType(file);
-        return DatacatUtilities.addFile(datacatClient, folder, file, metadata, fileFormat, dataType, site);
-    }
-
-    /**
-     * Add a file to the data catalog.
-     *
      * @param client the data catalog client
      * @param folder the folder name e.g. "data/raw"
      * @param fileMetadata the file's meta data including the path
@@ -53,21 +38,15 @@
      * @param dataType the file's data type (RAW, RECON, etc.)
      * @return the HTTP response code
      */
-    static int addFile(final DatacatClient client, final String folder, final File file,
-            final Map<String, Object> metadata, final DatasetFileFormat fileFormat, final DatasetDataType dataType,
-            final DatasetSite site) {
+    static int addFile(final DatacatClient client, final String folder, final File file, long fileLength,
+            final DatasetSite site, final Map<String, Object> metadata) {
         
-        // Get the cache file if this file is on JLAB MSS.
-        File actualFile = file;
-        if (EvioFileUtilities.isMssFile(file)) {
-            actualFile = EvioFileUtilities.getCachedFile(file);
-        }
-
+        // Get the dataset format and type.
+        final DatasetFileFormat fileFormat = DatacatUtilities.getFileFormat(file);
+        final DatasetDataType dataType = DatacatUtilities.getDataType(file);
+        
         // Add the dataset to the data catalog using the REST API.
-        final int response = client.addDataset(folder, dataType, file.getAbsolutePath(), actualFile.length(), site, fileFormat, 
-                file.getName(), metadata);
-
-        return response;
+        return client.addDataset(folder, dataType, file.getAbsolutePath(), fileLength, site, fileFormat, file.getName(), metadata);
     }
 
     /**
@@ -149,7 +128,7 @@
     static FileMetadataReader getFileMetaDataReader(final DatasetFileFormat fileFormat, final DatasetDataType dataType) {
         FileMetadataReader reader = null;
         if (fileFormat.equals(DatasetFileFormat.LCIO)) {
-            reader = new LcioMetadataReader();
+            reader = new LcioReconMetadataReader();
         } else if (fileFormat.equals(DatasetFileFormat.EVIO)) {
             reader = new EvioMetadataReader();
         } else if (fileFormat.equals(DatasetFileFormat.ROOT) && dataType.equals(DatasetDataType.DST)) {

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	Tue Dec  1 15:55:47 2015
@@ -4,32 +4,28 @@
 import java.io.IOException;
 import java.math.RoundingMode;
 import java.text.DecimalFormat;
-import java.util.ArrayList;
-import java.util.HashMap;
 import java.util.LinkedHashMap;
-import java.util.List;
 import java.util.Map;
 import java.util.Map.Entry;
 import java.util.Set;
 import java.util.logging.Level;
 import java.util.logging.Logger;
 
-import org.hps.record.epics.EpicsData;
-import org.hps.record.epics.EpicsEvioProcessor;
-import org.hps.record.evio.EventTagConstant;
-import org.hps.record.evio.EventTagMask;
 import org.hps.record.evio.EvioEventUtilities;
 import org.hps.record.evio.EvioFileUtilities;
-import org.hps.record.scalers.ScalerData;
-import org.hps.record.scalers.ScalersEvioProcessor;
+import org.hps.record.triggerbank.AbstractIntData.IntBankDefinition;
+import org.hps.record.triggerbank.HeadBankData;
+import org.hps.record.triggerbank.TIData;
+import org.hps.record.triggerbank.TiTimeOffsetEvioProcessor;
 import org.hps.record.triggerbank.TriggerType;
+import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
 import org.jlab.coda.jevio.EvioException;
 import org.jlab.coda.jevio.EvioReader;
 
 /**
- * Reads metadata from EVIO files, including the event count, run min and run max expected by the datacat,
- * as well as many custom field values applicable to HPS EVIO raw data.
+ * Reads metadata from EVIO files, including the event count, run min and run max expected by the datacat, as well as
+ * many custom field values applicable to HPS EVIO raw data.
  * <p>
  * The size of the data file is set externally to this reader using the datacat client.
  * 
@@ -38,245 +34,172 @@
 final class EvioMetadataReader implements FileMetadataReader {
 
     /**
-     * Initialize the logger.
+     * Initialize the package logger.
      */
     private static Logger LOGGER = Logger.getLogger(EvioMetadataReader.class.getPackage().getName());
-    
+
+    /**
+     * Head bank definition.
+     */
+    private static IntBankDefinition HEAD_BANK = new IntBankDefinition(HeadBankData.class, new int[] {0x2e, 0xe10f});
+
+    /**
+     * TI data bank definition.
+     */
+    private static IntBankDefinition TI_BANK = new IntBankDefinition(TIData.class, new int[] {0x2e, 0xe10a});
+
     /**
      * Get the EVIO file metadata.
-     *      
+     * 
      * @param file the EVIO file
      * @return the metadata map of key and value pairs
      */
     @Override
     public Map<String, Object> getMetadata(final File file) throws IOException {
 
-        Integer firstTimestamp = null;
-        Integer lastTimestamp = null;
-        Integer firstPhysicsTimestamp = null;
-        Integer lastPhysicsTimestamp = null;
-        int totalEvents = 0;
+        int events = 0;
         int badEvents = 0;
-        int epicsEvents = 0;
-        int scalerEvents = 0;
-        int physicsEvents = 0;
-        int syncEvents = 0;
-        int pauseEvents = 0;
-        int prestartEvents = 0;
-        int endEvents = 0;
-        int goEvents = 0;        
         boolean blinded = true;
         Integer run = null;
+        Integer firstHeadTimestamp = null;
+        Integer lastHeadTimestamp = null;
         Integer lastPhysicsEvent = null;
         Integer firstPhysicsEvent = null;
         double triggerRate = 0;
-        List<ScalerData> scalerData = new ArrayList<ScalerData>();
-        List<EpicsData> epicsData = new ArrayList<EpicsData>();
-                               
-        // Create map for counting event masks.        
-        Map<TriggerType, Integer> eventCounts = new HashMap<TriggerType, Integer>();
-        for (TriggerType mask : TriggerType.values()) {
-            eventCounts.put(mask, 0);
-        }
-                
-        // Scaler processor to check for scaler bank.
-        ScalersEvioProcessor scalersProcessor = new ScalersEvioProcessor();
-        
-        // EPICS data processor.
-        EpicsEvioProcessor epicsProcessor = new EpicsEvioProcessor(); 
+        long lastTI = 0;
+        long minTIDelta = 0;
+        long maxTIDelta = 0;
+        long firstTI = 0;
+        
+        TiTimeOffsetEvioProcessor tiProcessor = new TiTimeOffsetEvioProcessor();
+
+        // Create map for counting trigger types.
+        Map<TriggerType, Integer> triggerCounts = new LinkedHashMap<TriggerType, Integer>();
+        for (TriggerType triggerType : TriggerType.values()) {
+            triggerCounts.put(triggerType, 0);
+        }
 
         // Get the file number from the name.
         final int fileNumber = EvioFileUtilities.getSequenceFromName(file);
-        
-        // Only files divisible by 10 are unblinded (Eng Run 2015 scheme).
+
+        // Files with sequence number divisible by 10 are unblinded (Eng Run 2015 scheme).
         if (fileNumber % 10 == 0) {
             blinded = false;
         }
-        
+
         EvioReader evioReader = null;
-        try {                        
+        try {
             // Open file in sequential mode.
-            evioReader = EvioFileUtilities.open(file, true);            
+            evioReader = EvioFileUtilities.open(file, true);
             EvioEvent evioEvent = null;
 
             // Event read loop.
-            while (true) {
-                
-                // Read in an EVIO event, trapping exceptions in case a parse error occurs.
-                boolean badEvent = false;
+            fileLoop: while (true) {
                 try {
                     // Parse next event.
-                    evioEvent = evioReader.parseNextEvent();                    
-                } catch (IOException | EvioException e) {
-                    // Trap event parsing errors from bad EVIO data.
-                    badEvent = true;
-                    badEvents++;
-                    LOGGER.warning("bad EVIO event " + evioEvent.getEventNumber() + " could not be parsed");
-                } finally {
+                    evioEvent = evioReader.parseNextEvent();
+
                     // End of file.
-                    if (!badEvent && evioEvent == null) {
-                        LOGGER.info("EOF after " + totalEvents + " events");
-                        break;
+                    if (evioEvent == null) {
+                        LOGGER.info("EOF after " + events + " events");
+                        break fileLoop;
                     }
                     
-                    // Increment event count.
-                    totalEvents++;
-                }
-                                
-                // Continue to next event if a parse error occurred.
-                if (badEvent) {
-                    continue;
-                }                
-                                
-                // Debug print event number and tag.
-                LOGGER.finest("parsed event " + evioEvent.getEventNumber() + " with tag 0x" + String.format("%08x", evioEvent.getHeader().getTag()));
-                                
-                // Process different event types.
-                if (EventTagConstant.PRESTART.matches(evioEvent)) {
-                                                            
-                    // File has PRESTART event.
-                    LOGGER.fine("found PRESTART event " + evioEvent.getEventNumber());
-                    ++prestartEvents;
-                    
-                    // Set the run number from the PRESTART event.
-                    final int[] controlEventData = EvioEventUtilities.getControlEventData(evioEvent);
-                    if (run == null) {
-                        run = controlEventData[1];
-                        LOGGER.fine("set run to " + run + " from PRESTART");
-                    }
-                    
-                    // Set the first timestamp from the PRESTART event.
-                    if (firstTimestamp == null) {
-                        firstTimestamp = controlEventData[0];
-                        LOGGER.fine("set first timestamp to " + firstTimestamp + " from PRESTART event " + evioEvent.getEventNumber());
-                    }
-                    
-                } else if (EventTagConstant.GO.matches(evioEvent)) {
-                    
-                    // File has GO event.
-                    goEvents++;
-                    
-                    // Set first timestamp from the GO event (will not override PRESTART time).
-                    if (firstTimestamp == null) {
-                        final int[] controlEventData = EvioEventUtilities.getControlEventData(evioEvent);
-                        firstTimestamp = controlEventData[0];
-                        LOGGER.fine("set first timestamp to " + firstTimestamp + " from GO event " + evioEvent.getEventNumber());
-                    }
-                                                           
-                } else if (EventTagConstant.END.matches(evioEvent)) {
-                    
-                    // File has END event.
-                    LOGGER.fine("got END event");
-                    endEvents++;
-                    
-                    // Set the last timestamp from the END event.
-                    final int[] controlEventData = EvioEventUtilities.getControlEventData(evioEvent);
-                    lastTimestamp = controlEventData[0];
-                    LOGGER.fine("set last timestamp " + lastTimestamp + " from END event " + evioEvent.getEventNumber());
-                    if (run == null) {
-                        run = controlEventData[1];
-                        LOGGER.fine("set run to " + run);
-                    }
-                    
-                } else if (EventTagConstant.PAUSE.matches(evioEvent)) {
-                    
-                    // Count pause events.
-                    pauseEvents++;
-                    
-                } else if (EvioEventUtilities.isPhysicsEvent(evioEvent)) {
-                    
-                    // Count physics events.
-                    physicsEvents++;
-                    
+                    ++events;
+
+                    // Debug print event number and tag.
+                    LOGGER.finest("parsed event " + evioEvent.getEventNumber() + " with tag 0x"
+                            + String.format("%08x", evioEvent.getHeader().getTag()));
+
                     // Get head bank.
-                    final int[] headBankData = EvioEventUtilities.getHeadBankData(evioEvent);
-                    
-                    // Is head bank present?
-                    if (headBankData != null) {
+                    BaseStructure headBank = HEAD_BANK.findBank(evioEvent);
+
+                    // Current timestamp.
+                    int thisTimestamp = 0;
+
+                    // Process head bank if not null.
+                    if (headBank != null) {
+                        if (headBank != null) {
+                            final int[] headBankData = headBank.getIntData();
+                            thisTimestamp = headBankData[3];
+                            if (thisTimestamp != 0) {
+                                // First header timestamp.
+                                if (firstHeadTimestamp == null) {
+                                    firstHeadTimestamp = thisTimestamp;
+                                    LOGGER.info("first head timestamp " + firstHeadTimestamp + " from event "
+                                            + evioEvent.getEventNumber());
+                                }
+
+                                // Last header timestamp.
+                                lastHeadTimestamp = thisTimestamp;
+                            }
+
+                            // Run number.
+                            if (run == null) {
+                                if (headBankData[1] != 0) {
+                                    run = headBankData[1];
+                                    LOGGER.info("run " + run + " from event " + evioEvent.getEventNumber());
+                                }
+                            }
+                        }
+                    }
+
+                    // Process trigger bank data for TI times (copied from Sho's BasicEvioFileReader class).
+                    BaseStructure tiBank = TI_BANK.findBank(evioEvent);
+                    if (tiBank != null) {
+                        TIData tiData = new TIData(tiBank.getIntData());
+                        if (lastTI == 0) {
+                            firstTI = tiData.getTime();
+                        }
+                        lastTI = tiData.getTime();
+                        if (thisTimestamp != 0) {
+                            long delta = thisTimestamp * 1000000000L - tiData.getTime();
+                            if (minTIDelta == 0 || minTIDelta > delta) {
+                                minTIDelta = delta;
+                            }
+                            if (maxTIDelta == 0 || maxTIDelta < delta) {
+                                maxTIDelta = delta;
+                            }
+                        }
+                    }
+
+                    if (EvioEventUtilities.isPhysicsEvent(evioEvent)) {
+                                                
+                        final int[] eventIdData = EvioEventUtilities.getEventIdData(evioEvent);
                         
-                        // Is timestamp set?
-                        if (headBankData[3] != 0) {
-                            
-                            // Set first timestamp.
-                            if (firstTimestamp == null) {
-                                firstTimestamp = headBankData[3];
-                                LOGGER.fine("set first timestamp to " + firstTimestamp + " from physics event " + evioEvent.getEventNumber());                                
-                            }
-                            
-                            // Set first physics timestamp.
-                            if (firstPhysicsTimestamp == null) {                     
-                                firstPhysicsTimestamp = headBankData[3];
-                                LOGGER.fine("set first physics timestamp to " + firstTimestamp + " from event " + evioEvent.getEventNumber());                                
-                            }
-                            
-                            // Set last physics timestamp.
-                            lastPhysicsTimestamp = headBankData[3];
-                            LOGGER.finest("set last physics timestamp to " + firstTimestamp + " from event " + evioEvent.getEventNumber());
-                            
-                            // Set last timestamp.
-                            lastTimestamp = headBankData[3];
-                        }
+                        if (eventIdData != null) {
                         
-                        // Set run number.
-                        if (run == null) {
-                            run = headBankData[1];
-                            LOGGER.info("set run to " + run + " from physics event " + evioEvent.getEventNumber());
-                        }
-                    }                                                                
-                                                                                
-                    // Get the event ID data.
-                    final int[] eventIdData = EvioEventUtilities.getEventIdData(evioEvent);
-                    if (eventIdData != null) {
-                        
-                        // Set the last physics event.
-                        lastPhysicsEvent = eventIdData[0];
-
-                        // Set the first physics event.
-                        if (firstPhysicsEvent == null) {
-                            firstPhysicsEvent = eventIdData[0];
-                            LOGGER.fine("set start event " + firstPhysicsEvent + " from physics event " + evioEvent.getEventNumber());
-                        }                        
-                    }
-                                                                         
-                    // Count scaler events.
-                    scalersProcessor.process(evioEvent);
-                    if (scalersProcessor.getCurrentScalerData() != null) {
-                        scalerData.add(scalersProcessor.getCurrentScalerData());
-                        scalerEvents++;
-                    }
-                    
+                            // Set the last physics event.
+                            lastPhysicsEvent = eventIdData[0];
+
+                            // Set the first physics event.
+                            if (firstPhysicsEvent == null) {
+                                firstPhysicsEvent = eventIdData[0];
+                                LOGGER.info("set first physics event " + firstPhysicsEvent);
+                            }
+                        }
+                    }
+
                     // Count trigger types for this event.
                     Set<TriggerType> triggerTypes = TriggerType.getTriggerTypes(evioEvent);
                     for (TriggerType mask : triggerTypes) {
-                        int count = eventCounts.get(mask) + 1;
-                        eventCounts.put(mask, count);
-                        LOGGER.finer("incremented " + mask.name() + " to " + count);
+                        int count = triggerCounts.get(mask) + 1;
+                        triggerCounts.put(mask, count);
+                        LOGGER.finest("incremented " + mask.name() + " to " + count);
                     }
                     
-                    // Count sync events.
-                    if (EventTagMask.SYNC.matches(evioEvent.getHeader().getTag())) {
-                        // Count sync events.
-                        ++syncEvents;
-                        LOGGER.finer("got sync event from tag " + String.format("%08x", evioEvent.getHeader().getTag()));
-                    }
-                                          
-                } else if (EventTagConstant.EPICS.matches(evioEvent)) {
-                                        
-                    // Count EPICS events.
-                    ++epicsEvents;
+                    // Activate TI time offset processor.
+                    tiProcessor.process(evioEvent);
                     
-                    // Get EPICS data for charge calculation.
-                    epicsProcessor.process(evioEvent);
-                    EpicsData epicsEvent = epicsProcessor.getEpicsData();
-                    if (epicsEvent.hasKey("scaler_calc1")) {
-                        epicsData.add(epicsEvent);    
-                    }                    
-                } 
+                } catch (IOException | EvioException e) {
+                    // Trap event processing errors (not counted in event total).
+                    badEvents++;
+                    LOGGER.warning("error processing EVIO event " + evioEvent.getEventNumber());
+                }
             }
-            
         } catch (final EvioException e) {
             // Error reading the EVIO file.
-            throw new IOException(e);
+            throw new IOException("Error reading EVIO file.", e);
         } finally {
             // Close the reader.
             if (evioReader != null) {
@@ -287,68 +210,55 @@
                 }
             }
         }
-        
-        LOGGER.info("done reading "  + totalEvents + " events");
-        
+
+        LOGGER.info("done reading " + events + " events");
+
         // Rough trigger rate calculation.
-        triggerRate = calculateTriggerRate(firstPhysicsTimestamp, lastPhysicsTimestamp, physicsEvents);
-
-        // Calculate ungated charge.
-        //double ungatedCharge = calculateCharge(epicsData, firstPhysicsTimestamp, lastPhysicsTimestamp);
-        
-        // Calculated gated charge.
-        //double gatedCharge = calculateGatedCharge(ungatedCharge, scalerData, ScalerDataIndex.FCUP_TRG_GATED, ScalerDataIndex.FCUP_TRG_UNGATED);
-        
+        triggerRate = calculateTriggerRate(firstHeadTimestamp, lastHeadTimestamp, events);
+
         // Create and fill the metadata map.
         final Map<String, Object> metadataMap = new LinkedHashMap<String, Object>();
-        
-        // Built-in fields of datacat.
+
+        // Set built-in system metadata.
         metadataMap.put("runMin", run);
         metadataMap.put("runMax", run);
-        metadataMap.put("eventCount", totalEvents);
-        
-        // Run number.
-        metadataMap.put("RUN", run);
+        metadataMap.put("eventCount", events);
         
         // File sequence number.
-        metadataMap.put("FILE_NUMBER", fileNumber);
-        
+        metadataMap.put("FILE", fileNumber);
+
         // Blinded flag.
         metadataMap.put("BLINDED", blinded);
-        
+
         // First and last timestamps which may come from control or physics events.
-        metadataMap.put("FIRST_TIMESTAMP", firstTimestamp);
-        metadataMap.put("LAST_TIMESTAMP", lastTimestamp);
-        
-        // First and last physics events.
+        metadataMap.put("FIRST_HEAD_TIMESTAMP", firstHeadTimestamp);
+        metadataMap.put("LAST_HEAD_TIMESTAMP", lastHeadTimestamp);
+
+        // First and last physics event numbers.
         metadataMap.put("FIRST_PHYSICS_EVENT", firstPhysicsEvent);
         metadataMap.put("LAST_PHYSICS_EVENT", lastPhysicsEvent);
-        
-        // First and last physics event timestamps.
-        metadataMap.put("FIRST_PHYSICS_TIMESTAMP", firstPhysicsTimestamp);
-        metadataMap.put("LAST_PHYSICS_TIMESTAMP", lastPhysicsTimestamp);
-        
+
+        // TI times and offset.
+        metadataMap.put("FIRST_TI_TIME", firstTI);
+        metadataMap.put("LAST_TI_TIME", lastTI);
+        metadataMap.put("TI_TIME_DELTA", maxTIDelta - minTIDelta);
+        
+        // TI time offset (stored as string because of bug in MySQL datacat backend).
+        metadataMap.put("TI_TIME_OFFSET", ((Long) tiProcessor.getTiTimeOffset()).toString());
+
         // Event counts.
-        metadataMap.put("PHYSICS_EVENTS", physicsEvents);
         metadataMap.put("BAD_EVENTS", badEvents);
-        metadataMap.put("EPICS_EVENTS", epicsEvents);        
-        metadataMap.put("SCALER_EVENTS", scalerEvents);
-        metadataMap.put("END_EVENTS", endEvents);
-        metadataMap.put("PRESTART_EVENTS", prestartEvents);
-        metadataMap.put("GO_EVENTS", goEvents);
-        metadataMap.put("PAUSE_EVENTS", pauseEvents);
-        metadataMap.put("SYNC_EVENTS", syncEvents);
-        
-        // Trigger rate.
+        
+        // Trigger rate in KHz.
         DecimalFormat df = new DecimalFormat("#.####");
         df.setRoundingMode(RoundingMode.CEILING);
-        metadataMap.put("TRIGGER_RATE_KHZ", Double.parseDouble(df.format(triggerRate)));
-                                            
-        // Add the trigger counts.
-        for (Entry<TriggerType, Integer> entry : eventCounts.entrySet()) {
+        metadataMap.put("TRIGGER_RATE", Double.parseDouble(df.format(triggerRate)));
+
+        // Trigger type counts.
+        for (Entry<TriggerType, Integer> entry : triggerCounts.entrySet()) {
             metadataMap.put(entry.getKey().name(), entry.getValue());
         }
-        
+
         // Print the file metadata to log.
         StringBuffer sb = new StringBuffer();
         sb.append('\n');
@@ -356,20 +266,20 @@
             sb.append("  " + entry.getKey() + " = " + entry.getValue() + '\n');
         }
         LOGGER.info("file metadata ..." + '\n' + sb.toString());
-        
+
         // Return the completed metadata map.
         return metadataMap;
     }
-
+         
     /**
      * Calculate the trigger rate in KHz.
      * 
      * @param firstTimestamp the first physics timestamp
      * @param lastTimestamp the last physics timestamp
-     * @param physicsEvents the number of physics events
+     * @param events the number of physics events
      * @return the trigger rate calculation in KHz
      */
-    private double calculateTriggerRate(Integer firstTimestamp, Integer lastTimestamp, int physicsEvents) {
-        return ((double) physicsEvents / ((double) lastTimestamp - (double) firstTimestamp)) / 1000.;
-    }    
+    private double calculateTriggerRate(Integer firstTimestamp, Integer lastTimestamp, int events) {
+        return ((double) events / ((double) lastTimestamp - (double) firstTimestamp)) / 1000.;
+    }
 }

Copied: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioReconMetadataReader.java (from r3960, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioMetadataReader.java)
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioMetadataReader.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioReconMetadataReader.java	Tue Dec  1 15:55:47 2015
@@ -4,12 +4,16 @@
 import java.io.File;
 import java.io.IOException;
 import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
 import java.util.Map;
+import java.util.Set;
 
 import org.lcsim.conditions.ConditionsManager;
 import org.lcsim.conditions.ConditionsManagerImplementation;
 import org.lcsim.conditions.ConditionsReader;
 import org.lcsim.event.EventHeader;
+import org.lcsim.event.EventHeader.LCMetaData;
 import org.lcsim.lcio.LCIOReader;
 import org.lcsim.util.loop.DummyConditionsConverter;
 import org.lcsim.util.loop.DummyDetector;
@@ -19,7 +23,7 @@
  * 
  * @author Jeremy McCormick, SLAC
  */
-public class LcioMetadataReader implements FileMetadataReader {
+public class LcioReconMetadataReader implements FileMetadataReader {
 
     /**
      * Setup the conditions system in dummy mode.
@@ -40,31 +44,53 @@
      */
     @Override
     public Map<String, Object> getMetadata(File file) throws IOException {
-        Map<String, Object> metaData = new HashMap<String, Object>();
-        LCIOReader reader = null;
+        
+        Set<String> collectionNames = new HashSet<String>();
+        String detectorName = null;
+        int eventCount = 0;
+        Integer run = null;
+        LCIOReader reader = null;                
         try {        
             reader = new LCIOReader(file);               
             EventHeader eventHeader = null;
-            int eventCount = 0;
-            Integer run = null;
             try {
                 while((eventHeader = reader.read()) != null) {
                     if (run == null) {
                         run = eventHeader.getRunNumber();
-                    }            
+                    }
+                    if (detectorName == null) {
+                        detectorName = eventHeader.getDetectorName();
+                    }
+                    for (List<?> list : eventHeader.getLists()) {
+                        LCMetaData metadata = eventHeader.getMetaData(list);
+                        collectionNames.add(metadata.getName());
+                    }
                     eventCount++;
                 }
             } catch (EOFException e) {
                 e.printStackTrace();
             }
-            metaData.put("eventCount", eventCount);
-            metaData.put("runMin", run);
-            metaData.put("runMax", run);
         } finally {
             if (reader != null) {
                 reader.close();
             }
-        }        
-        return metaData;
+        }    
+        
+        // Build collection names string.
+        StringBuffer sb = new StringBuffer();
+        for (String collectionName : collectionNames) {
+            sb.append(collectionName + ",");
+        }
+        sb.setLength(sb.length() - 1);
+        
+        Map<String, Object> metadata = new HashMap<String, Object>();
+        metadata.put("eventCount", eventCount);
+        metadata.put("runMin", run);
+        metadata.put("runMax", run);
+        metadata.put("DETECTOR", detectorName);
+        metadata.put("COLLECTIONS", sb.toString());
+        
+        
+        return metadata;
     }
 }

Added: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/PathFilter.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/PathFilter.java	(added)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/PathFilter.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,43 @@
+package org.hps.crawler;
+
+import java.io.File;
+import java.io.FileFilter;
+import java.util.Set;
+import java.util.logging.Logger;
+
+/**
+ * Implementation of {@link java.io.FileFilter} which accepts a file if its path is 
+ * equal to any of the paths in a set of strings.
+ * 
+ * @author Jeremy McCormick, SLAC
+ */
+final class PathFilter implements FileFilter {
+
+    private static Logger LOGGER = Logger.getLogger(PathFilter.class.getPackage().getName());
+    
+    /**
+     * Set of paths for filtering.
+     */
+    private Set<String> paths = null;
+    
+    PathFilter(Set<String> paths) {
+        this.paths = paths;
+    }
+
+    /**
+     * Return <code>true</code> if the <code>pathname</code> has a path which is in the set of <code>paths</code>.
+     * 
+     * @return <code>true</code> if <code>pathname</code> passes the filter
+     */
+    @Override
+    public boolean accept(File pathname) {
+        for (String acceptPath : paths) {
+            if (pathname.getPath().equals(acceptPath)) {
+                LOGGER.info("accepted path " + pathname);                
+                return true;
+            }
+        }
+        LOGGER.info("rejected path " + pathname);
+        return false;
+    }
+}

Modified: java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/RunFilter.java
 =============================================================================
--- java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/RunFilter.java	(original)
+++ java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/RunFilter.java	Tue Dec  1 15:55:47 2015
@@ -38,6 +38,11 @@
      */
     @Override
     public boolean accept(final File file) {
-        return this.acceptRuns.contains(EvioFileUtilities.getRunFromName(file));
+        try {
+            int run = EvioFileUtilities.getRunFromName(file);
+            return this.acceptRuns.contains(run);
+        } catch (NumberFormatException e) {
+            return false;
+        }
     }
 }

Modified: java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java
 =============================================================================
--- java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java	(original)
+++ java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java	Tue Dec  1 15:55:47 2015
@@ -9,6 +9,7 @@
  *
  * @author Jeremy McCormick, SLAC
  */
+// TODO: add method for adding a location to an existing dataset
 public interface DatacatClient {
 
     /**

Modified: java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java
 =============================================================================
--- java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java	(original)
+++ java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java	Tue Dec  1 15:55:47 2015
@@ -46,7 +46,7 @@
      * Create client with default parameters.
      */
     DatacatClientImpl() {
-        this(DatacatConstants.BASE_URL, DatasetSite.JLAB, DatacatConstants.ROOT_DIR);
+        this(DatacatConstants.BASE_URL, DatasetSite.JLAB, DatacatConstants.ROOT_FOLDER);
     }
 
     /**
@@ -60,7 +60,7 @@
         try {
             this.url = new URL(url);
         } catch (final MalformedURLException e) {
-            throw new IllegalArgumentException("The URL is bad.", e);
+            throw new IllegalArgumentException("The URL is not valid.", e);
         }
         if (site == null) {
             throw new IllegalArgumentException("The site argument is null.");
@@ -204,7 +204,7 @@
             }
         }
         
-        LOGGER.info("findDatasets: " + urlLocation);
+        LOGGER.info(urlLocation);
         final StringBuffer outputBuffer = new StringBuffer();
         final int response = HttpUtilities.doGet(urlLocation, outputBuffer);
         if (response >= 400) {
@@ -213,7 +213,7 @@
 
         // Build and return dataset list
         final JSONObject searchResults = new JSONObject(outputBuffer.toString());
-        LOGGER.info("returning search results: " + searchResults.toString());
+        LOGGER.info(searchResults.toString());
         return createDatasetsFromSearch(searchResults);
     }
 
@@ -276,7 +276,7 @@
     @Override
     public int makeFolder(final String path) {
         final Map<String, Object> parameters = new HashMap<String, Object>();
-        parameters.put("path", "/" + DatacatConstants.ROOT_DIR + "/" + path);
+        parameters.put("path", "/" + DatacatConstants.ROOT_FOLDER + "/" + path);
         final String name = new File(path).getName();
         parameters.put("name", name);
         parameters.put("_type", "folder");

Modified: java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java
 =============================================================================
--- java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java	(original)
+++ java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java	Tue Dec  1 15:55:47 2015
@@ -5,12 +5,12 @@
  * 
  * @author Jeremy McCormick, SLAC
  */
-final class DatacatConstants {
+public final class DatacatConstants {
 
     /**
      * The root directory in the catalog for HPS folders.
      */
-    public static final String ROOT_DIR = "HPS";
+    public static final String ROOT_FOLDER = "HPS";
         
     /**
      * The base URL of the datacat server.

Modified: java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java
 =============================================================================
--- java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java	(original)
+++ java/branches/jeremy-dev/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java	Tue Dec  1 15:55:47 2015
@@ -48,6 +48,10 @@
         if (metadataCopy.containsKey("eventCount")) {
             dataset.put("eventCount", metadataCopy.get("eventCount"));
             metadataCopy.remove("eventCount");
+        }
+        if (metadataCopy.containsKey("scanStatus")) {
+            dataset.put("scanStatus", metadataCopy.get("scanStatus"));
+            metadataCopy.remove("scanStatus");
         }
         
         if (metadata != null && metadata.size() != 0) {

Modified: java/branches/jeremy-dev/detector-model/src/main/java/org/lcsim/geometry/compact/converter/HPSTrackerBuilder.java
 =============================================================================
--- java/branches/jeremy-dev/detector-model/src/main/java/org/lcsim/geometry/compact/converter/HPSTrackerBuilder.java	(original)
+++ java/branches/jeremy-dev/detector-model/src/main/java/org/lcsim/geometry/compact/converter/HPSTrackerBuilder.java	Tue Dec  1 15:55:47 2015
@@ -74,6 +74,10 @@
             LOGGER.info("mille parameters will be read from compact.xml file");
             initAlignmentParameters();
         }
+        if(debug) {
+            for (MilleParameter p : milleparameters)
+                System.out.printf("%d,%f \n", p.getId(),p.getValue());
+        }
     }
 
     /**
@@ -125,9 +129,8 @@
         if (debug) {
             System.out.printf("%s: Initialized %d alignment parameters:\n", this.getClass().getSimpleName(),
                     milleparameters.size());
-            for (MilleParameter p : milleparameters) {
+            for (MilleParameter p : milleparameters)
                 System.out.printf("%s: %s \n", this.getClass().getSimpleName(), p.toString());
-            }
         }
 
     }

Modified: java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/AbstractSvtEvioReader.java
 =============================================================================
--- java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/AbstractSvtEvioReader.java	(original)
+++ java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/AbstractSvtEvioReader.java	Tue Dec  1 15:55:47 2015
@@ -259,7 +259,7 @@
                         LOGGER.finest("this is a data multisample for apv " + SvtEvioUtils.getApvFromMultiSample(samples) + " ch " + SvtEvioUtils.getChannelNumber(samples));
                     
                     
-                    // Extract data words from multisample header 
+                    // Extract data words from multisample header and update index
                     multisampleHeaderIndex += this.extractMultisampleHeaderData(samples, multisampleHeaderIndex, multisampleHeaderData);
                     
                     // If a set of samples is associated with an APV header or tail, skip it
@@ -294,6 +294,12 @@
 
     
     
+    /**
+     * Process the headers that were extracted from the SVT data. 
+     * @param headers - list of all headers
+     * @param lcsimEvent - the current LCSIM event being processed
+     * @throws SvtEvioHeaderException
+     */
     protected abstract void processSvtHeaders(List<SvtHeaderDataInfo> headers, EventHeader lcsimEvent) throws SvtEvioHeaderException;
     
     /**
@@ -311,6 +317,13 @@
 
     }
     
+    /**
+     * Copy the multisample header data for the samples into a long array.
+     * @param samples
+     * @param index
+     * @param multisampleHeaderData
+     * @return
+     */
     protected int extractMultisampleHeaderData(int[] samples, int index, int[] multisampleHeaderData) {
         LOGGER.finest("extractMultisampleHeaderData: index " + index);
         if( SvtEvioUtils.isMultisampleHeader(samples) && !SvtEvioUtils.isMultisampleTail(samples) ) {
@@ -323,11 +336,23 @@
         }
     }
     
+    /**
+     * Checks that the SVT header data count is consistent with the bank size.
+     * @param sampleCount - sample count from the size.
+     * @param headerData - header extracted from the bank.
+     * @throws SvtEvioHeaderException
+     */
     protected void checkSvtSampleCount(int sampleCount, SvtHeaderDataInfo headerData) throws SvtEvioHeaderException {
         if( sampleCount != SvtEvioUtils.getSvtTailMultisampleCount(headerData.getTail())*4)
             throw new SvtEvioHeaderException("multisample count is not consistent with bank size.");
     }
     
+    /**
+     * Add the multisample headers to the {@link SvtHeaderDataInfo} object.
+     * @param headerData - object to add multisample headers to.
+     * @param n - number of multisample headers
+     * @param multisampleHeaders - multisample headers to copy
+     */
     protected void setMultiSampleHeaders(SvtHeaderDataInfo headerData, int n, int[] multisampleHeaders) {
         //copy out the headers that are non-zero
         int[] vals = new int[n];

Modified: java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/EvioToLcio.java
 =============================================================================
--- java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/EvioToLcio.java	(original)
+++ java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/EvioToLcio.java	Tue Dec  1 15:55:47 2015
@@ -367,6 +367,10 @@
         // Process the LCSim job variable definitions, if any.
         jobManager = new JobManager();
         
+        // Initialize run manager and add as listener on conditions system.
+        RunManager runManager = RunManager.getRunManager();
+        DatabaseConditionsManager.getInstance().addConditionsListener(runManager);
+        
         // Enable dry run because events will be processed individually.
         jobManager.setDryRun(true);
         

Modified: java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
 =============================================================================
--- java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	(original)
+++ java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	Tue Dec  1 15:55:47 2015
@@ -67,6 +67,8 @@
      * Modulus of TI timestamp offset (units of nanoseconds).
      */
     private final long timestampCycle = 24 * 6 * 35;
+    
+    private Long currentTiTimeOffset = null;
 
     /**
      * Class constructor.
@@ -92,22 +94,27 @@
     public void conditionsChanged(final ConditionsEvent conditionsEvent) {
         super.conditionsChanged(conditionsEvent);
         svtEventFlagger.initialize();
-    }
-
-    /**
-     * Get the time from the TI data.
+        
+        // Get TI time offset from run db.
+        if (RunManager.getRunManager().runExists() && RunManager.getRunManager().getRunSummary().getTiTimeOffset() != null) {
+            currentTiTimeOffset = RunManager.getRunManager().getRunSummary().getTiTimeOffset();
+            LOGGER.info("TI time offset set to " + currentTiTimeOffset + " for run " + conditionsEvent.getConditionsManager().getRun());
+        } else {
+            currentTiTimeOffset = null;
+            LOGGER.info("no TI time offset in database for run " + conditionsEvent.getConditionsManager().getRun());
+        }
+    }
+
+    /**
+     * Get the time from the TI data with time offset applied from run database.
      *
      * @param triggerList the TI data list
      */
     @Override
     protected long getTime(final List<AbstractIntData> triggerList) {
         long tiTimeOffset = 0;
-        try {
-            if (RunManager.getRunManager().runExists() && RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() != null) {
-                tiTimeOffset = (RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() / timestampCycle) * timestampCycle;
-            }
-        } catch (IllegalStateException e) {
-            // May happen if RunManager is not initialized; just ignore.
+        if (currentTiTimeOffset != null) {
+            tiTimeOffset = (currentTiTimeOffset / timestampCycle) * timestampCycle;
         }
         for (final AbstractIntData data : triggerList) {
             if (data instanceof TIData) {

Modified: java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/SvtEventFlagger.java
 =============================================================================
--- java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/SvtEventFlagger.java	(original)
+++ java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/SvtEventFlagger.java	Tue Dec  1 15:55:47 2015
@@ -16,6 +16,7 @@
 import org.hps.conditions.svt.SvtAlignmentConstant;
 import org.hps.conditions.svt.SvtBiasConstant;
 import org.hps.conditions.svt.SvtMotorPosition;
+import org.hps.conditions.svt.SvtTimingConstants;
 import org.hps.record.svt.SvtHeaderDataInfo;
 import org.hps.record.triggerbank.AbstractIntData;
 import org.hps.record.triggerbank.HeadBankData;
@@ -35,9 +36,11 @@
     private static final double angleTolerance = 0.0001;
     SvtBiasConstant.SvtBiasConstantCollection svtBiasConstants = null;
     SvtMotorPosition.SvtMotorPositionCollection svtPositionConstants = null;
+    private SvtTimingConstants svtTimingConstants = null;
     private boolean biasGood = false;
     private boolean positionGood = false;
     private boolean burstmodeNoiseGood = false;
+    private boolean latencyGood = false;
     private double nominalAngleTop = 0;
     private double nominalAngleBottom = 0;
 
@@ -64,11 +67,24 @@
             }
         }
 
+        latencyGood = false;
+        if (svtTimingConstants != null) {
+            if (svtTimingConstants.getOffsetTime() <= 27) {
+                latencyGood = true;
+            } else {
+                if (((event.getTimeStamp() - 4 * svtTimingConstants.getOffsetPhase()) % 24) < 16) {
+                    latencyGood = true;
+                }
+            }
+//        System.out.format("%f %b\n", svtTimingConstants.getOffsetTime() + (((event.getTimeStamp() - 4 * svtTimingConstants.getOffsetPhase()) % 24) - 12), latencyGood);
+        }
+
         burstmodeNoiseGood = isBurstmodeNoiseGood(event);
 
         event.getIntegerParameters().put("svt_bias_good", new int[]{biasGood ? 1 : 0});
         event.getIntegerParameters().put("svt_position_good", new int[]{positionGood ? 1 : 0});
         event.getIntegerParameters().put("svt_burstmode_noise_good", new int[]{burstmodeNoiseGood ? 1 : 0});
+        event.getIntegerParameters().put("svt_latency_good", new int[]{latencyGood ? 1 : 0});
     }
 
     private Date getEventTimeStamp(EventHeader event) {
@@ -109,6 +125,13 @@
         } catch (Exception e) {
             svtPositionConstants = null;
         }
+
+        try {
+            svtTimingConstants = DatabaseConditionsManager.getInstance().getCachedConditions(SvtTimingConstants.SvtTimingConstantsCollection.class, "svt_timing_constants").getCachedData().get(0);
+        } catch (Exception e) {
+            svtTimingConstants = null;
+        }
+
     }
 
     private static boolean isBurstmodeNoiseGood(EventHeader event) {
@@ -162,92 +185,90 @@
 
     public static void voidAddHeaderCheckResultToMetaData(boolean ok, EventHeader lcsimEvent) {
         //System.out.println("adding svt header check ");
-        lcsimEvent.getIntegerParameters().put("svt_event_header_good", new int[]{ ok ? 1 : 0});
+        lcsimEvent.getIntegerParameters().put("svt_event_header_good", new int[]{ok ? 1 : 0});
         //if(lcsimEvent.hasItem("svt_event_header_good"))
         //        System.out.println("event header has the svt header check ");
         //else
         //    System.out.println("event header doesn't have the svt header check ");
     }
-    
+
     public static void AddHeaderInfoToMetaData(List<SvtHeaderDataInfo> headers, EventHeader lcsimEvent) {
         int[] svtHeaders = new int[headers.size()];
         int[] svtTails = new int[headers.size()];
-        for(int iSvtHeader=0; iSvtHeader < headers.size();++iSvtHeader) {
+        for (int iSvtHeader = 0; iSvtHeader < headers.size(); ++iSvtHeader) {
             svtHeaders[iSvtHeader] = headers.get(iSvtHeader).getHeader();
             svtTails[iSvtHeader] = headers.get(iSvtHeader).getTail();
-            
+
             lcsimEvent.getIntegerParameters().put("svt_event_header_roc" + headers.get(iSvtHeader).getNum(), new int[]{headers.get(iSvtHeader).getHeader()});
             lcsimEvent.getIntegerParameters().put("svt_event_tail_roc" + headers.get(iSvtHeader).getNum(), new int[]{headers.get(iSvtHeader).getTail()});
-            
-            
+
             int nMS = headers.get(iSvtHeader).getNumberOfMultisampleHeaders();
-            int[] multisampleHeadersArray = new int[4*nMS];
-            for(int iMS = 0; iMS < nMS; ++iMS ) {
+            int[] multisampleHeadersArray = new int[4 * nMS];
+            for (int iMS = 0; iMS < nMS; ++iMS) {
                 int[] multisampleHeader = headers.get(iSvtHeader).getMultisampleHeader(iMS);
-                System.arraycopy(multisampleHeader, 0, multisampleHeadersArray, iMS*4, multisampleHeader.length);
+                System.arraycopy(multisampleHeader, 0, multisampleHeadersArray, iMS * 4, multisampleHeader.length);
             }
             lcsimEvent.getIntegerParameters().put("svt_multisample_headers_roc" + headers.get(iSvtHeader).getNum(), multisampleHeadersArray);
         }
-        
-    }
-    
-    private static final Pattern rocIdPattern  = Pattern.compile("svt_.*_roc(\\d+)");
-    
+
+    }
+
+    private static final Pattern rocIdPattern = Pattern.compile("svt_.*_roc(\\d+)");
+
     public static int getRocFromSvtHeaderName(String seq) {
         Matcher m = rocIdPattern.matcher(seq);
-        if(m == null) 
+        if (m == null) {
             throw new RuntimeException("null matcher, don't think this should happen");
-        if( !m.matches() ) 
+        }
+        if (!m.matches()) {
             return -1;
-        else
-            return Integer.parseInt( m.group(1) );
-    }
-    
-    
-    
-    public static List<SvtHeaderDataInfo>  getHeaderInfoToMetaData(EventHeader lcsimEvent) {
-        Map<Integer, Integer> headers = new HashMap<Integer,Integer>();
-        Map<Integer, Integer> tails = new HashMap<Integer,Integer>();
-        Map<Integer, Integer[]> multisampleHeaders = new HashMap<Integer,Integer[]>();
-        
-        
-        for(Map.Entry<String, int[]> entry : lcsimEvent.getIntegerParameters().entrySet()) {
-            
+        } else {
+            return Integer.parseInt(m.group(1));
+        }
+    }
+
+    public static List<SvtHeaderDataInfo> getHeaderInfoToMetaData(EventHeader lcsimEvent) {
+        Map<Integer, Integer> headers = new HashMap<Integer, Integer>();
+        Map<Integer, Integer> tails = new HashMap<Integer, Integer>();
+        Map<Integer, Integer[]> multisampleHeaders = new HashMap<Integer, Integer[]>();
+
+        for (Map.Entry<String, int[]> entry : lcsimEvent.getIntegerParameters().entrySet()) {
+
             int roc = getRocFromSvtHeaderName(entry.getKey());
-            
-            if( roc == -1) {
+
+            if (roc == -1) {
                 continue;
             }
             //LOGGER.logger.fine("processing entry \"" + entry.getKey()+ "\"" + " for roc "  + roc);
             int[] value = entry.getValue();
-           
-            if(entry.getKey().contains("svt_event_header_roc"))
+
+            if (entry.getKey().contains("svt_event_header_roc")) {
                 headers.put(roc, value[0]);
-            
-            if(entry.getKey().contains("svt_event_tail_roc")) 
+            }
+
+            if (entry.getKey().contains("svt_event_tail_roc")) {
                 tails.put(roc, value[0]);
-                
+            }
+
             // really need to copy?
-            if(entry.getKey().contains("svt_multisample_headers_roc")) {
+            if (entry.getKey().contains("svt_multisample_headers_roc")) {
                 Integer[] tmp = ArrayUtils.toObject(value); //new Integer[value.length];
                 multisampleHeaders.put(roc, tmp);
             }
-                    
-        }
-        
+
+        }
+
         // create the new objects
         List<SvtHeaderDataInfo> headerDataInfo = new ArrayList<SvtHeaderDataInfo>();
-        for(Integer roc : headers.keySet()) {
+        for (Integer roc : headers.keySet()) {
             int header = headers.get(roc);
             int tail = tails.get(roc);
             Integer[] ms = multisampleHeaders.get(roc);
             headerDataInfo.add(new SvtHeaderDataInfo(roc, header, tail, ms));
         }
-        
-       return headerDataInfo;
-        
-    }
-    
-    
-    
+
+        return headerDataInfo;
+
+    }
+
 }

Modified: java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/TestRunTriggeredReconToLcio.java
 =============================================================================
--- java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/TestRunTriggeredReconToLcio.java	(original)
+++ java/branches/jeremy-dev/evio/src/main/java/org/hps/evio/TestRunTriggeredReconToLcio.java	Tue Dec  1 15:55:47 2015
@@ -310,7 +310,7 @@
                 }
             }
             if (ecalScoringPlaneHits != null) {
-                lcsimEvent.put(ecalScoringPlaneHitsCollectionName, ecalScoringPlaneHits, SimTrackerHit.class, 0);
+                lcsimEvent.put(ecalScoringPlaneHitsCollectionName, ecalScoringPlaneHits, SimTrackerHit.class, 0xc0000000);
                 if (verbosity >= 1) {
                     System.out.println("Adding " + ecalScoringPlaneHits.size() + " ECalTrackerHits");
                 }
@@ -333,7 +333,7 @@
                 }
             }
             if (triggerECalScoringPlaneHits != null) {
-                lcsimEvent.put(ecalScoringPlaneHitsCollectionName, triggerECalScoringPlaneHits, SimTrackerHit.class, 0);
+                lcsimEvent.put(ecalScoringPlaneHitsCollectionName, triggerECalScoringPlaneHits, SimTrackerHit.class, 0xc0000000);
                 if (verbosity >= 1) {
                     System.out.println("Adding " + triggerECalScoringPlaneHits.size() + " ECalTrackerHits");
                 }

Modified: java/branches/jeremy-dev/job/pom.xml
 =============================================================================
--- java/branches/jeremy-dev/job/pom.xml	(original)
+++ java/branches/jeremy-dev/job/pom.xml	Tue Dec  1 15:55:47 2015
@@ -19,5 +19,9 @@
             <groupId>org.hps</groupId>
             <artifactId>hps-detector-model</artifactId>
         </dependency>
+        <dependency>
+            <groupId>org.hps</groupId>
+            <artifactId>hps-run-database</artifactId>
+        </dependency>        
     </dependencies>
 </project>

Modified: java/branches/jeremy-dev/job/src/main/java/org/hps/job/JobManager.java
 =============================================================================
--- java/branches/jeremy-dev/job/src/main/java/org/hps/job/JobManager.java	(original)
+++ java/branches/jeremy-dev/job/src/main/java/org/hps/job/JobManager.java	Tue Dec  1 15:55:47 2015
@@ -1,10 +1,10 @@
 package org.hps.job;
-
-import java.io.InputStream;
 
 import org.hps.conditions.ConditionsDriver;
 import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.detector.svt.SvtDetectorSetup;
+import org.hps.run.database.RunManager;
+import org.lcsim.conditions.ConditionsManager.ConditionsNotFoundException;
 import org.lcsim.job.JobControlManager;
 import org.lcsim.util.Driver;
 
@@ -32,21 +32,43 @@
      */
     public JobManager() {
     }
-
+   
     /**
-     * Override setup so the conditions system can be reset.
+     * Initialize the conditions system for the job.
+     * <p>
+     * If detector and run are provided from the command line or conditions driver then the
+     * conditions system will be initialized and frozen.
      * 
-     * @param is the input stream containing config information
+     * @throws ConditionsNotFoundException if a condition is not found during initialization
      */
-    public void setup(InputStream is) {
+    protected void initializeConditions() throws ConditionsNotFoundException {
         
-        // Add class that will setup SVT detector with conditions data (this is awkward but has to be done someplace).
-        DatabaseConditionsManager.getInstance().addConditionsListener(new SvtDetectorSetup());
+        // Initialize the db conditions manager.
+        DatabaseConditionsManager dbManager = DatabaseConditionsManager.getInstance();
         
-        super.setup(is);
+        // Initialize run manager and add as listener on conditions system.
+        RunManager runManager = RunManager.getRunManager();
+        dbManager.addConditionsListener(runManager);
+        
+        // Add class that will setup SVT detector with conditions data.
+        dbManager.addConditionsListener(new SvtDetectorSetup());
                 
-        // Setup the conditions system if there is a ConditionsDriver present.
-        this.setupConditionsDriver();
+        // Call super method which will initialize conditions system if the detector and run were provided.
+        super.initializeConditions();
+        
+        // Setup from conditions driver (to be deleted soon).
+        if (!dbManager.isInitialized()) {
+            setupConditionsDriver();
+        } else {
+            // Command line options overrode the conditions driver.
+            LOGGER.config("conditions driver was overridden by command line options");
+        }
+        
+        if (dbManager.isInitialized()) {
+            // Assume conditions system should be frozen since detector and run were provided explicitly.
+            LOGGER.config("job manager freezing conditions system");
+            dbManager.freeze();
+        }
     }
     
     /**
@@ -70,7 +92,9 @@
      * This method will find the {@link org.hps.conditions.ConditionsDriver} in the list of Drivers registered with the
      * manager and then execute its initialization method, which may override the default behavior of the conditions
      * system.
+     * @deprecated Use command line options of {@link org.lcsim.job.JobControlManager} instead.
      */
+    @Deprecated
     private void setupConditionsDriver() {
         ConditionsDriver conditionsDriver = null;
         for (final Driver driver : this.getDriverAdapter().getDriver().drivers()) {
@@ -80,7 +104,7 @@
             }
         }
         if (conditionsDriver != null) {
-            LOGGER.config("initializing conditions Driver");            
+            LOGGER.config("initializing conditions Driver");
             conditionsDriver.initialize();
             LOGGER.warning("Conditions driver will be removed soon!");
         }

Modified: java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SensorOccupancyPlotsDriver.java
 =============================================================================
--- java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SensorOccupancyPlotsDriver.java	(original)
+++ java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SensorOccupancyPlotsDriver.java	Tue Dec  1 15:55:47 2015
@@ -13,8 +13,8 @@
 import hep.aida.IPlotterRegion;
 import hep.aida.IPlotterStyle;
 import hep.aida.ITree;
-import hep.aida.jfree.plotter.Plotter;
-import hep.aida.jfree.plotter.PlotterRegion;
+import hep.aida.ref.plotter.Plotter;
+import hep.aida.ref.plotter.PlotterRegion;
 import hep.aida.ref.rootwriter.RootFileStore;
 import hep.physics.vec.Hep3Vector;
 
@@ -33,6 +33,8 @@
 import org.lcsim.event.GenericObject;
 import org.lcsim.event.RawTrackerHit;
 import org.lcsim.geometry.Detector;
+import org.lcsim.recon.tracking.digitization.sisim.SiTrackerHitStrip1D;
+import org.lcsim.recon.tracking.digitization.sisim.TrackerHitType;
 import org.lcsim.util.Driver;
 import org.hps.record.triggerbank.AbstractIntData;
 import org.hps.record.triggerbank.TIData;
@@ -46,9 +48,9 @@
 public class SensorOccupancyPlotsDriver extends Driver {
 
     // TODO: Add documentation
-    static {
-        hep.aida.jfree.AnalysisFactory.register();
-    }
+    //static {
+    //    hep.aida.jfree.AnalysisFactory.register();
+    //}
 
     // Plotting
     private static ITree tree = null;
@@ -60,6 +62,8 @@
     private static Map<String, IPlotter> plotters = new HashMap<String, IPlotter>();
     private static Map<String, IHistogram1D> occupancyPlots = new HashMap<String, IHistogram1D>();
     private static Map<String, IHistogram1D> positionPlots = new HashMap<String, IHistogram1D>();
+    private static Map<String, IHistogram1D> clusterPositionPlots = new HashMap<String, IHistogram1D>();
+    private static Map<String, IHistogram1D> clusterPositionPlotCounts = new HashMap<String, IHistogram1D>();
     private static Map<String, int[]> occupancyMap = new HashMap<String, int[]>();
     private static Map<String, IHistogram1D> maxSamplePositionPlots = new HashMap<String, IHistogram1D>();
 
@@ -69,6 +73,7 @@
     private static final String SUBDETECTOR_NAME = "Tracker";
     private String rawTrackerHitCollectionName = "SVTRawTrackerHits";
     private String triggerBankCollectionName = "TriggerBank";
+    private String stripClusterCollectionName = "StripClusterer_SiTrackerHitStrip1D";
 
     String rootFile = null;
 
@@ -100,6 +105,10 @@
 
     private boolean dropSmallHitEvents = true;
 
+    private boolean enableClusterTimeCuts = true;
+    private double clusterTimeCutMax = 4.0;
+    private double clusterTimeCutMin = -4.0;
+    
     public SensorOccupancyPlotsDriver() {
         maxSampleStatus = new SystemStatusImpl(Subsystem.SVT, "Checks that SVT is timed in (max sample plot)", true);
         maxSampleStatus.setStatus(StatusCode.UNKNOWN, "Status is unknown.");
@@ -305,6 +314,8 @@
 
             if (enablePositionPlots) {
                 positionPlots.get(sensor.getName()).reset();
+                clusterPositionPlots.get(sensor.getName()).reset();
+                clusterPositionPlotCounts.get(sensor.getName()).reset();
             }
 
             if (enableMaxSamplePlots) {
@@ -341,7 +352,9 @@
 //            this.resetPlots();
 //            return; 
 //        }
-        tree = analysisFactory.createTreeFactory().create();
+        //tree = analysisFactory.createTreeFactory().create();
+        tree = AIDA.defaultInstance().tree();
+        tree.cd("/");//        aida.tree().cd("/");
         histogramFactory = analysisFactory.createHistogramFactory(tree);
 
         // Create the plotter and regions.  A region is created for each
@@ -354,6 +367,8 @@
         if (enablePositionPlots) {
             plotters.put("Occupancy vs Position", plotterFactory.create("Occupancy vs Position"));
             plotters.get("Occupancy vs Position").createRegions(6, 6);
+            plotters.put("Cluster occupancy vs Position", plotterFactory.create("Cluster occupancy vs Position"));
+            plotters.get("Cluster occupancy vs Position").createRegions(6, 6);
         }
 
         if (enableMaxSamplePlots) {
@@ -373,13 +388,23 @@
                 if (sensor.isTopLayer()) {
                     positionPlots.put(sensor.getName(),
                             histogramFactory.createHistogram1D(sensor.getName() + " - Occupancy vs Position", 1000, 0, 60));
+                    clusterPositionPlots.put(sensor.getName(),
+                            histogramFactory.createHistogram1D(sensor.getName() + " - Cluster occupancy vs Position", 1000, 0, 60));
+                    clusterPositionPlotCounts.put(sensor.getName(),
+                            histogramFactory.createHistogram1D(sensor.getName() + " - Cluster count vs Position", 1000, 0, 60));
                 } else {
                     positionPlots.put(sensor.getName(),
                             histogramFactory.createHistogram1D(sensor.getName() + " - Occupancy vs Position", 1000, -60, 0));
+                    clusterPositionPlots.put(sensor.getName(),
+                            histogramFactory.createHistogram1D(sensor.getName() + " - Cluster occupancy vs Position", 1000, -60, 0));
+                    clusterPositionPlotCounts.put(sensor.getName(),
+                            histogramFactory.createHistogram1D(sensor.getName() + " - Cluster count vs Position", 1000, -60, 0));
                 }
 
                 plotters.get("Occupancy vs Position").region(SvtPlotUtils.computePlotterRegion(sensor))
                         .plot(positionPlots.get(sensor.getName()), this.createOccupancyPlotStyle("Distance from Beam [mm]", sensor, false));
+                plotters.get("Cluster occupancy vs Position").region(SvtPlotUtils.computePlotterRegion(sensor))
+                .plot(clusterPositionPlots.get(sensor.getName()), this.createOccupancyPlotStyle("Distance from Beam [mm]", sensor, false));
             }
             occupancyMap.put(sensor.getName(), new int[640]);
 
@@ -393,11 +418,12 @@
 
         for (IPlotter plotter : plotters.values()) {
             for (int regionN = 0; regionN < plotter.numberOfRegions(); regionN++) {
-                PlotterRegion region = ((PlotterRegion) ((Plotter) plotter).region(regionN));
-                if (region.getPlottedObjects().isEmpty()) {
-                    continue;
-                }
-                region.getPanel().addMouseListener(new PopupPlotterListener(region));
+                //Plotter l;
+                //PlotterRegion region = ((PlotterRegion) ((Plotter) plotter).region(regionN));
+                //if (region..getPlottedObjects().isEmpty()) {
+                //    continue;
+                //}
+                //region.getPanel().addMouseListener(new PopupPlotterListener(region));
             }
             plotter.show();
         }
@@ -494,6 +520,22 @@
                 maxSamplePositionPlots.get(((HpsSiSensor) rawHit.getDetectorElement()).getName()).fill(maxSamplePositionFound);
             }
         }
+        
+        // Fill the strip cluster counts if available
+        if(event.hasCollection(SiTrackerHitStrip1D.class, stripClusterCollectionName)) {
+            List<SiTrackerHitStrip1D> stripHits1D = event.get(SiTrackerHitStrip1D.class, stripClusterCollectionName);
+            for(SiTrackerHitStrip1D h : stripHits1D) {
+                SiTrackerHitStrip1D global = h.getTransformedHit(TrackerHitType.CoordinateSystem.GLOBAL);
+                Hep3Vector pos_global = global.getPositionAsVector();
+                if(enableClusterTimeCuts) {
+                    if( h.getTime() < clusterTimeCutMax && h.getTime() > clusterTimeCutMin) 
+                        clusterPositionPlotCounts.get(((HpsSiSensor) h.getRawHits().get(0).getDetectorElement()).getName()).fill(pos_global.y());
+                } else
+                    clusterPositionPlotCounts.get(((HpsSiSensor) h.getRawHits().get(0).getDetectorElement()).getName()).fill(pos_global.y());
+            }
+        }
+        
+        
 
         if (enableMaxSamplePlots && eventCount > maxSampleMonitorStart && eventCount % maxSampleMonitorPeriod == 0) {
             checkMaxSample();
@@ -521,6 +563,17 @@
                         positionPlots.get(sensor.getName()).fill(stripPosition, stripOccupancy);
                     }
                 }
+                if(enablePositionPlots) {
+                    clusterPositionPlots.get(sensor.getName()).reset();
+                    IHistogram1D h = clusterPositionPlotCounts.get(sensor.getName());
+                    for(int bin=0; bin<h.axis().bins(); ++bin) {
+                        int y = h.binEntries(bin);
+                        double stripClusterOccupancy = (double) y / (double) eventCount;
+                        double x = h.axis().binCenter(bin);
+                        clusterPositionPlots.get(sensor.getName()).fill(x,stripClusterOccupancy);
+                    }
+                }
+                
             }
         }
 

Modified: java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SvtPlotUtils.java
 =============================================================================
--- java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SvtPlotUtils.java	(original)
+++ java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/svt/SvtPlotUtils.java	Tue Dec  1 15:55:47 2015
@@ -1,14 +1,30 @@
 package org.hps.monitoring.drivers.svt;
 
+import hep.aida.IBaseHistogram;
+import hep.aida.IFitFactory;
+import hep.aida.IFitResult;
+import hep.aida.IFitter;
+import hep.aida.IFunction;
+import hep.aida.IFunctionFactory;
+import hep.aida.IHistogram1D;
+import hep.aida.IPlotter;
 import hep.aida.IPlotterFactory;
 import hep.aida.IPlotterStyle;
+import hep.aida.ref.plotter.style.registry.IStyleStore;
+import hep.aida.ref.plotter.style.registry.StyleRegistry;
+
 import java.util.HashMap;
 import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
 import org.lcsim.detector.tracker.silicon.HpsSiSensor;
 import org.lcsim.event.RawTrackerHit;
+import org.lcsim.geometry.compact.converter.HPSTrackerBuilder;
+import org.lcsim.util.aida.AIDA;
 
 /**
  *
@@ -16,6 +32,9 @@
  */
 public class SvtPlotUtils {
 
+    private static final Logger logger = Logger.getLogger(SvtPlotUtils.class.getSimpleName());
+    static private AIDA aida = AIDA.defaultInstance();
+    
     public static int computePlotterRegion(HpsSiSensor sensor) {
 
         if (sensor.getLayerNumber() < 7) {
@@ -44,6 +63,35 @@
         return -1;
     }
 
+    public static int computePlotterRegionAxialOnly(HpsSiSensor sensor) {
+        int l =  HPSTrackerBuilder.getLayerFromVolumeName(sensor.getName());
+        if(!sensor.isAxial()) throw new RuntimeException("not axial.");
+        if( l < 4 ) {
+            if (sensor.isTopLayer()) {
+                return 6 * (l - 1);
+            } else {
+                return 6 * (l - 1) + 1;
+            }
+        } else {
+            if (sensor.isTopLayer()) {
+                if (sensor.getSide() == HpsSiSensor.POSITRON_SIDE) {
+                    return 6 * (l - 4) + 2;
+                } else {
+                    return 6 * (l - 4) + 3;
+                }
+            } else if (sensor.isBottomLayer()) {
+                if (sensor.getSide() == HpsSiSensor.POSITRON_SIDE) {
+                    return 6 * (l - 4) + 4;
+                } else {
+                    return 6 * (l - 4) + 5;
+                }
+            }
+        }
+
+        return -1;
+    }
+
+    
     /**
      * Create a plotter style.
      *
@@ -140,4 +188,65 @@
         }
         return true;
     }
+    
+    public static IFitResult performGaussianFit(IHistogram1D histogram) {
+        IFunctionFactory functionFactory = aida.analysisFactory().createFunctionFactory(null);
+        IFitFactory fitFactory = aida.analysisFactory().createFitFactory();
+        IFunction function = functionFactory.createFunctionByName("Example Fit", "G");
+        IFitter fitter = fitFactory.createFitter("chi2", "jminuit");
+        double[] parameters = new double[3];
+        parameters[0] = histogram.maxBinHeight();
+        parameters[1] = histogram.mean();
+        parameters[2] = histogram.rms();
+        function.setParameters(parameters);
+        IFitResult fitResult = null;
+         Logger minuitLogger = Logger.getLogger("org.freehep.math.minuit");
+        minuitLogger.setLevel(Level.OFF);
+        minuitLogger.info("minuit logger test");
+        
+        try {
+            fitResult = fitter.fit(histogram, function);
+        } catch (RuntimeException e) {
+           logger.warning("fit failed");
+        }
+        return fitResult;
+    }
+    
+    /*
+     *  puts a function on a plotter region with a  style
+     *  copied from org.hps.monitoring.drivers.ecal.EcalMonitoringUtilities.java
+     */
+
+    public static void plot(IPlotter plotter, IFunction function, IPlotterStyle style, int region) {
+        if (style == null)
+            style = getPlotterStyle(function);
+        logger.info("Putting function in region " + region);
+        if(style != null)
+            plotter.region(region).plot(function, style);
+        else
+            plotter.region(region).plot(function);
+    }
+
+    
+    /*
+     *  gets default plotter style for a function type
+     *  copied from org.hps.monitoring.drivers.ecal.EcalMonitoringUtilities.java
+     */
+    public static IPlotterStyle getPlotterStyle(IFunction func) {
+        StyleRegistry styleRegistry = StyleRegistry.getStyleRegistry();
+        IStyleStore store = styleRegistry.getStore("DefaultStyleStore");
+        if(store == null) {
+            int n = styleRegistry.getAvailableStoreNames().length;
+            if(n==0) return null;
+            else store = styleRegistry.getStore(styleRegistry.getAvailableStoreNames()[0]);
+        }
+        IPlotterStyle style = null;
+        style = store.getStyle("DefaultFunctionStyle");
+        if (style == null) {
+            int n = store.getAllStyleNames().length;
+            if(n==0) return null;
+            else style = store.getStyle(store.getAllStyleNames()[0]);
+        }
+        return style;
+    }
 }

Modified: java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/ecal/plots/EcalLedSequenceMonitor.java
 =============================================================================
--- java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/ecal/plots/EcalLedSequenceMonitor.java	(original)
+++ java/branches/jeremy-dev/monitoring-drivers/src/main/java/org/hps/monitoring/ecal/plots/EcalLedSequenceMonitor.java	Tue Dec  1 15:55:47 2015
@@ -63,7 +63,7 @@
     private static final String dbTableName = "ecal_led_calibrations";
     private static final int runNumberMax = 9999;
     private static final int nDrivers = 8;
-    private static final int nSteps = 56;
+    private static final int nSteps = 100; //should be 56 but here is to avoid seg fault
     
    
 
@@ -746,6 +746,8 @@
                 led_calibrations.getCollectionId(), runNumber, runNumberMax, dbTableName, dbTableName, 
                 "Generated by LedAnalysis from Run #"+runNumber, dbTag);
         conditionsRecord.setConnection(conditionsManager.getConnection());
+        tableMetaData = conditionsManager.findTableMetaData("conditions");
+        conditionsRecord.setTableMetaData(tableMetaData);
         conditionsRecord.insert();
 
         System.out.println("Upload to DB done");

Modified: java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/EventFlagFilter.java
 =============================================================================
--- java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/EventFlagFilter.java	(original)
+++ java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/EventFlagFilter.java	Tue Dec  1 15:55:47 2015
@@ -11,7 +11,7 @@
  */
 public class EventFlagFilter extends EventReconFilter {
 
-    String[] flagNames = {"svt_bias_good", "svt_position_good", "svt_burstmode_noise_good", "svt_event_header_good"};
+    String[] flagNames = {"svt_bias_good", "svt_position_good", "svt_burstmode_noise_good", "svt_event_header_good", "svt_latency_good"};
 
     public void setFlagNames(String[] flagNames) {
         this.flagNames = flagNames;

Modified: java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/PulserTriggerFilterDriver.java
 =============================================================================
--- java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/PulserTriggerFilterDriver.java	(original)
+++ java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/PulserTriggerFilterDriver.java	Tue Dec  1 15:55:47 2015
@@ -7,7 +7,14 @@
 import org.hps.record.scalers.ScalerData;
 import org.hps.record.triggerbank.AbstractIntData;
 import org.hps.record.triggerbank.TIData;
-
+/**
+ * Keep pulser triggered events.
+ * Also keep EPICS events, and Scaler events.
+ * Drop all other events.
+ * 
+ * @author baltzell
+ *
+ */
 public class PulserTriggerFilterDriver extends Driver
 {
   public void process(EventHeader event) {

Modified: java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/V0CandidateFilter.java
 =============================================================================
--- java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/V0CandidateFilter.java	(original)
+++ java/branches/jeremy-dev/recon/src/main/java/org/hps/recon/filtering/V0CandidateFilter.java	Tue Dec  1 15:55:47 2015
@@ -2,15 +2,17 @@
 
 import static java.lang.Math.abs;
 import java.util.List;
+import org.hps.recon.ecal.cluster.ClusterUtilities;
+import org.hps.recon.particle.ReconParticleDriver;
 import org.hps.record.epics.EpicsData;
 import org.lcsim.event.EventHeader;
 import org.lcsim.event.ReconstructedParticle;
 
 /**
- * Class to strip off trident candidates. Currently defined as: e+ e- events with
- * tracks matched to clusters. Neither electron can be a full-energy candidate
- * (momentum less than _fullEnergyCut [0.85GeV]). The Ecal cluster times must be 
- * within _timingCut [2.5ns] of each other.
+ * Class to strip off trident candidates. Currently defined as: e+ e- events
+ * with tracks. If the tight constraint is enabled, tracks must be matched to
+ * clusters and the Ecal cluster times must be within _timingCut [2.5ns] of each
+ * other.
  *
  * @author Norman A Graf
  *
@@ -23,7 +25,6 @@
 
     private boolean _tight = false;
     private boolean _keepEpicsDataEvents = false;
-
 
     @Override
     protected void process(EventHeader event) {
@@ -40,7 +41,7 @@
             skipEvent();
         }
         List<ReconstructedParticle> V0Candidates = event.get(ReconstructedParticle.class, _V0CandidateCollectionName);
-        if (V0Candidates.size() == 0) {
+        if (V0Candidates.isEmpty()) {
             skipEvent();
         }
 
@@ -49,34 +50,33 @@
             if (V0Candidates.size() != 2) {
                 skipEvent();
             }
-        }
+            for (ReconstructedParticle rp : V0Candidates) {
 
-        for (ReconstructedParticle rp : V0Candidates) {
+                ReconstructedParticle electron;
+                ReconstructedParticle positron;
 
-            ReconstructedParticle e1 = null;
-            ReconstructedParticle e2 = null;
+                List<ReconstructedParticle> fsParticles = rp.getParticles();
+                if (fsParticles.size() != 2) {
+                    skipEvent();
+                }
+                // require both electrons to be associated with an ECal cluster
+                electron = fsParticles.get(ReconParticleDriver.ELECTRON);
+                if (electron.getClusters().isEmpty()) {
+                    skipEvent();
+                }
+                positron = fsParticles.get(ReconParticleDriver.POSITRON);
+                if (positron.getClusters().isEmpty()) {
+                    skipEvent();
+                }
 
-            List<ReconstructedParticle> electrons = rp.getParticles();
-            if (electrons.size() != 2) {
-                skipEvent();
-            }
-            // require both electrons to be associated with an ECal cluster
-            e1 = electrons.get(0);
-            if (e1.getClusters().size() == 0) {
-                skipEvent();
-            }
-            e2 = electrons.get(1);
-            if (e2.getClusters().size() == 0) {
-                skipEvent();
-            }
+                // calorimeter cluster timing cut
+                // first CalorimeterHit in the list is the seed crystal
+                double t1 = ClusterUtilities.getSeedHitTime(electron.getClusters().get(0));
+                double t2 = ClusterUtilities.getSeedHitTime(positron.getClusters().get(0));
 
-            // calorimeter cluster timing cut
-            // first CalorimeterHit in the list is the seed crystal
-            double t1 = e1.getClusters().get(0).getCalorimeterHits().get(0).getTime();
-            double t2 = e2.getClusters().get(0).getCalorimeterHits().get(0).getTime();
-
-            if (abs(t1 - t2) > _clusterTimingCut) {
-                skipEvent();
+                if (abs(t1 - t2) > _clusterTimingCut) {
+                    skipEvent();
+                }
             }
         }
         incrementEventPassed();
@@ -109,14 +109,13 @@
     public void setTightConstraint(boolean b) {
         _tight = b;
     }
-    
+
     /**
      * Setting this true keeps ALL events containing EPICS data
      *
      * @param b
      */
-    public void setKeepEpicsDataEvents(boolean b)
-    {
+    public void setKeepEpicsDataEvents(boolean b) {
         _keepEpicsDataEvents = b;
     }
 }

Added: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java	(added)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,52 @@
+package org.hps.record;
+
+import java.util.ArrayList;
+import java.util.List;
+import java.util.logging.Logger;
+
+import org.freehep.record.loop.AbstractLoopListener;
+import org.freehep.record.loop.LoopEvent;
+import org.freehep.record.loop.LoopListener;
+import org.freehep.record.loop.RecordEvent;
+import org.freehep.record.loop.RecordListener;
+
+public abstract class AbstractLoopAdapter<RecordType> extends AbstractLoopListener implements RecordListener, LoopListener {
+
+    private Logger LOGGER = Logger.getLogger(AbstractLoopAdapter.class.getPackage().getName());
+    
+    private List<AbstractRecordProcessor<RecordType>> processors = new ArrayList<AbstractRecordProcessor<RecordType>>();
+           
+    @Override
+    public void recordSupplied(RecordEvent recordEvent) {
+        //LOGGER.info("recordSupplied " + recordEvent.toString());
+        final RecordType record = (RecordType) recordEvent.getRecord();
+        for (final AbstractRecordProcessor<RecordType> processor : processors) {
+            try {
+                if (processor.isActive()) {
+                    //LOGGER.info("activating processor " + processor.getClass().getName());
+                    processor.process(record);
+                }
+            } catch (final Exception e) {
+                throw new RuntimeException(e);
+            }
+        }
+    }
+    
+    @Override
+    protected void finish(final LoopEvent event) {
+        for (final AbstractRecordProcessor<RecordType> processor : processors) {
+            processor.endJob();
+        }
+    }
+    
+    @Override
+    protected void start(final LoopEvent event) {
+        for (final AbstractRecordProcessor<RecordType> processor : processors) {
+            processor.startJob();
+        }
+    }
+    
+    void addProcessor(AbstractRecordProcessor<RecordType> processor) {
+        this.processors.add(processor);
+    }    
+}

Added: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java	(added)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,36 @@
+package org.hps.record;
+
+import java.util.Collection;
+
+import org.freehep.record.loop.DefaultRecordLoop;
+
+public abstract class AbstractRecordLoop<RecordType> extends DefaultRecordLoop {
+    
+    protected AbstractLoopAdapter<RecordType> adapter;
+    
+    public void addProcessors(Collection<AbstractRecordProcessor<RecordType>> processors) {
+        for (AbstractRecordProcessor<RecordType> processor : processors) {
+            adapter.addProcessor(processor);
+        }
+    }
+    
+    public void addProcessor(AbstractRecordProcessor<RecordType> processor) {
+        adapter.addProcessor(processor);
+    }
+    
+    /**
+     * Loop over events from the source.
+     *
+     * @param number the number of events to process or -1L for all events from the source
+     * @return the number of records that were processed
+     */
+    public long loop(final long number) {
+        if (number < 0L) {
+            this.execute(Command.GO, true);
+        } else {
+            this.execute(Command.GO_N, number, true);
+            this.execute(Command.STOP);
+        }
+        return this.getSupplied();
+    }
+}

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java	Tue Dec  1 15:55:47 2015
@@ -10,6 +10,8 @@
  */
 public abstract class AbstractRecordProcessor<RecordType> implements RecordProcessor<RecordType> {
 
+    private boolean active = true;
+    
     /**
      * End of job action.
      */
@@ -59,4 +61,12 @@
     @Override
     public void suspend() {
     }
+    
+    protected void setActive(boolean active) {
+        this.active = active;
+    }
+    
+    public boolean isActive() {
+        return this.active;
+    }
 }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/RecordProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/RecordProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/RecordProcessor.java	Tue Dec  1 15:55:47 2015
@@ -45,4 +45,11 @@
      * Suspend processing action.
      */
     void suspend();
+    
+    /**
+     * Return <code>true</code> if processor is active.
+     * 
+     * @return <code>true</code> if processor is active
+     */
+    boolean isActive();
 }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigDriver.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigDriver.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigDriver.java	Tue Dec  1 15:55:47 2015
@@ -1,16 +1,36 @@
 package org.hps.record.daqconfig;
 
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileReader;
+import java.io.IOException;
+import java.util.ArrayList;
 import java.util.List;
 
 import org.lcsim.event.EventHeader;
 import org.lcsim.util.Driver;
 
 /**
- * Class <code>DAQConfigDriver</code> is responsible for checking events
- * for DAQ configuration settings, and then passing them to the associated
+ * Class <code>DAQConfigDriver</code> is responsible for accessing the
+ * DAQ configuration settings, and then passing them to the associated
  * class <code>ConfigurationManager</code> so that they can be accessed
  * by other classes.<br/>
  * <br/>
+ * The driver may accomplish this by two means. By default, it will
+ * check each event for an <code>EvioDAQParser</code> object which
+ * contains all of the DAQ configuration information and pass this to
+ * the <code>ConfigurationManager</code>. It will continue to update
+ * the <code>ConfigurationManager</code> during the run if new parser
+ * objects appear, and can thusly account for changing DAQ conditions.
+ * <br/><br/>
+ * The driver may also be set to read a DAQ configuration from text
+ * files containing the DAQ configuration bank information. To enable
+ * this mode, the parameter <code>readDataFiles</code> must be set to
+ * <code>true</code> and the parameters <code>runNumber</code> and also
+ * <code>filepath</code> must be defined. <code>runNumber</code> defines
+ * the run number of the configuration to be loaded and the parameter
+ * <code>filepath</code> defines the location of the data file repository.
+ * <br/><br/>
  * This driver must be included in the driver chain if any other drivers
  * in the chain rely on <code>ConfigurationManager</code>, as it can
  * not be initialized otherwise.
@@ -19,15 +39,88 @@
  * @see ConfigurationManager
  */
 public class DAQConfigDriver extends Driver {
+	private int runNumber = -1;
+	private String filepath = null;
+	private boolean firstEvent = true;
+	private boolean readDataFiles = false;
+	private File[] dataFiles = new File[3];
+	private int[] crateNumber = { 46, 37, 39 };
+	
+	/**
+	 * Verifies the parameter <code>filepath</code> for the data file
+	 * repository and checks that appropriate data files exist for the
+	 * requested run number if the driver is set to read from data files.
+	 * Otherwise, this does nothing.
+	 */
+	@Override
+	public void startOfData() {
+		// Check whether to use stored data files or the EvIO data stream
+		// as the source of the DAQ settings. Nothing needs to be done
+		// in the latter case.
+		if(readDataFiles) {
+			// The user must define a data file prefix and repository
+			// location for this option to be used.
+			if(filepath == null) {
+				throw new NullPointerException("DAQ settings repository filepath must be defined.");
+			} if(runNumber == -1) {
+				throw new NullPointerException("Run number must be defined.");
+			}
+			
+			// Verify that the repository actually exist.
+			File repository = new File(filepath);
+			if(!repository.exists() || !repository.isDirectory()) {
+				throw new IllegalArgumentException("Repository location \"" + filepath + "\" must be an existing directory.");
+			}
+			
+			// Define the data file objects.
+			for(int i = 0; i < dataFiles.length; i++) {
+				try {
+					dataFiles[i] = new File(repository.getCanonicalPath() + "/" + runNumber + "_" + crateNumber[i] + ".txt");
+				} catch(IOException e) {
+					throw new RuntimeException("Error resolving absolute repository filepath.");
+				}
+			}
+			
+			// Verify that the data files actually exist.
+			for(File dataFile : dataFiles) {
+				if(!dataFile.exists() || !dataFile.canRead()) {
+					throw new IllegalArgumentException("Data file \"" + dataFile.getName() + "\" does not exist or can not be read.");
+				}
+			}
+		}
+	}
+	
     /**
      * Checks an event for the DAQ configuration banks and passes them
-     * to the <code>ConfigurationManager</code>.
-     * @param - The event to check.
+     * to the <code>ConfigurationManager</code> if the driver is set to
+     * read from the EvIO data stream. Otherwise, this will parse the
+     * data files on the first event and then do nothing.
+     * @param event - The current LCIO event.
      */
     @Override
     public void process(EventHeader event) {
+    	// If this is the first event and data files are to be read,
+    	// import the data files and generate the DAQ information.
+    	if(firstEvent && readDataFiles) {
+    		// Get the data files in the form of a data array.
+    		String[][] data;
+    		try { data = getDataFileArrays(dataFiles); }
+    		catch(IOException e) {
+				throw new RuntimeException("An error occurred when processing the data files.");
+			}
+    		
+    		// Instantiate an EvIO DAQ parser and feed it the data.
+    		EvioDAQParser daqConfig = new EvioDAQParser();
+    		for(int i = 0; i < dataFiles.length; i++) {
+        		daqConfig.parse(crateNumber[i], runNumber, data[i]);
+    		}
+    		
+    		// Update the configuration manager.
+    		ConfigurationManager.updateConfiguration(daqConfig);
+    	}
+    	
         // Check if a trigger configuration bank exists.
-        if(event.hasCollection(EvioDAQParser.class, "TriggerConfig")) {
+    	if(!readDataFiles && event.hasCollection(EvioDAQParser.class, "TriggerConfig")) {
             // Get the trigger configuration bank. There should only be
             // one in the list.
             List<EvioDAQParser> configList = event.get(EvioDAQParser.class, "TriggerConfig");
@@ -37,5 +130,87 @@
             // configuration object.
             ConfigurationManager.updateConfiguration(daqConfig);
         }
+        
+        // Note that it is no longer the first event.
+        firstEvent = false;
+    }
+	
+    /**
+     * Converts DAQ configuration data files into an array of strings
+     * where each array entry represents a line in the configuration
+     * file. The first array index of the returned object corresponds
+     * to the file, and the second array index corresponds to the line.
+     * @param dataFiles - An array of <code>File</code> objects pointing
+     * to the data files that are to be converted. These are expected
+     * to be plain text files.
+     * @return Returns a two-dimensional array of <code>String</code>
+     * objects where the first array index corresponds to the object
+     * of the same index in the <code>File</code> array and the second
+     * array index corresponds to the lines in the file referenced by
+     * the <code>File</code> object.
+     * @throws IOException Occurs if there is an issue with accessing
+     * or reading the objects in the objects referred to by the files
+     * pointed to in the <code>dataFiles</code> array.
+     */
+	private static final String[][] getDataFileArrays(File[] dataFiles) throws IOException {
+		// Create file readers to process the data files.
+		FileReader[] fr = new FileReader[dataFiles.length];
+		BufferedReader[] reader = new BufferedReader[dataFiles.length];
+		for(int i = 0; i < dataFiles.length; i++) {
+			fr[i] = new FileReader(dataFiles[i]);
+			reader[i] = new BufferedReader(fr[i]);
+		}
+		
+		// Generate String arrays where each entry in the array is
+		// a line from the data file.
+		String[][] data = new String[dataFiles.length][0];
+		for(int i = 0; i < dataFiles.length; i++) {
+			// Create a list to hold the raw strings.
+			List<String> rawData = new ArrayList<String>();
+			
+			// Add each line from the current data file to the list
+			// as a single entry.
+			String curLine = null;
+			while((curLine = reader[i].readLine()) != null) {
+				rawData.add(curLine);
+			}
+			
+			// Convert the list into a String array.
+			data[i] = rawData.toArray(new String[rawData.size()]);
+		}
+		
+		// Return the data array.
+		return data;
+	}
+    
+	/**
+	 * Sets the run number of the DAQ configuration being processed.
+	 * This is only used when reading from data files.
+	 * @param run - The run number of the data files to be used.
+	 */
+    public void setRunNumber(int run) {
+    	runNumber = run;
+    }
+    
+    /**
+     * Sets the location of the DAQ configuration data files. This is
+     * only used when reading from the data files.
+     * @param filepath - The file path of the data file repository.
+     */
+    public void setDataFileRepository(String filepath) {
+    	this.filepath = filepath;
+    }
+    
+    /**
+     * Sets whether or not to read the DAQ configuration directly from
+     * the EvIO data stream or whether to read the configuration from
+     * data files. Parameters <code>runNumber</code> and <code>filepath</code>
+     * must also be defined if this is set to <code>true</code>.
+     * @param state - <code>true</code> indicates that the configuration
+     * should be read from data files, and <code>false</code> that it
+     * should be read from the EvIO stream.
+     */
+    public void setReadDataFiles(boolean state) {
+    	readDataFiles = state;
     }
 }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigEvioProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigEvioProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigEvioProcessor.java	Tue Dec  1 15:55:47 2015
@@ -1,65 +1,129 @@
 package org.hps.record.daqconfig;
 
-import java.util.ArrayList;
-import java.util.List;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.logging.Level;
+import java.util.logging.Logger;
 
-import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.record.evio.EvioBankTag;
 import org.hps.record.evio.EvioEventProcessor;
 import org.hps.record.evio.EvioEventUtilities;
 import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
-import org.lcsim.conditions.ConditionsManager.ConditionsNotFoundException;
 
 /**
- * Modified from code in {@link org.hps.evio.TriggerConfigEvioReader} to extract trigger
- * config without an output LCSim event.
+ * Copied and modified from code in {@link org.hps.evio.TriggerConfigEvioReader} to extract DAQ config without
+ * needing an output LCSim event.
+ * <p>
+ * Only the last valid DAQ config object will be saved.
  * 
  * @author Jeremy McCormick, SLAC
  */
 public class DAQConfigEvioProcessor extends EvioEventProcessor {
-           
-    private List<EvioDAQParser> triggerConfig = new ArrayList<EvioDAQParser>();
 
+    private Logger LOGGER = Logger.getLogger(DAQConfigEvioProcessor.class.getPackage().getName());
+        
+    private DAQConfig daqConfig = null;
+    
+    private Map<Integer, String> stringData = new HashMap<Integer, String>();
+    
+    private Integer run = null;
+
+    /**
+     * Process EVIO events to extract DAQ config data.
+     */
     @Override
-    public void process(EvioEvent evioEvent) {        
+    public void process(EvioEvent evioEvent) {       
+        try {            
+            // Initialize the run number if necessary.
+            if (run == null) {
+                try {
+                    run = EvioEventUtilities.getRunNumber(evioEvent);
+                    LOGGER.info("run " + run);
+                } catch (NullPointerException e) {
+                }
+            }
+
+            // Can only start parsing DAQ banks once the run is set.
+            if (run != null) {
+                
+                // Parse config data from the EVIO banks.
+                EvioDAQParser evioParser = parseEvioData(evioEvent);
+            
+                // Was there a valid config created from the EVIO event?
+                if (evioParser != null) {            
+                    // Set the current DAQ config object.
+                    ConfigurationManager.updateConfiguration(evioParser);
+                    daqConfig = ConfigurationManager.getInstance();
+                }
+            }
+        } catch (Exception e) {
+            LOGGER.log(Level.WARNING, "Error parsing DAQ config from EVIO.", e);
+        }
+    }
+    
+    /**
+     * Parse DAQ config from an EVIO event.
+     * 
+     * @param evioEvent the EVIO event
+     * @return a parser object if the event has valid config data; otherwise <code>null</code>
+     */
+    private EvioDAQParser parseEvioData(EvioEvent evioEvent) {
+        EvioDAQParser parser = null;
+        int configBanks = 0;
         for (BaseStructure bank : evioEvent.getChildrenList()) {
-            if (bank.getChildCount() <= 0)
+            if (bank.getChildCount() <= 0) {
                 continue;
+            }
             int crate = bank.getHeader().getTag();
             for (BaseStructure subBank : bank.getChildrenList()) {
                 if (EvioBankTag.TRIGGER_CONFIG.equals(subBank)) {
-                    if (subBank.getStringData() == null) {                        
-                        throw new RuntimeException("Trigger config bank is missing string data.");
+                    if (subBank.getStringData() == null) {
+                        LOGGER.warning("Trigger config bank is missing string data.");
+                    } else {
+                        try { 
+                            if (parser == null) {
+                                parser = new EvioDAQParser();
+                                stringData.clear();
+                            }
+                            LOGGER.fine("raw string data" + subBank.getStringData()[0]);
+                            stringData.put(crate, subBank.getStringData()[0]);
+                            LOGGER.info("Parsing DAQ config from crate " + crate + ".");
+                            parser.parse(crate, run, subBank.getStringData());
+                            ++configBanks;
+                        } catch (Exception e) {
+                            LOGGER.log(Level.WARNING, "Failed to parse DAQ config.", e);
+                        }
                     }
-                    createTriggerConfig(evioEvent, crate, subBank);
                 }
             }
         }
+        if (configBanks >= 4 || parser == null) {
+            if (parser != null) {
+                LOGGER.info("DAQ config was created from event " + evioEvent.getEventNumber() + " with " + configBanks + " banks.");
+            }
+            return parser;
+        } else {
+            LOGGER.warning("Not enough banks were found to build DAQ config.");
+            return null;
+        }
+    }
+
+    /**
+     * Get the DAQ config.
+     * 
+     * @return the DAQ config
+     */
+    public DAQConfig getDAQConfig() {
+        return this.daqConfig;
     }
     
-    private void createTriggerConfig(EvioEvent evioEvent, int crate, BaseStructure subBank) {
-        
-        // Get run number from EVIO event.
-        int runNumber = EvioEventUtilities.getRunNumber(evioEvent);
-        
-        // Initialize the conditions system if necessary as the DAQ config parsing classes use it.
-        DatabaseConditionsManager conditionsManager = DatabaseConditionsManager.getInstance();
-        if (!conditionsManager.isInitialized() || conditionsManager.getRun() != runNumber) {
-            try {
-                conditionsManager.setXmlConfig("/org/hps/conditions/config/conditions_database_no_svt.xml");
-                DatabaseConditionsManager.getInstance().setDetector("HPS-dummy-detector", runNumber);
-            } catch (ConditionsNotFoundException e) {
-                throw new RuntimeException(e);
-            }
-        }
-        
-        // Create the trigger config from the EVIO data.
-        triggerConfig = new ArrayList<EvioDAQParser>();
-        triggerConfig.add(new EvioDAQParser());
-        triggerConfig.get(0).parse(
-                crate, 
-                runNumber, 
-                subBank.getStringData());
+    /**
+     * Get a map of bank number to its string data for the current config.
+     * 
+     * @return a map of bank to trigger config data
+     */
+    public Map<Integer, String> getTriggerConfigData() {
+        return this.stringData;
     }
 }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java	Tue Dec  1 15:55:47 2015
@@ -39,7 +39,7 @@
     private final EpicsEvioProcessor processor = new EpicsEvioProcessor();
 
     /**
-     * Create an EPICs log.
+     * Create a processor that will make a list of EPICS data.
      */
     public EpicsRunProcessor() {
     }
@@ -66,9 +66,9 @@
 
         // Add EPICS data to the collection.
         if (this.currentEpicsData != null) {
-            LOGGER.info("adding EPICS data for run " + this.currentEpicsData.getEpicsHeader().getRun() + " and timestamp " 
-                    + this.currentEpicsData.getEpicsHeader().getTimestamp() + " with seq " 
-                    + this.currentEpicsData.getEpicsHeader().getSequence());
+            LOGGER.info("Adding EPICS data with run " + this.currentEpicsData.getEpicsHeader().getRun() + "; timestamp " 
+                    + this.currentEpicsData.getEpicsHeader().getTimestamp() + "; seq "
+                    + this.currentEpicsData.getEpicsHeader().getSequence() + ".");
             this.epicsDataSet.add(this.currentEpicsData);
         }
     }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java	Tue Dec  1 15:55:47 2015
@@ -39,21 +39,28 @@
      */
     @Override
     public void process(final EvioEvent evioEvent) throws Exception {
+        
         // Get the head head bank from event.
         final BaseStructure headBank = EvioEventUtilities.getHeadBank(evioEvent);
 
-        // Is the head bank present?
-        if (headBank != null) {
+        // Initialize from head bank.
+        if (headBank != null) {            
+            initializeConditions(headBank.getIntData()[1]);
+        }
+        
+        // Initialize from PRESTART.
+        if (EventTagConstant.PRESTART.matches(evioEvent)) {
+            int runNumber = EvioEventUtilities.getControlEventData(evioEvent)[1];
+            initializeConditions(runNumber);
+        }
+    }
 
-            // Get the run number from the head bank.
-            final int runNumber = headBank.getIntData()[1];
-
-            // Initialize the conditions system from the detector name and run number.
-            try {
-                ConditionsManager.defaultInstance().setDetector(this.detectorName, runNumber);
-            } catch (final ConditionsNotFoundException e) {
-                throw new RuntimeException("Error setting up conditions from EVIO head bank.", e);
-            }
+    private void initializeConditions(final int runNumber) {
+        // Initialize the conditions system from the detector name and run number.
+        try {
+            ConditionsManager.defaultInstance().setDetector(this.detectorName, runNumber);
+        } catch (final ConditionsNotFoundException e) {
+            throw new RuntimeException("Error setting up conditions from EVIO head bank.", e);
         }
     }
 
@@ -65,6 +72,7 @@
      * @param evioEvent the <code>EvioEvent</code> to process
      */
     @Override
+    // FIXME: not activated by EvioLoop
     public void startRun(final EvioEvent evioEvent) {
         // System.out.println("EvioDetectorConditionsProcessor.startRun");
         if (EvioEventUtilities.isPreStartEvent(evioEvent)) {

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java	Tue Dec  1 15:55:47 2015
@@ -110,23 +110,18 @@
     /**
      * Get the run number from an EVIO event.
      *
-     * @return the run number
-     * @throws IllegalArgumentException if event does not have a head bank
-     */
-    public static int getRunNumber(final EvioEvent event) {
+     * @return the run number or <code>null</code> if not present in event
+     */
+    public static Integer getRunNumber(final EvioEvent event) {
         if (isControlEvent(event)) {
             return getControlEventData(event)[1];
         } else if (isPhysicsEvent(event)) {
             final BaseStructure headBank = EvioEventUtilities.getHeadBank(event);
             if (headBank != null) {
                 return headBank.getIntData()[1];
-            } else {
-                throw new IllegalArgumentException("Head bank is missing from physics event.");
-            }
-        } else {
-            // Not sure if this would ever happen.
-            throw new IllegalArgumentException("Wrong event type: " + event.getHeader().getTag());
-        }
+            } 
+        } 
+        return null;
     }
 
     /**

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java	Tue Dec  1 15:55:47 2015
@@ -12,10 +12,8 @@
 import org.jlab.coda.jevio.EvioReader;
 
 /**
- * A basic implementation of an <tt>AbstractRecordSource</tt> for supplying <tt>EvioEvent</tt> objects to a loop from a
- * list of EVIO files.
- * <p>
- * Unlike the LCIO record source, it has no rewind or indexing capabilities.
+ * A basic implementation of an <code>AbstractRecordSource</code> for supplying <code>EvioEvent</code> objects to a 
+ * loop from a list of EVIO files.
  *
  * @author Jeremy McCormick, SLAC
  */
@@ -159,10 +157,8 @@
      */
     private void openReader() {
         try {
-            System.out.println("Opening reader for file " + this.files.get(this.fileIndex) + " ...");
-            // FIXME: this should use the reader directly and cached paths should be managed externally
+            // FIXME: This should use the reader directly and MSS paths should be transformed externally.
             this.reader = EvioFileUtilities.open(this.files.get(this.fileIndex), true);
-            System.out.println("Done opening file.");
         } catch (EvioException | IOException e) {
             throw new RuntimeException(e);
         }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java	Tue Dec  1 15:55:47 2015
@@ -25,28 +25,7 @@
      * Milliseconds constant for conversion to/from second.
      */
     private static final long MILLISECONDS = 1000L;
-
-    /**
-     * Get a cached file path, assuming that the input file path is on the JLAB MSS e.g. it starts with "/mss".
-     * If the file is not on the JLAB MSS an error will be thrown.
-     * <p>
-     * If the file is already on the cache disk just return the same file.
-     *
-     * @param mssFile the MSS file path
-     * @return the cached file path (prepends "/cache" to the path)
-     * @throws IllegalArgumentException if the file is not on the MSS (e.g. path does not start with "/mss")
-     */
-    public static File getCachedFile(final File mssFile) {
-        if (!isMssFile(mssFile)) {
-            throw new IllegalArgumentException("File " + mssFile.getPath() + " is not on the JLab MSS.");
-        }
-        File cacheFile = mssFile;
-        if (!isCachedFile(mssFile)) {
-            cacheFile = new File("/cache" + mssFile.getAbsolutePath());
-        }        
-        return cacheFile;
-    }
-
+   
     /**
      * Get the run number from the file name.
      *
@@ -74,26 +53,6 @@
     }
 
     /**
-     * Return <code>true</code> if this is a file on the cache disk e.g. the path starts with "/cache".
-     *
-     * @param file the file
-     * @return <code>true</code> if the file is a cached file
-     */
-    public static boolean isCachedFile(final File file) {
-        return file.getPath().startsWith("/cache");
-    }
-
-    /**
-     * Return <code>true</code> if this file is on the JLAB MSS e.g. the path starts with "/mss".
-     *
-     * @param file the file
-     * @return <code>true</code> if the file is on the MSS
-     */
-    public static boolean isMssFile(final File file) {
-        return file.getPath().startsWith("/mss");
-    }
-
-    /**
      * Open an EVIO file using an <code>EvioReader</code> in memory mapping mode.
      *
      * @param file the EVIO file
@@ -116,14 +75,10 @@
      */
     public static EvioReader open(final File file, final boolean sequential) throws IOException, EvioException {
         LOGGER.info("opening " + file.getPath() + " in " + (sequential ? "sequential" : "mmap" + " mode"));
-        File openFile = file;
-        if (isMssFile(file)) {
-            openFile = getCachedFile(file);
-        }
         final long start = System.currentTimeMillis();
-        final EvioReader reader = new EvioReader(openFile, false, sequential);
+        final EvioReader reader = new EvioReader(file, false, sequential);
         final long end = System.currentTimeMillis() - start;
-        LOGGER.info("opened " + openFile.getPath() + " in " + (double) end / (double) MILLISECONDS + " seconds in "
+        LOGGER.info("opened " + file.getPath() + " in " + (double) end / (double) MILLISECONDS + " seconds in "
                 + (sequential ? "sequential" : "mmap" + " mode"));
         return reader;
     }

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoop.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoop.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoop.java	Tue Dec  1 15:55:47 2015
@@ -1,52 +1,24 @@
 package org.hps.record.evio;
 
-import org.freehep.record.loop.DefaultRecordLoop;
+import org.hps.record.AbstractRecordLoop;
+import org.jlab.coda.jevio.EvioEvent;
 
 /**
  * Implementation of a Freehep <code>RecordLoop</code> for EVIO data.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class EvioLoop extends DefaultRecordLoop {
-
-    /**
-     * The record adapter.
-     */
-    private final EvioLoopAdapter adapter = new EvioLoopAdapter();
+public class EvioLoop extends AbstractRecordLoop<EvioEvent> {
 
     /**
      * Create a new record loop.
      */
     public EvioLoop() {
+        this.adapter = new EvioLoopAdapter();
         this.addLoopListener(adapter);
         this.addRecordListener(adapter);
     }
-
-    /**
-     * Add an EVIO event processor to the adapter which will be activated for every EVIO event that is processed.
-     *
-     * @param evioEventProcessor the EVIO processor to add
-     */
-    public void addEvioEventProcessor(final EvioEventProcessor evioEventProcessor) {
-        adapter.addEvioEventProcessor(evioEventProcessor);
-    }
-
-    /**
-     * Loop over events from the source.
-     *
-     * @param number the number of events to process or -1L for all events from the source
-     * @return the number of records that were processed
-     */
-    public long loop(final long number) {
-        if (number < 0L) {
-            this.execute(Command.GO, true);
-        } else {
-            this.execute(Command.GO_N, number, true);
-            this.execute(Command.STOP);
-        }
-        return this.getSupplied();
-    }
-
+  
     /**
      * Set the EVIO data source.
      *

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java	Tue Dec  1 15:55:47 2015
@@ -1,14 +1,6 @@
 package org.hps.record.evio;
 
-import java.util.ArrayList;
-import java.util.List;
-import java.util.logging.Logger;
-
-import org.freehep.record.loop.AbstractLoopListener;
-import org.freehep.record.loop.LoopEvent;
-import org.freehep.record.loop.LoopListener;
-import org.freehep.record.loop.RecordEvent;
-import org.freehep.record.loop.RecordListener;
+import org.hps.record.AbstractLoopAdapter;
 import org.jlab.coda.jevio.EvioEvent;
 
 /**
@@ -16,79 +8,5 @@
  *
  * @author Jeremy McCormick, SLAC
  */
-public final class EvioLoopAdapter extends AbstractLoopListener implements RecordListener, LoopListener {
-
-    /**
-     * Initialize the logger.
-     */
-    private static final Logger LOGGER = Logger.getLogger(EvioLoopAdapter.class.getPackage().getName());
-
-    /**
-     * List of event processors to activate.
-     */
-    private final List<EvioEventProcessor> processors = new ArrayList<EvioEventProcessor>();
-
-    /**
-     * Create a new loop adapter.
-     */
-    EvioLoopAdapter() {
-    }
-
-    /**
-     * Add an EVIO processor to the adapter.
-     *
-     * @param processor the EVIO processor to add to the adapter
-     */
-    void addEvioEventProcessor(final EvioEventProcessor processor) {
-        LOGGER.info("adding " + processor.getClass().getName() + " to EVIO processors");
-        this.processors.add(processor);
-    }
-
-    /**
-     * Implementation of the finish hook which activates the {@link EvioEventProcessor#endJob()} method of all
-     * registered processors.
-     */
-    @Override
-    protected void finish(final LoopEvent event) {
-        LOGGER.info("finish");
-        for (final EvioEventProcessor processor : processors) {
-            processor.endJob();
-        }
-    }
-
-    /**
-     * Primary event processing method that activates the {@link EvioEventProcessor#process(EvioEvent)} method of all
-     * registered processors.
-     *
-     * @param recordEvent the record event to process which should have an EVIO event
-     * @throws IllegalArgumentException if the record is the wrong type
-     */
-    @Override
-    public void recordSupplied(final RecordEvent recordEvent) {
-        final Object record = recordEvent.getRecord();
-        if (record instanceof EvioEvent) {
-            final EvioEvent evioEvent = EvioEvent.class.cast(record);
-            for (final EvioEventProcessor processor : processors) {
-                try {
-                    processor.process(evioEvent);
-                } catch (final Exception e) {
-                    throw new RuntimeException(e);
-                }
-            }
-        } else {
-            throw new IllegalArgumentException("The supplied record has the wrong type: " + record.getClass());
-        }
-    }
-
-    /**
-     * Implementation of the start hook which activates the {@link EvioEventProcessor#startJob()} method of all
-     * registered processors.
-     */
-    @Override
-    protected void start(final LoopEvent event) {
-        LOGGER.info("start");
-        for (final EvioEventProcessor processor : processors) {
-            processor.startJob();
-        }
-    }
+public final class EvioLoopAdapter extends AbstractLoopAdapter<EvioEvent> {
 }

Added: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java	(added)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,175 @@
+package org.hps.record.svt;
+
+import java.io.StringReader;
+import java.io.StringWriter;
+
+import javax.xml.parsers.DocumentBuilder;
+import javax.xml.parsers.DocumentBuilderFactory;
+import javax.xml.transform.OutputKeys;
+import javax.xml.transform.Transformer;
+import javax.xml.transform.TransformerFactory;
+import javax.xml.transform.dom.DOMSource;
+import javax.xml.transform.stream.StreamResult;
+
+import org.w3c.dom.Document;
+import org.xml.sax.InputSource;
+
+/**
+ * Represents the four SVT status banks from EVIO sync events containing XML config data.
+ * 
+ * @author Jeremy McCormick, SLAC
+ */
+public class SvtConfigData {
+        
+    /**
+     * Helper for ROC bank tag information and data array indices.
+     */
+    public enum RocTag {
+        
+        /** Data bank. */
+        DATA(51, 0, 1),
+        /** Control bank. */
+        CONTROL(66, 2, 3);
+        
+        private int tag;
+        private int configIndex;
+        private int statusIndex;
+                
+        RocTag(int tag, int configIndex, int statusIndex) {
+            this.tag = tag;
+            this.configIndex = configIndex;
+            this.statusIndex = statusIndex;
+        }
+        
+        int configIndex() {
+            return configIndex;
+        }
+        
+        int statusIndex() {
+            return statusIndex;
+        }
+        
+        /**
+         * Get the ROC tag from an int value.
+         * 
+         * @param tag the tag's int value
+         * @return the matching <code>RocTag</code>
+         * @throws IllegalArgumentException if <code>tag</code> is not valid
+         */
+        static RocTag fromTag(int tag) {
+            if (tag == DATA.tag) {
+                return DATA;
+            } else if (tag == CONTROL.tag) {
+                return CONTROL;
+            } else {
+                throw new IllegalArgumentException("Unknown tag " + tag + " for ROC.");
+            }
+        }
+    }
+    
+    // Unix timestamp from the closest head bank.
+    private int timestamp;
+    
+    // The config data strings.
+    private String[] data = new String[4];
+        
+    public SvtConfigData(int timestamp) {
+        this.timestamp = timestamp;
+    }
+    
+    public void setData(RocTag rocTag, String data) {
+        if (data.contains("<config>")) {
+            setConfigData(rocTag, data);
+        }
+        if (data.contains("<status>")) {
+            setStatusData(rocTag, data);
+        }
+    }
+    
+    public void setConfigData(RocTag rocTag, String configData) {
+        if (rocTag.equals(RocTag.DATA)) {
+            data[RocTag.DATA.configIndex()] = configData;
+        } else {
+            data[RocTag.CONTROL.configIndex()] = configData;
+        }
+    }
+    
+    public void setStatusData(RocTag rocTag, String statusData) {
+        if (rocTag.equals(RocTag.DATA)) {
+            data[RocTag.DATA.statusIndex()] = statusData;
+        } else {
+            data[RocTag.CONTROL.statusIndex()] = statusData;
+        }
+    }
+    
+    public String getConfigData(RocTag roc) {
+        if (roc.equals(RocTag.DATA)) {
+            return data[RocTag.DATA.configIndex()];
+        } else {
+            return data[RocTag.CONTROL.configIndex()];
+        }
+    }
+    
+    public String getStatusData(RocTag roc) {
+        if (roc.equals(RocTag.DATA)) {
+            return data[RocTag.DATA.statusIndex()];
+        } else {
+            return data[RocTag.CONTROL.statusIndex()];
+        }
+    }
+    
+    public Document toXmlDocument() {
+        StringBuffer sb = new StringBuffer();
+        sb.append("<svt>" + '\n');
+        if (getConfigData(RocTag.DATA) != null) {
+            sb.append(getConfigData(RocTag.DATA));
+        } else {
+            sb.append("<config/>" + '\n');
+        }
+        if (getStatusData(RocTag.DATA) != null) {
+            sb.append(getStatusData(RocTag.DATA));
+        } else {
+            sb.append("<status/>" + '\n');
+        }
+        if (getConfigData(RocTag.CONTROL) != null) {
+            sb.append(getConfigData(RocTag.CONTROL));                
+        } else {
+            sb.append("<config/>" + '\n');
+        }
+        if (getStatusData(RocTag.CONTROL) != null) {
+            sb.append(getStatusData(RocTag.CONTROL));   
+        } else {
+            sb.append("<status/>" + '\n');
+        }
+        sb.append("</svt> + '\n'");
+        Document document = null;
+        try {
+            DocumentBuilder builder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
+            document = builder.parse(new InputSource(new StringReader(sb.toString())));
+        } catch (Exception e) {
+            throw new RuntimeException(e);
+        }
+        return document;
+    }
+    
+    public String toXmlString() {
+        Document document = toXmlDocument();
+        String output = null;
+        try {
+            TransformerFactory tf = TransformerFactory.newInstance();
+            Transformer transformer = tf.newTransformer();
+            transformer.setOutputProperty(OutputKeys.OMIT_XML_DECLARATION, "yes");
+            transformer.setOutputProperty(OutputKeys.INDENT, "yes");
+            StringWriter writer = new StringWriter();
+            transformer.transform(new DOMSource(document), new StreamResult(writer));
+            output = writer.getBuffer().toString().replaceAll("\n|\r", "");
+        } catch (Exception e) {
+            throw new RuntimeException();
+        }
+        return output;
+    }
+    
+    public int getTimestamp() {
+        return timestamp;
+    }
+}

Added: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java	(added)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,87 @@
+package org.hps.record.svt;
+
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.logging.Logger;
+
+import org.hps.record.evio.EvioEventProcessor;
+import org.hps.record.evio.EvioEventUtilities;
+import org.hps.record.svt.SvtConfigData.RocTag;
+import org.jlab.coda.jevio.BaseStructure;
+import org.jlab.coda.jevio.EvioEvent;
+
+/**
+ * Get a list of SVT config data from an EVIO data stream.
+ * 
+ * @author Jeremy McCormick, SLAC
+ * @see SvtConfigData
+ */
+public class SvtConfigEvioProcessor extends EvioEventProcessor {
+
+    private static Logger LOGGER = Logger.getLogger(SvtConfigEvioProcessor.class.getPackage().getName());
+
+    private static final int DATA_TAG = 51;
+    private static final int CONTROL_TAG = 66;
+    private static final int CONFIG_TAG = 57614;
+
+    private List<SvtConfigData> configs = new ArrayList<SvtConfigData>();
+
+    private int timestamp = 0;
+    
+    public void process(EvioEvent evioEvent) {
+        SvtConfigData config = null;
+        BaseStructure headBank = EvioEventUtilities.getHeadBank(evioEvent);
+        int configBanks = 0;
+        if (headBank != null) {
+            if (headBank.getIntData()[0] != 0) {
+                timestamp = headBank.getIntData()[0];
+                LOGGER.info("set timestamp " + timestamp);
+            }
+        }
+        for (BaseStructure bank : evioEvent.getChildrenList()) {
+            if (bank.getHeader().getTag() == DATA_TAG || bank.getHeader().getTag() == CONTROL_TAG) {
+                if (bank.getChildrenList() != null) {
+                    for (BaseStructure subBank : bank.getChildrenList()) {
+                        if (subBank.getHeader().getTag() == CONFIG_TAG) {
+                            String[] stringData = subBank.getStringData();
+                            if (stringData == null) {
+                                LOGGER.warning("string data is null");
+                                if (subBank.getRawBytes() != null) {
+                                    LOGGER.info("raw byte array len " + subBank.getRawBytes().length);
+                                    LOGGER.info("cnv to string data" + '\n' + new String(subBank.getRawBytes(), StandardCharsets.UTF_8));
+                                } else {
+                                    LOGGER.warning("Raw byte array is null.");
+                                }
+                            } else {
+                                if (config == null) {
+                                    config = new SvtConfigData(timestamp);
+                                }
+                                if (stringData.length > 0) {
+                                    if (!stringData[0].trim().isEmpty()) {
+                                        LOGGER.info("Adding SVT config data with len " + stringData[0].length() + " ..." + '\n' + stringData[0]);
+                                        config.setData(RocTag.fromTag(bank.getHeader().getTag()), stringData[0]);
+                                        ++configBanks;
+                                    } else {
+                                        LOGGER.warning("String data has no XML content.");
+                                    }
+                                } else {
+                                    LOGGER.warning("String data has zero len.");
+                                }
+                            }
+                        }
+                    }
+                }
+            }
+        } 
+        if (config != null) {
+            LOGGER.info("Adding SVT config " + evioEvent.getEventNumber() + " with " + configBanks 
+                    + " banks from event " + evioEvent.getEventNumber());
+            this.configs.add(config);
+        }
+    }
+
+    public List<SvtConfigData> getSvtConfigs() {
+        return configs;
+    }
+}

Modified: java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java
 =============================================================================
--- java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java	(original)
+++ java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java	Tue Dec  1 15:55:47 2015
@@ -55,13 +55,18 @@
             }
         }
     }
+    
+    public long getTiTimeOffset() {
+        final long offsetRange = maxOffset - minOffset;
+        if (offsetRange > minRange && nOutliers < maxOutliers) {
+            return minOffset;
+        } else {
+            return 0L;
+        }
+    }
 
     public void updateTriggerConfig(final TriggerConfig triggerConfig) {
-        final long offsetRange = maxOffset - minOffset;
-        if (offsetRange > minRange && nOutliers < maxOutliers) {
-            triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, minOffset);
-        } else {
-            triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, 0L);
-        }
+        long tiTimeOffset = getTiTimeOffset();
+        triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, tiTimeOffset);
     }
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java	Tue Dec  1 15:55:47 2015
@@ -97,12 +97,12 @@
                 deleteEpicsData.setInt(1, headerId);
                 int rowsAffected = deleteEpicsData.executeUpdate();
                 if (rowsAffected == 0) {
-                    throw new SQLException("Deletion of EPICS data failed; no rows affect.");
+                    throw new SQLException("Deletion of EPICS data failed; no rows affected.");
                 }
                 deleteHeader.setInt(1, headerId);
                 rowsAffected = deleteHeader.executeUpdate();
                 if (rowsAffected == 0) {
-                    throw new SQLException("Deletion of EPICS header failed; no rows affect.");
+                    throw new SQLException("Deletion of EPICS header failed; no rows affected.");
                 }
             }
 
@@ -137,7 +137,7 @@
      * Get EPICS data by run.
      *
      * @param run the run number
-     * @param epicsType the type of EPICS data (1s or 10s)
+     * @param epicsType the type of EPICS data (2s or 20s)
      * @return the EPICS data
      */
     @Override
@@ -238,11 +238,11 @@
                     insertStatement.setDouble(parameterIndex, value);
                     ++parameterIndex;
                 }
-                final int dataRowsCreated = insertStatement.executeUpdate();                
+                final int dataRowsCreated = insertStatement.executeUpdate();
                 if (dataRowsCreated == 0) {
                     throw new SQLException("Creation of EPICS data failed; no rows affected.");
                 }
-                LOGGER.info("inserted EPICS data with run " + epicsHeader.getRun() + ", seq " + epicsHeader.getSequence() + "timestamp " 
+                LOGGER.fine("inserted EPICS data with run " + epicsHeader.getRun() + "; seq " + epicsHeader.getSequence() + "; timestamp " 
                         + epicsHeader.getTimestamp());
                 insertStatement.close();
             }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsType.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsType.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsType.java	Tue Dec  1 15:55:47 2015
@@ -3,34 +3,34 @@
 import org.hps.record.epics.EpicsData;
 
 /**
- * Enum for representing different types of EPICS data in the run database, of which there are currently two (1s and
- * 10s).
+ * Enum for representing different types of EPICS data in the run database, of which there are currently two (2s and
+ * 20s).
  *
  * @author Jeremy McCormick, SLAC
  */
 public enum EpicsType {
 
     /**
-     * 10S EPICS data.
+     * 20S EPICS data.
      */
-    EPICS_10S(10),
+    EPICS_20s(10),
     /**
-     * 1S EPICS data.
+     * 2S EPICS data.
      */
-    EPICS_1S(1);
+    EPICS_2s(1);
 
     /**
      * Get the type from an int.
      *
      * @param type the type from an int
      * @return the type from an int
-     * @throws IllegalArgumentException if <code>type</code> is invalid (not 1 or 10)
+     * @throws IllegalArgumentException if <code>type</code> is invalid (not 2 or 20)
      */
     public static EpicsType fromInt(final int type) {
-        if (type == EPICS_1S.type) {
-            return EPICS_1S;
-        } else if (type == EPICS_10S.type) {
-            return EPICS_10S;
+        if (type == EPICS_2s.type) {
+            return EPICS_2s;
+        } else if (type == EPICS_20s.type) {
+            return EPICS_20s;
         } else {
             throw new IllegalArgumentException("The type code is invalid (must be 1 or 10): " + type);
         }
@@ -44,9 +44,9 @@
     public static EpicsType getEpicsType(final EpicsData epicsData) {
         // FIXME: The type argument should be set on creation which would make this key check unnecessary.
         if (epicsData.getKeys().contains("MBSY2C_energy")) {
-            return EpicsType.EPICS_1S;
+            return EpicsType.EPICS_2s;
         } else {
-            return EpicsType.EPICS_10S;
+            return EpicsType.EPICS_20s;
         }
     }
 

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsVariable.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsVariable.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EpicsVariable.java	Tue Dec  1 15:55:47 2015
@@ -2,7 +2,7 @@
 
 /**
  * Information about an EPICS variable including its name in the EPICS database, column name for the run database,
- * description of the variable, and type (either 1s or 10s).
+ * description of the variable, and type (either 2s or 20s).
  * <p>
  * This class is used to represent data from the <i>epics_variables</i> table in the run database.
  *

Added: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java	(added)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,740 @@
+package org.hps.run.database;
+
+import java.io.File;
+import java.io.IOException;
+import java.sql.Connection;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.LinkedHashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.Set;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.hps.conditions.database.ConnectionParameters;
+import org.hps.conditions.database.DatabaseConditionsManager;
+import org.hps.conditions.run.RunSpreadsheet;
+import org.hps.conditions.run.RunSpreadsheet.RunData;
+import org.hps.datacat.client.DatacatClient;
+import org.hps.datacat.client.DatacatClientFactory;
+import org.hps.datacat.client.Dataset;
+import org.hps.datacat.client.DatasetSite;
+import org.hps.record.AbstractRecordProcessor;
+import org.hps.record.daqconfig.DAQConfigEvioProcessor;
+import org.hps.record.epics.EpicsData;
+import org.hps.record.epics.EpicsRunProcessor;
+import org.hps.record.evio.EventTagConstant;
+import org.hps.record.evio.EvioEventUtilities;
+import org.hps.record.evio.EvioFileSource;
+import org.hps.record.evio.EvioFileUtilities;
+import org.hps.record.evio.EvioLoop;
+import org.hps.record.scalers.ScalerData;
+import org.hps.record.scalers.ScalerUtilities;
+import org.hps.record.scalers.ScalerUtilities.LiveTimeIndex;
+import org.hps.record.scalers.ScalersEvioProcessor;
+import org.hps.record.svt.SvtConfigData;
+import org.hps.record.svt.SvtConfigEvioProcessor;
+import org.hps.record.triggerbank.AbstractIntData.IntBankDefinition;
+import org.hps.record.triggerbank.HeadBankData;
+import org.hps.record.triggerbank.TiTimeOffsetEvioProcessor;
+import org.jlab.coda.jevio.BaseStructure;
+import org.jlab.coda.jevio.EvioEvent;
+import org.jlab.coda.jevio.EvioException;
+import org.jlab.coda.jevio.EvioReader;
+import org.lcsim.conditions.ConditionsManager.ConditionsNotFoundException;
+
+/**
+ * Builds a complete {@link RunSummary} object from various data sources, including the data catalog and the run
+ * spreadsheet, so that it is ready to be inserted into the run database using the DAO interfaces.  This class also 
+ * extracts EPICS and scaler records from the EVIO data for insertion into run database tables.
+ * <p>
+ * The setters and some other methods follow the builder pattern and so can be chained by the caller.
+ * 
+ * @author Jeremy McCormick, SLAC
+ * @see RunSummary
+ * @see RunSummaryImpl
+ */
+final class RunDatabaseBuilder {
+
+    /**
+     * Package logger.
+     */
+    private static final Logger LOGGER = Logger.getLogger(RunDatabaseBuilder.class.getPackage().getName());
+
+    /**
+     * Database connection.
+     */
+    private ConnectionParameters connectionParameters;
+
+    /**
+     * Data catalog client API.
+     */
+    private DatacatClient datacatClient;
+
+    /**
+     * Detector name for initializing conditions system.
+     */
+    private String detectorName;
+
+    /**
+     * Dry run to not perform database updates (off by default).
+     */
+    private boolean dryRun = false;
+
+    /**
+     * List of EPICS data from the run.
+     */
+    private List<EpicsData> epicsData;
+
+    /**
+     * Map of EVIO files to their dataset objects.
+     */
+    private Map<File, Dataset> evioDatasets;
+
+    /**
+     * List of EVIO files.
+     */
+    private List<File> evioFiles;
+
+    /**
+     * Allow replacement of information in the database (off by default).
+     */
+    private boolean replace = false;
+
+    /**
+     * Run summary to be updated.
+     */
+    private RunSummaryImpl runSummary;
+
+    /**
+     * List of scaler data from the run.
+     */
+    private List<ScalerData> scalerData;
+
+    /**
+     * Skip full EVIO file processing (off by default).
+     */
+    private boolean skipEvioProcessing = false;
+
+    /**
+     * Path to run spreadsheet CSV file (not used by default).
+     */
+    private File spreadsheetFile;
+
+    /**
+     * List of SVT configuration bank data.
+     */
+    private List<SvtConfigData> svtConfigs;
+        
+    /**
+     * Create an empty run summary.
+     * 
+     * @param run the run number
+     * @return the empty run summary
+     */
+    RunDatabaseBuilder createRunSummary(int run) {
+        runSummary = new RunSummaryImpl(run);
+        return this;
+    }
+
+    /**
+     * Find EVIO files in the data catalog.
+     */
+    private void findEvioDatasets() {
+        LOGGER.info("finding EVIO datasets for run " + getRun());
+        
+        // Metadata to return from search.
+        final Set<String> metadata = new LinkedHashSet<String>();
+        metadata.add("runMin");
+        metadata.add("eventCount");
+        
+        // Initialize map of files to datasets.
+        evioDatasets = new HashMap<File, Dataset>();
+        
+        // Find datasets in the datacat using a search.
+        final List<Dataset> datasets = datacatClient.findDatasets(
+                "data/raw",
+                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + getRun(), 
+                metadata);
+        if (datasets.isEmpty()) {
+            // No files for the run in datacat is a fatal error.
+            throw new IllegalStateException("No EVIO datasets for run " + getRun() + " were found in the data catalog.");
+        }
+        
+        // Map file to dataset.
+        for (final Dataset dataset : datasets) {
+            evioDatasets.put(new File(dataset.getLocations().get(0).getResource()), dataset);
+        }
+        
+        // Create the list of EVIO files.
+        evioFiles = new ArrayList<File>();
+        evioFiles.addAll(evioDatasets.keySet());
+        EvioFileUtilities.sortBySequence(evioFiles);
+        
+        LOGGER.info("found " + evioFiles.size() + " EVIO file(s) for run " + runSummary.getRun());
+    }
+   
+    /**
+     * Get the current run number from the run summary.
+     * 
+     * @return the run number from the run summary
+     */
+    int getRun() {
+        return runSummary.getRun();
+    }
+
+    /**
+     * Initialize the datacat client.
+     */
+    private void initializeDatacat() {
+
+        LOGGER.info("initializing data catalog client");
+
+        // DEBUG: use dev datacat server; prod should use default JLAB connection
+        datacatClient = new DatacatClientFactory().createClient("http://localhost:8080/datacat-v0.4-SNAPSHOT/r",
+                DatasetSite.SLAC, "HPS");
+    }
+
+    /**
+     * Insert the run data into the database using the current connection.
+     */
+    private void insertRun(Connection connection) {
+
+        LOGGER.info("inserting run " + runSummary.getRun() + " into db");
+
+        // Create DAO factory.
+        final RunDatabaseDaoFactory runFactory = new RunDatabaseDaoFactory(connection);
+
+        // Insert the run summary record.
+        LOGGER.info("inserting run summary");
+        runFactory.createRunSummaryDao().insertRunSummary(runSummary);
+
+        // Insert the EPICS data.
+        if (epicsData != null) {
+            LOGGER.info("inserting EPICS data");
+            runFactory.createEpicsDataDao().insertEpicsData(epicsData);
+        } else {
+            LOGGER.warning("no EPICS data to insert");
+        }
+
+        // Insert the scaler data.
+        if (scalerData != null) {
+            LOGGER.info("inserting scaler data");
+            runFactory.createScalerDataDao().insertScalerData(scalerData, getRun());
+        } else {
+            LOGGER.warning("no scaler data to insert");
+        }
+
+        // Insert SVT config data.
+        if (this.svtConfigs != null) {
+            LOGGER.info("inserting SVT config");
+            runFactory.createSvtConfigDao().insertSvtConfigs(svtConfigs, getRun());
+        } else {
+            LOGGER.warning("no SVT config to insert");
+        }
+        
+        try {
+            connection.close();
+        } catch (Exception e) {
+            LOGGER.log(Level.WARNING, e.getMessage(), e);
+        }
+               
+        LOGGER.info("done inserting run " + getRun());
+    }
+    
+    /**
+     * Reload state for the current run number into this object (used for testing after a database insert).
+     * 
+     * @param load <code>true</code> if this method should be executed (skipped if <code>false</code>)
+     * @return this object
+     */
+    RunDatabaseBuilder load(boolean load) {
+        if (load) {
+            RunManager runManager = new RunManager(connectionParameters.createConnection());
+            runManager.setRun(getRun());
+
+            this.runSummary = RunSummaryImpl.class.cast(runManager.getRunSummary());
+
+            LOGGER.info("loaded run summary ..." + '\n' + runSummary);
+
+            epicsData = new ArrayList<EpicsData>();
+            epicsData.addAll(runManager.getEpicsData(EpicsType.EPICS_2s));
+            epicsData.addAll(runManager.getEpicsData(EpicsType.EPICS_20s));
+            LOGGER.info("loaded " + epicsData.size() + " EPICS records");
+
+            scalerData = runManager.getScalerData();
+            LOGGER.info("loaded " + scalerData.size() + " scaler records");
+
+            svtConfigs = runManager.getSvtConfigData();
+            LOGGER.info("loaded " + svtConfigs.size() + " SVT configurations");
+
+            runManager.closeConnection();
+        } else {
+            LOGGER.info("load is skipped");
+        }
+        
+        return this;
+    }
+    
+    /**
+     * Print summary information to the log.
+     */
+    private void printSummary() {
+        LOGGER.info("built run summary ..." + '\n' + runSummary.toString());
+        if (epicsData != null) {
+            LOGGER.info("found " + epicsData.size() + " EPICS data records");
+        } else {
+            LOGGER.info("no EPICS data");
+        }
+        if (scalerData != null) {
+            LOGGER.info("found " + scalerData.size() + " scalers");
+        } else {
+            LOGGER.info("no scaler data");
+        }
+        if (svtConfigs != null) {
+            for (SvtConfigData config : svtConfigs) {
+                try {
+                    LOGGER.info("SVT XML config with timestamp " + config.getTimestamp() + " ..." + config.toXmlString());
+                } catch (Exception e) {
+                    LOGGER.warning("Could not print config!  Probably bad string data.");
+                }
+            }
+        } else {
+            LOGGER.info("no SVT config");
+        }
+        if (runSummary.getTriggerConfigData() != null) {
+            for (Entry<Integer, String> entry : runSummary.getTriggerConfigData().entrySet()) {
+                LOGGER.info("trigger config data " + entry.getKey() + " ..." + entry.getValue());
+            }
+        } else {
+            LOGGER.info("no trigger config");
+        }
+    }    
+
+    /**
+     * Process all the EVIO files in the run and set information on the current run summary.
+     */
+    private void processEvioFiles() {
+
+        LOGGER.fine("processing EVIO files");
+
+        if (evioFiles == null || evioFiles.isEmpty()) {
+            throw new IllegalStateException("No EVIO files were found.");
+        }
+
+        if (detectorName == null) {
+            throw new IllegalStateException("The detector name was not set.");
+        }
+
+        // Initialize the conditions system.
+        try {
+            DatabaseConditionsManager dbManager = DatabaseConditionsManager.getInstance();
+            DatabaseConditionsManager.getInstance().setDetector(detectorName, runSummary.getRun());
+            dbManager.freeze();
+        } catch (ConditionsNotFoundException e) {
+            throw new RuntimeException(e);
+        }
+
+        // List of processors to execute in the job.
+        ArrayList<AbstractRecordProcessor<EvioEvent>> processors = new ArrayList<AbstractRecordProcessor<EvioEvent>>();
+
+        // Processor to get scaler data.
+        ScalersEvioProcessor scalersProcessor = new ScalersEvioProcessor();
+        scalersProcessor.setResetEveryEvent(false);
+        processors.add(scalersProcessor);
+
+        // Processor for calculating TI time offset.
+        TiTimeOffsetEvioProcessor tiProcessor = new TiTimeOffsetEvioProcessor();
+        processors.add(tiProcessor);
+
+        // Processor for getting DAQ config.
+        DAQConfigEvioProcessor daqProcessor = new DAQConfigEvioProcessor();
+        processors.add(daqProcessor);
+
+        // Processor for getting the SVT XML config.
+        SvtConfigEvioProcessor svtProcessor = new SvtConfigEvioProcessor();
+        processors.add(svtProcessor);
+
+        // Processor for getting EPICS data.
+        EpicsRunProcessor epicsProcessor = new EpicsRunProcessor();
+        processors.add(epicsProcessor);
+
+        // Run the job using the EVIO loop.
+        EvioLoop loop = new EvioLoop();
+        loop.addProcessors(processors);
+        EvioFileSource source = new EvioFileSource(evioFiles);
+        loop.setEvioFileSource(source);
+        loop.loop(-1);
+
+        // Set livetime field values.
+        updateLivetimes(scalersProcessor);
+
+        // Set TI time offset.
+        runSummary.setTiTimeOffset(tiProcessor.getTiTimeOffset());
+
+        // Set DAQ config object.
+        runSummary.setDAQConfig(daqProcessor.getDAQConfig());
+
+        // Set map of crate number to string trigger config data.
+        runSummary.setTriggerConfigData(daqProcessor.getTriggerConfigData());
+        LOGGER.info("found " + daqProcessor.getTriggerConfigData().size() + " valid SVT config events");
+
+        // Set EPICS data list.
+        epicsData = epicsProcessor.getEpicsData();
+
+        // Set scalers list.
+        scalerData = scalersProcessor.getScalerData();
+
+        // Set SVT config data strings.
+        svtConfigs = svtProcessor.getSvtConfigs();
+
+        LOGGER.info("done processing EVIO files");
+    }
+
+    /**
+     * Run the job to build the information for the database and perform an update (if not dry run).
+     * 
+     * @return this object
+     */
+    RunDatabaseBuilder run() {
+        
+        LOGGER.info("building run " + getRun());
+        
+        if (this.runSummary == null) {
+            throw new IllegalStateException("The run summary was never created.");
+        }        
+        
+        // Setup datacat client.
+        initializeDatacat();
+        
+        // Find EVIO datasets in the datacat.
+        findEvioDatasets();
+
+        // Set total number of files.
+        updateTotalFiles();
+
+        // Set GO and PRESTART timestamps.
+        updateStartTimestamps();
+
+        // Set END timestamp.
+        updateEndTimestamp();
+
+        // Set total number of events.
+        updateTotalEvents();
+
+        // Calculate trigger rate.
+        updateTriggerRate();
+                
+        // Run EVIO job if enabled.
+        if (!this.skipEvioProcessing) {
+            processEvioFiles();
+        } else {
+            LOGGER.info("EVIO file processing is skipped.");
+        }
+
+        // Get extra info from spreadsheet if enabled.
+        if (this.spreadsheetFile != null) {
+            updateFromSpreadsheet();
+        } else {
+            LOGGER.info("Run spreadsheet not used.");
+        }
+
+        // Print out summary info to the log before updating database.
+        printSummary();
+
+        if (!dryRun) {
+            // Update the database.
+            updateDatabase();
+        } else {
+            // Dry run so database is not updated.
+            LOGGER.info("Dry run enabled so no updates were performed.");
+        }
+        
+        return this;
+    }
+
+    /**
+     * Set the database connection to the run database.
+     * 
+     * @param connection the database connection to the run database
+     * @return this object
+     */
+    RunDatabaseBuilder setConnectionParameters(ConnectionParameters connectionParameters) {
+        this.connectionParameters = connectionParameters;
+        return this;
+    }
+
+    /**
+     * Set the detector name for initializing the conditions system.
+     * 
+     * @param detectorName the detector name for initializing the conditions system
+     * @return this object
+     */
+    RunDatabaseBuilder setDetectorName(String detectorName) {
+        this.detectorName = detectorName;
+        LOGGER.config("detector = " + this.detectorName);
+        return this;
+    }
+
+    /**
+     * Set dry run which will not update the database.
+     * 
+     * @param dryRun <code>true</code> to perform dry run
+     * @return this object
+     */
+    RunDatabaseBuilder setDryRun(boolean dryRun) {
+        this.dryRun = dryRun;
+        LOGGER.config("dryRun = " + this.dryRun);
+        return this;
+    }
+
+    /**
+     * Enable replacement of existing records in the database.
+     * 
+     * @param replace <code>true</code> to allow replacement of records
+     * @return this object
+     */
+    RunDatabaseBuilder setReplace(boolean replace) {
+        this.replace = replace;
+        LOGGER.config("replace = " + this.replace);
+        return this;
+    }
+
+    /**
+     * Set the path to the run spreadsheet CSV file from Google Docs.
+     * 
+     * @param spreadsheetFile spreadsheet CSV file (can be <code>null</code>)
+     * @return this object
+     */
+    RunDatabaseBuilder setSpreadsheetFile(File spreadsheetFile) {
+        this.spreadsheetFile = spreadsheetFile;
+        if (this.spreadsheetFile != null) {
+            LOGGER.config("spreadsheetFile = " + this.spreadsheetFile.getPath());
+        }
+        return this;
+    }
+    
+    /**
+     * Set whether full EVIO file processing should occur to extract EPICS data, etc. 
+     * <p>
+     * Even if this is disabled, the first and last EVIO files will still be processed
+     * for timestamps.
+     * 
+     * @param skipEvioFileProcessing <code>true</code> to disable full EVIO file processing
+     * @return this object
+     */
+    RunDatabaseBuilder skipEvioProcessing(boolean skipEvioProcessing) {
+        this.skipEvioProcessing = skipEvioProcessing;
+        LOGGER.config("skipEvioFileProcessing = " + this.skipEvioProcessing);
+        return this;
+    }
+
+    /**
+     * Update the database after the run information has been created.
+     */
+    private void updateDatabase() {
+
+        LOGGER.fine("updating the run database");
+        
+        // Initialize the run manager.
+        RunManager runManager = new RunManager(connectionParameters.createConnection());
+        runManager.setRun(runSummary.getRun());
+
+        // Does run exist?
+        if (runManager.runExists()) {
+            
+            LOGGER.info("run already exists");
+            
+            // If replacement is not enabled and run exists, then this is a fatal exception.
+            if (!replace) {
+                throw new RuntimeException("Run already exists (use -x option to enable replacement).");
+            }
+
+            // Delete the run so insert statements can be used to rebuild it.
+            LOGGER.info("deleting existing run");
+            runManager.deleteRun();
+        }
+
+        // Insert the run data into the database.
+        LOGGER.info("inserting the run data");
+        insertRun(runManager.getConnection());
+
+        // Close the database connection.
+        runManager.closeConnection();
+    }
+
+    /**
+     * Update the run summary's end timestamp.
+     */
+    private void updateEndTimestamp() {
+        LOGGER.info("updating end timestamp");
+        IntBankDefinition headBankDefinition = new IntBankDefinition(HeadBankData.class, new int[] {0x2e, 0xe10f});
+        File lastEvioFile = evioFiles.get(evioFiles.size() - 1);
+        EvioReader reader = null;
+        Integer endTimestamp = null;
+        try {
+            reader = EvioFileUtilities.open(lastEvioFile, true);
+            EvioEvent evioEvent = reader.parseNextEvent();
+            while (evioEvent != null) {
+                if (EventTagConstant.END.matches(evioEvent)) {
+                    endTimestamp = EvioEventUtilities.getControlEventData(evioEvent)[0];
+                    LOGGER.fine("found END timestamp " + endTimestamp);
+                    break;
+                }
+                BaseStructure headBank = headBankDefinition.findBank(evioEvent);
+                if (headBank != null) {
+                    if (headBank.getIntData()[0] != 0) {
+                        endTimestamp = headBank.getIntData()[0];
+                    }
+                }
+                evioEvent = reader.parseNextEvent();
+            }
+        } catch (IOException | EvioException e) {
+            throw new RuntimeException("Error reading first EVIO file.", e);
+        } finally {
+            if (reader != null) {
+                try {
+                    reader.close();
+                } catch (Exception e) {
+                    LOGGER.log(Level.SEVERE, e.getMessage(), e);
+                }
+            }
+        }
+        runSummary.setEndTimestamp(endTimestamp);
+        LOGGER.fine("end timestamp set to " + endTimestamp);
+    }
+
+    /**
+     * Update the current run summary from information in the run spreadsheet.
+     * 
+     * @param spreadsheetFile file object pointing to the run spreadsheet (CSV format)
+     * @return this object
+     */
+    private void updateFromSpreadsheet() {       
+        LOGGER.fine("updating from spreadsheet file " + spreadsheetFile.getPath());
+        RunSpreadsheet runSpreadsheet = new RunSpreadsheet(spreadsheetFile);
+        RunData data = runSpreadsheet.getRunMap().get(runSummary.getRun());        
+        if (data != null) {
+            LOGGER.info("found run data ..." + '\n' + data.getRecord());
+            String triggerConfigName = data.getRecord().get("trigger_config");
+            if (triggerConfigName != null) {
+                runSummary.setTriggerConfigName(triggerConfigName);
+                LOGGER.info("set trigger config name <" + runSummary.getTriggerConfigName() + "> from spreadsheet");
+            }
+            String notes = data.getRecord().get("notes");
+            if (notes != null) {
+                runSummary.setNotes(notes);
+                LOGGER.info("set notes <" + runSummary.getNotes() + "> from spreadsheet");
+            }
+            String target = data.getRecord().get("target");
+            if (target != null) {
+                runSummary.setTarget(target);
+                LOGGER.info("set target <" + runSummary.getTarget() + "> from spreadsheet");
+            }
+        } else {
+            LOGGER.warning("No record for this run was found in spreadsheet.");
+        }
+    }
+
+    /**
+     * Calculate the DAQ livetime measurements from the last scaler data bank.
+     * 
+     * @param scalersProcessor the EVIO scaler data processor
+     */
+    private void updateLivetimes(ScalersEvioProcessor scalersProcessor) {
+        LOGGER.fine("updating livetime calculations");
+        ScalerData scalers = scalersProcessor.getCurrentScalerData();
+        if (scalers == null) {
+            throw new IllegalStateException("No scaler data was found by the EVIO processor.");
+        }
+        double[] livetimes = ScalerUtilities.getLiveTimes(scalers);
+        runSummary.setLivetimeClock(livetimes[LiveTimeIndex.CLOCK.ordinal()]);
+        runSummary.setLivetimeFcupTdc(livetimes[LiveTimeIndex.FCUP_TDC.ordinal()]);
+        runSummary.setLivetimeFcupTrg(livetimes[LiveTimeIndex.FCUP_TRG.ordinal()]);
+        LOGGER.info("clock livetime set to " + runSummary.getLivetimeClock());
+        LOGGER.info("fcup tdc livetime set to " + runSummary.getLivetimeFcupTdc());
+        LOGGER.info("fcup trg livetime set to " + runSummary.getLivetimeFcupTrg());
+    }
+
+    /**
+     * Update the starting timestamps from the first EVIO file.
+     */
+    private void updateStartTimestamps() {
+        LOGGER.fine("updating start timestamps");
+        File firstEvioFile = evioFiles.get(0);
+        int sequence = EvioFileUtilities.getSequenceFromName(firstEvioFile);
+        if (sequence != 0) {
+            LOGGER.warning("first file does not have sequence 0");
+        }
+        EvioReader reader = null;
+        try {
+            reader = EvioFileUtilities.open(firstEvioFile, true);
+            EvioEvent evioEvent = reader.parseNextEvent();
+            Integer prestartTimestamp = null;
+            Integer goTimestamp = null;
+            while (evioEvent != null) {
+                if (EventTagConstant.PRESTART.matches(evioEvent)) {
+                    prestartTimestamp = EvioEventUtilities.getControlEventData(evioEvent)[0];
+                } else if (EventTagConstant.GO.matches(evioEvent)) {
+                    goTimestamp = EvioEventUtilities.getControlEventData(evioEvent)[0];
+                }
+                if (prestartTimestamp != null && goTimestamp != null) {
+                    break;
+                }
+                evioEvent = reader.parseNextEvent();
+            }
+            runSummary.setPrestartTimestamp(prestartTimestamp);
+            runSummary.setGoTimestamp(goTimestamp);
+        } catch (IOException | EvioException e) {
+            throw new RuntimeException("Error reading first EVIO file.", e);
+        } finally {
+            try {
+                reader.close();
+            } catch (Exception e) {
+                LOGGER.log(Level.SEVERE, e.getMessage(), e);
+            }
+        }
+        LOGGER.info("PRESTART timestamp set to " + runSummary.getPrestartTimestamp());
+        LOGGER.info("GO timestamp set to " + runSummary.getGoTimestamp());
+    }
+
+    /**
+     * Update the total number of events.
+     */
+    private void updateTotalEvents() {
+        LOGGER.fine("updating total events");
+        int totalEvents = 0;
+        for (Entry<File, Dataset> entry : evioDatasets.entrySet()) {
+            totalEvents += entry.getValue().getLocations().get(0).getEventCount();
+        }
+        runSummary.setTotalEvents(totalEvents);
+        LOGGER.info("total events set to " + runSummary.getTotalEvents());
+    }
+
+    /**
+     * Update the total number of EVIO files in the run.
+     */
+    private void updateTotalFiles() {
+        LOGGER.fine("updating total files");
+        // Set number of files from datacat query.
+        runSummary.setTotalFiles(evioFiles.size());
+        LOGGER.info("total files set to " + runSummary.getTotalFiles());
+    }
+
+    /**
+     * Update the trigger rate.
+     */
+    private void updateTriggerRate() {
+        LOGGER.fine("updating trigger rate");
+        if (runSummary.getEndTimestamp() != null && runSummary.getGoTimestamp() != null) {
+            double triggerRate = ((double) runSummary.getTotalEvents() / ((double) runSummary.getEndTimestamp() - (double) runSummary
+                    .getGoTimestamp())) / 1000.;
+            runSummary.setTriggerRate(triggerRate);
+            LOGGER.info("trigger rate set to " + runSummary.getTriggerRate());
+        } else {
+            LOGGER.warning("Skipped trigger rate calculation because END or GO timestamp is missing.");
+        }
+    }
+}

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java	Tue Dec  1 15:55:47 2015
@@ -1,76 +1,39 @@
 package org.hps.run.database;
 
 import java.io.File;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.Date;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
-import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.DefaultParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
-import org.apache.commons.cli.DefaultParser;
 import org.hps.conditions.database.ConnectionParameters;
-import org.hps.datacat.client.DatacatClient;
-import org.hps.datacat.client.DatacatClientFactory;
-import org.hps.datacat.client.Dataset;
-import org.hps.datacat.client.DatasetMetadata;
-import org.hps.record.evio.EvioFileUtilities;
 
 /**
- * Command line tool for updating the run database from EVIO files registered in the data catalog.
+ * Command line tool for inserting records into the run database.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class RunDatabaseCommandLine {
-
-    /**
-     * Set of features supported by the tool.
-     */
-    static enum Feature {
-        /**
-         * Insert EPICS data.
-         */
-        EPICS,
-        /**
-         * Insert scaler data.
-         */
-        SCALERS,
-        /**
-         * Insert run summary.
-         */
-        SUMMARY,
-        /**
-         * Insert trigger config.
-         */
-        TRIGGER_CONFIG
-    }
-
-    /**
-     * Initialize the logger.
-     */
-    private static final Logger LOGGER = Logger.getLogger(RunDatabaseCommandLine.class.getPackage().getName());
-
+public final class RunDatabaseCommandLine {
+        
     /**
      * Command line options for the crawler.
      */
-    private static final Options OPTIONS = new Options();
+    private static final Options OPTIONS = new Options();    
 
     /**
      * Statically define the command options.
      */
     static {
-        OPTIONS.addOption("f", "feature", true, "enable a feature");
-        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");
         OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
         OPTIONS.addOption("r", "run", true, "run to update");
-        OPTIONS.addOption("u", "update", false, "allow updating existing run in the database");
+        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");       
+        OPTIONS.addOption("D", "dry-run", false, "dry run which will not update the database");
+        OPTIONS.addOption("x", "replace", false, "allow deleting and replacing an existing run");
+        OPTIONS.addOption("s", "spreadsheet", true, "path to run database spreadsheet (CSV format)");
+        OPTIONS.addOption("d", "detector", true, "conditions system detector name");        
+        OPTIONS.addOption("N", "no-evio-processing", false, "skip processing of all EVIO files");
+        OPTIONS.addOption("L", "load", false, "load back run information after inserting (for debugging)");
     }
 
     /**
@@ -79,113 +42,57 @@
      * @param args the command line arguments
      */
     public static void main(final String args[]) {
+        // Parse command line options and run the job.
         new RunDatabaseCommandLine().parse(args).run();
     }
-
+    
     /**
-     * Allow updating of the database for existing runs.
+     * Enable dry run which will not update the run database.
      */
-    private boolean allowUpdates = false;
-
+    private boolean dryRun = false;
+    
     /**
-     * The set of enabled features.
+     * Run number.
      */
-    private final Set<Feature> features = new HashSet<Feature>();
-
+    private int run;
+    
     /**
-     * The run manager for interacting with the run db.
+     * Path to spreadsheet CSV file.
      */
-    private RunManager runManager;
-
+    private File spreadsheetFile = null;
+    
     /**
-     * Create a run processor from the current configuration.
-     *
-     * @return the run processor
+     * Name of detector for conditions system (default for Eng Run 2015 provided here).
      */
-    private RunProcessor createEvioRunProcessor(final RunSummaryImpl runSummary, final List<File> files) {
-
-        final RunProcessor runProcessor = new RunProcessor(runSummary, files);
-
-        if (features.contains(Feature.EPICS)) {
-            runProcessor.addEpicsProcessor();
-        }
-        if (features.contains(Feature.SCALERS)) {
-            runProcessor.addScalerProcessor();
-        }
-        if (features.contains(Feature.TRIGGER_CONFIG)) {
-            runProcessor.addTriggerTimeProcessor();
-        }
-
-        return runProcessor;
-    }
-
+    private String detectorName = "HPS-EngRun2015-Nominal-v3";
+    
     /**
-     * Get the list of EVIO files for the run.
-     *
-     * @param run the run number
-     * @return the list of EVIO files from the run
+     * Allow replacement of existing records.
      */
-    private Map<File, Dataset> getEvioFiles(final int run) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
-        final Set<String> metadata = new HashSet<String>();
-        final Map<File, Dataset> files = new HashMap<File, Dataset>();
-        metadata.add("runMin");
-        metadata.add("eventCount");
-        metadata.add("fileNumber");
-        metadata.add("endTimestamp");
-        metadata.add("startTimestamp");
-        metadata.add("hasEnd");
-        metadata.add("hasPrestart");
-        final List<Dataset> datasets = datacatClient.findDatasets("data/raw",
-                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + run, metadata);
-        if (datasets.isEmpty()) {
-            throw new IllegalStateException("No EVIO datasets for run " + run + " were found in the data catalog.");
-        }
-        for (final Dataset dataset : datasets) {
-            files.put(new File(dataset.getLocations().get(0).getResource()), dataset);
-        }
-        return files;
-    }
-
+    private boolean replace = false;
+    
     /**
-     * Insert information for a run into the database.
-     *
-     * @param runManager the run manager for interacting with the run db
-     * @param runSummary the run summary with information about the run
+     * Skip full EVIO file processing.
      */
-    private void insertRun(final RunManager runManager, final RunSummary runSummary) {
-
-        final RunDatabaseDaoFactory runFactory = new RunDatabaseDaoFactory(runManager.getConnection());
-
-        // Add the run summary record.
-        if (this.features.contains(Feature.SUMMARY)) {
-            LOGGER.info("inserting run summary");
-            runFactory.createRunSummaryDao().insertRunSummary(runSummary);
-        }
-
-        if (this.features.contains(Feature.EPICS)) {
-            LOGGER.info("inserting EPICS data");
-            runFactory.createEpicsDataDao().insertEpicsData(runSummary.getEpicsData());
-        }
-
-        if (this.features.contains(Feature.SCALERS)) {
-            LOGGER.info("inserting scaler data");
-            runFactory.createScalerDataDao().insertScalerData(runSummary.getScalerData(), runManager.getRun());
-        }
-
-        if (this.features.contains(Feature.TRIGGER_CONFIG)) {
-            LOGGER.info("inserting trigger config");
-            runFactory.createTriggerConfigDao().insertTriggerConfig(runSummary.getTriggerConfig(), runManager.getRun());
-        }
-    }
-
+    private boolean skipEvioProcessing = false;
+    
     /**
-     * Parse command line options and return reference to <code>this</code>.
+     * Load back run information after insert (for debugging).
+     */
+    private boolean load = false;
+    
+    /**
+     * Database connection parameters.
+     */
+    private ConnectionParameters connectionParameters = null;
+    
+    /**
+     * Parse command line options and return reference to <code>this</code> object.
      *
      * @param args the command line arguments
      * @return reference to this object
      */
-    RunDatabaseCommandLine parse(final String args[]) {
+    private RunDatabaseCommandLine parse(final String args[]) {
         try {
             final CommandLine cl = new DefaultParser().parse(OPTIONS, args);
 
@@ -204,43 +111,52 @@
                     throw new IllegalArgumentException("Connection properties file " + dbPropFile.getPath()
                             + " does not exist.");
                 }
-                final ConnectionParameters connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
-                LOGGER.config("using " + dbPropPath + " for db connection properties");
-
-                runManager = new RunManager(connectionParameters.createConnection());
-
+                connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
             } else {
                 // Database connection properties file is required.
-                throw new RuntimeException("Connection properties are required.");
+                throw new RuntimeException("Connection properties are a required argument.");
             }
 
-            Integer run = null;
+            // Run number.
             if (cl.hasOption("r")) {
                 run = Integer.parseInt(cl.getOptionValue("r"));
             } else {
                 throw new RuntimeException("The run number is required.");
             }
-            runManager.setRun(run);
-
-            if (cl.hasOption("f")) {
-                // Enable individual features.
-                for (final String arg : cl.getOptionValues("f")) {
-                    features.add(Feature.valueOf(arg));
+            
+            // Dry run.
+            if (cl.hasOption("D")) {
+                this.dryRun = true;
+            }
+            
+            // Run spreadsheet.
+            if (cl.hasOption("s")) {
+                this.spreadsheetFile = new File(cl.getOptionValue("s"));
+                if (!this.spreadsheetFile.exists()) {
+                    throw new RuntimeException("The run spreadsheet " + this.spreadsheetFile.getPath() + " is inaccessible or does not exist.");
                 }
-            } else {
-                // By default all features are enabled.
-                features.addAll(Arrays.asList(Feature.values()));
             }
-            for (final Feature feature : features) {
-                LOGGER.config("feature " + feature.name() + " is enabled.");
+            
+            // Detector name.
+            if (cl.hasOption("d")) {
+                this.detectorName = cl.getOptionValue("d");
             }
-
-            // Allow updates to existing runs in the db.
-            if (cl.hasOption("u")) {
-                this.allowUpdates = true;
-                LOGGER.config("updating or replacing existing run data is enabled");
+            
+            // Replace existing run.
+            if (cl.hasOption("x")) {
+                this.replace = true;
             }
-
+            
+            // Skip full EVIO processing.
+            if (cl.hasOption("N")) {
+                this.skipEvioProcessing = true;
+            }
+            
+            // Load back run info at end of job.
+            if (cl.hasOption("L")) {
+                this.load = true;
+            }
+            
         } catch (final ParseException e) {
             throw new RuntimeException(e);
         }
@@ -249,90 +165,19 @@
     }
 
     /**
-     * Run the job to update the information in the run database.
+     * Configure the builder from command line options and run the job to update the database.
      */
     private void run() {
-
-        LOGGER.info("starting");
-
-        final boolean runExists = runManager.runExists();
-
-        // Fail if run exists and updates are not allowed.
-        if (runExists && !allowUpdates) {
-            throw new IllegalStateException("The run " + runManager.getRun()
-                    + " already exists and updates are not allowed.");
-        }
-
-        // Get the run number configured from command line.
-        final int run = runManager.getRun();
-
-        // Get the list of EVIO files for the run using a data catalog query.
-        final Map<File, Dataset> fileDatasets = this.getEvioFiles(run);
-        final List<File> files = new ArrayList<File>(fileDatasets.keySet());
-        EvioFileUtilities.sortBySequence(files);
-
-        // Process the run's files to get information.
-        final RunSummaryImpl runSummary = new RunSummaryImpl(run);
-        final RunProcessor runProcessor = this.createEvioRunProcessor(runSummary, files);
-        try {
-            runProcessor.processRun();
-        } catch (final Exception e) {
-            throw new RuntimeException(e);
-        }
-
-        // Set number of files from datacat query.
-        runSummary.setTotalFiles(files.size());
-
-        // Set run start date.
-        this.setStartDate(fileDatasets, files, runSummary);
-
-        // Set run end date.
-        this.setEndDate(fileDatasets, files, runSummary);
-
-        // Delete existing run.
-        if (runExists) {
-            runManager.deleteRun();
-        }
-
-        // Insert run into database.
-        this.insertRun(runManager, runSummary);
-
-        // Close the database connection.
-        runManager.closeConnection();
-
-        LOGGER.info("done");
+        new RunDatabaseBuilder()
+            .createRunSummary(run)
+            .setDetectorName(detectorName)
+            .setConnectionParameters(connectionParameters)
+            .setDryRun(dryRun)
+            .setReplace(replace)
+            .skipEvioProcessing(skipEvioProcessing)
+            .setSpreadsheetFile(spreadsheetFile)
+            .run()
+            .load(load);
     }
-
-    /**
-     * Set the run end date.
-     *
-     * @param fileDatasets the run's datasets
-     * @param files the run's EVIO files
-     * @param runSummary the run summary
-     */
-    private void setEndDate(final Map<File, Dataset> fileDatasets, final List<File> files,
-            final RunSummaryImpl runSummary) {
-        final Dataset lastDataset = fileDatasets.get(files.get(files.size() - 1));
-        final DatasetMetadata metadata = lastDataset.getMetadata();
-        // System.out.println("endTimestamp: " + metadata.getLong("endTimestamp"));
-        final Date endDate = new Date(metadata.getLong("endTimestamp"));
-        // System.out.println("endDate: " + startDate);
-        runSummary.setEndDate(endDate);
-        runSummary.setEndOkay(metadata.getLong("hasEnd") == 0 ? false : true);
-    }
-
-    /**
-     * Set the run start date.
-     *
-     * @param fileDatasets the run's datasets
-     * @param files the run's EVIO files
-     * @param runSummary the run summary
-     */
-    private void setStartDate(final Map<File, Dataset> fileDatasets, final List<File> files,
-            final RunSummaryImpl runSummary) {
-        final Dataset firstDataset = fileDatasets.get(files.get(0));
-        final DatasetMetadata metadata = firstDataset.getMetadata();
-        final Date startDate = new Date(metadata.getLong("startTimestamp"));
-        runSummary.setStartDate(startDate);
-    }
+        
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseDaoFactory.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseDaoFactory.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseDaoFactory.java	Tue Dec  1 15:55:47 2015
@@ -5,12 +5,8 @@
 
 /**
  * Factory for creating database API objects for interacting with the run database.
- * <p>
- * This allows the implementation classes to be package protected as only public interfaces are returned by this class.
  *
  * @author Jeremy McCormick, SLAC
- * @see EpicsDataDao
- * @see EpicsVariableDao
  */
 final class RunDatabaseDaoFactory {
 
@@ -73,13 +69,13 @@
     ScalerDataDao createScalerDataDao() {
         return new ScalerDataDaoImpl(connection);
     }
-
+    
     /**
-     * Get the trigger config DAO.
-     *
-     * @return the trigger config DAO
+     * Get the SVT config DAO.
+     * 
+     * @return the SVT config DAO
      */
-    TriggerConfigDao createTriggerConfigDao() {
-        return new TriggerConfigDaoImpl(connection);
+    SvtConfigDao createSvtConfigDao() {
+        return new SvtConfigDaoImpl(connection);
     }
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunManager.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunManager.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunManager.java	Tue Dec  1 15:55:47 2015
@@ -8,7 +8,7 @@
 import org.hps.conditions.database.ConnectionParameters;
 import org.hps.record.epics.EpicsData;
 import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
+import org.hps.record.svt.SvtConfigData;
 import org.lcsim.conditions.ConditionsEvent;
 import org.lcsim.conditions.ConditionsListener;
 
@@ -23,13 +23,11 @@
      * Simple class for caching data.
      */
     private class DataCache {
-
-        List<EpicsData> epicsData;
-        RunSummary fullRunSummary;
-        Boolean runExists;
-        RunSummary runSummary;
-        List<ScalerData> scalerData;
-        TriggerConfig triggerConfig;
+        Boolean runExists = null;
+        RunSummary runSummary = null;
+        List<EpicsData> epicsData = null;
+        List<ScalerData> scalerData = null;
+        List<SvtConfigData> svtConfigData = null;
     }
 
     /**
@@ -142,12 +140,16 @@
      *
      * @param run the run number
      */
-    public void deleteRun() {
-        // Create object for updating run info in the database.
-        final RunSummaryDao runSummaryDao = factory.createRunSummaryDao();
-
-        // Delete run from the database.
-        runSummaryDao.deleteFullRun(run);
+    void deleteRun() {
+        
+        factory.createEpicsDataDao().deleteEpicsData(EpicsType.EPICS_2s, run);
+        factory.createEpicsDataDao().deleteEpicsData(EpicsType.EPICS_20s, run);
+        
+        factory.createScalerDataDao().deleteScalerData(run);
+        
+        factory.createSvtConfigDao().deleteSvtConfigs(run);
+        
+        factory.createRunSummaryDao().deleteRunSummary(run);
     }
 
     /**
@@ -185,24 +187,11 @@
     }
 
     /**
-     * Get the full run summary for the current run including scaler data, etc.
-     *
-     * @return the full run summary for the current run
-     */
-    public RunSummary getFullRunSummary() {
-        this.checkRunNumber();
-        if (this.dataCache.fullRunSummary == null) {
-            this.dataCache.fullRunSummary = factory.createRunSummaryDao().readFullRunSummary(this.run);
-        }
-        return this.dataCache.fullRunSummary;
-    }
-
-    /**
      * Get the current run number.
      *
      * @return the run number
      */
-    public int getRun() {
+    public int getCurrentRun() {
         return run;
     }
 
@@ -214,16 +203,7 @@
     public List<Integer> getRuns() {
         return new RunSummaryDaoImpl(this.connection).getRuns();
     }
-
-    /**
-     * Get the full list of summaries for all runs in the database without complex data like EPICS records.
-     *
-     * @return the full list of run summaries
-     */
-    public List<RunSummary> getRunSummaries() {
-        return this.factory.createRunSummaryDao().getRunSummaries();
-    }
-
+  
     /**
      * Get the run summary for the current run not including its sub-objects like scaler data.
      *
@@ -250,39 +230,21 @@
         }
         return this.dataCache.scalerData;
     }
-
-    /**
-     * Get the trigger config for the current run.
-     *
-     * @return the trigger config for the current run
-     */
-    public TriggerConfig getTriggerConfig() {
-        this.checkRunNumber();
-        if (this.dataCache.triggerConfig == null) {
-            LOGGER.info("loading trigger config for run " + this.run);
-            this.dataCache.triggerConfig = factory.createTriggerConfigDao().getTriggerConfig(run);
-        }
-        return this.dataCache.triggerConfig;
-    }
-
-    /**
-     * Update the database with information found from crawling the files.
-     *
-     * @param runs the list of runs to update
-     * @throws SQLException if there is a database query error
-     */
-    public void insertRun(final RunSummary runSummary) throws SQLException {
-        LOGGER.info("updating run database for run " + runSummary.getRun());
-
-        // Create object for updating run info in the database.
-        final RunSummaryDao runSummaryDao = factory.createRunSummaryDao();
-
-        // Insert run summary into database.
-        runSummaryDao.insertFullRunSummary(runSummary);
-
-        LOGGER.info("done updating run database");
-    }
-
+    
+    /**
+     * Get SVT configuration data.
+     * 
+     * @return the SVT configuration data
+     */
+    public List<SvtConfigData> getSvtConfigData() {
+        this.checkRunNumber();
+        if (this.dataCache.svtConfigData == null) {
+            LOGGER.info("loading SVT configuration data for run " + this.run);
+            this.dataCache.svtConfigData = factory.createSvtConfigDao().getSvtConfigs(run);
+        }
+        return this.dataCache.svtConfigData;
+    }
+     
     /**
      * Open a new database connection from the connection parameters if the current one is closed or <code>null</code>.
      * <p>
@@ -307,7 +269,7 @@
     public boolean runExists() {
         this.checkRunNumber();
         if (this.dataCache.runExists == null) {
-            this.dataCache.runExists = factory.createRunSummaryDao().runSummaryExists(this.run);
+            this.dataCache.runExists = factory.createRunSummaryDao().runExists(this.run);
         }
         return this.dataCache.runExists;
     }
@@ -319,10 +281,7 @@
      * @return <code>true</code> if the run exists in the database
      */
     boolean runExists(final int run) {
-        if (this.dataCache.runExists == null) {
-            this.dataCache.runExists = factory.createRunSummaryDao().runSummaryExists(run);
-        }
-        return this.dataCache.runExists;
+        return factory.createRunSummaryDao().runExists(run);
     }
 
     /**

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummary.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummary.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummary.java	Tue Dec  1 15:55:47 2015
@@ -1,137 +1,154 @@
 package org.hps.run.database;
 
-import java.io.File;
 import java.util.Date;
-import java.util.List;
+import java.util.Map;
 
-import org.hps.datacat.client.DatasetFileFormat;
-import org.hps.record.epics.EpicsData;
-import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
+import org.hps.record.daqconfig.DAQConfig;
 
 /**
- * This is an API for accessing run summary information which is persisted as a row in the <i>runs</i> table of the run
- * database.
+ * This is an API for accessing run summary information which is persisted as a row in the <i>run_summaries</i> table.
  * <p>
- * This information includes:
- * <ul>
- * <li>run number</li>
- * <li>start date</li>
- * <li>end date</li>
- * <li>number of events</li>
- * <li>number of EVIO files</li>
- * <li>whether the END event was found indicating that the DAQ did not crash</li>
- * <li>whether the run is considered good (all <code>true</code> for now)</li>
- * </ul>
- * <p>
- * It also references several complex objects including lists of {@link org.hps.record.epics.EpicsData} and
- * {@link org.hps.record.scalers.ScalerData} for the run, as well as a list of EVIO files.
+ * All timestamp fields use the Unix convention (seconds since the epoch).
  *
+ * @author Jeremy McCormick, SLAC
  * @see RunSummaryImpl
  * @see RunSummaryDao
  * @see RunSummaryDaoImpl
  * @see RunManager
- * 
- * @author Jeremy McCormick, SLAC
  */
 public interface RunSummary {
-  
+
+    /*
+     * Mapping of trigger config fields to crate numbers.
+     */
+    public static final int TRIGGER_CONFIG1 = 37;
+    public static final int TRIGGER_CONFIG2 = 39;
+    public static final int TRIGGER_CONFIG3 = 46;
+    public static final int TRIGGER_CONFIG4 = 58;
+
     /**
-     * Get the creation date of this run record.
+     * Get the creation date of this record.
      *
-     * @return the creation date of this run record
+     * @return the creation date of this record
      */
     Date getCreated();
 
     /**
-     * Get the end date.
-     *
-     * @return the end date
+     * Get the trigger config.
+     * 
+     * @return the trigger config
      */
-    Date getEndDate();
+    DAQConfig getDAQConfig();
 
     /**
-     * Return <code>true</code> if END event was found in the data.
-     *
-     * @return <code>true</code> if END event was in the data
+     * Get the END event timestamp or the timestamp from the last head bank if END is not present.
+     * 
+     * @return the last event timestamp
      */
-    boolean getEndOkay();
+    Integer getEndTimestamp();
 
     /**
-     * Get the EPICS data from the run.
-     *
-     * @return the EPICS data from the run
+     * Get the GO event timestamp.
+     * 
+     * @return the GO event timestamp
      */
-    List<EpicsData> getEpicsData();
+    Integer getGoTimestamp();
 
     /**
-     * Get the event rate (effectively the trigger rate) which is the total events divided by the number of seconds in
-     * the run.
-     *
-     * @return the event rate
+     * Get the livetime computed from the clock scaler.
+     * 
+     * @return the livetime computed from the clock scaler
      */
-    double getEventRate();
+    Double getLivetimeClock();
+
+    /**
+     * Get the livetime computed from the FCUP_TDC scaler.
+     * 
+     * @return the livetime computed from the FCUP_TDC scaler
+     */
+    Double getLivetimeFcupTdc();
+
+    /**
+     * Get the livetime computed from the FCUP_TRG scaler.
+     * 
+     * @return the livetime computed from the FCUP_TRG scaler
+     */
+    Double getLivetimeFcupTrg();
+
+    /**
+     * Get the notes for the run (from the run spreadsheet).
+     * 
+     * @return the notes for the run
+     */
+    String getNotes();
+
+    /**
+     * Get the PRESTART event timestamp.
+     * 
+     * @return the PRESTART event timestamp
+     */
+    Integer getPrestartTimestamp();
 
     /**
      * Get the run number.
      *
      * @return the run number
      */
-    int getRun();
+    Integer getRun();
+   
+    /**
+     * Get the target setting for the run (string from run spreadsheet).
+     * 
+     * @return the target setting for the run
+     */
+    String getTarget();
 
     /**
-     * Return <code>true</code> if the run was okay (no major errors or data corruption occurred).
-     *
-     * @return <code>true</code> if the run was okay
+     * Get the TI time offset in ns.
+     * 
+     * @return the TI time offset in ns
      */
-    boolean getRunOkay();
+    Long getTiTimeOffset();
 
     /**
-     * Get the scaler data of this run.
+     * Get the total number of events in the run.
      *
-     * @return the scaler data of this run
+     * @return the total number of events in the run
      */
-    List<ScalerData> getScalerData();
+    Integer getTotalEvents();
 
     /**
-     * Get the trigger config int values.
+     * Get the total number of EVIO files in this run.
      *
-     * @return the trigger config int values
+     * @return the total number of files in this run
      */
-    TriggerConfig getTriggerConfig();
+    Integer getTotalFiles();
 
     /**
-     * Get the start date.
-     *
-     * @return the start date
+     * Get a map of crate number to trigger config data.
+     * 
+     * @return the map of crate number to trigger config data
      */
-    Date getStartDate();
+    Map<Integer, String> getTriggerConfigData();
 
     /**
-     * Get the total events in the run.
-     *
-     * @return the total events in the run
+     * Get the trigger config name (from the run spreadsheet).
+     * 
+     * @return the trigger config name
      */
-    int getTotalEvents();
+    String getTriggerConfigName();
 
     /**
-     * Get the total number of EVIO files for this run.
-     *
-     * @return the total number of files for this run
+     * Get the trigger rate in KHz.
+     * 
+     * @return the trigger rate in KHz
      */
-    int getTotalFiles();
+    Double getTriggerRate();
 
     /**
-     * Get the number of seconds in the run which is the difference between the start and end times.
+     * Get the date when this record was last updated.
      *
-     * @return the total seconds in the run
-     */
-    long getTotalSeconds();
-
-    /**
-     * Get the date when this run record was last updated.
-     *
-     * @return the date when this run record was last updated
+     * @return the date when this record was last updated
      */
     Date getUpdated();
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java	Tue Dec  1 15:55:47 2015
@@ -8,14 +8,7 @@
  * @author Jeremy McCormick, SLAC
  */
 interface RunSummaryDao {
-
-    /**
-     * Delete a run summary from the database including its referenced objects such as EPICS data.
-     *
-     * @param runSummary the run summary to delete
-     */
-    void deleteFullRun(int run);
-
+  
     /**
      * Delete a run summary by run number.
      *
@@ -36,14 +29,7 @@
      * @return the list of run numbers
      */
     List<Integer> getRuns();
-
-    /**
-     * Get a list of run summaries without loading their objects such as EPICS data.
-     *
-     * @return the list of run summaries
-     */
-    List<RunSummary> getRunSummaries();
-
+  
     /**
      * Get a run summary by run number without loading object state.
      *
@@ -51,36 +37,13 @@
      * @return the run summary object
      */
     RunSummary getRunSummary(int run);
-
+  
     /**
-     * Insert a list of run summaries along with its referenced objects such as scaler and EPICS data.
-     *
-     * @param runSummaryList the list of run summaries
-     * @param deleteExisting <code>true</code> to allow deletion and replacement of existing run summaries
-     */
-    void insertFullRunSummaries(List<RunSummary> runSummaryList, boolean deleteExisting);
-
-    /**
-     * Insert a run summary including all its objects.
-     *
-     * @param runSummary the run summary object
-     */
-    void insertFullRunSummary(RunSummary runSummary);
-
-    /**
-     * Insert a run summary but not its objects.
+     * Insert a run summary.
      *
      * @param runSummary the run summary object
      */
     void insertRunSummary(RunSummary runSummary);
-
-    /**
-     * Read a run summary and its objects such as scaler data.
-     *
-     * @param run the run number
-     * @return the full run summary
-     */
-    RunSummary readFullRunSummary(int run);
 
     /**
      * Return <code>true</code> if a run summary exists in the database.
@@ -88,12 +51,12 @@
      * @param run the run number
      * @return <code>true</code> if <code>run</code> exists in the database
      */
-    boolean runSummaryExists(int run);
+    boolean runExists(int run);
 
     /**
-     * Update a run summary but not its objects.
+     * Update a run summary.
      *
      * @param runSummary the run summary to update
      */
-    void updateRunSummary(RunSummary runSummary);
+    void updateRunSummary(RunSummary runSummary);    
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java	Tue Dec  1 15:55:47 2015
@@ -1,17 +1,15 @@
 package org.hps.run.database;
 
+import java.sql.Clob;
 import java.sql.Connection;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.util.ArrayList;
-import java.util.Calendar;
-import java.util.GregorianCalendar;
+import java.util.LinkedHashMap;
 import java.util.List;
-import java.util.TimeZone;
+import java.util.Map;
 import java.util.logging.Logger;
-
-import org.hps.record.epics.EpicsData;
 
 /**
  * Implementation of database operations for {@link RunSummary} objects in the run database.
@@ -21,36 +19,35 @@
 final class RunSummaryDaoImpl implements RunSummaryDao {
 
     /**
-     * SQL query strings.
-     */
-    private static final class RunSummaryQuery {
-
-        /**
-         * Delete by run number.
-         */
-        private static final String DELETE_RUN = "DELETE FROM runs WHERE run = ?";
-        /**
-         * Insert a record for a run.
-         */
-        private static final String INSERT = "INSERT INTO runs (run, start_date, end_date, nevents, nfiles, end_ok, created) VALUES(?, ?, ?, ?, ?, ?, NOW())";
-        /**
-         * Select all records.
-         */
-        private static final String SELECT_ALL = "SELECT * from runs";
-        /**
-         * Select record by run number.
-         */
-        private static final String SELECT_RUN = "SELECT run, start_date, end_date, nevents, nfiles, end_ok, run_ok, updated, created FROM runs WHERE run = ?";
-        /**
-         * Update information for a run.
-         */
-        private static final String UPDATE_RUN = "UPDATE runs SET start_date, end_date, nevents, nfiles, end_ok, run_ok WHERE run = ?";
-    }
-
-    /**
-     * Eastern time zone.
-     */
-    private static Calendar CALENDAR = new GregorianCalendar(TimeZone.getTimeZone("America/New_York"));
+     * Expected number of string banks in trigger config.
+     */
+    private static final int TRIGGER_CONFIG_LEN = 4;
+
+    /**
+     * Delete by run number.
+     */
+    private static final String DELETE = "DELETE FROM run_summaries WHERE run = ?";
+        
+    /**
+     * Insert a record for a run.
+     */
+    private static final String INSERT = "INSERT INTO run_summaries (run, nevents, nfiles, prestart_timestamp,"
+            + " go_timestamp, end_timestamp, trigger_rate, trigger_config_name, trigger_config1, trigger_config2," 
+            + " trigger_config3, trigger_config4, ti_time_offset, livetime_clock, livetime_fcup_tdc, livetime_fcup_trg,"
+            + " target, notes, created) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, NOW())";
+                     
+    /**
+     * Select record by run number.
+     */
+    private static final String SELECT = "SELECT * FROM run_summaries WHERE run = ?";
+        
+    /**
+     * Update information for a run.
+     */
+    private static final String UPDATE = "UPDATE run_summaries SET nevents = ?, nfiles = ?, prestart_timestamp = ?,"
+            + " go_timestamp = ?, end_timestamp = ?, trigger_rate = ?, trigger_config_name = ?, trigger_config1 = ?,"
+            + " trigger_config2 = ?, trigger_config3 = ?, trigger_config4 = ?, ti_time_offset = ?, livetime_clock = ?,"
+            + " livetime_fcup_tdc = ?, livetime_fcup_trg = ?, target = ?, notes = ?, created WHERE run = ?";
 
     /**
      * Initialize the logger.
@@ -61,60 +58,24 @@
      * The database connection.
      */
     private final Connection connection;
-
-    /**
-     * The database API for EPICS data.
-     */
-    private EpicsDataDao epicsDataDao = null;
-
-    /**
-     * The database API for scaler data.
-     */
-    private ScalerDataDao scalerDataDao = null;
-
-    /**
-     * The database API for integer trigger config.
-     */
-    private TriggerConfigDao triggerConfigIntDao = null;
-
+  
     /**
      * Create a new DAO object for run summary information.
      *
      * @param connection the database connection
      */
     RunSummaryDaoImpl(final Connection connection) {
-        // Set the connection.
         if (connection == null) {
             throw new IllegalArgumentException("The connection is null.");
         }
+        try {
+            if (connection.isClosed()) {
+                throw new IllegalArgumentException("The connection is closed.");
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
         this.connection = connection;
-
-        // Setup DAO API objects for managing complex object state.
-        epicsDataDao = new EpicsDataDaoImpl(this.connection);
-        scalerDataDao = new ScalerDataDaoImpl(this.connection);
-        triggerConfigIntDao = new TriggerConfigDaoImpl(this.connection);
-    }
-
-    /**
-     * Delete a run from the database including its referenced objects such as EPICS data.
-     *
-     * @param runSummary the run summary to delete
-     */
-    @Override
-    public void deleteFullRun(int run) {
-
-        // Delete EPICS log.
-        this.epicsDataDao.deleteEpicsData(EpicsType.EPICS_1S, run);
-        this.epicsDataDao.deleteEpicsData(EpicsType.EPICS_10S, run);
-
-        // Delete scaler data.
-        this.scalerDataDao.deleteScalerData(run);
-
-        // Delete trigger config.
-        this.triggerConfigIntDao.deleteTriggerConfigInt(run);
-
-        // Finally delete the run summary information.
-        this.deleteRunSummary(run);
     }
 
     /**
@@ -126,7 +87,7 @@
     public void deleteRunSummary(final int run) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.DELETE_RUN);
+            preparedStatement = connection.prepareStatement(DELETE);
             preparedStatement.setInt(1, run);
             preparedStatement.executeUpdate();
         } catch (final SQLException e) {
@@ -151,7 +112,7 @@
     public void deleteRunSummary(final RunSummary runSummary) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.DELETE_RUN);
+            preparedStatement = connection.prepareStatement(DELETE);
             preparedStatement.setInt(1, runSummary.getRun());
             preparedStatement.executeUpdate();
         } catch (final SQLException e) {
@@ -177,7 +138,7 @@
         final List<Integer> runs = new ArrayList<Integer>();
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = this.connection.prepareStatement("SELECT distinct(run) FROM runs ORDER BY run");
+            preparedStatement = this.connection.prepareStatement("SELECT distinct(run) FROM run_summaries ORDER BY run");
             final ResultSet resultSet = preparedStatement.executeQuery();
             while (resultSet.next()) {
                 final Integer run = resultSet.getInt(1);
@@ -196,47 +157,9 @@
         }
         return runs;
     }
-
-    /**
-     * Get a list of run summaries without loading their objects such as EPICS data.
-     *
-     * @return the list of run summaries
-     */
-    @Override
-    public List<RunSummary> getRunSummaries() {
-        PreparedStatement statement = null;
-        final List<RunSummary> runSummaries = new ArrayList<RunSummary>();
-        try {
-            statement = this.connection.prepareStatement(RunSummaryQuery.SELECT_ALL);
-            final ResultSet resultSet = statement.executeQuery();
-            while (resultSet.next()) {
-                final RunSummaryImpl runSummary = new RunSummaryImpl(resultSet.getInt("run"));
-                runSummary.setStartDate(resultSet.getTimestamp("start_date"));
-                runSummary.setEndDate(resultSet.getTimestamp("end_date"));
-                runSummary.setTotalEvents(resultSet.getInt("nevents"));
-                runSummary.setTotalFiles(resultSet.getInt("nfiles"));
-                runSummary.setEndOkay(resultSet.getBoolean("end_ok"));
-                runSummary.setRunOkay(resultSet.getBoolean("run_ok"));
-                runSummary.setUpdated(resultSet.getTimestamp("updated"));
-                runSummary.setCreated(resultSet.getTimestamp("created"));
-                runSummaries.add(runSummary);
-            }
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
-            if (statement != null) {
-                try {
-                    statement.close();
-                } catch (final SQLException e) {
-                    e.printStackTrace();
-                }
-            }
-        }
-        return runSummaries;
-    }
-
-    /**
-     * Get a run summary by run number without loading object state.
+   
+    /**
+     * Get a run summary.
      *
      * @param run the run number
      * @return the run summary object
@@ -246,22 +169,32 @@
         PreparedStatement statement = null;
         RunSummaryImpl runSummary = null;
         try {
-            statement = this.connection.prepareStatement(RunSummaryQuery.SELECT_RUN);
+            statement = this.connection.prepareStatement(SELECT);
             statement.setInt(1, run);
             final ResultSet resultSet = statement.executeQuery();
             if (!resultSet.next()) {
-                throw new IllegalArgumentException("No record exists for run " + run + " in database.");
-            }
-
+                throw new IllegalArgumentException("Run " + run + " does not exist in database.");
+            }
             runSummary = new RunSummaryImpl(run);
-            runSummary.setStartDate(resultSet.getTimestamp("start_date"));
-            runSummary.setEndDate(resultSet.getTimestamp("end_date"));
             runSummary.setTotalEvents(resultSet.getInt("nevents"));
             runSummary.setTotalFiles(resultSet.getInt("nfiles"));
-            runSummary.setEndOkay(resultSet.getBoolean("end_ok"));
-            runSummary.setRunOkay(resultSet.getBoolean("run_ok"));
+            runSummary.setPrestartTimestamp(resultSet.getInt("prestart_timestamp"));
+            runSummary.setGoTimestamp(resultSet.getInt("go_timestamp"));
+            runSummary.setEndTimestamp(resultSet.getInt("end_timestamp"));
+            runSummary.setTriggerRate(resultSet.getDouble("trigger_rate"));
+            runSummary.setTriggerConfigName(resultSet.getString("trigger_config_name"));
+            Map<Integer, String> triggerConfigData = createTriggerConfigData(resultSet);
+            if (!triggerConfigData.isEmpty()) {
+                runSummary.setTriggerConfigData(triggerConfigData);
+            } 
+            runSummary.setTiTimeOffset(resultSet.getLong("ti_time_offset"));
+            runSummary.setLivetimeClock(resultSet.getDouble("livetime_clock"));
+            runSummary.setLivetimeFcupTdc(resultSet.getDouble("livetime_fcup_tdc"));
+            runSummary.setLivetimeFcupTrg(resultSet.getDouble("livetime_fcup_trg"));
+            runSummary.setTarget(resultSet.getString("target"));
+            runSummary.setNotes(resultSet.getString("notes"));
+            runSummary.setCreated(resultSet.getTimestamp("created"));
             runSummary.setUpdated(resultSet.getTimestamp("updated"));
-            runSummary.setCreated(resultSet.getTimestamp("created"));
         } catch (final SQLException e) {
             throw new RuntimeException(e);
         } finally {
@@ -277,182 +210,98 @@
     }
 
     /**
-     * Insert a list of run summaries along with their complex state such as referenced scaler and EPICS data.
-     *
-     * @param runSummaryList the list of run summaries
-     * @param deleteExisting <code>true</code> to allow deletion and replacement of existing run summaries
-     */
-    @Override
-    public void insertFullRunSummaries(final List<RunSummary> runSummaryList, final boolean deleteExisting) {
-
-        if (runSummaryList == null) {
-            throw new IllegalArgumentException("The run summary list is null.");
-        }
-        if (runSummaryList.isEmpty()) {
-            throw new IllegalArgumentException("The run summary list is empty.");
-        }
-
-        LOGGER.info("inserting " + runSummaryList.size() + " run summaries into database");
-
-        // Turn off auto commit.
-        try {
-            LOGGER.info("turning off auto commit");
-            this.connection.setAutoCommit(false);
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        }
-
-        // Loop over all runs found while crawling.
-        for (final RunSummary runSummary : runSummaryList) {
-
-            final int run = runSummary.getRun();
-
-            LOGGER.info("inserting run summary for run " + run + " into database");
-
-            // Does the run exist in the database already?
-            if (this.runSummaryExists(run)) {
-                // Is deleting existing rows allowed?
-                if (deleteExisting) {
-                    LOGGER.info("deleting existing run summary");
-                    // Delete the existing rows.
-                    this.deleteFullRun(runSummary.getRun());
-                } else {
-                    // Rows exist but updating is disallowed which is a fatal error.
-                    throw new IllegalStateException("Run " + runSummary.getRun()
-                            + " already exists and updates are disallowed.");
-                }
-            }
-
-            // Insert full run summary information including sub-objects.
-            LOGGER.info("inserting run summary");
-            this.insertFullRunSummary(runSummary);
-            LOGGER.info("run summary for " + run + " inserted successfully");
-
-            try {
-                // Commit the transaction for the run.
-                LOGGER.info("committing transaction");
-                this.connection.commit();
-            } catch (final SQLException e1) {
-                try {
-                    LOGGER.severe("rolling back transaction");
-                    // Rollback the transaction if there was an error.
-                    this.connection.rollback();
-                } catch (final SQLException e2) {
-                    throw new RuntimeException(e2);
-                }
-            }
-
-            LOGGER.info("done inserting run summary " + run);
-        }
-
-        try {
-            LOGGER.info("turning auto commit on");
-            // Turn auto commit back on.
-            this.connection.setAutoCommit(true);
-        } catch (final SQLException e) {
-            e.printStackTrace();
-        }
-
-        LOGGER.info("done inserting run summaries");
-    }
-
-    /**
-     * Insert a run summary including all its objects.
-     *
-     * @param runSummary the run summary object to insert
-     */
-    @Override
-    public void insertFullRunSummary(final RunSummary runSummary) {
-
-        if (runSummary == null) {
-            throw new IllegalArgumentException("The run summary is null.");
-        }
-        
-        // Insert basic run log info.
-        this.insertRunSummary(runSummary);
-
-        // Insert EPICS data.
-        if (runSummary.getEpicsData() != null && !runSummary.getEpicsData().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getEpicsData().size() + " EPICS records");
-            epicsDataDao.insertEpicsData(runSummary.getEpicsData());
+     * Create trigger config data from result set.
+     * 
+     * @param resultSet the result set with the run summary record
+     * @return the trigger config data as a map of bank number to string data
+     * @throws SQLException if there is an error querying the database
+     */
+    private Map<Integer, String> createTriggerConfigData(final ResultSet resultSet) throws SQLException {
+        Map<Integer, String> triggerConfigData = new LinkedHashMap<Integer, String>();
+        Clob clob = resultSet.getClob("trigger_config1");            
+        if (clob != null) {
+            triggerConfigData.put(RunSummary.TRIGGER_CONFIG1, clob.getSubString(1, (int) clob.length()));
+        }
+        clob = resultSet.getClob("trigger_config2");
+        if (clob != null) {
+            triggerConfigData.put(RunSummary.TRIGGER_CONFIG2, clob.getSubString(1, (int) clob.length()));
+        }
+        clob = resultSet.getClob("trigger_config3");
+        if (clob != null) {
+            triggerConfigData.put(RunSummary.TRIGGER_CONFIG3, clob.getSubString(1, (int) clob.length()));
+        }
+        clob = resultSet.getClob("trigger_config4");
+        if (clob != null) {
+            triggerConfigData.put(RunSummary.TRIGGER_CONFIG4, clob.getSubString(1, (int) clob.length()));
+        }
+        return triggerConfigData;
+    }
+      
+    /**
+     * Insert a run summary.
+     *
+     * @param runSummary the run summary object
+     */
+    @Override
+    public void insertRunSummary(final RunSummary runSummary) {
+        PreparedStatement preparedStatement = null;        
+        try {
+            preparedStatement = connection.prepareStatement(INSERT);                       
+            preparedStatement.setInt(1, runSummary.getRun());
+            preparedStatement.setInt(2, runSummary.getTotalEvents());
+            preparedStatement.setInt(3, runSummary.getTotalFiles());
+            preparedStatement.setInt(4, runSummary.getPrestartTimestamp());
+            preparedStatement.setInt(5, runSummary.getGoTimestamp());
+            preparedStatement.setInt(6, runSummary.getEndTimestamp());
+            preparedStatement.setDouble(7, runSummary.getTriggerRate());
+            preparedStatement.setString(8, runSummary.getTriggerConfigName());
+            Map<Integer, String> triggerData = runSummary.getTriggerConfigData();
+            prepareTriggerData(preparedStatement, triggerData);
+            preparedStatement.setLong(13, runSummary.getTiTimeOffset());
+            preparedStatement.setDouble(14, runSummary.getLivetimeClock());
+            preparedStatement.setDouble(15, runSummary.getLivetimeFcupTdc());
+            preparedStatement.setDouble(16, runSummary.getLivetimeFcupTrg());
+            preparedStatement.setString(17, runSummary.getTarget());
+            preparedStatement.setString(18, runSummary.getNotes());
+            LOGGER.fine(preparedStatement.toString());
+            preparedStatement.executeUpdate();
+        } catch (final SQLException e) {
+            throw new RuntimeException(e);
+        } finally {
+            if (preparedStatement != null) {
+                try {
+                    preparedStatement.close();
+                } catch (final SQLException e) {
+                    e.printStackTrace();
+                }
+            }
+        }
+    }
+
+    /**
+     * Set trigger config data on prepared statement.
+     * @param preparedStatement the prepared statement
+     * @param triggerData the trigger config data
+     * @throws SQLException if there is an error querying the database
+     */
+    private void prepareTriggerData(PreparedStatement preparedStatement, Map<Integer, String> triggerData)
+            throws SQLException {
+        if (triggerData != null && !triggerData.isEmpty()) {
+            if (triggerData.size() != TRIGGER_CONFIG_LEN) {
+                throw new IllegalArgumentException("The trigger config data has the wrong length.");
+            }
+            preparedStatement.setBytes(9, triggerData.get(RunSummary.TRIGGER_CONFIG1).getBytes());
+            preparedStatement.setBytes(10, triggerData.get(RunSummary.TRIGGER_CONFIG2).getBytes());
+            preparedStatement.setBytes(11, triggerData.get(RunSummary.TRIGGER_CONFIG3).getBytes());
+            preparedStatement.setBytes(12, triggerData.get(RunSummary.TRIGGER_CONFIG4).getBytes());
         } else {
-            LOGGER.warning("no EPICS data to insert");
-        }
-
-        // Insert scaler data.
-        if (runSummary.getScalerData() != null && !runSummary.getScalerData().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getScalerData().size() + " scaler data records");
-            scalerDataDao.insertScalerData(runSummary.getScalerData(), runSummary.getRun());
-        } else {
-            LOGGER.warning("no scaler data to insert");
-        }
-
-        // Insert trigger config.
-        if (runSummary.getTriggerConfig() != null && !runSummary.getTriggerConfig().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getTriggerConfig().size() + " trigger config variables");
-            triggerConfigIntDao.insertTriggerConfig(runSummary.getTriggerConfig(), runSummary.getRun());
-        } else {
-            LOGGER.warning("no trigger config to insert");
-        }
-    }
-
-    /**
-     * Insert a run summary but not its objects.
-     *
-     * @param runSummary the run summary object
-     */
-    @Override
-    public void insertRunSummary(final RunSummary runSummary) {
-        PreparedStatement preparedStatement = null;
-        try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.INSERT);
-            preparedStatement.setInt(1, runSummary.getRun());
-            preparedStatement.setTimestamp(2, new java.sql.Timestamp(runSummary.getStartDate().getTime()), CALENDAR);
-            preparedStatement.setTimestamp(3, new java.sql.Timestamp(runSummary.getEndDate().getTime()), CALENDAR);
-            preparedStatement.setInt(4, runSummary.getTotalEvents());
-            preparedStatement.setInt(5, runSummary.getTotalFiles());
-            preparedStatement.setBoolean(6, runSummary.getEndOkay());
-            preparedStatement.executeUpdate();
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
-            if (preparedStatement != null) {
-                try {
-                    preparedStatement.close();
-                } catch (final SQLException e) {
-                    e.printStackTrace();
-                }
-            }
-        }
-    }
-
-    /**
-     * Read a run summary and its objects such as scaler data.
-     *
-     * @param run the run number
-     * @return the full run summary
-     */
-    @Override
-    public RunSummary readFullRunSummary(final int run) {
-
-        // Read main run summary but not referenced objects.
-        final RunSummaryImpl runSummary = (RunSummaryImpl) this.getRunSummary(run);
-
-        // Read EPICS data and set on RunSummary.
-        final List<EpicsData> epicsDataList = new ArrayList<EpicsData>();
-        epicsDataList.addAll(epicsDataDao.getEpicsData(EpicsType.EPICS_1S, run));
-        epicsDataList.addAll(epicsDataDao.getEpicsData(EpicsType.EPICS_10S, run));
-        runSummary.setEpicsData(epicsDataList);
-
-        // Read scaler data and set on RunSummary.
-        runSummary.setScalerData(scalerDataDao.getScalerData(run));
-
-        // Read trigger config.
-        runSummary.setTriggerConfig(triggerConfigIntDao.getTriggerConfig(run));
-
-        return runSummary;
-    }
-
+            preparedStatement.setBytes(9, null);
+            preparedStatement.setBytes(10, null);
+            preparedStatement.setBytes(11, null);
+            preparedStatement.setBytes(12, null);
+        }
+    }
+   
     /**
      * Return <code>true</code> if a run summary exists in the database for the run number.
      *
@@ -460,10 +309,10 @@
      * @return <code>true</code> if run exists in the database
      */
     @Override
-    public boolean runSummaryExists(final int run) {
-        PreparedStatement preparedStatement = null;
-        try {
-            preparedStatement = connection.prepareStatement("SELECT run FROM runs where run = ?");
+    public boolean runExists(final int run) {
+        PreparedStatement preparedStatement = null;
+        try {
+            preparedStatement = connection.prepareStatement("SELECT run FROM run_summaries where run = ?");
             preparedStatement.setInt(1, run);
             final ResultSet rs = preparedStatement.executeQuery();
             return rs.first();
@@ -481,7 +330,7 @@
     }
 
     /**
-     * Update a run summary but not its complex state.
+     * Update a run summary.
      *
      * @param runSummary the run summary to update
      */
@@ -489,14 +338,37 @@
     public void updateRunSummary(final RunSummary runSummary) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.UPDATE_RUN);
-            preparedStatement.setTimestamp(1, new java.sql.Timestamp(runSummary.getStartDate().getTime()), CALENDAR);
-            preparedStatement.setTimestamp(2, new java.sql.Timestamp(runSummary.getEndDate().getTime()), CALENDAR);
-            preparedStatement.setInt(3, runSummary.getTotalEvents());
-            preparedStatement.setInt(4, runSummary.getTotalFiles());
-            preparedStatement.setBoolean(5, runSummary.getEndOkay());
-            preparedStatement.setBoolean(6, runSummary.getRunOkay());
-            preparedStatement.setInt(7, runSummary.getRun());
+            preparedStatement = connection.prepareStatement(UPDATE);                       
+            preparedStatement.setInt(1, runSummary.getTotalEvents());
+            preparedStatement.setInt(2, runSummary.getTotalFiles());
+            preparedStatement.setInt(3, runSummary.getPrestartTimestamp());
+            preparedStatement.setInt(4, runSummary.getGoTimestamp());
+            preparedStatement.setInt(5, runSummary.getEndTimestamp());
+            preparedStatement.setDouble(6, runSummary.getTriggerRate());
+            preparedStatement.setString(7, runSummary.getTriggerConfigName());
+            Map<Integer, String> triggerData = runSummary.getTriggerConfigData();
+            if (triggerData != null && !triggerData.isEmpty()) {
+                if (triggerData.size() != 4) {
+                    throw new IllegalArgumentException("The trigger config data has the wrong length.");
+                }
+                preparedStatement.setBytes(8, triggerData.get(RunSummary.TRIGGER_CONFIG1).getBytes());
+                preparedStatement.setBytes(9, triggerData.get(RunSummary.TRIGGER_CONFIG2).getBytes());
+                preparedStatement.setBytes(10, triggerData.get(RunSummary.TRIGGER_CONFIG3).getBytes());
+                preparedStatement.setBytes(11, triggerData.get(RunSummary.TRIGGER_CONFIG4).getBytes());
+            } else {
+                preparedStatement.setBytes(8, null);
+                preparedStatement.setBytes(9, null);
+                preparedStatement.setBytes(10, null);
+                preparedStatement.setBytes(11, null);
+            }
+            preparedStatement.setLong(12, runSummary.getTiTimeOffset());
+            preparedStatement.setDouble(13, runSummary.getLivetimeClock());
+            preparedStatement.setDouble(14, runSummary.getLivetimeFcupTdc());
+            preparedStatement.setDouble(15, runSummary.getLivetimeFcupTrg());
+            preparedStatement.setString(16, runSummary.getTarget());
+            preparedStatement.setString(17, runSummary.getNotes());
+            preparedStatement.setInt(18, runSummary.getRun());
+            LOGGER.fine(preparedStatement.toString());
             preparedStatement.executeUpdate();
         } catch (final SQLException e) {
             throw new RuntimeException(e);
@@ -509,5 +381,5 @@
                 }
             }
         }
-    }
+    }      
 }

Modified: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java	(original)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java	Tue Dec  1 15:55:47 2015
@@ -1,40 +1,19 @@
 package org.hps.run.database;
 
-import java.io.File;
-import java.text.DateFormat;
-import java.text.SimpleDateFormat;
-import java.util.ArrayList;
 import java.util.Date;
-import java.util.GregorianCalendar;
-import java.util.HashMap;
-import java.util.List;
 import java.util.Map;
-import java.util.TimeZone;
-
-import org.hps.datacat.client.DatasetFileFormat;
-import org.hps.record.epics.EpicsData;
-import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
+import java.util.Map.Entry;
+
+import org.hps.record.daqconfig.ConfigurationManager;
+import org.hps.record.daqconfig.DAQConfig;
+import org.hps.record.daqconfig.EvioDAQParser;
 
 /**
  * Implementation of {@link RunSummary} for retrieving information from the run database.
  *
  * @author Jeremy McCormick, SLAC
  */
-public final class RunSummaryImpl implements RunSummary {
-
-    /**
-     * Default date display format.
-     */
-    private static final DateFormat DATE_DISPLAY = new SimpleDateFormat();
-
-    static {
-        /**
-         * Set default time zone for display to East Coast (JLAB) where data was
-         * taken.
-         */
-        DATE_DISPLAY.setCalendar(new GregorianCalendar(TimeZone.getTimeZone("America/New_York")));
-    }
+final class RunSummaryImpl implements RunSummary {
 
     /**
      * Date this record was created.
@@ -42,60 +21,90 @@
     private Date created;
 
     /**
-     * End date of run.
-     */
-    private Date endDate;
-
-    /**
-     * This is <code>true</code> if the END event is found in the data.
-     */
-    private boolean endOkay;
-
-    /**
-     * The EPICS data from the run.
-     */
-    private List<EpicsData> epicsDataList;
+     * DAQ config object built from string data.
+     */
+    private DAQConfig daqConfig;
+
+    /**
+     * Timestamp of END event.
+     */
+    private Integer endTimestamp;
+
+    /**
+     * Timestamp of GO event.
+     */
+    private Integer goTimestamp;
+
+    /**
+     * Clock livetime calculation.
+     */
+    private Double livetimeClock;
+
+    /**
+     * FCup TDC livetime calculation.
+     */
+    private Double livetimeTdc;
+
+    /**
+     * FCup TRG livetime calculation.
+     */
+    private Double livetimeTrg;
+
+    /**
+     * Notes about the run (from spreadsheet).
+     */
+    private String notes;
+
+    /**
+     * Timestamp of PRESTART event.
+     */
+    private Integer prestartTimestamp;
 
     /**
      * The run number.
      */
-    private final int run;
-
-    /**
-     * Flag to indicate run was okay.
-     */
-    private boolean runOkay = true;
-
-    /**
-     * The scaler data for the run.
-     */
-    private List<ScalerData> scalerDataList;
-
-    /**
-     * The trigger data for the run.
-     */
-    private TriggerConfig triggerConfig;
-
-    /**
-     * Start date of run.
-     */
-    private Date startDate;
+    private final Integer run;
+
+    /**
+     * Target setup (string from run spreadsheet).
+     */
+    private String target;
+
+    /**
+     * TI time offset in ns.
+     */
+    private Long tiTimeOffset;
 
     /**
      * The total events found in the run across all files.
      */
-    private int totalEvents = -1;
+    private Integer totalEvents;
 
     /**
      * The total number of files in the run.
      */
-    private int totalFiles = 0;
+    private Integer totalFiles;
+
+    /**
+     * Map of crate number to trigger config string data.
+     */
+    private Map<Integer, String> triggerConfigData;
+
+    /**
+     * Get the name of the trigger config file.
+     */
+    private String triggerConfigName;
+
+    /**
+     * Trigger rate in KHz.
+     */
+    private double triggerRate;
 
     /**
      * Date when the run record was last updated.
      */
     private Date updated;
-    
+
     /**
      * Create a run summary.
      *
@@ -105,209 +114,157 @@
         this.run = run;
     }
 
-    /**
-     * Get the creation date of this run record.
-     *
-     * @return the creation date of this run record
-     */
+    @Override
     public Date getCreated() {
         return this.created;
     }
 
-    /**
-     * Get the end date.
-     *
-     * @return the end date
-     */
-    public Date getEndDate() {
-        return endDate;
-    }
-
-    /**
-     * Return <code>true</code> if END event was found in the data.
-     *
-     * @return <code>true</code> if END event was in the data
-     */
-    public boolean getEndOkay() {
-        return this.endOkay;
-    }
-
-    /**
-     * Get the EPICS data from the run.
-     *
-     * @return the EPICS data from the run
-     */
-    public List<EpicsData> getEpicsData() {
-        return this.epicsDataList;
-    }
-
-    /**
-     * Get the event rate (effectively the trigger rate) which is the total
-     * events divided by the number of seconds in the run.
-     *
-     * @return the event rate
-     */
-    public double getEventRate() {
-        if (this.getTotalEvents() <= 0) {
-            throw new RuntimeException("Total events is zero or invalid.");
-        }
-        return (double) this.getTotalEvents() / (double) this.getTotalSeconds();
-    }
-
-    /**
-     * Get the run number.
-     *
-     * @return the run number
-     */
-    public int getRun() {
+    @Override
+    public DAQConfig getDAQConfig() {
+        return this.daqConfig;
+    }
+
+    @Override
+    public Integer getEndTimestamp() {
+        return endTimestamp;
+    }
+
+    @Override
+    public Integer getGoTimestamp() {
+        return goTimestamp;
+    }
+
+    @Override
+    public Double getLivetimeClock() {
+        return this.livetimeClock;
+    }
+
+    @Override
+    public Double getLivetimeFcupTdc() {
+        return this.livetimeTdc;
+    }
+
+    @Override
+    public Double getLivetimeFcupTrg() {
+        return this.livetimeTrg;
+    }
+
+    @Override
+    public String getNotes() {
+        return this.notes;
+    }
+
+    @Override
+    public Integer getPrestartTimestamp() {
+        return prestartTimestamp;
+    }
+
+    @Override
+    public Integer getRun() {
         return this.run;
     }
 
-    /**
-     * Return <code>true</code> if the run was okay (no major errors or data
-     * corruption occurred).
-     *
-     * @return <code>true</code> if the run was okay
-     */
-    public boolean getRunOkay() {
-        return this.runOkay;
-    }
-
-    /**
-     * Get the scaler data of this run.
-     *
-     * @return the scaler data of this run
-     */
-    public List<ScalerData> getScalerData() {
-        return this.scalerDataList;
-    }
-
-    /**
-     * Get the trigger config of this run.
-     *
-     * @return the trigger config of this run
-     */
-    public TriggerConfig getTriggerConfig() {
-        return triggerConfig;
-    }
-
-    /**
-     * Get the start date.
-     *
-     * @return the start date
-     */
-    public Date getStartDate() {
-        return startDate;
-    }
-
-    /**
-     * Get the total events in the run.
-     *
-     * @return the total events in the run
-     */
-    public int getTotalEvents() {
+    @Override
+    public String getTarget() {
+        return this.target;
+    }
+
+    @Override
+    public Long getTiTimeOffset() {
+        return this.tiTimeOffset;
+    }
+
+    @Override
+    public Integer getTotalEvents() {
         return this.totalEvents;
     }
 
-    /**
-     * Get the total number of files for this run.
-     *
-     * @return the total number of files for this run
-     */
-    public int getTotalFiles() {
+    @Override
+    public Integer getTotalFiles() {
         return this.totalFiles;
     }
 
-    /**
-     * Get the number of seconds in the run which is the difference between the
-     * start and end times.
-     *
-     * @return the total seconds in the run
-     */
-    public long getTotalSeconds() {
-        return (endDate.getTime() - startDate.getTime()) / 1000;
-    }
-
-    /**
-     * Get the date when this run record was last updated.
-     *
-     * @return the date when this run record was last updated
-     */
+    @Override
+    public Map<Integer, String> getTriggerConfigData() {
+        return this.triggerConfigData;
+    }
+
+    @Override
+    public String getTriggerConfigName() {
+        return this.triggerConfigName;
+    }
+
+    @Override
+    public Double getTriggerRate() {
+        return this.triggerRate;
+    }
+
+    @Override
     public Date getUpdated() {
         return updated;
     }
-    
-    /**
-     * Set the creation date of the run record.
-     *
-     * @param created the creation date of the run record
-     */
-    void setCreated(final Date created) {
+
+    /**
+     * Load DAQ config object from trigger config string data.
+     */
+    private void loadDAQConfig() {
+        if (this.triggerConfigData != null && !this.triggerConfigData.isEmpty()) {
+            EvioDAQParser parser = new EvioDAQParser();
+            for (Entry<Integer, String> entry : this.triggerConfigData.entrySet()) {
+                parser.parse(entry.getKey(), this.getRun(), new String[] {entry.getValue()});
+            }
+            ConfigurationManager.updateConfiguration(parser);
+            daqConfig = ConfigurationManager.getInstance();
+        }
+    }
+
+    void setCreated(Date created) {
         this.created = created;
     }
 
-    /**
-     * Set the start date.
-     *
-     * @param startDate the start date
-     */
-    void setEndDate(final Date endDate) {
-        this.endDate = endDate;
-    }
-
-    /**
-     * Set if end is okay.
-     *
-     * @param endOkay <code>true</code> if end is okay
-     */
-    void setEndOkay(final boolean endOkay) {
-        this.endOkay = endOkay;
-    }
-   
-    /**
-     * Set the EPICS data for the run.
-     *
-     * @param epics the EPICS data for the run
-     */
-    void setEpicsData(final List<EpicsData> epicsDataList) {
-        this.epicsDataList = epicsDataList;
-    }
-    
-    /**
-     * Set whether the run was "okay" meaning the data is usable for physics
-     * analysis.
-     *
-     * @param runOkay <code>true</code> if the run is okay
-     */
-    void setRunOkay(final boolean runOkay) {
-        this.runOkay = runOkay;
-    }
-
-    /**
-     * Set the scaler data of the run.
-     *
-     * @param scalerData the scaler data
-     */
-    void setScalerData(final List<ScalerData> scalerDataList) {
-        this.scalerDataList = scalerDataList;
-    }
-
-    /**
-     * Set the trigger config of the run.
-     *
-     * @param triggerConfig the trigger config
-     */
-    void setTriggerConfig(final TriggerConfig triggerConfig) {
-        this.triggerConfig = triggerConfig;
-    }
-
-    /**
-     * Set the start date.
-     *
-     * @param startDate the start date
-     */
-    void setStartDate(final Date startDate) {
-        this.startDate = startDate;
+    void setDAQConfig(DAQConfig daqConfig) {
+        this.daqConfig = daqConfig;
+    }
+
+    void setEndTimestamp(Integer endTimestamp) {
+        this.endTimestamp = endTimestamp;
+    }
+
+    void setGoTimestamp(Integer goTimestamp) {
+        this.goTimestamp = goTimestamp;
+    }
+
+    void setLivetimeClock(Double livetimeClock) {
+        this.livetimeClock = livetimeClock;
+    }
+
+    void setLivetimeFcupTdc(Double livetimeTdc) {
+        this.livetimeTdc = livetimeTdc;
+    }
+
+    void setLivetimeFcupTrg(Double livetimeTrg) {
+        this.livetimeTrg = livetimeTrg;
+    }
+
+    void setNotes(String notes) {
+        this.notes = notes;
+    }
+
+    void setPrestartTimestamp(Integer prestartTimestamp) {
+        this.prestartTimestamp = prestartTimestamp;
+    }
+
+    void setTarget(String target) {
+        this.target = target;
+    }
+
+    /**
+     * Set the TI time offset in ns.
+     * 
+     * @param tiTimeOffset the TIM time offset in ns
+     */
+    void setTiTimeOffset(Long tiTimeOffset) {
+        this.tiTimeOffset = tiTimeOffset;
     }
 
     /**
@@ -315,7 +272,7 @@
      *
      * @param totalEvents the total number of physics events in the run
      */
-    void setTotalEvents(final int totalEvents) {
+    void setTotalEvents(final Integer totalEvents) {
         this.totalEvents = totalEvents;
     }
 
@@ -324,37 +281,68 @@
      *
      * @param totalFiles the total number of EVIO files in the run
      */
-    void setTotalFiles(final int totalFiles) {
+    void setTotalFiles(final Integer totalFiles) {
         this.totalFiles = totalFiles;
     }
 
     /**
-     * Set the date when this run record was last updated.
-     *
-     * @param updated the date when the run record was last updated
-     */
-    void setUpdated(final Date updated) {
+     * Build the DAQ config from the trigger config string data.
+     * 
+     * @param triggerConfigData a map of crate number to the trigger config string data from the bank
+     */
+    void setTriggerConfigData(Map<Integer, String> triggerConfigData) {
+        this.triggerConfigData = triggerConfigData;
+        // Load DAQ config if not already set.
+        if (daqConfig == null) {
+            loadDAQConfig();
+        }
+    }
+
+    /**
+     * Set the trigger config file.
+     * 
+     * @param triggerConfigName the trigger config file
+     */
+    void setTriggerConfigName(String triggerConfigName) {
+        this.triggerConfigName = triggerConfigName;
+    }
+
+    /**
+     * Set the trigger rate in KHz.
+     * 
+     * @param triggerRate the trigger rate in KHz
+     */
+    void setTriggerRate(Double triggerRate) {
+        this.triggerRate = triggerRate;
+    }
+
+    void setUpdated(Date updated) {
         this.updated = updated;
     }
-    
-    /**
-     * Convert this object to a string.
-     *
+
+    /**
+     * Convert the object to a string.
+     * 
      * @return this object converted to a string
      */
     @Override
     public String toString() {
         return "RunSummary { " 
                 + "run: " + this.getRun() 
-                + ", startDate: " + (this.getStartDate() != null ? DATE_DISPLAY.format(this.getStartDate()) : null)
-                + ", endDate: " + (this.getEndDate() != null ? DATE_DISPLAY.format(this.getEndDate()) : null) 
-                + ", totalEvents: " + this.getTotalEvents()
-                + ", totalFiles: " + this.getTotalFiles() 
-                + ", endOkay: " + this.getEndOkay() 
-                + ", runOkay: "
-                + this.getRunOkay() 
-                + ", updated: " + this.getUpdated() 
+                + ", events: " + this.getTotalEvents() 
+                + ", files: " + this.getTotalFiles() 
                 + ", created: " + this.getCreated() 
+                + ", updated: " + this.getUpdated()
+                + ", prestartTimestamp: " + this.getPrestartTimestamp()
+                + ", goTimestamp: " + this.getGoTimestamp()
+                + ", endTimestamp: " + this.getEndTimestamp()
+                + ", triggerConfigFile: " + this.getTriggerConfigName()
+                + ", DAQConfig: " + (this.getDAQConfig() != null ? true : false)
+                + ", triggerRate: " + this.getTriggerRate()
+                + ", livetimeClock: " + this.getLivetimeClock()
+                + ", livetimeTdc: " + this.getLivetimeFcupTdc()
+                + ", livetimeTrg: " + this.getLivetimeFcupTrg()
+                + ", tiTimeOffset: " + this.getTiTimeOffset() 
                 + " }";
     }
 }

Added: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java	(added)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,36 @@
+package org.hps.run.database;
+
+import java.util.List;
+
+import org.hps.record.svt.SvtConfigData;
+
+/**
+ * Database API for accessing SVT configuration in run database.
+ * 
+ * @author Jeremy McCormick, SLAC
+ */
+public interface SvtConfigDao {
+   
+    /**
+     * Insert SVT configurations.
+     * 
+     * @param svtConfigs the list of SVT configurations
+     * @param run the run number
+     */
+    void insertSvtConfigs(List<SvtConfigData> svtConfigs, int run);
+    
+    /**
+     * Get the list of SVT configurations for the run.
+     * 
+     * @param run the run number
+     * @return the list of SVT configurations
+     */
+    List<SvtConfigData> getSvtConfigs(int run);
+    
+    /**
+     * Delete SVT configurations for the run.
+     * 
+     * @param run the run number
+     */
+    void deleteSvtConfigs(int run);
+}

Added: java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java
 =============================================================================
--- java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java	(added)
+++ java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java	Tue Dec  1 15:55:47 2015
@@ -0,0 +1,146 @@
+package org.hps.run.database;
+
+import java.sql.Clob;
+import java.sql.Connection;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.hps.record.svt.SvtConfigData;
+import org.hps.record.svt.SvtConfigData.RocTag;
+
+/**
+ * Implementation of SVT configuration database operations.
+ * 
+ * @author Jeremy McCormick, SLAC
+ */
+public class SvtConfigDaoImpl implements SvtConfigDao {
+
+    private Connection connection = null;
+    
+    private static final String INSERT = 
+            "INSERT INTO svt_configs (run, timestamp, config1, status1, config2, status2) VALUES (?, ?, ?, ?, ?, ?)"; 
+    
+    private static final String SELECT = 
+            "SELECT * FROM svt_configs WHERE run = ?";
+    
+    SvtConfigDaoImpl(Connection connection) {
+        if (connection == null) {
+            throw new IllegalArgumentException("The connection is null.");
+        }
+        this.connection = connection;
+    }
+    
+    @Override
+    public void insertSvtConfigs(List<SvtConfigData> svtConfigs, int run) {
+        PreparedStatement preparedStatement = null;
+        try {
+            preparedStatement = connection.prepareStatement(INSERT);
+            for (SvtConfigData config : svtConfigs) {
+                preparedStatement.setInt(1, run);
+                preparedStatement.setInt(2, config.getTimestamp());
+                if (config.getConfigData(RocTag.DATA) != null) {
+                    preparedStatement.setBytes(3, config.getConfigData(RocTag.DATA).getBytes());
+                } else {
+                    preparedStatement.setBytes(3, null);
+                }
+                if (config.getStatusData(RocTag.DATA) != null) {
+                    preparedStatement.setBytes(4, config.getStatusData(RocTag.DATA).getBytes());
+                } else {
+                    preparedStatement.setBytes(4, null);
+                }
+                if (config.getConfigData(RocTag.CONTROL) != null) {
+                    preparedStatement.setBytes(5, config.getConfigData(RocTag.CONTROL).getBytes());
+                } else {
+                    preparedStatement.setBytes(6, null);
+                }
+                if (config.getStatusData(RocTag.CONTROL) != null) {
+                    preparedStatement.setBytes(7, config.getConfigData(RocTag.CONTROL).getBytes());
+                } else {
+                    preparedStatement.setBytes(8, null);
+                }
+                preparedStatement.executeUpdate();
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        } finally {
+            if (preparedStatement != null) {
+                try {
+                    preparedStatement.close();
+                } catch (final SQLException e) {
+                    e.printStackTrace();
+                }
+            }
+        }
+    }
+    
+    @Override
+    public List<SvtConfigData> getSvtConfigs(int run) {
+        List<SvtConfigData> svtConfigList = new ArrayList<SvtConfigData>();
+        PreparedStatement preparedStatement = null;
+        try {
+            preparedStatement = connection.prepareStatement(SELECT);
+            preparedStatement.setInt(1, run);
+            ResultSet resultSet = preparedStatement.executeQuery();
+            while (resultSet.next()) {
+                
+                SvtConfigData config = new SvtConfigData(resultSet.getInt("timestamp"));
+                                
+                Clob clob = resultSet.getClob("config1");
+                if (clob != null) {
+                    config.setConfigData(RocTag.DATA, clob.getSubString(1, (int) clob.length()));
+                }
+                
+                clob = resultSet.getClob("status1");
+                if (clob != null) {
+                    config.setStatusData(RocTag.DATA, clob.getSubString(1, (int) clob.length()));
+                }
+                
+                clob = resultSet.getClob("config2");
+                if (clob != null) { 
+                    config.setConfigData(RocTag.CONTROL, clob.getSubString(1, (int) clob.length()));
+                }
+                
+                clob = resultSet.getClob("status2");
+                if (clob != null) {
+                    config.setStatusData(RocTag.CONTROL, clob.getSubString(1, (int) clob.length()));
+                }                
+                
+                svtConfigList.add(config);
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        } finally {
+            if (preparedStatement != null) {
+                try {
+                    preparedStatement.close();
+                } catch (final SQLException e) {
+                    e.printStackTrace();
+                }
+            }
+        }
+        return svtConfigList;
+    }
+    
+    @Override
+    public void deleteSvtConfigs(int run) {
+        PreparedStatement preparedStatement = null;
+        try {
+            preparedStatement = connection.prepareStatement("DELETE FROM svt_configs WHERE run = ?");
+            preparedStatement.setInt(1, run);
+            preparedStatement.executeUpdate();
+        } catch (final SQLException e) {
+            throw new RuntimeException(e);
+        } finally {
+            if (preparedStatement != null) {
+                try {
+                    preparedStatement.close();
+                } catch (final SQLException e) {
+                    e.printStackTrace();
+                }
+            }
+        }
+    }
+}

Modified: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/monitoring/EcalLedSequenceStandalone.lcsim
 =============================================================================
--- java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/monitoring/EcalLedSequenceStandalone.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/monitoring/EcalLedSequenceStandalone.lcsim	Tue Dec  1 15:55:47 2015
@@ -38,7 +38,7 @@
 	           <doFullAnalysis>false</doFullAnalysis>
 	           <skipMin>0.25</skipMin>
 	           <skipInitial>0.05</skipInitial>    
-	           <useRawEnergy>false</useRawEnergy>
+	           <useRawEnergy>true</useRawEnergy>
 	           <energyCut>1</energyCut>
 	           <nEventsMin>300</nEventsMin>
 	           <evnMinDraw>0.</evnMinDraw>

Modified: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/readout/HPSReconNoReadout.lcsim
 =============================================================================
--- java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/readout/HPSReconNoReadout.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/readout/HPSReconNoReadout.lcsim	Tue Dec  1 15:55:47 2015
@@ -1,6 +1,11 @@
 
 <lcsim xmlns:xs="http://www.w3.org/2001/XMLSchema-instance" 
        xs:noNamespaceSchemaLocation="http://www.lcsim.org/schemas/lcsim/1.0/lcsim.xsd">
+<!--
+    <control>
+        <numberOfEvents>20000</numberOfEvents>
+    </control>
+-->
     <execute>
         <driver name="EventMarkerDriver"/>   
         <driver name="ReconClusterer" />
@@ -46,7 +51,17 @@
             <ecalClusterCollectionName>EcalClustersCorr</ecalClusterCollectionName>        
             <trackCollectionNames>MatchedTracks GBLTracks</trackCollectionNames>
         </driver>  
-        <driver name="GBLOutputDriver" type="org.hps.recon.tracking.gbl.GBLOutputDriver"/>      
+        <driver name="GBLOutputDriver" type="org.hps.recon.tracking.gbl.GBLOutputDriver">
+            <debug>0</debug>
+            <isMC>false</isMC>
+            <gblFileName>${outputFile}.gbl</gblFileName>
+            <addBeamspot>false</addBeamspot>
+            <beamspotScatAngle>0.005</beamspotScatAngle>
+            <beamspotWidthZ>0.05</beamspotWidthZ>
+            <beamspotWidthY>0.2</beamspotWidthY>
+            <beamspotTiltZOverY>0.26</beamspotTiltZOverY>
+            <beamspotPosition>0.0 -0.11 -0.05</beamspotPosition>
+        </driver> 
         <driver name="GBLRefitterDriver" type="org.hps.recon.tracking.gbl.HpsGblRefitter"/>
         <driver name="LCIOWriter" type="org.lcsim.util.loop.LCIODriver">
             <outputFilePath>${outputFile}</outputFilePath>

Modified: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/recon/EngineeringRun2015FullReconMC_Pass2.lcsim
 =============================================================================
--- java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/recon/EngineeringRun2015FullReconMC_Pass2.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/recon/EngineeringRun2015FullReconMC_Pass2.lcsim	Tue Dec  1 15:55:47 2015
@@ -119,7 +119,7 @@
             <rmsTimeCut>8.0</rmsTimeCut>
         </driver>       
         <driver name="MergeTrackCollections" type="org.hps.recon.tracking.MergeTrackCollections" />
-        <driver name="GBLOutputDriver" type="org.hps.recon.tracking.gbl.GBLOutputDriver"/>             
+        <driver name="GBLOutputDriver" type="org.hps.recon.tracking.gbl.GBLOutputDriver"/>
         <driver name="GBLRefitterDriver" type="org.hps.recon.tracking.gbl.HpsGblRefitter"/>
         <driver name="EcalRawConverter" type="org.hps.recon.ecal.EcalRawConverterDriver">
             <ecalCollectionName>EcalCalHits</ecalCollectionName>
@@ -153,5 +153,6 @@
         <driver name="AidaSaveDriver" type="org.lcsim.job.AidaSaveDriver">
             <outputFileName>${outputFile}.root</outputFileName>
         </driver>
+        
     </drivers>
 </lcsim>

Modified: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullReconMC_Pass2_Gbl.lcsim
 =============================================================================
--- java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullReconMC_Pass2_Gbl.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullReconMC_Pass2_Gbl.lcsim	Tue Dec  1 15:55:47 2015
@@ -166,10 +166,11 @@
         <driver name="AidaSaveDriver" type="org.lcsim.job.AidaSaveDriver">
             <outputFileName>${outputFile}.root</outputFileName>
         </driver>
+        <!--
         <driver name="TriggerTurnOnDriver" type="org.hps.analysis.trigger.TriggerTurnOnDriver">
             <isMC>True</isMC>
         </driver>
         <driver name="TriggerTurnOnSSPDriver" type="org.hps.analysis.trigger.TriggerTurnOnSSPDriver"/>
-        
+        -->
     </drivers>
 </lcsim>

Modified: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullRecon_Pass2_Gbl.lcsim
 =============================================================================
--- java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullRecon_Pass2_Gbl.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/EngineeringRun2015FullRecon_Pass2_Gbl.lcsim	Tue Dec  1 15:55:47 2015
@@ -144,6 +144,12 @@
             <debug>0</debug>
             <isMC>false</isMC>
             <gblFileName>${outputFile}.gbl</gblFileName>
+            <addBeamspot>false</addBeamspot>
+            <beamspotScatAngle>0.005</beamspotScatAngle>
+            <beamspotWidthZ>0.05</beamspotWidthZ>
+            <beamspotWidthY>0.2</beamspotWidthY>
+            <beamspotTiltZOverY>0.26</beamspotTiltZOverY>
+            <beamspotPosition>0.0 -0.11 -0.05</beamspotPosition>
         </driver> 
         <driver name="GBLRefitterDriver" type="org.hps.recon.tracking.gbl.HpsGblRefitter">
             <debug>false</debug>
@@ -159,7 +165,7 @@
         <driver name="GblResidualEcalDriver" type="org.hps.users.phansson.ECalExtrapolationDriver"/>   
         <driver name="TrackExtrapolationTestDriver" type="org.hps.users.phansson.TrackExtrapolationTestDriver"/>   
         <driver name="TrackingReconstructionPlots" type="org.hps.users.phansson.TrackingReconstructionPlots">
-            <showPlots>True</showPlots>
+            <showPlots>False</showPlots>
         </driver>
         <driver name="TimerDriver1" type="org.hps.util.TimerDriver"/>
         <driver name="GeomChecker" type="org.hps.users.phansson.TrackingGeometryChecker"/>

Copied: java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim (from r3968, java/trunk/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim)
 =============================================================================
--- java/trunk/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim	(original)
+++ java/branches/jeremy-dev/steering-files/src/main/resources/org/hps/steering/users/phansson/Occupancy.lcsim	Tue Dec  1 15:55:47 2015
@@ -59,8 +59,11 @@
         <driver name="SensorOccupancyDriver" type="org.hps.monitoring.drivers.svt.SensorOccupancyPlotsDriver">
             <enablePositionPlots>True</enablePositionPlots>
             <eventRefreshRate>100</eventRefreshRate>
-            <enableTriggerFilter>True</enableTriggerFilter>
-            <filterPair1Triggers>True</filterPair1Triggers>
+            <enableTriggerFilter>False</enableTriggerFilter>
+            <filterPair1Triggers>False</filterPair1Triggers>
+            <filterPulserTriggers>True</filterPulserTriggers>
+            <timeWindowWeight>3.0</timeWindowWeight>
+            <maxSamplePosition>1</maxSamplePosition>
         </driver>
 
     </drivers>

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/TrackUtils.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/TrackUtils.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/TrackUtils.java	Tue Dec  1 15:55:47 2015
@@ -785,24 +785,37 @@
         return !isTopTrack(htf);
     }
 
+    
     /**
      * Transform MCParticle into a Helix object. Note that it produces the helix
      * parameters at nominal x=0 and assumes that there is no field at x<0
      *
      * @param mcp MC particle to be transformed
-     * @return helix object based on the MC particle
+     * @return {@link HelicalTrackFit} object based on the MC particle
      */
     public static HelicalTrackFit getHTF(MCParticle mcp, double Bz) {
-        boolean debug = true;
-        if (debug) {
-            System.out.printf("getHTF\n");
-            System.out.printf("mcp org %s mc p %s\n", mcp.getOrigin().toString(), mcp.getMomentum().toString());
-        }
-        Hep3Vector org = CoordinateTransformations.transformVectorToTracking(mcp.getOrigin());
+        return getHTF(mcp,mcp.getOrigin(),Bz);
+    }
+        
+    
+    
+    /**
+     * Transform MCParticle into a Helix object. Note that it produces the helix
+     * parameters at nominal x=0 and assumes that there is no field at x<0
+     *
+     * @param mcp MC particle to be transformed
+     * @param org origin to be used for the track 
+     * @return {@link HelicalTrackFit} object based on the MC particle
+     */
+    public static HelicalTrackFit getHTF(MCParticle mcp, Hep3Vector origin, double Bz) {
+        boolean debug = false;
+        
+        if (debug) System.out.printf("getHTF\nmcp org %s origin used %s mc p %s\n", mcp.getOrigin().toString(),origin.toString(), mcp.getMomentum().toString());
+        
+        Hep3Vector org = CoordinateTransformations.transformVectorToTracking(origin);
         Hep3Vector p = CoordinateTransformations.transformVectorToTracking(mcp.getMomentum());
 
-        if (debug)
-            System.out.printf("mcp org %s mc p %s (trans)\n", org.toString(), p.toString());
+        if (debug) System.out.printf("mcp org %s mc p %s (trans)\n", org.toString(), p.toString());
 
         // Move to x=0 if needed
         double targetX = BeamlineConstants.DIPOLE_EDGELOW_TESTRUN;
@@ -820,8 +833,7 @@
             // old.toString(),p.toString(),org.toString());
         }
 
-        if (debug)
-            System.out.printf("mcp org %s mc p %s (trans2)\n", org.toString(), p.toString());
+        if (debug) System.out.printf("mcp org %s mc p %s (trans2)\n", org.toString(), p.toString());
 
         HelixParamCalculator helixParamCalculator = new HelixParamCalculator(p, org, -1 * ((int) mcp.getCharge()), Bz);
         double par[] = new double[5];
@@ -831,8 +843,7 @@
         par[HelicalTrackFit.curvatureIndex] = 1.0 / helixParamCalculator.getRadius();
         par[HelicalTrackFit.z0Index] = helixParamCalculator.getZ0();
         HelicalTrackFit htf = getHTF(par);
-        System.out.printf("d0 %f z0 %f R %f phi %f lambda %s\n",
-                htf.dca(), htf.z0(), htf.R(), htf.phi0(), htf.slope());
+        if(debug) System.out.printf("d0 %f z0 %f R %f phi %f lambda %s\n", htf.dca(), htf.z0(), htf.R(), htf.phi0(), htf.slope());
         return htf;
     }
 

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutput.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutput.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutput.java	Tue Dec  1 15:55:47 2015
@@ -58,15 +58,16 @@
     private final MaterialSupervisor materialManager;
     private final MultipleScattering _scattering;
     private final double _beamEnergy = 1.1; //GeV
-    private boolean AprimeEvent = false; // do extra checks
     private boolean hasXPlanes = false;
     private boolean addBeamspot = false;
     private double beamspotTiltZOverY = 0; //Math.PI/180* 15;
     private double beamspotScatAngle = 0.000001;
-    // beam spot with in tracking frame
+    // beam spot, in tracking frame
     private double beamspotWidthZ = 0.05;
     private double beamspotWidthY = 0.150;
-    double beamspotPosition[] = {0,0,0};
+    private double beamspotPosition[] = {0,0,0};
+    // human readable ID for beam spot     
+    private final int iBeamspotHit = -1; 
     
 
     /**
@@ -138,10 +139,6 @@
         }
     }
 
-    void setAPrimeEventFlag(boolean flag) {
-        this.AprimeEvent = flag;
-    }
-
     void setXPlaneFlag(boolean flag) {
         this.hasXPlanes = flag;
     }
@@ -168,25 +165,61 @@
 
         // Find the truth particle of the track
         MCParticle mcp = null;
-
+        MCParticle ap = null;
+        
+        // MC processing
         if (isMC) {
+            
+            // find the truth particle for this track
             mcp = getMatchedTruthParticle(trk);
 
+            // check if this is an A' event
+            for(MCParticle part : mcParticles) {
+                if(Math.abs(part.getPDGID()) == 622) {
+                    ap = part;
+                    break;
+                }
+            }
+            
             if (mcp == null) {
                 System.out.printf("%s: WARNING!! no truth particle found in event!\n", this.getClass().getSimpleName());
                 this.printMCParticles(mcParticles);
                 //System.exit(1);
-            } else if (_debug > 0) {
-                System.out.printf("%s: truth particle (pdgif %d ) found in event!\n", this.getClass().getSimpleName(), mcp.getPDGID());
-            }
-
-            if (AprimeEvent) {
-                checkAprimeTruth(mcp, mcParticles);
-            }
-        }
+            } else {
+                if (_debug > 0) System.out.printf("%s: truth particle (pdgif %d ) found in event!\n", this.getClass().getSimpleName(), mcp.getPDGID());
+
+                // If this is an A' event, do some more checks
+                if ( ap != null) {
+                    // A few MC files have broken links b/w parents-daughters
+                    // This causes the MC particle to come from the origin even if the decay happen somewhere else
+                    if(this.getAprimeDecayProducts(mcParticles).size()>0) {
+                        //do a full check
+                        checkAprimeTruth(mcp, mcParticles);
+                    }
+                }
+            }
+        }        
 
         // Get track parameters from MC particle 
-        HelicalTrackFit htfTruth = (isMC && mcp != null) ? TrackUtils.getHTF(mcp, -1.0 * this.bFieldVector.z()) : null;
+        HelicalTrackFit htfTruth = null;
+        
+        if( isMC && mcp != null) {
+            // check if we should be using a different origin than the particle tells us
+            Hep3Vector mcp_origin;
+            if( ap != null) {
+                // There is an A' here. Use its origin if different
+                if (_debug > 0) System.out.printf("%s: A' found with origin  %s compared to particle %s (diff: %s)\n", this.getClass().getSimpleName(), ap.getOrigin().toString(), mcp.getOrigin().toString(), VecOp.sub(ap.getOrigin(), mcp.getOrigin()).toString());
+                if(VecOp.sub(ap.getOrigin(), mcp.getOrigin()).magnitude() > 0.00001)
+                    mcp_origin = ap.getOrigin();
+                else
+                    mcp_origin = mcp.getOrigin();
+            } else {
+                // No A', use particle origin
+                mcp_origin = mcp.getOrigin();
+            }
+
+            htfTruth = TrackUtils.getHTF(mcp,mcp_origin, -1.0 * this.bFieldVector.z());
+        }
 
         // Use the truth helix as the initial track for GBL?
         //htf = htfTruth;
@@ -274,7 +307,6 @@
         
         
         int istrip = 0;
-        final int iBeamspotHit = -1; // human readable ID for beam spot 
         int beamSpotMillepedeId = 98; // just a random int number that I came up with
         
         for (int ihit = -1; ihit != hits.size(); ++ihit) {
@@ -282,7 +314,7 @@
             HelicalTrackHit hit = null;
             HelicalTrackCross htc = null;
             List<HelicalTrackStrip> strips;
-            List<MCParticle> hitMCParticles = null;
+            List<MCParticle> hitMCParticles = new ArrayList<MCParticle>();
             Hep3Vector correctedHitPosition = null;
 
             // Add beamspot first
@@ -390,7 +422,7 @@
                 SimTrackerHit simHit = simHitsLayerMap.get(strip.layer());
 
                 if (isMC) {
-                    if (simHit == null) {
+                    if (simHit == null && ihit != iBeamspotHit) {
                         System.out.printf("%s: no sim hit for strip hit at layer %d\n", this.getClass().getSimpleName(), strip.layer());
                         System.out.printf("%s: it as %d mc particles associated with it:\n", this.getClass().getSimpleName(), hitMCParticles.size());
                         for (MCParticle particle : hitMCParticles) {
@@ -676,14 +708,57 @@
     }
 
     
+    
+    private List<MCParticle> getAprimeDecayProducts(List<MCParticle> mcParticles) {
+        List<MCParticle> pair = new ArrayList<MCParticle>();
+        for (MCParticle mcp : mcParticles) {
+            if (mcp.getGeneratorStatus() != MCParticle.FINAL_STATE) {
+                continue;
+            }
+            boolean hasAprimeParent = false;
+            for (MCParticle parent : mcp.getParents()) {
+                if (Math.abs(parent.getPDGID()) == 622) {
+                    hasAprimeParent = true;
+                }
+            }
+            if (hasAprimeParent) {
+                pair.add(mcp);
+            }
+        }
+        
+        return pair;
+
+    }
+
    
 
     private void checkAprimeTruth(MCParticle mcp, List<MCParticle> mcParticles) {
+        
         List<MCParticle> mcp_pair = getAprimeDecayProducts(mcParticles);
 
+        
+        if (mcp_pair.size() != 2) {
+            System.out.printf("%s: ERROR this event has %d mcp with 622 as parent!!??  \n", this.getClass().getSimpleName(), mcp_pair.size());
+            this.printMCParticles(mcParticles);
+            System.exit(1);
+        }
+        if (Math.abs(mcp_pair.get(0).getPDGID()) != 11 || Math.abs(mcp_pair.get(1).getPDGID()) != 11) {
+            System.out.printf("%s: ERROR decay products are not e+e-? \n", this.getClass().getSimpleName());
+            this.printMCParticles(mcParticles);
+            System.exit(1);
+        }
+        if (mcp_pair.get(0).getPDGID() * mcp_pair.get(1).getPDGID() > 0) {
+            System.out.printf("%s: ERROR decay products have the same sign? \n", this.getClass().getSimpleName());
+            this.printMCParticles(mcParticles);
+            System.exit(1);
+        }
+        
+        
+        
         if (_debug > 0) {
             double invMassTruth = Math.sqrt(Math.pow(mcp_pair.get(0).getEnergy() + mcp_pair.get(1).getEnergy(), 2) - VecOp.add(mcp_pair.get(0).getMomentum(), mcp_pair.get(1).getMomentum()).magnitudeSquared());
             double invMassTruthTrks = getInvMassTracks(TrackUtils.getHTF(mcp_pair.get(0), -1.0 * this.bFieldVector.z()), TrackUtils.getHTF(mcp_pair.get(1), -1.0 * this.bFieldVector.z()));
+            
             System.out.printf("%s: invM = %f\n", this.getClass().getSimpleName(), invMassTruth);
             System.out.printf("%s: invMTracks = %f\n", this.getClass().getSimpleName(), invMassTruthTrks);
         }
@@ -894,40 +969,6 @@
         return chi2.e(0, 0);
     }
 
-    private List<MCParticle> getAprimeDecayProducts(List<MCParticle> mcParticles) {
-        List<MCParticle> pair = new ArrayList<MCParticle>();
-        for (MCParticle mcp : mcParticles) {
-            if (mcp.getGeneratorStatus() != MCParticle.FINAL_STATE) {
-                continue;
-            }
-            boolean hasAprimeParent = false;
-            for (MCParticle parent : mcp.getParents()) {
-                if (Math.abs(parent.getPDGID()) == 622) {
-                    hasAprimeParent = true;
-                }
-            }
-            if (hasAprimeParent) {
-                pair.add(mcp);
-            }
-        }
-        if (pair.size() != 2) {
-            System.out.printf("%s: ERROR this event has %d mcp with 622 as parent!!??  \n", this.getClass().getSimpleName(), pair.size());
-            this.printMCParticles(mcParticles);
-            System.exit(1);
-        }
-        if (Math.abs(pair.get(0).getPDGID()) != 11 || Math.abs(pair.get(1).getPDGID()) != 11) {
-            System.out.printf("%s: ERROR decay products are not e+e-? \n", this.getClass().getSimpleName());
-            this.printMCParticles(mcParticles);
-            System.exit(1);
-        }
-        if (pair.get(0).getPDGID() * pair.get(1).getPDGID() > 0) {
-            System.out.printf("%s: ERROR decay products have the same sign? \n", this.getClass().getSimpleName());
-            this.printMCParticles(mcParticles);
-            System.exit(1);
-        }
-        return pair;
-
-    }
 
     private void printMCParticles(List<MCParticle> mcParticles) {
         System.out.printf("%s: printMCParticles \n", this.getClass().getSimpleName());

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutputDriver.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutputDriver.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLOutputDriver.java	Tue Dec  1 15:55:47 2015
@@ -79,7 +79,6 @@
         gbl = new GBLOutput(gblFileName, bfield); // if filename is empty no text file is written
         gbl.setDebug(_debug);
         gbl.buildModel(detector);
-        gbl.setAPrimeEventFlag(false);
         gbl.setXPlaneFlag(false);
         gbl.setAddBeamspot(addBeamspot);
         gbl.setBeamspotScatAngle(beamspotScatAngle);

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLRefitterDriver.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLRefitterDriver.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GBLRefitterDriver.java	Tue Dec  1 15:55:47 2015
@@ -6,6 +6,7 @@
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
+import org.apache.commons.math3.util.Pair;
 import org.hps.recon.tracking.MaterialSupervisor;
 import org.hps.recon.tracking.MultipleScattering;
 import org.hps.recon.tracking.TrackUtils;
@@ -68,9 +69,9 @@
 
         Map<Track, Track> inputToRefitted = new HashMap<Track, Track>();
         for (Track track : tracks) {
-            Track newTrack = GblUtils.refitTrack(TrackUtils.getHTF(track), TrackUtils.getStripHits(track, hitToStrips, hitToRotated), track.getTrackerHits(), 5, _scattering, bfield);
-            refittedTracks.add(newTrack);
-            inputToRefitted.put(track, newTrack);
+            Pair<Track, GBLKinkData> newTrack = MakeGblTracks.refitTrack(TrackUtils.getHTF(track), TrackUtils.getStripHits(track, hitToStrips, hitToRotated), track.getTrackerHits(), 5, _scattering, bfield);
+            refittedTracks.add(newTrack.getFirst());
+            inputToRefitted.put(track, newTrack.getFirst());
         }
 
         if (mergeTracks) {
@@ -105,8 +106,8 @@
                         }
                     }
 
-                    Track mergedTrack = GblUtils.refitTrack(TrackUtils.getHTF(track), TrackUtils.getStripHits(track, hitToStrips, hitToRotated), allHth, 5, _scattering, bfield);
-                    mergedTracks.add(mergedTrack);
+                    Pair<Track, GBLKinkData> mergedTrack = MakeGblTracks.refitTrack(TrackUtils.getHTF(track), TrackUtils.getStripHits(track, hitToStrips, hitToRotated), allHth, 5, _scattering, bfield);
+                    mergedTracks.add(mergedTrack.getFirst());
 //                    System.out.format("%f %f %f\n", fit.get_chi2(), inputToRefitted.get(track).getChi2(), inputToRefitted.get(otherTrack).getChi2());
 //                mergedTrackToTrackList.put(mergedTrack, new ArrayList<Track>());
                 }

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GblUtils.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GblUtils.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/GblUtils.java	Tue Dec  1 15:55:47 2015
@@ -1,34 +1,11 @@
 package org.hps.recon.tracking.gbl;
 
 import hep.physics.matrix.BasicMatrix;
-import hep.physics.vec.BasicHep3Vector;
-import hep.physics.vec.Hep3Matrix;
-import hep.physics.vec.Hep3Vector;
-import hep.physics.vec.VecOp;
-import java.util.ArrayList;
-import java.util.Collection;
-import java.util.Collections;
-import java.util.Comparator;
-import java.util.List;
-import org.hps.recon.tracking.CoordinateTransformations;
 import org.hps.recon.tracking.MaterialSupervisor;
 import org.hps.recon.tracking.MultipleScattering;
-import org.hps.recon.tracking.TrackUtils;
-import org.lcsim.constants.Constants;
 import org.lcsim.detector.IDetectorElement;
-import org.lcsim.detector.ITransform3D;
-import org.lcsim.detector.tracker.silicon.ChargeCarrier;
-import org.lcsim.detector.tracker.silicon.HpsSiSensor;
-import org.lcsim.detector.tracker.silicon.SiSensor;
-import org.lcsim.detector.tracker.silicon.SiSensorElectrodes;
-import org.lcsim.event.RawTrackerHit;
-import org.lcsim.event.Track;
-import org.lcsim.event.TrackerHit;
 import org.lcsim.fit.helicaltrack.HelicalTrackFit;
-import org.lcsim.fit.helicaltrack.HelicalTrackStrip;
 import org.lcsim.fit.helicaltrack.HelixUtils;
-import org.lcsim.recon.tracking.digitization.sisim.SiTrackerHitStrip1D;
-import org.lcsim.recon.tracking.digitization.sisim.TrackerHitType;
 import org.lcsim.recon.tracking.seedtracker.ScatterAngle;
 
 /**
@@ -128,216 +105,4 @@
             throw new UnsupportedOperationException("Should not happen. This problem is only solved with the MaterialSupervisor.");
         }
     }
-
-    /**
-     * Do a GBL fit to an arbitrary set of strip hits, with a starting value of
-     * the helix parameters.
-     *
-     * @param helix Initial helix parameters. Only track parameters are used
-     * (not covariance)
-     * @param stripHits Strip hits to be used for the GBL fit. Does not need to
-     * be in sorted order.
-     * @param hth Stereo hits for the track's hit list (these are not used in
-     * the GBL fit). Does not need to be in sorted order.
-     * @param nIterations Number of times to iterate the GBL fit.
-     * @param scattering Multiple scattering manager.
-     * @param bfield B-field
-     * @return The refitted track.
-     */
-    public static Track refitTrack(HelicalTrackFit helix, Collection<TrackerHit> stripHits, Collection<TrackerHit> hth, int nIterations, MultipleScattering scattering, double bfield) {
-        List<TrackerHit> allHthList = sortHits(hth);
-        List<TrackerHit> sortedStripHits = sortHits(stripHits);
-        FittedGblTrajectory fit = GblUtils.doGBLFit(helix, sortedStripHits, scattering, bfield, 0);
-        for (int i = 0; i < nIterations; i++) {
-            Track newTrack = MakeGblTracks.makeCorrectedTrack(fit, helix, allHthList, 0, bfield);
-            helix = TrackUtils.getHTF(newTrack);
-            fit = GblUtils.doGBLFit(helix, sortedStripHits, scattering, bfield, 0);
-        }
-        Track mergedTrack = MakeGblTracks.makeCorrectedTrack(fit, helix, allHthList, 0, bfield);
-        return mergedTrack;
-    }
-
-    public static FittedGblTrajectory doGBLFit(HelicalTrackFit htf, List<TrackerHit> stripHits, MultipleScattering _scattering, double bfield, int debug) {
-        List<GBLStripClusterData> stripData = makeStripData(htf, stripHits, _scattering, bfield, debug);
-        double bfac = Constants.fieldConversion * bfield;
-
-        FittedGblTrajectory fit = HpsGblRefitter.fit(stripData, bfac, debug > 0);
-        return fit;
-    }
-
-    public static List<GBLStripClusterData> makeStripData(HelicalTrackFit htf, List<TrackerHit> stripHits, MultipleScattering _scattering, double _B, int _debug) {
-        List<GBLStripClusterData> stripClusterDataList = new ArrayList<GBLStripClusterData>();
-
-        // Find scatter points along the path
-        MultipleScattering.ScatterPoints scatters = _scattering.FindHPSScatterPoints(htf);
-
-        if (_debug > 0) {
-            System.out.printf("perPar covariance matrix\n%s\n", htf.covariance().toString());
-        }
-
-        for (TrackerHit stripHit : stripHits) {
-            HelicalTrackStripGbl strip;
-            if (stripHit instanceof SiTrackerHitStrip1D) {
-                strip = new HelicalTrackStripGbl(makeDigiStrip((SiTrackerHitStrip1D) stripHit), true);
-            } else {
-                SiTrackerHitStrip1D newHit = new SiTrackerHitStrip1D(stripHit);
-                strip = new HelicalTrackStripGbl(makeDigiStrip(newHit), true);
-            }
-
-            // find Millepede layer definition from DetectorElement
-            HpsSiSensor sensor = (HpsSiSensor) ((RawTrackerHit) stripHit.getRawHits().get(0)).getDetectorElement();
-
-            int millepedeId = sensor.getMillepedeId();
-
-            if (_debug > 0) {
-                System.out.printf("layer %d millepede %d (DE=\"%s\", origin %s) \n", strip.layer(), millepedeId, sensor.getName(), strip.origin().toString());
-            }
-
-            //Center of the sensor
-            Hep3Vector origin = strip.origin();
-
-            //Find intercept point with sensor in tracking frame
-            Hep3Vector trkpos = TrackUtils.getHelixPlaneIntercept(htf, strip, Math.abs(_B));
-            if (trkpos == null) {
-                if (_debug > 0) {
-                    System.out.println("Can't find track intercept; use sensor origin");
-                }
-                trkpos = strip.origin();
-            }
-            if (_debug > 0) {
-                System.out.printf("trkpos at intercept [%.10f %.10f %.10f]\n", trkpos.x(), trkpos.y(), trkpos.z());
-            }
-
-            //GBLDATA
-            GBLStripClusterData stripData = new GBLStripClusterData(millepedeId);
-            //Add to output list
-            stripClusterDataList.add(stripData);
-
-            //path length to intercept
-            double s = HelixUtils.PathToXPlane(htf, trkpos.x(), 0, 0).get(0);
-            double s3D = s / Math.cos(Math.atan(htf.slope()));
-
-            //GBLDATA
-            stripData.setPath(s);
-            stripData.setPath3D(s3D);
-
-            //GBLDATA
-            stripData.setU(strip.u());
-            stripData.setV(strip.v());
-            stripData.setW(strip.w());
-
-            //Print track direction at intercept
-            Hep3Vector tDir = HelixUtils.Direction(htf, s);
-            double phi = htf.phi0() - s / htf.R();
-            double lambda = Math.atan(htf.slope());
-
-            //GBLDATA
-            stripData.setTrackDir(tDir);
-            stripData.setTrackPhi(phi);
-            stripData.setTrackLambda(lambda);
-
-            //Print residual in measurement system
-            // start by find the distance vector between the center and the track position
-            Hep3Vector vdiffTrk = VecOp.sub(trkpos, origin);
-
-            // then find the rotation from tracking to measurement frame
-            Hep3Matrix trkToStripRot = getTrackToStripRotation(sensor);
-
-            // then rotate that vector into the measurement frame to get the predicted measurement position
-            Hep3Vector trkpos_meas = VecOp.mult(trkToStripRot, vdiffTrk);
-
-            //GBLDATA
-            stripData.setMeas(strip.umeas());
-            stripData.setTrackPos(trkpos_meas);
-            stripData.setMeasErr(strip.du());
-
-            if (_debug > 1) {
-                System.out.printf("rotation matrix to meas frame\n%s\n", VecOp.toString(trkToStripRot));
-                System.out.printf("tPosGlobal %s origin %s\n", trkpos.toString(), origin.toString());
-                System.out.printf("tDiff %s\n", vdiffTrk.toString());
-                System.out.printf("tPosMeas %s\n", trkpos_meas.toString());
-            }
-
-            if (_debug > 0) {
-                System.out.printf("layer %d millePedeId %d uRes %.10f\n", strip.layer(), millepedeId, stripData.getMeas() - stripData.getTrackPos().x());
-            }
-
-            // find scattering angle
-            MultipleScattering.ScatterPoint scatter = scatters.getScatterPoint(((RawTrackerHit) strip.getStrip().rawhits().get(0)).getDetectorElement());
-            double scatAngle;
-
-            if (scatter != null) {
-                scatAngle = scatter.getScatterAngle().Angle();
-            } else {
-                if (_debug > 0) {
-                    System.out.printf("WARNING cannot find scatter for detector %s with strip cluster at %s\n", ((RawTrackerHit) strip.getStrip().rawhits().get(0)).getDetectorElement().getName(), strip.origin().toString());
-                }
-                scatAngle = GblUtils.estimateScatter(sensor, htf, _scattering, _B);
-            }
-
-            //GBLDATA
-            stripData.setScatterAngle(scatAngle);
-        }
-        return stripClusterDataList;
-    }
-
-    private static Hep3Matrix getTrackToStripRotation(SiSensor sensor) {
-        // This function transforms the hit to the sensor coordinates
-
-        // Transform from JLab frame to sensor frame (done through the RawTrackerHit)
-        SiSensorElectrodes electrodes = sensor.getReadoutElectrodes(ChargeCarrier.HOLE);
-        ITransform3D detToStrip = electrodes.getGlobalToLocal();
-        // Get rotation matrix
-        Hep3Matrix detToStripMatrix = detToStrip.getRotation().getRotationMatrix();
-        // Transformation between the JLAB and tracking coordinate systems
-        Hep3Matrix detToTrackMatrix = CoordinateTransformations.getMatrix();
-
-        return VecOp.mult(detToStripMatrix, VecOp.inverse(detToTrackMatrix));
-    }
-
-    private static HelicalTrackStrip makeDigiStrip(SiTrackerHitStrip1D h) {
-        SiTrackerHitStrip1D local = h.getTransformedHit(TrackerHitType.CoordinateSystem.SENSOR);
-        SiTrackerHitStrip1D global = h.getTransformedHit(TrackerHitType.CoordinateSystem.GLOBAL);
-
-        ITransform3D trans = local.getLocalToGlobal();
-        Hep3Vector org = trans.transformed(new BasicHep3Vector(0., 0., 0.));
-        Hep3Vector u = global.getMeasuredCoordinate();
-        Hep3Vector v = global.getUnmeasuredCoordinate();
-
-        //rotate to tracking frame
-        Hep3Vector neworigin = CoordinateTransformations.transformVectorToTracking(org);
-        Hep3Vector newu = CoordinateTransformations.transformVectorToTracking(u);
-        Hep3Vector newv = CoordinateTransformations.transformVectorToTracking(v);
-
-        double umeas = local.getPosition()[0];
-        double vmin = VecOp.dot(local.getUnmeasuredCoordinate(), local.getHitSegment().getStartPoint());
-        double vmax = VecOp.dot(local.getUnmeasuredCoordinate(), local.getHitSegment().getEndPoint());
-        double du = Math.sqrt(local.getCovarianceAsMatrix().diagonal(0));
-
-        //don't fill fields we don't use
-//        IDetectorElement de = h.getSensor();
-//        String det = getName(de);
-//        int lyr = getLayer(de);
-//        BarrelEndcapFlag be = getBarrelEndcapFlag(de);
-        double dEdx = h.getdEdx();
-        double time = h.getTime();
-        List<RawTrackerHit> rawhits = h.getRawHits();
-        HelicalTrackStrip strip = new HelicalTrackStrip(neworigin, newu, newv, umeas, du, vmin, vmax, dEdx, time, rawhits, null, -1, null);
-
-        return strip;
-    }
-
-    private static List<TrackerHit> sortHits(Collection<TrackerHit> hits) {
-        List<TrackerHit> hitList = new ArrayList<TrackerHit>(hits);
-        Collections.sort(hitList, new LayerComparator());
-        return hitList;
-    }
-
-    private static class LayerComparator implements Comparator<TrackerHit> {
-
-        @Override
-        public int compare(TrackerHit o1, TrackerHit o2) {
-            return Integer.compare(TrackUtils.getLayer(o1), TrackUtils.getLayer(o2));
-        }
-    }
 }

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/HpsGblRefitter.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/HpsGblRefitter.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/HpsGblRefitter.java	Tue Dec  1 15:55:47 2015
@@ -14,8 +14,10 @@
 import java.util.logging.Formatter;
 import java.util.logging.Level;
 import java.util.logging.Logger;
+import org.apache.commons.math3.util.Pair;
 
 import org.hps.recon.tracking.TrackUtils;
+import static org.hps.recon.tracking.gbl.MakeGblTracks.makeCorrectedTrack;
 import org.hps.recon.tracking.gbl.matrix.Matrix;
 import org.hps.recon.tracking.gbl.matrix.SymMatrix;
 import org.hps.recon.tracking.gbl.matrix.Vector;
@@ -25,8 +27,12 @@
 import org.lcsim.event.GenericObject;
 import org.lcsim.event.LCRelation;
 import org.lcsim.event.Track;
+import org.lcsim.event.base.BaseLCRelation;
 import org.lcsim.geometry.Detector;
 import org.lcsim.geometry.compact.converter.MilleParameter;
+import org.lcsim.lcio.LCIOConstants;
+import org.lcsim.recon.tracking.seedtracker.SeedCandidate;
+import org.lcsim.recon.tracking.seedtracker.SeedTrack;
 import org.lcsim.util.Driver;
 
 /**
@@ -40,21 +46,20 @@
 public class HpsGblRefitter extends Driver {
 
     static Formatter f = new BasicLogFormatter();
-    private static Logger LOGGER = Logger.getLogger(HpsGblRefitter.class.getPackage().getName());
+    private final static Logger LOGGER = Logger.getLogger(HpsGblRefitter.class.getPackage().getName());
     private boolean _debug = false;
     private final String trackCollectionName = "MatchedTracks";
     private final String track2GblTrackRelationName = "TrackToGBLTrack";
     private final String gblTrack2StripRelationName = "GBLTrackToStripData";
+    private final String outputTrackCollectionName = "GBLTracks";
 
     private MilleBinary mille;
     private String milleBinaryFileName = MilleBinary.DEFAULT_OUTPUT_FILE_NAME;
     private boolean writeMilleBinary = false;
 
-    private final MakeGblTracks _makeTracks;
-
     public void setDebug(boolean debug) {
         _debug = debug;
-        _makeTracks.setDebug(debug);
+        MakeGblTracks.setDebug(debug);
     }
 
     public void setMilleBinaryFileName(String filename) {
@@ -66,8 +71,7 @@
     }
 
     public HpsGblRefitter() {
-        _makeTracks = new MakeGblTracks();
-        _makeTracks.setDebug(_debug);
+        MakeGblTracks.setDebug(_debug);
         LOGGER.setLevel(Level.WARNING);
         //System.out.println("level " + LOGGER.getLevel().toString());
     }
@@ -179,7 +183,35 @@
         LOGGER.info(stripsGblMap.size() + " tracks in stripsGblMap");
         LOGGER.info(trackFits.size() + " fitted GBL tracks before adding to event");
 
-        _makeTracks.Process(event, trackFits, bfield);
+        List<Track> newTracks = new ArrayList<Track>();
+
+        List<GBLKinkData> kinkDataCollection = new ArrayList<GBLKinkData>();
+
+        List<LCRelation> kinkDataRelations = new ArrayList<LCRelation>();
+
+        LOGGER.info("adding " + trackFits.size() + " of fitted GBL tracks to the event");
+
+        for (FittedGblTrajectory fittedTraj : trackFits) {
+
+            SeedTrack seedTrack = (SeedTrack) fittedTraj.get_seed();
+            SeedCandidate trackseed = seedTrack.getSeedCandidate();
+
+            //  Create a new Track
+            Pair<Track, GBLKinkData> trk = makeCorrectedTrack(fittedTraj, trackseed.getHelix(), seedTrack.getTrackerHits(), seedTrack.getType(), bfield);
+
+            //  Add the track to the list of tracks
+            newTracks.add(trk.getFirst());
+            kinkDataCollection.add(trk.getSecond());
+            kinkDataRelations.add(new BaseLCRelation(trk.getSecond(), trk.getFirst()));
+        }
+
+        LOGGER.info("adding " + Integer.toString(newTracks.size()) + " Gbl tracks to event with " + event.get(Track.class, "MatchedTracks").size() + " matched tracks");
+
+        // Put the tracks back into the event and exit
+        int flag = 1 << LCIOConstants.TRBIT_HITS;
+        event.put(outputTrackCollectionName, newTracks, Track.class, flag);
+        event.put(GBLKinkData.DATA_COLLECTION, kinkDataCollection, GBLKinkData.class, 0);
+        event.put(GBLKinkData.DATA_RELATION_COLLECTION, kinkDataRelations, LCRelation.class, 0);
 
         if (_debug) {
             System.out.printf("%s: Done.\n", getClass().getSimpleName());

Modified: java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/MakeGblTracks.java
 =============================================================================
--- java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/MakeGblTracks.java	(original)
+++ java/branches/jeremy-dev/tracking/src/main/java/org/hps/recon/tracking/gbl/MakeGblTracks.java	Tue Dec  1 15:55:47 2015
@@ -1,53 +1,58 @@
 package org.hps.recon.tracking.gbl;
 
-import static org.hps.recon.tracking.gbl.GBLOutput.getPerToClPrj;
 import hep.physics.matrix.SymmetricMatrix;
 import hep.physics.vec.BasicHep3Vector;
 import hep.physics.vec.Hep3Matrix;
 import hep.physics.vec.Hep3Vector;
 import hep.physics.vec.VecOp;
-
 import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.Comparator;
 import java.util.List;
 import java.util.logging.Level;
 import java.util.logging.Logger;
-
 import org.apache.commons.math3.util.Pair;
+import org.hps.recon.tracking.CoordinateTransformations;
+import org.hps.recon.tracking.MultipleScattering;
 import org.hps.recon.tracking.TrackType;
+import org.hps.recon.tracking.TrackUtils;
+import static org.hps.recon.tracking.gbl.GBLOutput.getPerToClPrj;
 import org.hps.recon.tracking.gbl.matrix.Matrix;
 import org.hps.recon.tracking.gbl.matrix.SymMatrix;
 import org.hps.recon.tracking.gbl.matrix.Vector;
 import org.lcsim.constants.Constants;
-import org.lcsim.event.EventHeader;
+import org.lcsim.detector.ITransform3D;
+import org.lcsim.detector.tracker.silicon.ChargeCarrier;
+import org.lcsim.detector.tracker.silicon.HpsSiSensor;
+import org.lcsim.detector.tracker.silicon.SiSensor;
+import org.lcsim.detector.tracker.silicon.SiSensorElectrodes;
+import org.lcsim.event.RawTrackerHit;
 import org.lcsim.event.Track;
 import org.lcsim.event.TrackState;
 import org.lcsim.event.TrackerHit;
 import org.lcsim.event.base.BaseTrack;
 import org.lcsim.event.base.BaseTrackState;
 import org.lcsim.fit.helicaltrack.HelicalTrackFit;
-import org.lcsim.lcio.LCIOConstants;
-import org.lcsim.recon.tracking.seedtracker.SeedCandidate;
-import org.lcsim.recon.tracking.seedtracker.SeedTrack;
+import org.lcsim.fit.helicaltrack.HelicalTrackStrip;
+import org.lcsim.fit.helicaltrack.HelixUtils;
+import org.lcsim.recon.tracking.digitization.sisim.SiTrackerHitStrip1D;
+import org.lcsim.recon.tracking.digitization.sisim.TrackerHitType;
 
 /**
- * A class that creates track objects from fitted GBL trajectories and adds them
- * into the event.
+ * Utilities that create track objects from fitted GBL trajectories.
  *
  * @author Per Hansson Adrian <[log in to unmask]>
  *
  */
 public class MakeGblTracks {
 
-    private String _TrkCollectionName = "GBLTracks";
-    private static Logger LOGGER = Logger.getLogger(MakeGblTracks.class.getPackage().getName());
-
-    /**
-     * Creates a new instance of MakeTracks.
-     */
-    public MakeGblTracks() {
-    }
-
-    public void setDebug(boolean debug) {
+    private final static Logger LOGGER = Logger.getLogger(MakeGblTracks.class.getPackage().getName());
+
+    private MakeGblTracks() {
+    }
+
+    public static void setDebug(boolean debug) {
         if (debug) {
             LOGGER.setLevel(Level.INFO);
         } else {
@@ -55,40 +60,7 @@
         }
     }
 
-    /**
-     * Process a Gbl track and store it into the event
-     *
-     * @param event event header
-     * @param track Gbl trajectory
-     * @param seed SeedTrack
-     * @param bfield magnetic field (used to turn curvature into momentum)
-     */
-    public void Process(EventHeader event, List<FittedGblTrajectory> gblTrajectories, double bfield) {
-
-        List<Track> tracks = new ArrayList<Track>();
-
-        LOGGER.info("adding " + gblTrajectories.size() + " of fitted GBL tracks to the event");
-
-        for (FittedGblTrajectory fittedTraj : gblTrajectories) {
-
-            SeedTrack seedTrack = (SeedTrack) fittedTraj.get_seed();
-            SeedCandidate trackseed = seedTrack.getSeedCandidate();
-
-            //  Create a new Track
-            Track trk = makeCorrectedTrack(fittedTraj, trackseed.getHelix(), seedTrack.getTrackerHits(), seedTrack.getType(), bfield);
-
-            //  Add the track to the list of tracks
-            tracks.add(trk);
-        }
-
-        LOGGER.info("adding " + Integer.toString(tracks.size()) + " Gbl tracks to event with " + event.get(Track.class, "MatchedTracks").size() + " matched tracks");
-
-        // Put the tracks back into the event and exit
-        int flag = 1 << LCIOConstants.TRBIT_HITS;
-        event.put(_TrkCollectionName, tracks, Track.class, flag);
-    }
-
-    public static Track makeCorrectedTrack(FittedGblTrajectory fittedTraj, HelicalTrackFit helix, List<TrackerHit> trackHits, int trackType, double bfield) {
+    public static Pair<Track, GBLKinkData> makeCorrectedTrack(FittedGblTrajectory fittedTraj, HelicalTrackFit helix, List<TrackerHit> trackHits, int trackType, double bfield) {
         //  Initialize the reference point to the origin
         double[] ref = new double[]{0., 0., 0.};
 
@@ -112,6 +84,8 @@
         Pair<double[], SymmetricMatrix> correctedHelixParamsLast = getGblCorrectedHelixParameters(helix, fittedTraj.get_traj(), bfield, FittedGblTrajectory.GBLPOINT.LAST);
         TrackState stateLast = new BaseTrackState(correctedHelixParamsLast.getFirst(), ref, correctedHelixParamsLast.getSecond().asPackedArray(true), TrackState.AtLastHit, bfield);
         trk.getTrackStates().add(stateLast);
+
+        GBLKinkData kinkData = getKinks(fittedTraj.get_traj());
 
         // Set other info needed
         trk.setChisq(fittedTraj.get_chi2());
@@ -128,7 +102,7 @@
                 LOGGER.info(String.format("param %d: %.10f -> %.10f    helix-gbl= %f", i, helix.parameters()[i], trk.getTrackParameter(i), helix.parameters()[i] - trk.getTrackParameter(i)));
             }
         }
-        return trk;
+        return new Pair<Track, GBLKinkData>(trk, kinkData);
     }
 
     /**
@@ -285,8 +259,239 @@
         return new Pair<double[], SymmetricMatrix>(parameters_gbl, cov);
     }
 
-    public void setTrkCollectionName(String name) {
-        _TrkCollectionName = name;
-    }
-
+    public static GBLKinkData getKinks(GblTrajectory traj) {
+
+        // get corrections from GBL fit
+        Vector locPar = new Vector(5);
+        SymMatrix locCov = new SymMatrix(5);
+        float[] lambdaKinks = new float[traj.getNumPoints() - 1];
+        double[] phiKinks = new double[traj.getNumPoints() - 1];
+
+        double oldPhi = 0, oldLambda = 0;
+        for (int i = 0; i < traj.getNumPoints(); i++) {
+            traj.getResults(i + 1, locPar, locCov); // vertex point
+            double newPhi = locPar.get(FittedGblTrajectory.GBLPARIDX.XTPRIME.getValue());
+            double newLambda = locPar.get(FittedGblTrajectory.GBLPARIDX.YTPRIME.getValue());
+            if (i > 0) {
+                lambdaKinks[i - 1] = (float) (newLambda - oldLambda);
+                phiKinks[i - 1] = newPhi - oldPhi;
+            }
+            oldPhi = newPhi;
+            oldLambda = newLambda;
+        }
+
+        return new GBLKinkData(lambdaKinks, phiKinks);
+    }
+
+    /**
+     * Do a GBL fit to an arbitrary set of strip hits, with a starting value of
+     * the helix parameters.
+     *
+     * @param helix Initial helix parameters. Only track parameters are used
+     * (not covariance)
+     * @param stripHits Strip hits to be used for the GBL fit. Does not need to
+     * be in sorted order.
+     * @param hth Stereo hits for the track's hit list (these are not used in
+     * the GBL fit). Does not need to be in sorted order.
+     * @param nIterations Number of times to iterate the GBL fit.
+     * @param scattering Multiple scattering manager.
+     * @param bfield B-field
+     * @return The refitted track.
+     */
+    public static Pair<Track, GBLKinkData> refitTrack(HelicalTrackFit helix, Collection<TrackerHit> stripHits, Collection<TrackerHit> hth, int nIterations, MultipleScattering scattering, double bfield) {
+        List<TrackerHit> allHthList = sortHits(hth);
+        List<TrackerHit> sortedStripHits = sortHits(stripHits);
+        FittedGblTrajectory fit = MakeGblTracks.doGBLFit(helix, sortedStripHits, scattering, bfield, 0);
+        for (int i = 0; i < nIterations; i++) {
+            Pair<Track, GBLKinkData> newTrack = MakeGblTracks.makeCorrectedTrack(fit, helix, allHthList, 0, bfield);
+            helix = TrackUtils.getHTF(newTrack.getFirst());
+            fit = MakeGblTracks.doGBLFit(helix, sortedStripHits, scattering, bfield, 0);
+        }
+        Pair<Track, GBLKinkData> mergedTrack = MakeGblTracks.makeCorrectedTrack(fit, helix, allHthList, 0, bfield);
+        return mergedTrack;
+    }
+
+    public static FittedGblTrajectory doGBLFit(HelicalTrackFit htf, List<TrackerHit> stripHits, MultipleScattering _scattering, double bfield, int debug) {
+        List<GBLStripClusterData> stripData = makeStripData(htf, stripHits, _scattering, bfield, debug);
+        double bfac = Constants.fieldConversion * bfield;
+
+        FittedGblTrajectory fit = HpsGblRefitter.fit(stripData, bfac, debug > 0);
+        return fit;
+    }
+
+    public static List<GBLStripClusterData> makeStripData(HelicalTrackFit htf, List<TrackerHit> stripHits, MultipleScattering _scattering, double _B, int _debug) {
+        List<GBLStripClusterData> stripClusterDataList = new ArrayList<GBLStripClusterData>();
+
+        // Find scatter points along the path
+        MultipleScattering.ScatterPoints scatters = _scattering.FindHPSScatterPoints(htf);
+
+        if (_debug > 0) {
+            System.out.printf("perPar covariance matrix\n%s\n", htf.covariance().toString());
+        }
+
+        for (TrackerHit stripHit : stripHits) {
+            HelicalTrackStripGbl strip;
+            if (stripHit instanceof SiTrackerHitStrip1D) {
+                strip = new HelicalTrackStripGbl(makeDigiStrip((SiTrackerHitStrip1D) stripHit), true);
+            } else {
+                SiTrackerHitStrip1D newHit = new SiTrackerHitStrip1D(stripHit);
+                strip = new HelicalTrackStripGbl(makeDigiStrip(newHit), true);
+            }
+
+            // find Millepede layer definition from DetectorElement
+            HpsSiSensor sensor = (HpsSiSensor) ((RawTrackerHit) stripHit.getRawHits().get(0)).getDetectorElement();
+
+            int millepedeId = sensor.getMillepedeId();
+
+            if (_debug > 0) {
+                System.out.printf("layer %d millepede %d (DE=\"%s\", origin %s) \n", strip.layer(), millepedeId, sensor.getName(), strip.origin().toString());
+            }
+
+            //Center of the sensor
+            Hep3Vector origin = strip.origin();
+
+            //Find intercept point with sensor in tracking frame
+            Hep3Vector trkpos = TrackUtils.getHelixPlaneIntercept(htf, strip, Math.abs(_B));
+            if (trkpos == null) {
+                if (_debug > 0) {
+                    System.out.println("Can't find track intercept; use sensor origin");
+                }
+                trkpos = strip.origin();
+            }
+            if (_debug > 0) {
+                System.out.printf("trkpos at intercept [%.10f %.10f %.10f]\n", trkpos.x(), trkpos.y(), trkpos.z());
+            }
+
+            //GBLDATA
+            GBLStripClusterData stripData = new GBLStripClusterData(millepedeId);
+            //Add to output list
+            stripClusterDataList.add(stripData);
+
+            //path length to intercept
+            double s = HelixUtils.PathToXPlane(htf, trkpos.x(), 0, 0).get(0);
+            double s3D = s / Math.cos(Math.atan(htf.slope()));
+
+            //GBLDATA
+            stripData.setPath(s);
+            stripData.setPath3D(s3D);
+
+            //GBLDATA
+            stripData.setU(strip.u());
+            stripData.setV(strip.v());
+            stripData.setW(strip.w());
+
+            //Print track direction at intercept
+            Hep3Vector tDir = HelixUtils.Direction(htf, s);
+            double phi = htf.phi0() - s / htf.R();
+            double lambda = Math.atan(htf.slope());
+
+            //GBLDATA
+            stripData.setTrackDir(tDir);
+            stripData.setTrackPhi(phi);
+            stripData.setTrackLambda(lambda);
+
+            //Print residual in measurement system
+            // start by find the distance vector between the center and the track position
+            Hep3Vector vdiffTrk = VecOp.sub(trkpos, origin);
+
+            // then find the rotation from tracking to measurement frame
+            Hep3Matrix trkToStripRot = getTrackToStripRotation(sensor);
+
+            // then rotate that vector into the measurement frame to get the predicted measurement position
+            Hep3Vector trkpos_meas = VecOp.mult(trkToStripRot, vdiffTrk);
+
+            //GBLDATA
+            stripData.setMeas(strip.umeas());
+            stripData.setTrackPos(trkpos_meas);
+            stripData.setMeasErr(strip.du());
+
+            if (_debug > 1) {
+                System.out.printf("rotation matrix to meas frame\n%s\n", VecOp.toString(trkToStripRot));
+                System.out.printf("tPosGlobal %s origin %s\n", trkpos.toString(), origin.toString());
+                System.out.printf("tDiff %s\n", vdiffTrk.toString());
+                System.out.printf("tPosMeas %s\n", trkpos_meas.toString());
+            }
+
+            if (_debug > 0) {
+                System.out.printf("layer %d millePedeId %d uRes %.10f\n", strip.layer(), millepedeId, stripData.getMeas() - stripData.getTrackPos().x());
+            }
+
+            // find scattering angle
+            MultipleScattering.ScatterPoint scatter = scatters.getScatterPoint(((RawTrackerHit) strip.getStrip().rawhits().get(0)).getDetectorElement());
+            double scatAngle;
+
+            if (scatter != null) {
+                scatAngle = scatter.getScatterAngle().Angle();
+            } else {
+                if (_debug > 0) {
+                    System.out.printf("WARNING cannot find scatter for detector %s with strip cluster at %s\n", ((RawTrackerHit) strip.getStrip().rawhits().get(0)).getDetectorElement().getName(), strip.origin().toString());
+                }
+                scatAngle = GblUtils.estimateScatter(sensor, htf, _scattering, _B);
+            }
+
+            //GBLDATA
+            stripData.setScatterAngle(scatAngle);
+        }
+        return stripClusterDataList;
+    }
+
+    private static Hep3Matrix getTrackToStripRotation(SiSensor sensor) {
+        // This function transforms the hit to the sensor coordinates
+
+        // Transform from JLab frame to sensor frame (done through the RawTrackerHit)
+        SiSensorElectrodes electrodes = sensor.getReadoutElectrodes(ChargeCarrier.HOLE);
+        ITransform3D detToStrip = electrodes.getGlobalToLocal();
+        // Get rotation matrix
+        Hep3Matrix detToStripMatrix = detToStrip.getRotation().getRotationMatrix();
+        // Transformation between the JLAB and tracking coordinate systems
+        Hep3Matrix detToTrackMatrix = CoordinateTransformations.getMatrix();
+
+        return VecOp.mult(detToStripMatrix, VecOp.inverse(detToTrackMatrix));
+    }
+
+    private static HelicalTrackStrip makeDigiStrip(SiTrackerHitStrip1D h) {
+        SiTrackerHitStrip1D local = h.getTransformedHit(TrackerHitType.CoordinateSystem.SENSOR);
+        SiTrackerHitStrip1D global = h.getTransformedHit(TrackerHitType.CoordinateSystem.GLOBAL);
+
+        ITransform3D trans = local.getLocalToGlobal();
+        Hep3Vector org = trans.transformed(new BasicHep3Vector(0., 0., 0.));
+        Hep3Vector u = global.getMeasuredCoordinate();
+        Hep3Vector v = global.getUnmeasuredCoordinate();
+
+        //rotate to tracking frame
+        Hep3Vector neworigin = CoordinateTransformations.transformVectorToTracking(org);
+        Hep3Vector newu = CoordinateTransformations.transformVectorToTracking(u);
+        Hep3Vector newv = CoordinateTransformations.transformVectorToTracking(v);
+
+        double umeas = local.getPosition()[0];
+        double vmin = VecOp.dot(local.getUnmeasuredCoordinate(), local.getHitSegment().getStartPoint());
+        double vmax = VecOp.dot(local.getUnmeasuredCoordinate(), local.getHitSegment().getEndPoint());
+        double du = Math.sqrt(local.getCovarianceAsMatrix().diagonal(0));
+
+        //don't fill fields we don't use
+//        IDetectorElement de = h.getSensor();
+//        String det = getName(de);
+//        int lyr = getLayer(de);
+//        BarrelEndcapFlag be = getBarrelEndcapFlag(de);
+        double dEdx = h.getdEdx();
+        double time = h.getTime();
+        List<RawTrackerHit> rawhits = h.getRawHits();
+        HelicalTrackStrip strip = new HelicalTrackStrip(neworigin, newu, newv, umeas, du, vmin, vmax, dEdx, time, rawhits, null, -1, null);
+
+        return strip;
+    }
+
+    private static List<TrackerHit> sortHits(Collection<TrackerHit> hits) {
+        List<TrackerHit> hitList = new ArrayList<TrackerHit>(hits);
+        Collections.sort(hitList, new LayerComparator());
+        return hitList;
+    }
+
+    private static class LayerComparator implements Comparator<TrackerHit> {
+
+        @Override
+        public int compare(TrackerHit o1, TrackerHit o2) {
+            return Integer.compare(TrackUtils.getLayer(o1), TrackUtils.getLayer(o2));
+        }
+    }
 }

Copied: java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java (from r3964, java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java)
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java	(original)
+++ java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitFunction.java	Tue Dec  1 15:55:47 2015
@@ -4,10 +4,10 @@
 
 /*
  * Function for fitting the leading edge of the RF waveform.
+ * Straight line fit
  */
 public class RfFitFunction extends AbstractIFunction {
-	protected double pedestal=0;
-	protected double time=0;
+	protected double intercept=0;
 	protected double slope=0;
 	public RfFitFunction() {
 		this("");
@@ -15,22 +15,21 @@
 	public RfFitFunction(String title) {
 		super();
 		this.variableNames=new String[]{"time"};
-		this.parameterNames=new String[]{"pedestal","time","slope"};
+		this.parameterNames=new String[]{"intercept","slope"};
+
 		init(title);
 	}
 	public double value(double [] v) {
-		return  pedestal + (v[0]-time)*slope;
+		return  intercept + (v[0])*slope;
 	}
 	public void setParameters(double[] pars) throws IllegalArgumentException {
 		super.setParameters(pars);
-		pedestal=pars[0];
-		time=pars[1];
-		slope=pars[2];
+		intercept=pars[0];
+		slope=pars[1];
 	}
 	public void setParameter(String key,double value) throws IllegalArgumentException{
 		super.setParameter(key,value);
-		if      (key.equals("pedestal")) pedestal=value;
-		else if (key.equals("time"))     time=value;
+		if      (key.equals("intercept")) intercept=value;
 		else if (key.equals("slope"))    slope=value;
 	}
 }

Copied: java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java (from r3964, java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java)
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java	(original)
+++ java/branches/jeremy-dev/users/src/main/java/org/hps/users/baltzell/RfFitterDriver.java	Tue Dec  1 15:55:47 2015
@@ -29,6 +29,7 @@
 	static final int SLOT=13;
 	static final int CHANNELS[]={0,1};
 	static final double NSPERSAMPLE=4;
+		
 
 	// boilerplate:
     AIDA aida = AIDA.defaultInstance();
@@ -61,8 +62,12 @@
 					
 					// we found a RF readout, fit it:
 					foundRf=true;
-					IFitResult fit=fitPulse(hit);
-  				    times[ii]=NSPERSAMPLE*fit.fittedParameter("time");
+					times[ii] = fitPulse(hit);
+					if (ii==1){
+						
+						System.out.println(times[1]-times[0]);
+					}
+					  				    
 					break;
 				}
 			}
@@ -79,30 +84,85 @@
 	/*
 	 * Perform the fit to the RF pulse:
 	 */
-	public IFitResult fitPulse(FADCGenericHit hit) {
-
+	public double fitPulse(FADCGenericHit hit) {
 		fitData.clear();
 		final int adcSamples[]=hit.getData();
+		//stores the number of peaks
+		int iz=0;
+		int peakBin[]={-999,-999};
+		final int threshold = 300;	
+		double fitThresh[]={-999,-999};
+		double pedVal[]={-999,-999};
 		
-		// TODO: only add those ADC values which are to be fitted:
-		for (int ii=0; ii<adcSamples.length; ii++) {
-			final int jj=fitData.size();
-			fitData.addPoint();
-			fitData.point(jj).coordinate(0).setValue(ii);
-			fitData.point(jj).coordinate(1).setValue(adcSamples[ii]);
-			fitData.point(jj).coordinate(1).setErrorMinus(NOISE);
-			fitData.point(jj).coordinate(1).setErrorPlus(NOISE);
+		// Look for bins containing the peaks (2-3 peaks)
+		for (int ii=4; ii<adcSamples.length; ii++) {
+			// After 2 peaks, stop looking for more
+			if (iz==2){break;}
+			if ((adcSamples[ii+1]>0) && (adcSamples[ii-1]>0) && (adcSamples[ii]>threshold) && ii>8){
+				if ((adcSamples[ii]>adcSamples[ii+1]) && (adcSamples[ii]>=adcSamples[ii-1]) ){
+					
+					peakBin[iz]=ii;
+					iz++;
+				}
+			}
 		}
 		
-		// TODO: properly initialize fit parameters:
-		fitFunction.setParameter("time",0.0);
-		fitFunction.setParameter("pedestal",0.0);
-		fitFunction.setParameter("slope",100.0);
+		
+		int jj=0;
+		// Choose peak closest to center of window (second peak, ik=1)
+		final int ik=1;
+		pedVal[ik] = (adcSamples[peakBin[ik]-6]+adcSamples[peakBin[ik]-7]+adcSamples[peakBin[ik]-8]+adcSamples[peakBin[ik]-9])/4.0;
+		fitThresh[ik]= (adcSamples[peakBin[ik]]+pedVal[ik])/3.0;
 	
+		// Initial values: we find/fit 3 points:
+		double itime[] = {-999,-999,-999};
+		double ifadc[] = {-999,-999,-999};
+		
+		// Find the points of the peak bin to peak bin-5 
+		for (int ll=0; ll<5; ll++){	
+			if ((adcSamples[peakBin[ik]-5+ll]) > fitThresh[ik]){
+				// One point is below fit threshold and two points are above	
+				if(jj==0 && (adcSamples[peakBin[ik]-6+ll] > pedVal[ik])){
+					final int zz=fitData.size();	
+					fitData.addPoint();
+					itime[zz] = peakBin[ik]-6+ll;
+					ifadc[zz] = adcSamples[peakBin[ik]-6+ll];
+					fitData.point(zz).coordinate(0).setValue(peakBin[ik]-6+ll);
+					fitData.point(zz).coordinate(1).setValue(adcSamples[peakBin[ik]-6+ll]);
+					fitData.point(zz).coordinate(1).setErrorMinus(NOISE);
+					fitData.point(zz).coordinate(1).setErrorPlus(NOISE);		
+					jj++;	
+				}
+				final int zz=fitData.size();	
+				fitData.addPoint();
+				itime[zz] = peakBin[ik]-5+ll;
+				ifadc[zz] = adcSamples[peakBin[ik]-5+ll];
+				fitData.point(zz).coordinate(0).setValue(peakBin[ik]-5+ll);
+				fitData.point(zz).coordinate(1).setValue(adcSamples[peakBin[ik]-5+ll]);
+				fitData.point(zz).coordinate(1).setErrorMinus(NOISE);
+				fitData.point(zz).coordinate(1).setErrorPlus(NOISE);
+					
+				jj++;
+				if (jj==3) {break;}					
+			}
+		}
+		
+		double islope = ((double)(ifadc[2]-ifadc[0]))/(itime[2]-itime[0]);
+		double icept = ifadc[1] - islope*itime[1];
+		// Initialize fit parameters:
+		fitFunction.setParameter("intercept",icept);
+		fitFunction.setParameter("slope",islope);
+
 		// this used to be turned on somewhere else on every event, dunno if it still is:
 		//Logger.getLogger("org.freehep.math.minuit").setLevel(Level.OFF);
+	
+		IFitResult fitResults = fitter.fit(fitData,fitFunction);
 		
-		return fitter.fit(fitData,fitFunction);
+		// Read the time value at this location on the fit:
+		double halfVal = (adcSamples[peakBin[1]]+pedVal[1])/2.0;	
+	
+		return NSPERSAMPLE*(halfVal-fitResults.fittedParameter("intercept"))/fitResults.fittedParameter("slope");
+			
 	}
-	
+		
 }

Modified: java/branches/jeremy-dev/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java
 =============================================================================
--- java/branches/jeremy-dev/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	(original)
+++ java/branches/jeremy-dev/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	Tue Dec  1 15:55:47 2015
@@ -137,7 +137,7 @@
 
                 if (runNum != currentRun) {
                     RunManager.getRunManager().setRun(runNum);
-                    if (!RunManager.getRunManager().runExists() || RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() == null) {
+                    if (!RunManager.getRunManager().runExists() || RunManager.getRunManager().getRunSummary().getTiTimeOffset() == null) {
                         continue;
                     }
                     try {
@@ -150,7 +150,7 @@
                         continue;
                     }
 
-                    tiTimeOffset = RunManager.getRunManager().getTriggerConfig().getTiTimeOffset();
+                    tiTimeOffset = RunManager.getRunManager().getRunSummary().getTiTimeOffset();
 
                     for (final SvtAlignmentConstant constant : alignmentConstants) {
                         switch (constant.getParameter()) {

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

November 2017
August 2017
July 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use