LISTSERV mailing list manager LISTSERV 16.5

Help for HPS-SVN Archives


HPS-SVN Archives

HPS-SVN Archives


HPS-SVN@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

HPS-SVN Home

HPS-SVN Home

HPS-SVN  February 2016

HPS-SVN February 2016

Subject:

r4246 - in /java/trunk: ./ analysis/src/main/java/org/hps/analysis/plots/ conditions/ conditions/src/main/java/org/hps/conditions/ conditions/src/main/java/org/hps/conditions/api/ conditions/src/main/java/org/hps/conditions/cli/ conditions/src/main/java/org/hps/conditions/database/ conditions/src/main/java/org/hps/conditions/dummy/ conditions/src/main/java/org/hps/conditions/ecal/ conditions/src/main/java/org/hps/conditions/run/ crawler/ crawler/src/main/java/org/hps/crawler/ crawler/src/main/python/ datacat-client/src/main/java/org/hps/datacat/client/ distribution/ ecal-recon/src/main/java/org/hps/recon/ecal/ evio/src/main/java/org/hps/evio/ integration-tests/ integration-tests/src/test/java/org/hps/test/it/ job/ job/src/main/java/org/hps/job/ logging/src/main/resources/org/hps/logging/config/ monitoring-app/ monitoring-app/src/main/java/org/hps/monitoring/application/ monitoring-app/src/main/java/org/hps/monitoring/application/util/ monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/ parent/ record-util/src/main/java/org/hps/job/ record-util/src/main/java/org/hps/record/ record-util/src/main/java/org/hps/record/daqconfig/ record-util/src/main/java/org/hps/record/epics/ record-util/src/main/java/org/hps/record/evio/ record-util/src/main/java/org/hps/record/scalers/ record-util/src/main/java/org/hps/record/svt/ record-util/src/main/java/org/hps/record/triggerbank/ record-util/src/main/java/org/hps/record/util/ run-database/ run-database/src/main/java/org/hps/run/database/ run-database/src/test/java/org/hps/run/database/ users/src/main/java/org/hps/users/holly/ users/src/main/java/org/hps/users/meeg/

From:

[log in to unmask]

Reply-To:

Notification of commits to the hps svn repository <[log in to unmask]>

Date:

Wed, 24 Feb 2016 21:07:03 -0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (7153 lines)

Author: [log in to unmask]
Date: Wed Feb 24 13:06:58 2016
New Revision: 4246

Log:
Merge from jeremy-dev branch.

Added:
    java/trunk/analysis/src/main/java/org/hps/analysis/plots/
      - copied from r4241, java/branches/jeremy-dev/analysis/src/main/java/org/hps/analysis/plots/
    java/trunk/conditions/src/main/java/org/hps/conditions/database/AbstractConditionsObjectConverter.java
      - copied unchanged from r4241, java/branches/jeremy-dev/conditions/src/main/java/org/hps/conditions/database/AbstractConditionsObjectConverter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/CrawlerFileVisitor.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DataType.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DataType.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatHelper.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/DatacatHelper.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileFormat.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/FileFormat.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileUtilities.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/FileUtilities.java
    java/trunk/crawler/src/main/java/org/hps/crawler/LcioReconMetadataReader.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/LcioReconMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/MetadataWriter.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/MetadataWriter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/PathFilter.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/PathFilter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/Site.java
      - copied unchanged from r4241, java/branches/jeremy-dev/crawler/src/main/java/org/hps/crawler/Site.java
    java/trunk/crawler/src/main/python/
      - copied from r4241, java/branches/jeremy-dev/crawler/src/main/python/
    java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java
      - copied, changed from r4241, java/branches/jeremy-dev/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java
    java/trunk/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractLoopAdapter.java
    java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/AbstractRecordLoop.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsUtilities.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/epics/EpicsUtilities.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagMask.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/evio/EventTagMask.java
    java/trunk/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigData.java
    java/trunk/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/svt/SvtConfigEvioProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetCalculator.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetCalculator.java
    java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TriggerConfigData.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TriggerConfigData.java
    java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TriggerType.java
      - copied unchanged from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/triggerbank/TriggerType.java
    java/trunk/record-util/src/main/java/org/hps/record/util/
      - copied from r4241, java/branches/jeremy-dev/record-util/src/main/java/org/hps/record/util/
    java/trunk/run-database/src/main/java/org/hps/run/database/AbstractRunBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/AbstractRunBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/DaoProvider.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/DaoProvider.java
    java/trunk/run-database/src/main/java/org/hps/run/database/DatacatBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/DatacatBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/DatacatUtilities.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/DatacatUtilities.java
    java/trunk/run-database/src/main/java/org/hps/run/database/EvioDataBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/EvioDataBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/LivetimeBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/LivetimeBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/RunDatabaseBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/SpreadsheetBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SpreadsheetBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java
    java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigBuilder.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/TriggerConfigBuilder.java
    java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java
    java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java
    java/trunk/run-database/src/test/java/org/hps/run/database/RunBuilderTest.java
      - copied unchanged from r4241, java/branches/jeremy-dev/run-database/src/test/java/org/hps/run/database/RunBuilderTest.java
Removed:
    java/trunk/conditions/src/main/java/org/hps/conditions/api/AbstractConditionsObjectConverter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerFileUtilities.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatUtilities.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileSet.java
    java/trunk/crawler/src/main/java/org/hps/crawler/LcioMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/RunSummaryMap.java
    java/trunk/record-util/src/main/java/org/hps/job/
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/DAQConfigEvioProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EventCountProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagBitMask.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileMetadata.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileMetadataAdapter.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileMetadataProcessor.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseDaoFactory.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunProcessor.java
    java/trunk/run-database/src/test/java/org/hps/run/database/TiTriggerOffsetTest.java
Modified:
    java/trunk/   (props changed)
    java/trunk/conditions/   (props changed)
    java/trunk/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java
    java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java
    java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java
    java/trunk/conditions/src/main/java/org/hps/conditions/api/ConditionsRecord.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java
    java/trunk/conditions/src/main/java/org/hps/conditions/cli/TagCommand.java
    java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java
    java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java
    java/trunk/conditions/src/main/java/org/hps/conditions/database/Converter.java
    java/trunk/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java
    java/trunk/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java
    java/trunk/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java
    java/trunk/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java
    java/trunk/crawler/pom.xml
    java/trunk/crawler/src/main/java/org/hps/crawler/AidaMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
    java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/RootDqmMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/RootDstMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/RunFilter.java
    java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java
    java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java
    java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java
    java/trunk/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java
    java/trunk/distribution/   (props changed)
    java/trunk/distribution/pom.xml
    java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalOnlineRawConverter.java
    java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalRawConverter.java
    java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java
    java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
    java/trunk/integration-tests/   (props changed)
    java/trunk/integration-tests/src/test/java/org/hps/test/it/ReconSteeringTest.java
    java/trunk/job/pom.xml
    java/trunk/job/src/main/java/org/hps/job/JobManager.java
    java/trunk/logging/src/main/resources/org/hps/logging/config/logging.properties
    java/trunk/logging/src/main/resources/org/hps/logging/config/test_logging.properties
    java/trunk/monitoring-app/   (props changed)
    java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/EventProcessing.java
    java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/MonitoringApplication.java
    java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/SystemStatusPanel.java
    java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/util/TableExporter.java
    java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/PlotAndFitUtilities.java
    java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/V0ReconPlots.java
    java/trunk/parent/pom.xml
    java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordQueue.java
    java/trunk/record-util/src/main/java/org/hps/record/RecordProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/DAQConfig.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/FADCConfig.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/GTPConfig.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/IDAQConfig.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/SSPConfig.java
    java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsData.java
    java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsEvioProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagConstant.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioBankTag.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoop.java
    java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java
    java/trunk/record-util/src/main/java/org/hps/record/scalers/ScalerUtilities.java
    java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java
    java/trunk/run-database/pom.xml
    java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java
    java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java
    java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java
    java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java
    java/trunk/users/src/main/java/org/hps/users/holly/EcalRawConverter.java
    java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/ConditionsDriver.java	Wed Feb 24 13:06:58 2016
@@ -33,7 +33,10 @@
  * time to achieve the proper behavior.
  *
  * @author Jeremy McCormick, SLAC
+ * 
+ * @deprecated Use built-in options of job manager.
  */
+@Deprecated
 public class ConditionsDriver extends Driver {
 
     /** The name of the detector model. */

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObject.java	Wed Feb 24 13:06:58 2016
@@ -15,7 +15,7 @@
  *
  * @author Jeremy McCormick, SLAC
  */
-public class BaseConditionsObject implements ConditionsObject {
+public abstract class BaseConditionsObject implements ConditionsObject {
 
     /**
      * Field name for collection ID.
@@ -440,4 +440,22 @@
         }
         return rowsUpdated != 0;
     }
+    
+    public boolean equals(Object object) {
+        // Is it the same object?
+        if (object == this) {
+            return true;
+        }
+        // Are these objects the same class?
+        if (object.getClass().equals(this.getClass())) {
+            BaseConditionsObject otherObject = BaseConditionsObject.class.cast(object);
+            // Do the row IDs and database table name match?
+            if (otherObject.getTableMetaData().getTableName().equals(this.getTableMetaData().getTableName()) &&
+                    this.getRowId() == otherObject.getRowId()) {
+                // These are considered the same object (same database table and row ID).
+                return true;
+            }
+        }
+        return false;
+    }
 }

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/api/BaseConditionsObjectCollection.java	Wed Feb 24 13:06:58 2016
@@ -102,6 +102,15 @@
         if (object == null) {
             throw new IllegalArgumentException("The object argument is null.");
         }
+        //checkCollectionId(object);
+        final boolean added = this.objects.add(object);
+        if (!added) {
+            throw new RuntimeException("Failed to add object.");
+        }
+        return added;
+    }
+
+    private void checkCollectionId(final ObjectType object) {
         // Does this collection have a valid ID yet?
         if (this.getCollectionId() != BaseConditionsObject.UNSET_COLLECTION_ID) {
             // Does the object that is being added have a collection ID?
@@ -122,11 +131,6 @@
                 }
             }
         }
-        final boolean added = this.objects.add(object);
-        if (!added) {
-            throw new RuntimeException("Failed to add object.");
-        }
-        return added;
     }
 
     /**
@@ -344,7 +348,7 @@
         } else {
             // If the collection already exists in the database with this ID then it cannot be inserted.
             if (this.exists()) {
-                throw new DatabaseObjectException("The collection " + this.collectionId
+                throw new DatabaseObjectException("The collection ID " + this.collectionId
                         + " cannot be inserted because it already exists in the " + this.tableMetaData.getTableName()
                         + " table.", this);
             }
@@ -703,7 +707,6 @@
     public void writeCsv(final File file) throws IOException {
         FileWriter fileWriter = null;
         CSVPrinter csvFilePrinter = null;
-
         try {
             fileWriter = new FileWriter(file);
             csvFilePrinter = new CSVPrinter(fileWriter, CSVFormat.DEFAULT);

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/api/ConditionsRecord.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/api/ConditionsRecord.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/api/ConditionsRecord.java	Wed Feb 24 13:06:58 2016
@@ -8,6 +8,7 @@
 import org.hps.conditions.database.ConditionsRecordConverter;
 import org.hps.conditions.database.Converter;
 import org.hps.conditions.database.Field;
+import org.hps.conditions.database.MultipleCollectionsAction;
 import org.hps.conditions.database.Table;
 
 /**
@@ -233,6 +234,33 @@
         public final ConditionsRecordCollection sortedByUpdated() {
             return (ConditionsRecordCollection) this.sorted(new UpdatedComparator());
         }
+        
+        /**
+         * Find a unique record using the selected action for disambiguating conditions with the same key.
+         * @param key the name of the key
+         * @param action the disambiguation action
+         * @return the unique conditions record or <code>null</code> if does not exist
+         */
+        public ConditionsRecord findUniqueRecord(String key, MultipleCollectionsAction action) {
+            ConditionsRecord record = null;
+            ConditionsRecordCollection keyRecords = this.findByKey(key);
+            if (keyRecords.size() > 0) {
+                if (keyRecords.size() == 1) {
+                    record = keyRecords.get(0);
+                } else {
+                    if (action.equals(MultipleCollectionsAction.LAST_UPDATED)) {
+                        record = sortedByUpdated().get(this.size() - 1);
+                    } else if (action.equals(MultipleCollectionsAction.LAST_CREATED)) {
+                        record = sortedByCreated().get(this.size() - 1);
+                    } else if (action.equals(MultipleCollectionsAction.LATEST_RUN_START)) {
+                        record = sortedByRunStart().get(this.size() - 1);
+                    } else if (action.equals(MultipleCollectionsAction.ERROR)) {
+                        throw new RuntimeException("Multiple ConditionsRecord object found for conditions key " + key + ".");
+                    }
+                }
+            }
+            return record;
+        }
     }
 
     /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/AddCommand.java	Wed Feb 24 13:06:58 2016
@@ -32,16 +32,13 @@
      */
     private static final Options OPTIONS = new Options();
     static {
-        OPTIONS.addOption(new Option("h", false, "print help for add command"));
-        OPTIONS.addOption("r", true, "starting run number (required)");
-        OPTIONS.getOption("r").setRequired(true);
-        OPTIONS.addOption("e", true, "ending run number (default is starting run number)");
-        OPTIONS.addOption("t", true, "table name (required)");
-        OPTIONS.getOption("t").setRequired(true);
-        OPTIONS.addOption("c", true, "collection ID (required)");
-        OPTIONS.getOption("c").setRequired(true);
-        OPTIONS.addOption("u", true, "user name (optional)");
-        OPTIONS.addOption("m", true, "notes about this conditions set (optional)");
+        OPTIONS.addOption(new Option("h", "help", false, "print help for add command"));
+        OPTIONS.addOption("r", "run-start", true, "starting run number (required)");
+        OPTIONS.addOption("e", "run-end", true, "ending run number (default is starting run number)");
+        OPTIONS.addOption("t", "table", true, "table name (required)");
+        OPTIONS.addOption("c", "collection", true, "collection ID (required)");
+        OPTIONS.addOption("u", "user", true, "user name (optional)");
+        OPTIONS.addOption("m", "notes", true, "notes about this conditions set (optional)");
     }
 
     /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/CommandLineTool.java	Wed Feb 24 13:06:58 2016
@@ -34,13 +34,12 @@
     private static Options OPTIONS = new Options();
 
     static {
-        OPTIONS.addOption(new Option("h", false, "print help"));
-        OPTIONS.addOption(new Option("d", true, "detector name"));
-        OPTIONS.addOption(new Option("r", true, "run number"));
-        OPTIONS.addOption(new Option("p", true, "database connection properties file"));
-        OPTIONS.addOption(new Option("x", true, "conditions XML configuration file"));
-        OPTIONS.addOption(new Option("t", true, "conditions tag to use for filtering records"));
-        OPTIONS.addOption(new Option("l", true, "log level of the conditions manager (INFO, FINE, etc.)"));
+        OPTIONS.addOption(new Option("h", "help", false, "print help"));
+        OPTIONS.addOption(new Option("d", "detector", true, "detector name"));
+        OPTIONS.addOption(new Option("r", "run", true, "run number"));
+        OPTIONS.addOption(new Option("p", "connection", true, "database connection properties file"));
+        OPTIONS.addOption(new Option("x", "xml", true, "conditions XML configuration file"));
+        OPTIONS.addOption(new Option("t", "tag", true, "conditions tag to use for filtering records"));
     }
 
     /**
@@ -177,13 +176,6 @@
         // Create new manager.
         this.conditionsManager = DatabaseConditionsManager.getInstance();
 
-        // Set the conditions manager log level (does not affect logger of this class or sub-commands).
-        if (commandLine.hasOption("l")) {
-            final Level newLevel = Level.parse(commandLine.getOptionValue("l"));
-            Logger.getLogger(DatabaseConditionsManager.class.getPackage().getName()).setLevel(newLevel);
-            LOGGER.config("conditions manager log level will be set to " + newLevel.toString());
-        }
-
         // Connection properties.
         if (commandLine.hasOption("p")) {
             final File connectionPropertiesFile = new File(commandLine.getOptionValue("p"));

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/LoadCommand.java	Wed Feb 24 13:06:58 2016
@@ -33,12 +33,10 @@
      */
     private static final Options OPTIONS = new Options();
     static {
-        OPTIONS.addOption(new Option("h", false, "print help for load command"));
-        OPTIONS.addOption(new Option("t", true, "name of the target table (required)"));
-        OPTIONS.getOption("t").setRequired(true);
-        OPTIONS.addOption(new Option("f", true, "input data file path (required)"));
-        OPTIONS.getOption("f").setRequired(true);
-        OPTIONS.addOption(new Option("d", true, "description for the collection log"));
+        OPTIONS.addOption(new Option("h", "help", false, "print help for load command"));
+        OPTIONS.addOption(new Option("t", "table", true, "name of the target table (required)"));
+        OPTIONS.addOption(new Option("f", "file", true, "input data file path (required)"));
+        OPTIONS.addOption(new Option("d", "description", true, "description for the collection log"));
     }
 
     /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/PrintCommand.java	Wed Feb 24 13:06:58 2016
@@ -38,12 +38,12 @@
     static Options options = new Options();
 
     static {
-        options.addOption(new Option("h", false, "print help for print command"));
-        options.addOption(new Option("t", true, "table name"));
-        options.addOption(new Option("i", false, "print the ID for the records (off by default)"));
-        options.addOption(new Option("f", true, "write print output to a file (must be used with -t option)"));
-        options.addOption(new Option("H", false, "suppress printing of conditions record and table info"));
-        options.addOption(new Option("d", false, "use tabs for field delimiter instead of spaces"));
+        options.addOption(new Option("h", "help", false, "print help for print command"));
+        options.addOption(new Option("t", "table", true, "table name"));
+        options.addOption(new Option("i", "print-id", false, "print the ID for the records (off by default)"));
+        options.addOption(new Option("f", "file", true, "write print output to a file (must be used with -t option)"));
+        options.addOption(new Option("H", "no-header", false, "suppress printing of conditions record and table info"));
+        options.addOption(new Option("d", "tabs", false, "use tabs for field delimiter instead of spaces"));
     }
 
     /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/RunSummaryCommand.java	Wed Feb 24 13:06:58 2016
@@ -35,8 +35,8 @@
      */
     static Options options = new Options();
     static {
-        options.addOption(new Option("h", false, "Show help for run-summary command"));
-        options.addOption(new Option("a", false, "Print all collections found for the run"));
+        options.addOption(new Option("h", "print", false, "Show help for run-summary command"));
+        options.addOption(new Option("a", "all", false, "Print all collections found for the run"));
     }
 
     /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/cli/TagCommand.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/cli/TagCommand.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/cli/TagCommand.java	Wed Feb 24 13:06:58 2016
@@ -1,10 +1,9 @@
 package org.hps.conditions.cli;
 
-import java.sql.Connection;
-import java.sql.PreparedStatement;
-import java.sql.ResultSet;
 import java.sql.SQLException;
-import java.util.logging.Level;
+import java.util.HashSet;
+import java.util.Set;
+import java.util.TreeSet;
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
@@ -16,17 +15,16 @@
 import org.hps.conditions.api.ConditionsTag;
 import org.hps.conditions.api.ConditionsTag.ConditionsTagCollection;
 import org.hps.conditions.api.DatabaseObjectException;
-import org.hps.conditions.api.TableMetaData;
 import org.hps.conditions.api.TableRegistry;
+import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.conditions.database.MultipleCollectionsAction;
+import org.lcsim.conditions.ConditionsManager.ConditionsNotFoundException;
 
 /**
  * Create a conditions system tag.
  * <p>
  * The tag groups together conditions records from the <i>conditions</i> database table with a run validity range that 
  * is between a specified starting and ending run.
- * <p>
- * Tagging will not disambiguate overlapping conditions, which is done at run-time based on the current run number.
  *
  * @author Jeremy McCormick, SLAC
  */
@@ -41,20 +39,29 @@
      * Defines command options.
      */
     private static Options OPTIONS = new Options();
+    
+    private MultipleCollectionsAction multipleCollectionsAction = MultipleCollectionsAction.LAST_CREATED; 
+    
+    private static String getMultipleCollectionsActionString() {
+        StringBuffer sb = new StringBuffer();
+        for (MultipleCollectionsAction action : MultipleCollectionsAction.values()) {
+            sb.append(action.name() + " ");
+        }
+        sb.setLength(sb.length() - 1);
+        return sb.toString();
+    }
 
     /**
      * Define all command options.
      */
     static {
-        OPTIONS.addOption(new Option("h", false, "Show help for tag command"));
-        OPTIONS.addOption(new Option("t", true, "Conditions tag name"));
-        OPTIONS.addOption(new Option("s", true, "Starting run number (required)"));
-        OPTIONS.getOption("s").setRequired(true);
-        OPTIONS.addOption(new Option("e", true, "Ending run number (default is unlimited)"));
-        OPTIONS.getOption("t").setRequired(true);
-        OPTIONS.addOption(new Option("m", true,
-                "MultipleCollectionsAction to use for disambiguation (default is LAST_CREATED)"));
-        OPTIONS.addOption(new Option("d", false, "Don't prompt before making tag (be careful!)"));
+        OPTIONS.addOption(new Option("h", "help", false, "Show help for tag command"));
+        OPTIONS.addOption(new Option("t", "tag", true, "Conditions tag name"));
+        OPTIONS.addOption(new Option("s", "run-start", true, "Starting run number (required)"));
+        OPTIONS.addOption(new Option("e", "run-end", true, "Ending run number (required)"));
+        OPTIONS.addOption(new Option("m", "multiple", true, 
+                "set run overlap handling (" + getMultipleCollectionsActionString() + ")"));
+        OPTIONS.addOption(new Option("d", false, "Don't prompt before making tag (careful!)"));
     }
 
     /**
@@ -103,6 +110,11 @@
         } else {
             throw new RuntimeException("Missing required -t argument with the tag name.");
         }
+        
+        // Check if tag exists already.
+        if (getManager().getAvailableTags().contains(tagName)) {
+            throw new RuntimeException("The tag '" + tagName + "' already exists in the database.");
+        }
 
         // Starting run number (required).
         int runStart = -1;
@@ -110,19 +122,16 @@
             runStart = Integer.parseInt(commandLine.getOptionValue("s"));
             LOGGER.config("run start set to " + runStart);
         } else {
-            throw new RuntimeException("missing require -s argument with starting run number");
-        }
-
-        // Ending run number (max integer is default).
-        int runEnd = Integer.MAX_VALUE;
+            throw new RuntimeException("Missing required -s argument with starting run number.");
+        }
+
+        // Ending run number (required).
+        int runEnd = -1;
         if (commandLine.hasOption("e")) {
             runEnd = Integer.parseInt(commandLine.getOptionValue("e"));
             LOGGER.config("run end set to " + runEnd);
-        }
-
-        // Run end must be greater than or equal to run start.
-        if (runEnd < runStart) {
-            throw new IllegalArgumentException("runEnd < runStart");
+        } else {
+            throw new RuntimeException("Missing required -e argument with starting run number.");
         }
 
         // Action for disambiguating overlapping collections (default is to use the most recent creation date).
@@ -131,20 +140,24 @@
             multipleCollectionsAction = MultipleCollectionsAction
                     .valueOf(commandLine.getOptionValue("m").toUpperCase());
         }
-        LOGGER.config("multiple collections action set tco " + multipleCollectionsAction);
+        LOGGER.config("run overlaps will be disambiguated using " + multipleCollectionsAction);
 
         // Whether to prompt before tagging (default is yes).
         boolean promptBeforeTagging = true;
         if (commandLine.hasOption("d")) {
             promptBeforeTagging = false;
         }
-        LOGGER.config("prompt before tagging: " + promptBeforeTagging);
+        LOGGER.config("prompt before tagging = " + promptBeforeTagging);
 
         // Conditions system configuration.
         this.getManager().setXmlConfig("/org/hps/conditions/config/conditions_database_no_svt.xml");
 
         // Find all the applicable conditions records by their run number ranges.
         ConditionsRecordCollection tagConditionsRecordCollection = this.findConditionsRecords(runStart, runEnd);
+        
+        if (tagConditionsRecordCollection.size() == 0) {
+            throw new RuntimeException("No records found for tag.");
+        }
 
         LOGGER.info("found " + tagConditionsRecordCollection.size() + " conditions records for the tag");
 
@@ -152,8 +165,8 @@
         final ConditionsTagCollection conditionsTagCollection = this.createConditionsTagCollection(
                 tagConditionsRecordCollection, tagName);
 
-        LOGGER.info("created " + conditionsTagCollection.size() + " tag records ..." + '\n' + conditionsTagCollection);
-
+        printConditionsRecords(tagConditionsRecordCollection);
+        
         // Prompt user to verify tag creation.
         boolean createTag = true;
         if (promptBeforeTagging) {
@@ -178,68 +191,85 @@
 
         LOGGER.info("done!");
     }
-   
-    /**
-     * Find all the conditions records that are applicable for the given run range.
-     * <p>
-     * Overlapping run numbers in conditions with the same key are not disambiguated.
-     * This must be done in the user's job at runtime; usually the most recently created 
-     * conditions record will be used if multiple one's are applicable to the current run.
-     *
-     * @param runStart the start run
-     * @param runEnd the end run (must be greater than or equal to <code>runStart</code>)
-     * @return the conditions records that fall in the run range
+    
+    /**
+     * Print information about conditions records in the tag to the log.
+     * 
+     * @param collection the conditions tag collection
+     */
+    private void printConditionsRecords(ConditionsRecordCollection records) {
+        StringBuffer sb = new StringBuffer();
+        Set<String> keys = new TreeSet<String>(records.getConditionsKeys());
+        for (String key : keys) {
+            ConditionsRecordCollection keyRecords = records.findByKey(key);
+            keyRecords.sortByKey();
+            for (ConditionsRecord record : keyRecords) {
+                sb.append("conditions_id: " + record.getRowId() + ", name: " + record.getName() + ", collection_id: "
+                        + record.getCollectionId() + ", run_start: " + record.getRunStart() 
+                        + ", run_end: " + record.getRunEnd() + ", notes: " + record.getNotes() + '\n');
+                
+            }
+        }        
+        LOGGER.info("including " + records.size() + " records in tag ..." + '\n' + sb.toString());
+    }
+     
+    /**
+     * Scan through a run range to find conditions records for the tag.
+     * 
+     * @param runStart the starting run number
+     * @param runEnd the ending run number
+     * @return the conditions records for the tag
      */
     private ConditionsRecordCollection findConditionsRecords(final int runStart, final int runEnd) {
-        if (runStart > runEnd) {
-            throw new IllegalArgumentException("runStart > runEnd");
-        }
-        if (runStart < 0) {
-            throw new IllegalArgumentException("invalid runStart: " + runStart);
-        }
-        if (runEnd < 0) {
-            throw new IllegalArgumentException("invalid runEnd: " + runEnd);
-        }
-        final Connection connection = this.getManager().getConnection();
-        final ConditionsRecordCollection conditionsRecordCollection = new ConditionsRecordCollection();
-        final TableMetaData tableMetaData = TableRegistry.getTableRegistry().findByTableName("conditions");
-        PreparedStatement statement = null;
-        try {
-            /*
-             * SQL statement handles 3 cases: 
-             * 1) condition's run_start in range 
-             * 2) condition's run_end in range 
-             * 3) condition's run_start and run_end enclose the range
-             */
-            statement = connection
-                    .prepareStatement("SELECT id FROM conditions WHERE (run_start >= ? and run_start <= ?) or (run_end >= ? and run_end <= ?)"
-                            + " or (run_start <= ? and run_end >= ?)");
-            statement.setInt(1, runStart);
-            statement.setInt(2, runEnd);
-            statement.setInt(3, runStart);
-            statement.setInt(4, runEnd);
-            statement.setInt(5, runStart);
-            statement.setInt(6, runEnd);
-
-            final ResultSet resultSet = statement.executeQuery();
-            while (resultSet.next()) {
-                final ConditionsRecord record = new ConditionsRecord();
-                record.setConnection(connection);
-                record.setTableMetaData(tableMetaData);
-                record.select(resultSet.getInt(1));
-                conditionsRecordCollection.add(record);
-            }
-        } catch (DatabaseObjectException | ConditionsObjectException | SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
+        if (runStart < 0 ) {
+            throw new IllegalArgumentException("The run start " + runStart + " is invalid.");
+        }
+        if (runEnd < 0 ) {
+            throw new IllegalArgumentException("The run end " + runEnd + " is invalid.");
+        }
+        if (runStart > runEnd ) {
+            throw new IllegalArgumentException("The run start is greater than the run end.");
+        }
+        DatabaseConditionsManager dbManager = this.getManager();
+        if (dbManager.isFrozen()) {
+            dbManager.unfreeze();
+        }
+        if (!dbManager.getActiveTags().isEmpty()) {
+            dbManager.clearTags();
+        }
+        final String detectorName = "HPS-dummy-detector";
+        ConditionsRecordCollection tagRecords = new ConditionsRecordCollection();
+        Set<Integer> ids = new HashSet<Integer>();
+        for (int run = runStart; run <= runEnd; run++) {
+            LOGGER.info("loading run " + run);
             try {
-                if (statement != null) {
-                    statement.close();
+                dbManager.setDetector(detectorName, run);
+            } catch (ConditionsNotFoundException e) {
+                throw new RuntimeException(e);
+            }
+            ConditionsRecordCollection runRecords = dbManager.getConditionsRecords();
+            Set<String> keys = runRecords.getConditionsKeys();
+            LOGGER.fine("run has " + runRecords.size() + " conditions records");
+            for (String key : keys) {
+                ConditionsRecord record = runRecords.findUniqueRecord(key, this.multipleCollectionsAction);
+                if (record == null) {
+                    throw new RuntimeException("Missing expected unique condition record for " + key + ".");
                 }
-            } catch (final SQLException e) {
-                e.printStackTrace();
-            }
-        }
-        return conditionsRecordCollection;
+                if (!ids.contains(record.getRowId())) {
+                    try {
+                        LOGGER.fine("adding conditions to tag ..." + '\n' + record.toString());
+                        tagRecords.add(record);
+                        ids.add(record.getRowId());
+                    } catch (ConditionsObjectException e) {
+                        throw new RuntimeException(e);
+                    }
+                } else {
+                    LOGGER.fine("Conditions record with row id " + record.getRowId() + " is already in the tag.");
+                }
+            }
+            LOGGER.info("done processing run " + run);
+        }
+        LOGGER.info("Found " + tagRecords.size() + " conditions records for tag.");
+        return tagRecords;
     }
 }

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsRecordConverter.java	Wed Feb 24 13:06:58 2016
@@ -3,7 +3,6 @@
 import java.sql.ResultSet;
 import java.sql.SQLException;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.ConditionsObject;
 import org.hps.conditions.api.ConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectException;

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/database/ConditionsTagConverter.java	Wed Feb 24 13:06:58 2016
@@ -5,7 +5,6 @@
 import java.sql.ResultSet;
 import java.sql.SQLException;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.ConditionsObjectException;
 import org.hps.conditions.api.ConditionsTag;
 import org.hps.conditions.api.ConditionsTag.ConditionsTagCollection;

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/database/Converter.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/database/Converter.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/database/Converter.java	Wed Feb 24 13:06:58 2016
@@ -4,8 +4,6 @@
 import java.lang.annotation.Retention;
 import java.lang.annotation.RetentionPolicy;
 import java.lang.annotation.Target;
-
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 
 /**
  * This is an annotation for providing converter configuration for {@link org.hps.conditions.api.ConditionsObject}

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/database/ConverterRegistry.java	Wed Feb 24 13:06:58 2016
@@ -6,7 +6,6 @@
 
 import javassist.Modifier;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.BaseConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObject;
 import org.hps.conditions.api.TableRegistry;

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/dummy/DummyConditionsObjectConverter.java	Wed Feb 24 13:06:58 2016
@@ -1,6 +1,6 @@
 package org.hps.conditions.dummy;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
+import org.hps.conditions.database.AbstractConditionsObjectConverter;
 import org.hps.conditions.dummy.DummyConditionsObject.DummyConditionsObjectCollection;
 
 /**

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/ecal/EcalChannel.java	Wed Feb 24 13:06:58 2016
@@ -4,12 +4,12 @@
 import java.util.HashMap;
 import java.util.Map;
 
-import org.hps.conditions.api.AbstractConditionsObjectConverter;
 import org.hps.conditions.api.AbstractIdentifier;
 import org.hps.conditions.api.BaseConditionsObject;
 import org.hps.conditions.api.BaseConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectCollection;
 import org.hps.conditions.api.ConditionsObjectException;
+import org.hps.conditions.database.AbstractConditionsObjectConverter;
 import org.hps.conditions.database.Converter;
 import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.conditions.database.Field;
@@ -290,11 +290,16 @@
         public EcalChannelCollection getData(final ConditionsManager conditionsManager, final String name) {
             final EcalChannelCollection collection = super.getData(conditionsManager, name);
             final Subdetector ecal = DatabaseConditionsManager.getInstance().getEcalSubdetector();
-            if (ecal.getDetectorElement() != null) {
-                collection.buildGeometryMap(ecal.getDetectorElement().getIdentifierHelper(), ecal.getSystemID());
+            if (ecal != null) {
+                if (ecal.getDetectorElement() != null) {
+                    collection.buildGeometryMap(ecal.getDetectorElement().getIdentifierHelper(), ecal.getSystemID());
+                } else {
+                    // This can happen when not running with the detector-framework jar in the classpath.
+                    throw new IllegalStateException("The ECal subdetector's detector element is not setup.");
+                }
             } else {
-                // This can happen when not running with the detector-framework jar in the classpath.
-                throw new IllegalStateException("The ECal subdetector's detector element is not setup.");
+                // Bad detector or conditions system not initialized properly.
+                throw new IllegalStateException("The ECal subdetector object is null.");
             }
             return collection;
         }

Modified: java/trunk/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java
 =============================================================================
--- java/trunk/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java	(original)
+++ java/trunk/conditions/src/main/java/org/hps/conditions/run/RunSpreadsheet.java	Wed Feb 24 13:06:58 2016
@@ -25,17 +25,44 @@
  * The rows are accessible as raw CSV data through the Apache Commons CSV library, and this data must be manually cleaned up and converted 
  * to the correct data type before being inserted into the conditions database.
  *
- * @author Jeremy McCormick
+ * @author Jeremy McCormick, SLAC
  */
 public final class RunSpreadsheet {
 
     /**
      * The column headers.
      */
-    private static String[] HEADERS = {"run", "date", "start_time", "end_time", "to_tape", "n_events", "trigger_rate", "target", "beam_current",
-        "beam_x", "beam_y", "trigger_config", "ecal_fadc_mode", "ecal_fadc_thresh", "ecal_fadc_window", "ecal_cluster_thresh_seed", "ecal_cluster_thresh_cluster",
-        "ecal_cluster_window_hits", "ecal_cluster_window_pairs", "ecal_scalers_fadc", "ecal_scalers_dsc", "svt_y_position", "svt_offset_phase", "svt_offset_time",
-        "ecal_temp", "ecal_lv_current", "notes"};
+    private static String[] HEADERS = {
+        "run", 
+        "date", 
+        "start_time", 
+        "end_time", 
+        "to_tape", 
+        "events",
+        "files",
+        "trigger_rate", 
+        "target", 
+        "beam_current",
+        "beam_x", 
+        "beam_y",
+        "trigger_config", 
+        /* Next 7 are actually hidden in the spreadsheet! */
+        "ecal_fadc_mode",
+        "ecal_fadc_thresh", 
+        "ecal_fadc_window", 
+        "ecal_cluster_thresh_seed", 
+        "ecal_cluster_thresh_cluster",
+        "ecal_cluster_window_hits", 
+        "ecal_cluster_window_pairs", 
+        /* End hidden fields. */
+        "ecal_scalers_fadc", 
+        "ecal_scalers_dsc", 
+        "svt_y_position", 
+        "svt_offset_phase", 
+        "svt_offset_time",
+        "ecal_temp", 
+        "ecal_lv_current", 
+        "notes"};
 
     /**
      * Read the CSV file from the command line and print the data to the terminal (just a basic test).
@@ -72,11 +99,14 @@
      * @param file the CSV file
      */
     public RunSpreadsheet(final File file) {
+        if (file == null) {
+            throw new IllegalArgumentException("The file argument is null.");
+        }
         this.file = file;
         try {
             this.fromCsv(this.file);
         } catch (final Exception e) {
-            throw new RuntimeException();
+            throw new RuntimeException("Failed to parse run spreadsheet.", e);
         }
     }
 
@@ -133,14 +163,15 @@
         return records;
     }
     
-    public static final AnotherSimpleDateFormat DATE_FORMAT = new AnotherSimpleDateFormat("MM/dd/yyyy H:mm"); 
+    public static final RunSpreadsheetDateFormat DATE_FORMAT = new RunSpreadsheetDateFormat("MM/dd/yyyy H:mm"); 
     private static final TimeZone TIME_ZONE =  TimeZone.getTimeZone("EST");
     
     
     @SuppressWarnings("serial")
     public
-    static class AnotherSimpleDateFormat extends SimpleDateFormat {
-        public AnotherSimpleDateFormat(String formatstring) {
+    static class RunSpreadsheetDateFormat extends SimpleDateFormat {
+        
+        public RunSpreadsheetDateFormat(String formatstring) {
             super(formatstring);
             //Calendar c = Calendar.getInstance(TIME_ZONE,Locale.US);
             //setTimeZone(TIME_ZONE);
@@ -236,7 +267,7 @@
                 try {
                     addRunData(new RunData(record));
                 } catch (NumberFormatException e) {
-                    e.printStackTrace();
+                    //e.printStackTrace();
                 }
             }
         }

Modified: java/trunk/crawler/pom.xml
 =============================================================================
--- java/trunk/crawler/pom.xml	(original)
+++ java/trunk/crawler/pom.xml	Wed Feb 24 13:06:58 2016
@@ -17,7 +17,11 @@
     <dependencies>
         <dependency>
             <groupId>org.hps</groupId>
-            <artifactId>hps-run-database</artifactId>
+            <artifactId>hps-record-util</artifactId>
+        </dependency>
+        <dependency>
+            <groupId>srs</groupId>
+            <artifactId>org-srs-datacat-client</artifactId>
         </dependency>
     </dependencies>
 </project>

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/AidaMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/AidaMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/AidaMetadataReader.java	Wed Feb 24 13:06:58 2016
@@ -12,7 +12,7 @@
  *
  * @author Jeremy McCormick, SLAC
  */
-public class AidaMetadataReader implements FileMetadataReader {
+final class AidaMetadataReader implements FileMetadataReader {
 
     /**
      * Get the metadata for a ROOT DQM file.
@@ -22,7 +22,7 @@
     @Override
     public Map<String, Object> getMetadata(final File file) throws IOException {
         final Map<String, Object> metadata = new HashMap<String, Object>();
-        final int run = CrawlerFileUtilities.getRunFromFileName(file);
+        final Long run = FileUtilities.getRunFromFileName(file);
         metadata.put("runMin", run);
         metadata.put("runMax", run);
         return metadata;

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	Wed Feb 24 13:06:58 2016
@@ -11,13 +11,9 @@
 import java.util.Set;
 
 import org.hps.conditions.database.ConnectionParameters;
-import org.hps.datacat.client.DatasetFileFormat;
-import org.hps.datacat.client.DatasetSite;
 
 /**
  * Full configuration information for the {@link Crawler} class.
- * <p>
- * Method chaining of setters is supported.
  *
  * @author Jeremy McCormick, SLAC
  */
@@ -41,20 +37,13 @@
 
     /**
      * The name of the folder in the data catalog for inserting data (under "/HPS" root folder).
-     * <p>
-     * Default provided for Eng Run 2015 data.
      */
     private String datacatFolder = null;
 
     /**
-     * Set whether extraction of metadata from files is enabled.
-     */
-    private boolean enableMetadata;
-
-    /**
-     * Set of file formats for filtering files.
-     */
-    Set<DatasetFileFormat> formats = new HashSet<DatasetFileFormat>();
+     * Set of accepted file formats.
+     */
+    private Set<FileFormat> formats = new HashSet<FileFormat>();
 
     /**
      * The maximum depth to crawl.
@@ -69,7 +58,7 @@
     /**
      * The dataset site for the datacat.
      */
-    private DatasetSite site;
+    private Site site = Site.JLAB;
 
     /**
      * A timestamp to use for filtering input files on their creation date.
@@ -80,6 +69,21 @@
      * A file to use for getting the timestamp date.
      */
     private File timestampFile = null;
+    
+    /**
+     * Dry run for not actually executing updates.
+     */
+    private boolean dryRun = false;
+    
+    /**
+     * Base URL of datacat client.
+     */
+    private String baseUrl = DatacatHelper.DATACAT_URL;
+        
+    /**
+     * Set of paths used for filtering files (file's path must match one of these).
+     */
+    private Set<String> paths = new HashSet<String>();
 
     /**
      * Get the set of runs that will be accepted for the job.
@@ -94,7 +98,7 @@
      * Add the default file formats.
      */
     CrawlerConfig addDefaultFileFormats() {
-        final List<DatasetFileFormat> defaultFormats = Arrays.asList(DatasetFileFormat.values());
+        final List<FileFormat> defaultFormats = Arrays.asList(FileFormat.values());
         this.formats.addAll(defaultFormats);
         return this;
     }
@@ -104,9 +108,8 @@
      *
      * @param format the file format
      */
-    CrawlerConfig addFileFormat(final DatasetFileFormat format) {
+    void addFileFormat(final FileFormat format) {
         this.formats.add(format);
-        return this;
     }
 
     /**
@@ -123,7 +126,7 @@
      *
      * @return the data catalog folder
      */
-    String datacatFolder() {
+    String folder() {
         return this.datacatFolder;
     }
 
@@ -132,25 +135,16 @@
      *
      * @return the dataset site
      */
-    DatasetSite datasetSite() {
+    Site site() {
         return this.site;
     }
 
     /**
-     * Return <code>true</code> if metadata extraction from files is enabled.
-     *
-     * @return <code>true</code> if metadata extraction is enabled
-     */
-    boolean enableMetaData() {
-        return this.enableMetadata;
-    }
-
-    /**
      * Get the file formats for filtering.
      *
      * @return the file formats for filtering
      */
-    Set<DatasetFileFormat> getFileFormats() {
+    Set<FileFormat> getFileFormats() {
         return this.formats;
     }
 
@@ -164,7 +158,7 @@
     }
 
     /**
-     * Get the root directory for the file search.
+     * Get the root directory in the file catalog.
      *
      * @return the root directory for the file search
      */
@@ -178,9 +172,8 @@
      * @param acceptRuns the list of acceptable run numbers
      * @return this object
      */
-    CrawlerConfig setAcceptRuns(final Set<Integer> acceptRuns) {
+    void setAcceptRuns(final Set<Integer> acceptRuns) {
         this.acceptRuns = acceptRuns;
-        return this;
     }
 
     /**
@@ -189,9 +182,8 @@
      * @param connectionParameters the database connection parameters
      * @return this object
      */
-    CrawlerConfig setConnection(final ConnectionParameters connectionParameters) {
+    void setConnection(final ConnectionParameters connectionParameters) {
         this.connectionParameters = connectionParameters;
-        return this;
     }
 
     /**
@@ -199,9 +191,8 @@
      *
      * @param datacatFolder the data catalog folder
      */
-    CrawlerConfig setDatacatFolder(final String datacatFolder) {
+    void setDatacatFolder(final String datacatFolder) {
         this.datacatFolder = datacatFolder;
-        return this;
     }
 
     /**
@@ -209,29 +200,28 @@
      *
      * @return this object
      */
-    void setDatasetSite(final DatasetSite site) {
+    void setSite(final Site site) {
         this.site = site;
     }
-
-    /**
-     * Set whether metadata extraction is enabled.
-     *
-     * @param enableMetadata <code>true</code> to enable metadata
-     * @return this object
-     */
-    CrawlerConfig setEnableMetadata(final boolean enableMetadata) {
-        this.enableMetadata = enableMetadata;
-        return this;
-    }
+    
+    /**
+     * Enable dry run.
+     * 
+     * @param dryRun set to <code>true</code> to enable dry run
+     * @return this object
+     */
+    void setDryRun(boolean dryRun) {
+        this.dryRun = dryRun;
+    }
+    
 
     /**
      * Set the max depth.
      *
      * @param maxDepth the max depth
      */
-    CrawlerConfig setMaxDepth(final Integer maxDepth) {
+    void setMaxDepth(final Integer maxDepth) {
         this.maxDepth = maxDepth;
-        return this;
     }
 
     /**
@@ -240,9 +230,8 @@
      * @param rootDir the root directory for the file search
      * @return this object
      */
-    CrawlerConfig setRootDir(final File rootDir) {
+    void setRootDir(final File rootDir) {
         this.rootDir = rootDir;
-        return this;
     }
 
     /**
@@ -253,9 +242,8 @@
      * @param timestamp the date for filtering files
      * @return this object
      */
-    CrawlerConfig setTimestamp(final Date timestamp) {
+    void setTimestamp(final Date timestamp) {
         this.timestamp = timestamp;
-        return this;
     }
 
     /**
@@ -267,9 +255,8 @@
      * @param timestamp the date string for filtering files
      * @return this object
      */
-    CrawlerConfig setTimestamp(final String timestampString) throws ParseException {
+    void setTimestamp(final String timestampString) throws ParseException {
         TIMESTAMP_FORMAT.parse(timestampString);
-        return this;
     }
 
     /**
@@ -278,9 +265,8 @@
      * @param timestampFile the timestamp file for date filtering
      * @return this object
      */
-    CrawlerConfig setTimestampFile(final File timestampFile) {
+    void setTimestampFile(final File timestampFile) {
         this.timestampFile = timestampFile;
-        return this;
     }
 
     /**
@@ -302,4 +288,49 @@
     File timestampFile() {
         return timestampFile;
     }
+    
+    /**
+     * Returns <code>true</code> if dry run which means no updates will occur.
+     * 
+     * @return <code>true</code> if dry run
+     */
+    boolean dryRun() {
+        return this.dryRun;
+    }
+    
+    /**
+     * Set the data catalog URL.
+     * 
+     * @param baseUrl the data catalog URL
+     */
+    void setDatacatUrl(String baseUrl) {
+        this.baseUrl = baseUrl;        
+    }
+    
+    /**
+     * Get the data catalog URL.
+     * 
+     * @return the data catalog URL
+     */
+    String datacatUrl() {
+        return this.baseUrl;
+    }
+        
+    /**
+     * Add a path for filtering files.
+     * 
+     * @param path the path for filtering
+     */
+    void addPath(String path) {
+        this.paths.add(path);
+    }
+    
+    /**
+     * Get the list of paths for filtering. 
+     * 
+     * @return the list of paths for filtering
+     */
+    Set<String> paths() {
+        return this.paths;
+    }
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	Wed Feb 24 13:06:58 2016
@@ -1,112 +1,30 @@
 package org.hps.crawler;
 
 import java.io.File;
-import java.io.FileFilter;
 import java.io.IOException;
 import java.nio.file.FileVisitOption;
-import java.nio.file.FileVisitResult;
 import java.nio.file.Files;
-import java.nio.file.Path;
-import java.nio.file.SimpleFileVisitor;
 import java.nio.file.attribute.BasicFileAttributes;
-import java.util.ArrayList;
 import java.util.Date;
 import java.util.EnumSet;
-import java.util.HashMap;
 import java.util.HashSet;
 import java.util.List;
-import java.util.Map;
 import java.util.Set;
-import java.util.logging.Level;
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.DefaultParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
-import org.apache.commons.cli.DefaultParser;
-import org.hps.datacat.client.DatacatClient;
-import org.hps.datacat.client.DatacatClientFactory;
-import org.hps.datacat.client.DatasetFileFormat;
+import org.srs.datacat.model.DatasetModel;
 
 /**
  * Command line file crawler for populating the data catalog.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class DatacatCrawler {
-
-    /**
-     * Visitor which creates a {@link FileSet} from walking a directory tree.
-     * <p>
-     * Any number of {@link java.io.FileFilter} objects can be registered with this visitor to restrict which files are
-     * accepted.
-     *
-     * @author Jeremy McCormick, SLAC
-     */
-    final class DatacatFileVisitor extends SimpleFileVisitor<Path> {
-
-        /**
-         * The run log containing information about files from each run.
-         */
-        private final FileSet fileSet = new FileSet();
-
-        /**
-         * A list of file filters to apply.
-         */
-        private final List<FileFilter> filters = new ArrayList<FileFilter>();
-
-        /**
-         * Run the filters on the file to tell whether it should be accepted or not.
-         *
-         * @param file the EVIO file
-         * @return <code>true</code> if file should be accepted
-         */
-        private boolean accept(final File file) {
-            boolean accept = true;
-            for (final FileFilter filter : this.filters) {
-                accept = filter.accept(file);
-                if (!accept) {
-                    break;
-                }
-            }
-            return accept;
-        }
-
-        /**
-         * Add a file filter.
-         *
-         * @param filter the file filter
-         */
-        void addFilter(final FileFilter filter) {
-            this.filters.add(filter);
-        }
-
-        /**
-         * Get the file set created by visiting the directory tree.
-         *
-         * @return the file set from visiting the directory tree
-         */
-        FileSet getFileSet() {
-            return this.fileSet;
-        }
-
-        /**
-         * Visit a single file.
-         *
-         * @param path the file to visit
-         * @param attrs the file attributes
-         */
-        @Override
-        public FileVisitResult visitFile(final Path path, final BasicFileAttributes attrs) {
-            final File file = path.toFile();
-            if (this.accept(file)) {
-                final DatasetFileFormat format = DatacatUtilities.getFileFormat(file);
-                fileSet.addFile(format, file);
-            }
-            return FileVisitResult.CONTINUE;
-        }
-    }
+public final class DatacatCrawler {
 
     /**
      * Make a list of available file formats for printing help.
@@ -117,14 +35,14 @@
      * Setup the logger.
      */
     private static final Logger LOGGER = Logger.getLogger(DatacatCrawler.class.getPackage().getName());
-
+    
     /**
      * Command line options for the crawler.
      */
     private static final Options OPTIONS = new Options();
     static {
         final StringBuffer buffer = new StringBuffer();
-        for (final DatasetFileFormat format : DatasetFileFormat.values()) {
+        for (final FileFormat format : FileFormat.values()) {
             buffer.append(format.name() + " ");
         }
         buffer.setLength(buffer.length() - 1);
@@ -135,17 +53,17 @@
      * Statically define the command options.
      */
     static {
-        OPTIONS.addOption("L", "log-level", true, "set the log level (INFO, FINE, etc.)");
         OPTIONS.addOption("b", "min-date", true, "min date for a file (example \"2015-03-26 11:28:59\")");
         OPTIONS.addOption("d", "directory", true, "root directory to crawl");
         OPTIONS.addOption("f", "folder", true, "datacat folder");
         OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
         OPTIONS.addOption("o", "format", true, "add a file format for filtering: " + AVAILABLE_FORMATS);
-        OPTIONS.addOption("m", "metadata", false, "create metadata for datasets");
         OPTIONS.addOption("r", "run", true, "add a run number to accept");
         OPTIONS.addOption("s", "site", true, "datacat site");
         OPTIONS.addOption("t", "timestamp-file", true, "existing or new timestamp file name");
         OPTIONS.addOption("x", "max-depth", true, "max depth to crawl");
+        OPTIONS.addOption("D", "dry-run", false, "dry run which will not update the datacat");
+        OPTIONS.addOption("u", "base-url", true, "provide a base URL of the datacat server");
     }
 
     /**
@@ -166,32 +84,15 @@
      * The options parser.
      */
     private final DefaultParser parser = new DefaultParser();
-
-    /**
-     * Throw an exception if the path doesn't exist in the data catalog or it is not a folder.
-     *
-     * @param folder the folder in the datacat
-     * @throws RuntimeException if the given path does not exist or it is not a folder
-     */
-    void checkFolder(final String folder) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
-        if (!datacatClient.exists(folder)) {
-            throw new RuntimeException("The folder " + folder + " does not exist in the data catalog.");
-        }
-        if (!datacatClient.isFolder(folder)) {
-            throw new RuntimeException("The path " + folder + " is not a folder.");
-        }
-    }
-
+    
     /**
      * Parse command line options.
      *
      * @param args the command line arguments
      * @return this object (for method chaining)
      */
-    public DatacatCrawler parse(final String[] args) {
-        config = new CrawlerConfig();
-
+    private DatacatCrawler parse(final String[] args) {
+        
         LOGGER.config("parsing command line options");
 
         this.config = new CrawlerConfig();
@@ -202,13 +103,6 @@
             // Print help.
             if (cl.hasOption("h") || args.length == 0) {
                 this.printUsage();
-            }
-
-            // Log level.
-            if (cl.hasOption("L")) {
-                final Level level = Level.parse(cl.getOptionValue("L"));
-                LOGGER.config("setting log level to " + level);
-                LOGGER.setLevel(level);
             }
 
             // Root directory for file crawling.
@@ -221,7 +115,7 @@
                     throw new IllegalArgumentException("The specified path is not a directory.");
                 }
                 config.setRootDir(rootDir);
-                LOGGER.config("root dir set to " + config.rootDir());
+                LOGGER.config("root dir " + config.rootDir());
             }
 
             // Timestamp file for date filtering.
@@ -278,9 +172,9 @@
             // Configure enabled file formats.
             if (cl.hasOption("o")) {
                 for (final String arg : cl.getOptionValues("o")) {
-                    DatasetFileFormat format = null;
+                    FileFormat format = null;
                     try {
-                        format = DatasetFileFormat.valueOf(arg);
+                        format = FileFormat.valueOf(arg);
                     } catch (IllegalArgumentException | NullPointerException e) {
                         throw new IllegalArgumentException("The format " + arg + " is not valid.", e);
                     }
@@ -288,19 +182,22 @@
                     this.config.addFileFormat(format);
                 }
             } else {
-                throw new RuntimeException("The -o argument with data format must be supplied at least once.");
-            }
-
-            // Enable metadata extraction from files.
-            if (cl.hasOption("m")) {
-                config.setEnableMetadata(true);
-                LOGGER.config("metadata extraction enabled");
+                for (FileFormat format : FileFormat.values()) {
+                    this.config.addFileFormat(format);
+                    LOGGER.config("adding default format " + format);
+                }
+            }
+            
+            // Enable the default set of file formats.
+            if (this.config.getFileFormats().isEmpty()) {
+                LOGGER.config("enabling default file formats");
+                this.config.addDefaultFileFormats();
             }
 
             // Datacat folder.
             if (cl.hasOption("f")) {
                 config.setDatacatFolder(cl.getOptionValue("f"));
-                LOGGER.config("set datacat folder to " + config.datacatFolder());
+                LOGGER.config("set datacat folder to " + config.folder());
             } else {
                 throw new RuntimeException("The -f argument with the datacat folder is required.");
             }
@@ -313,20 +210,43 @@
                 }
                 config.setAcceptRuns(acceptRuns);
             }
+                                    
+            // Dry run.
+            if (cl.hasOption("D")) {
+                config.setDryRun(true);
+            }
+                        
+            // List of paths.
+            if (!cl.getArgList().isEmpty()) {
+                for (String arg : cl.getArgList()) {
+                    config.addPath(arg);
+                }
+            }
+            
+            // Dataset site (defaults to JLAB).
+            Site site = Site.JLAB;
+            if (cl.hasOption("s")) {
+                site = Site.valueOf(cl.getOptionValue("s"));
+            }
+            LOGGER.config("dataset site " + site);
+            config.setSite(site);
+            
+            // Data catalog URL.
+            if (cl.hasOption("u")) {
+                config.setDatacatUrl(cl.getOptionValue("u"));
+                LOGGER.config("datacat URL " + config.datacatUrl());
+            }
 
         } catch (final ParseException e) {
             throw new RuntimeException("Error parsing options.", e);
         }
 
-        // Check the datacat folder which must already exist.
-        this.checkFolder(config.datacatFolder());
-
         // Check that there is at least one file format enabled for filtering.
         if (this.config.getFileFormats().isEmpty()) {
-            throw new IllegalStateException("At least one file format must be provided with the -f switch.");
-        }
-
-        LOGGER.info("done parsing command line options");
+            throw new IllegalStateException("At least one file format must be provided with the -o switch.");
+        }
+
+        LOGGER.info("Done parsing command line options.");
 
         return this;
     }
@@ -336,72 +256,65 @@
      */
     private void printUsage() {
         final HelpFormatter help = new HelpFormatter();
-        help.printHelp(70, "DatacatCrawler [options]", "", OPTIONS, "");
+        help.printHelp(70, "DatacatCrawler [options] path ...", "", OPTIONS, "");
         System.exit(0);
     }
 
     /**
      * Run the crawler job.
      */
-    void run() {
-
+    private void run() {
+                
         // Create the file visitor for crawling the root directory with the given date filter.
-        final DatacatFileVisitor visitor = new DatacatFileVisitor();
+        final CrawlerFileVisitor visitor = new CrawlerFileVisitor();
 
         // Add date filter if timestamp is supplied.
         if (config.timestamp() != null) {
             visitor.addFilter(new DateFileFilter(config.timestamp()));
+            LOGGER.config("added timestamp filter " + config.timestamp());
+        }
+        
+        // Add path filter.
+        if (!config.paths().isEmpty()) {
+            visitor.addFilter(new PathFilter(config.paths()));
+            StringBuffer sb = new StringBuffer();
+            for (String path : config.paths()) {
+                sb.append(path + ":");
+            }
+            sb.setLength(sb.length() - 1);
+            LOGGER.config("added paths " + sb.toString());
         }
 
         // Add file format filter.
-        for (final DatasetFileFormat fileFormat : config.getFileFormats()) {
-            LOGGER.info("adding file format filter for " + fileFormat.name());
-        }
         visitor.addFilter(new FileFormatFilter(config.getFileFormats()));
 
-        // Run number filter.
+        // Add run number filter.
         if (!config.acceptRuns().isEmpty()) {
             visitor.addFilter(new RunFilter(config.acceptRuns()));
         }
 
-        // Walk the file tree using the visitor.
+        // Walk the file tree and get list of files.
         this.walk(visitor);
-
-        // Update the data catalog.
-        this.updateDatacat(visitor.getFileSet());
-    }
-
-    /**
-     * Update the data catalog.
-     *
-     * @param runMap the map of run information including the EVIO file list
-     */
-    private void updateDatacat(final FileSet fileSet) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
-        for (final DatasetFileFormat fileFormat : config.getFileFormats()) {
-            LOGGER.info("adding files to datacat with format " + fileFormat.name());
-            for (final File file : fileSet.get(fileFormat)) {
-
-                LOGGER.info("adding file " + file.getAbsolutePath() + " to datacat");
-
-                // Create metadata if this is enabled (takes awhile).
-                Map<String, Object> metadata = new HashMap<String, Object>();
-                if (config.enableMetaData()) {
-                    metadata = DatacatUtilities.createMetadata(file);
-                }
-
-                // Register file in the catalog.
-                DatacatUtilities.addFile(datacatClient, config.datacatFolder(), file, metadata);
-            }
-        }
-    }
-
-    /**
-     * Walk the directory tree to find EVIO files for the runs that are being processed in the job.
+                
+        // Insert datasets if files were found.
+        if (!visitor.getFiles().isEmpty()) {
+            List<DatasetModel> datasets = DatacatHelper.createDatasets(visitor.getFiles(), config.folder(), config.site().toString());
+            LOGGER.info("built " + datasets.size() + " datasets");
+            DatacatHelper.addDatasets(datasets, config.folder(), config.datacatUrl());
+            LOGGER.info("added datasets to datacat");
+        } else {
+            LOGGER.warning("No files were found by the crawler.");
+        }
+        
+        LOGGER.info("Done!");
+    }
+                         
+    /**
+     * Walk the directory tree to find files for the runs that are being processed in the job.
      *
      * @param visitor the file visitor
      */
-    private void walk(final DatacatFileVisitor visitor) {
+    private void walk(final CrawlerFileVisitor visitor) {
         try {
             // Walk the file tree from the root directory.
             final EnumSet<FileVisitOption> options = EnumSet.noneOf(FileVisitOption.class);

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	Wed Feb 24 13:06:58 2016
@@ -2,147 +2,330 @@
 
 import java.io.File;
 import java.io.IOException;
-import java.util.Date;
-import java.util.HashMap;
+import java.math.RoundingMode;
+import java.text.DecimalFormat;
+import java.util.LinkedHashMap;
 import java.util.Map;
+import java.util.Map.Entry;
+import java.util.Set;
+import java.util.logging.Level;
 import java.util.logging.Logger;
 
 import org.hps.record.evio.EventTagConstant;
 import org.hps.record.evio.EvioEventUtilities;
 import org.hps.record.evio.EvioFileUtilities;
+import org.hps.record.triggerbank.AbstractIntData.IntBankDefinition;
+import org.hps.record.triggerbank.HeadBankData;
+import org.hps.record.triggerbank.TiTimeOffsetEvioProcessor;
+import org.hps.record.triggerbank.TriggerType;
+import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
 import org.jlab.coda.jevio.EvioException;
 import org.jlab.coda.jevio.EvioReader;
 
 /**
- * Reads metadata from EVIO files.
- *
+ * Creates detailed metadata for the datacat from an EVIO input file.
+ * 
  * @author Jeremy McCormick, SLAC
  */
-public class EvioMetadataReader implements FileMetadataReader {
+final class EvioMetadataReader implements FileMetadataReader {
 
     /**
-     * Initialize the logger.
+     * Initialize the package logger.
      */
     private static Logger LOGGER = Logger.getLogger(EvioMetadataReader.class.getPackage().getName());
 
     /**
+     * Head bank definition.
+     */
+    private static IntBankDefinition HEAD_BANK = new IntBankDefinition(HeadBankData.class, new int[] {0x2e, 0xe10f});
+
+    /**
      * Get the EVIO file metadata.
-     *
+     * 
      * @param file the EVIO file
      * @return the metadata map of key and value pairs
      */
     @Override
     public Map<String, Object> getMetadata(final File file) throws IOException {
-
-        Date startDate = null;
-        Date endDate = null;
-        int badEventCount = 0;
-        int eventCount = 0;
-        int byteCount = 0;
-        boolean hasPrestart = false;
-        boolean hasEnd = false;
-        int[] eventIdData = null;
-        Integer run = null;
-        Integer endEvent = null;
-        Integer startEvent = null;
-        Long lastTimestamp = null;
+        
+        long totalEvents = 0;
+        int physicsEvents = 0;
+        int badEvents = 0;
+        int blinded = 0;
+        Long run = null;
+        Integer firstHeadTimestamp = null;
+        Integer lastHeadTimestamp = null;
+        Integer lastPhysicsEvent = null;
+        Integer firstPhysicsEvent = null;
+        Integer prestartTimestamp = null;
+        Integer endTimestamp = null;
+        Integer goTimestamp = null;
+        Double triggerRate = null;
+        
+        // Processor for calculating TI time offsets.
+        TiTimeOffsetEvioProcessor tiProcessor = new TiTimeOffsetEvioProcessor();
+
+        // Create map for counting trigger types.
+        Map<TriggerType, Integer> triggerCounts = new LinkedHashMap<TriggerType, Integer>();
+        for (TriggerType triggerType : TriggerType.values()) {
+            triggerCounts.put(triggerType, 0);
+        }
+
+        // Get the file number from the name.
+        final int fileNumber = EvioFileUtilities.getSequenceFromName(file);
+
+        // File numbers indivisible by 10 are blinded (Eng Run 2015 scheme).
+        if (!(fileNumber % 10 == 0)) {
+            blinded = 1;
+        }
+        
+        // Get file size.
+        long size = 0;
+        File cacheFile = file;
+        if (FileUtilities.isMssFile(file)) {
+            cacheFile = FileUtilities.getCachedFile(file);
+        }
+        size = cacheFile.length();
+        
+        // Compute MD5 checksum string.
+        String checksum = FileUtilities.createMD5Checksum(cacheFile);
 
         EvioReader evioReader = null;
         try {
-            evioReader = EvioFileUtilities.open(file, false);
+            // Open file in sequential mode.
+            evioReader = EvioFileUtilities.open(file, true);
+            EvioEvent evioEvent = null;
+
+            // Event read loop.
+            eventLoop: while (true) {
+                try {
+                    // Parse next event.
+                    evioEvent = evioReader.parseNextEvent();
+
+                    // End of file.
+                    if (evioEvent == null) {
+                        LOGGER.fine("EOF after " + totalEvents + " events.");
+                        break eventLoop;
+                    }
+                    
+                    // Increment event count (doesn't count events that can't be parsed).
+                    ++totalEvents;
+
+                    // Debug print event number and tag.
+                    LOGGER.finest("Parsed event " + evioEvent.getEventNumber() + " with tag 0x"
+                            + String.format("%08x", evioEvent.getHeader().getTag()));
+
+                    // Get head bank.
+                    BaseStructure headBank = HEAD_BANK.findBank(evioEvent);
+
+                    // Current timestamp.
+                    int thisTimestamp = 0;
+
+                    // Process head bank if not null.
+                    if (headBank != null) {
+                        if (headBank != null) {
+                            final int[] headBankData = headBank.getIntData();
+                            thisTimestamp = headBankData[3];
+                            if (thisTimestamp != 0) {
+                                // First header timestamp.
+                                if (firstHeadTimestamp == null) {
+                                    firstHeadTimestamp = thisTimestamp;
+                                    LOGGER.finer("First head timestamp " + firstHeadTimestamp + " from event "
+                                            + evioEvent.getEventNumber());
+                                }
+
+                                // Last header timestamp.
+                                lastHeadTimestamp = thisTimestamp;
+                            }
+
+                            // Run number.
+                            if (run == null) {
+                                if (headBankData[1] != 0) {
+                                    run = (long) headBankData[1];
+                                    LOGGER.finer("Run number " + run + " from event " + evioEvent.getEventNumber());
+                                }
+                            }
+                        }
+                    }
+                    
+                    if (EvioEventUtilities.isPhysicsEvent(evioEvent)) {
+                                                
+                        final int[] eventIdData = EvioEventUtilities.getEventIdData(evioEvent);
+                        
+                        if (eventIdData != null) {
+                        
+                            // Set the last physics event.
+                            lastPhysicsEvent = eventIdData[0];
+
+                            // Set the first physics event.
+                            if (firstPhysicsEvent == null) {
+                                firstPhysicsEvent = eventIdData[0];
+                                LOGGER.finer("Set first physics event " + firstPhysicsEvent);
+                            }
+                        }
+                        
+                        ++physicsEvents;
+                    } else if (EvioEventUtilities.isControlEvent(evioEvent)) {
+                        int[] controlData = EvioEventUtilities.getControlEventData(evioEvent);
+                        if (controlData[0] != 0) {
+                            if (EventTagConstant.PRESTART.isEventTag(evioEvent)) {
+                                prestartTimestamp = controlData[0];
+                            }                        
+                            if (EventTagConstant.GO.isEventTag(evioEvent)) {
+                                goTimestamp = controlData[0];
+                            }
+                            if (EventTagConstant.END.isEventTag(evioEvent)) {
+                                endTimestamp = controlData[0];
+                            }
+                        }
+                    }
+
+                    // Count trigger types for this event.
+                    Set<TriggerType> triggerTypes = TriggerType.getTriggerTypes(evioEvent);
+                    for (TriggerType mask : triggerTypes) {
+                        int count = triggerCounts.get(mask) + 1;
+                        triggerCounts.put(mask, count);
+                        LOGGER.finest("Incremented " + mask.name() + " to " + count);
+                    }
+                    
+                    // Activate TI time offset processor.
+                    tiProcessor.process(evioEvent);
+                    
+                } catch (Exception e) {  
+                    // Trap all event processing errors.
+                    badEvents++;
+                    LOGGER.warning("Error processing EVIO event " + evioEvent.getEventNumber());
+                }
+            }
         } catch (final EvioException e) {
-            throw new IOException(e);
-        }
-
-        final int fileNumber = EvioFileUtilities.getSequenceFromName(file);
-
-        EvioEvent evioEvent = null;
-
-        while (true) {
-            try {
-                evioEvent = evioReader.parseNextEvent();
-            } catch (IOException | EvioException e) {
-                ++badEventCount;
-                continue;
+            // Error reading the EVIO file.
+            throw new IOException("Error reading EVIO file.", e);
+        } finally {
+            // Close the reader.
+            if (evioReader != null) {
+                try {
+                    evioReader.close();
+                } catch (IOException e) {
+                    LOGGER.log(Level.WARNING, "Error closing EVIO reader", e);
+                }
             }
-            if (evioEvent == null) {
-                break;
+        }
+
+        LOGGER.info("Done reading " + totalEvents + " events from " + file.getPath());
+
+        // Rough trigger rate calculation.
+        try {
+            if (firstHeadTimestamp != null && lastHeadTimestamp != null && totalEvents > 0 
+                    && (firstHeadTimestamp - lastHeadTimestamp != 0)) {
+                triggerRate = calculateTriggerRate(firstHeadTimestamp, lastHeadTimestamp, totalEvents);
+            } else {
+                LOGGER.log(Level.WARNING, "Missing information for calculating trigger rate.");
             }
-            byteCount += evioEvent.getTotalBytes();
-            if (EventTagConstant.PRESTART.equals(evioEvent)) {
-                LOGGER.info("found PRESTART");
-                hasPrestart = true;
-                final int[] controlEventData = EvioEventUtilities.getControlEventData(evioEvent);
-                final long timestamp = controlEventData[0] * 1000L;
-                startDate = new Date(timestamp);
-                LOGGER.info("set start date to " + startDate + " from PRESTART");
-                if (run == null) {
-                    run = controlEventData[1];
-                    LOGGER.info("set run to " + run);
-                }
-            } else if (EventTagConstant.END.equals(evioEvent)) {
-                LOGGER.info("found END event");
-                hasEnd = true;
-                final int[] controlEventData = EvioEventUtilities.getControlEventData(evioEvent);
-                final long timestamp = controlEventData[0] * 1000L;
-                endDate = new Date(timestamp);
-                LOGGER.info("set end date to " + endDate);
-                if (run == null) {
-                    run = controlEventData[1];
-                    LOGGER.info("set run to " + run);
-                }
-            } else if (EvioEventUtilities.isPhysicsEvent(evioEvent)) {
-                final int[] headBankData = EvioEventUtilities.getHeadBankData(evioEvent);
-                if (startDate == null) {
-                    if (headBankData[3] != 0) {
-                        startDate = new Date(headBankData[3] * 1000L);
-                        LOGGER.info("set start date to " + startDate + " from physics event");
-                    }
-                }
-                if (run == null) {
-                    run = headBankData[1];
-                    LOGGER.info("set run to " + run + " from physics event");
-                }
-                eventIdData = EvioEventUtilities.getEventIdData(evioEvent);
-                if (startEvent == null) {
-                    startEvent = eventIdData[0];
-                    LOGGER.info("set start event " + startEvent);
-                }
-                if (headBankData[3] != 0) {
-                    lastTimestamp = headBankData[3] * 1000L;
-                }
-                ++eventCount;
+        } catch (Exception e) {
+            LOGGER.log(Level.WARNING, "Error calculating the trigger rate.", e);
+        }
+
+        // Create and fill the metadata map.
+        final Map<String, Object> metadataMap = new LinkedHashMap<String, Object>();
+
+        try {
+            if (run == null) {
+                run = new Long(EvioFileUtilities.getRunFromName(file));
             }
-        }
-
-        // Set end date from last valid timestamp.
-        if (endDate == null) {
-            endDate = new Date(lastTimestamp);
-            LOGGER.info("set end date to " + endDate + " from last timestamp " + lastTimestamp);
-        }
-
-        // Set end event number.
-        if (eventIdData != null) {
-            endEvent = eventIdData[0];
-            LOGGER.info("set end event " + endEvent);
-        }
-
-        final Map<String, Object> metaDataMap = new HashMap<String, Object>();
-
-        metaDataMap.put("runMin", run);
-        metaDataMap.put("runMax", run);
-        metaDataMap.put("eventCount", eventCount);
-        metaDataMap.put("size", byteCount);
-        metaDataMap.put("fileNumber", fileNumber);
-        metaDataMap.put("badEventCount", badEventCount);
-        metaDataMap.put("endTimestamp", endDate.getTime());
-        metaDataMap.put("startTimestamp", startDate.getTime());
-        metaDataMap.put("startEvent", startEvent);
-        metaDataMap.put("endEvent", endEvent);
-        metaDataMap.put("hasEnd", hasEnd ? 1 : 0);
-        metaDataMap.put("hasPrestart", hasPrestart ? 1 : 0);
-
-        return metaDataMap;
+        } catch (Exception e) {
+            throw new RuntimeException("Failed to get run number from event data or file name.", e);
+        }
+
+        // Set locationExtras metadata.
+        metadataMap.put("runMin", run);
+        metadataMap.put("runMax", run);
+        metadataMap.put("eventCount", totalEvents);
+        metadataMap.put("size", size);
+        metadataMap.put("checksum", checksum);     
+        
+        // File sequence number.
+        metadataMap.put("FILE", fileNumber);
+
+        // Blinded flag.
+        metadataMap.put("BLINDED", blinded);
+
+        // First and last timestamps which may come from control or physics events.
+        if (firstHeadTimestamp != null) {
+            metadataMap.put("FIRST_HEAD_TIMESTAMP", firstHeadTimestamp);
+        } 
+        
+        if (lastHeadTimestamp != null) {
+            metadataMap.put("LAST_HEAD_TIMESTAMP", lastHeadTimestamp);
+        } 
+
+        // First and last physics event numbers.
+        if (firstPhysicsEvent != null) {
+            metadataMap.put("FIRST_PHYSICS_EVENT", firstPhysicsEvent);
+        } 
+        
+        if (lastPhysicsEvent != null) {
+            metadataMap.put("LAST_PHYSICS_EVENT", lastPhysicsEvent);
+        }
+        
+        // Timestamps which are only set if the corresponding control events were found in the file.
+        if (prestartTimestamp != null) {
+            metadataMap.put("PRESTART_TIMESTAMP", prestartTimestamp);
+        }
+        if (endTimestamp != null) {
+            metadataMap.put("END_TIMESTAMP", endTimestamp);
+        }
+        if (goTimestamp != null) {
+            metadataMap.put("GO_TIMESTAMP", goTimestamp);
+        }
+
+        // TI times and offset.
+        metadataMap.put("TI_TIME_MIN_OFFSET", new Long(tiProcessor.getMinOffset()).toString());
+        metadataMap.put("TI_TIME_MAX_OFFSET", new Long(tiProcessor.getMaxOffset()).toString());
+        metadataMap.put("TI_TIME_N_OUTLIERS", tiProcessor.getNumOutliers());
+        
+        // Event counts.
+        metadataMap.put("BAD_EVENTS", badEvents);
+        
+        // Physics event count.
+        metadataMap.put("PHYSICS_EVENTS", physicsEvents);
+        
+        // Rough trigger rate.
+        if (triggerRate != null && !Double.isInfinite(triggerRate) && !Double.isNaN(triggerRate)) {
+            DecimalFormat df = new DecimalFormat("#.##");
+            df.setRoundingMode(RoundingMode.CEILING);
+            LOGGER.info("Setting trigger rate to " + triggerRate + " Hz.");
+            metadataMap.put("TRIGGER_RATE", Double.parseDouble(df.format(triggerRate)));
+        } else {
+            LOGGER.warning("Failed to calculate trigger rate.");
+        }        
+
+        // Trigger type counts.
+        for (Entry<TriggerType, Integer> entry : triggerCounts.entrySet()) {
+            metadataMap.put(entry.getKey().name(), entry.getValue());
+        }
+
+        // Print the file metadata to log.
+        StringBuffer sb = new StringBuffer();
+        sb.append('\n');
+        for (Entry<String, Object> entry : metadataMap.entrySet()) {
+            sb.append("  " + entry.getKey() + " = " + entry.getValue() + '\n');
+        }
+        LOGGER.info("File metadata ..." + '\n' + sb.toString());
+
+        // Return the completed metadata map.
+        return metadataMap;
+    }
+         
+    /**
+     * Calculate the trigger rate in Hz.
+     * 
+     * @param firstTimestamp the first physics timestamp
+     * @param lastTimestamp the last physics timestamp
+     * @param events the number of physics events
+     * @return the trigger rate calculation in KHz
+     */
+    private double calculateTriggerRate(Integer firstTimestamp, Integer lastTimestamp, long events) {
+        return ((double) events / ((double) lastTimestamp - (double) firstTimestamp));
     }
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java	Wed Feb 24 13:06:58 2016
@@ -3,9 +3,6 @@
 import java.io.File;
 import java.io.FileFilter;
 import java.util.Set;
-import java.util.logging.Logger;
-
-import org.hps.datacat.client.DatasetFileFormat;
 
 /**
  * Filter files on their format.
@@ -17,21 +14,16 @@
 public class FileFormatFilter implements FileFilter {
 
     /**
-     * Initialize the logger.
-     */
-    private static final Logger LOGGER = Logger.getLogger(FileFormatFilter.class.getPackage().getName());
-
-    /**
      * The file format.
      */
-    private final Set<DatasetFileFormat> formats;
+    private final Set<FileFormat> formats;
 
     /**
      * Create a new filter with the given format.
      *
      * @param format the file format
      */
-    FileFormatFilter(final Set<DatasetFileFormat> formats) {
+    FileFormatFilter(final Set<FileFormat> formats) {
         if (formats == null) {
             throw new IllegalArgumentException("The formats collection is null.");
         }
@@ -48,13 +40,10 @@
      */
     @Override
     public boolean accept(final File pathname) {
-        LOGGER.info(pathname.getPath());
-        final DatasetFileFormat fileFormat = DatacatUtilities.getFileFormat(pathname);
+        final FileFormat fileFormat = DatacatHelper.getFileFormat(pathname);
         if (fileFormat != null) {
-            LOGGER.info("file " + pathname.getPath() + " has format " + fileFormat.name());
             return formats.contains(fileFormat);
         } else {
-            LOGGER.info("rejected file " + pathname.getPath() + " with unknown format");
             return false;
         }
     }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/FileMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/FileMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/FileMetadataReader.java	Wed Feb 24 13:06:58 2016
@@ -4,8 +4,19 @@
 import java.io.IOException;
 import java.util.Map;
 
-
+/**
+ * Interface for reading metadata for the datacat from files.
+ * 
+ * @author Jeremy McCormick, SLAC
+ */
 public interface FileMetadataReader {   
     
-    public Map<String, Object> getMetadata(File file) throws IOException;
+    /**
+     * Create a metadata map with keys and values from the contents of a file.
+     * 
+     * @param the input file for extracting metadata 
+     * @return the metadata map
+     * @throws IOException if there is an error reading the file
+     */
+    Map<String, Object> getMetadata(File file) throws IOException;
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/RootDqmMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/RootDqmMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/RootDqmMetadataReader.java	Wed Feb 24 13:06:58 2016
@@ -22,7 +22,7 @@
     @Override
     public Map<String, Object> getMetadata(final File file) throws IOException {
         final Map<String, Object> metadata = new HashMap<String, Object>();
-        final int run = CrawlerFileUtilities.getRunFromFileName(file);
+        final Long run = FileUtilities.getRunFromFileName(file);
         metadata.put("runMin", run);
         metadata.put("runMax", run);
         return metadata;

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/RootDstMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/RootDstMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/RootDstMetadataReader.java	Wed Feb 24 13:06:58 2016
@@ -28,6 +28,10 @@
     @Override
     public Map<String, Object> getMetadata(final File file) throws IOException {
         final Map<String, Object> metadata = new HashMap<String, Object>();
+        Long run = FileUtilities.getRunFromFileName(file);
+        metadata.put("runMin", run);
+        metadata.put("runMax", run);
+        /*
         RootFileReader rootReader = null;
         long eventCount = 0;
         int runMin = 0;
@@ -60,6 +64,7 @@
         metadata.put("runMin", runMin);
         metadata.put("runMax", runMax);
         metadata.put("size", size);
+        */
         return metadata;
     }
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/RunFilter.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/RunFilter.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/RunFilter.java	Wed Feb 24 13:06:58 2016
@@ -3,8 +3,6 @@
 import java.io.File;
 import java.io.FileFilter;
 import java.util.Set;
-
-import org.hps.record.evio.EvioFileUtilities;
 
 /**
  * A filter which rejects files with run numbers not in a specified set.
@@ -25,7 +23,7 @@
      */
     RunFilter(final Set<Integer> acceptRuns) {
         if (acceptRuns.isEmpty()) {
-            throw new IllegalArgumentException("the acceptRuns collection is empty");
+            throw new IllegalArgumentException("The acceptRuns collection is empty.");
         }
         this.acceptRuns = acceptRuns;
     }
@@ -38,6 +36,11 @@
      */
     @Override
     public boolean accept(final File file) {
-        return this.acceptRuns.contains(EvioFileUtilities.getRunFromName(file));
+        try {
+            int run = Integer.parseInt(file.getName().substring(5, 10));
+            return this.acceptRuns.contains(run);
+        } catch (Exception e) {
+            return false;
+        }
     }
 }

Modified: java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java
 =============================================================================
--- java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java	(original)
+++ java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClient.java	Wed Feb 24 13:06:58 2016
@@ -9,6 +9,7 @@
  *
  * @author Jeremy McCormick, SLAC
  */
+// TODO: add method for adding a location to an existing dataset
 public interface DatacatClient {
 
     /**

Modified: java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java
 =============================================================================
--- java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java	(original)
+++ java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatClientImpl.java	Wed Feb 24 13:06:58 2016
@@ -28,7 +28,7 @@
     private static Logger LOGGER = Logger.getLogger(DatacatClientImpl.class.getPackage().getName());
 
     /**
-     * The root directory (e.g. should be 'HPS').
+     * The root directory (should be 'HPS').
      */
     private final String rootDir;
 
@@ -46,7 +46,7 @@
      * Create client with default parameters.
      */
     DatacatClientImpl() {
-        this(DatacatConstants.BASE_URL, DatasetSite.SLAC, DatacatConstants.ROOT_DIR);
+        this(DatacatConstants.BASE_URL, DatasetSite.JLAB, DatacatConstants.ROOT_FOLDER);
     }
 
     /**
@@ -60,7 +60,7 @@
         try {
             this.url = new URL(url);
         } catch (final MalformedURLException e) {
-            throw new IllegalArgumentException("The URL is bad.", e);
+            throw new IllegalArgumentException("The URL is not valid.", e);
         }
         if (site == null) {
             throw new IllegalArgumentException("The site argument is null.");
@@ -99,14 +99,14 @@
         final Map<String, Object> parameters = new HashMap<String, Object>();
         parameters.put("dataType", dataType.toString());
         parameters.put("resource", resource);
-        parameters.put("site", DatasetSite.SLAC.name());
+        parameters.put("site", site);
         parameters.put("fileFormat", fileFormat.toString());
         parameters.put("name", name);
         parameters.put("size", size);
         final JSONObject jsonDataset = JSONUtilities.createJSONDataset(parameters, metadata);
         final String urlLocation = url + "/datasets.json/" + this.rootDir + "/" + folder;
-        LOGGER.info("addDataset: " + urlLocation);
-        LOGGER.info("dataset JSON: " + jsonDataset.toString());
+        LOGGER.info("add dataset " + urlLocation);
+        LOGGER.info("dataset JSON " + jsonDataset.toString());
         return HttpUtilities.doPost(urlLocation, jsonDataset.toString());
     }
 
@@ -204,7 +204,7 @@
             }
         }
         
-        LOGGER.info("findDatasets: " + urlLocation);
+        LOGGER.info(urlLocation);
         final StringBuffer outputBuffer = new StringBuffer();
         final int response = HttpUtilities.doGet(urlLocation, outputBuffer);
         if (response >= 400) {
@@ -213,7 +213,7 @@
 
         // Build and return dataset list
         final JSONObject searchResults = new JSONObject(outputBuffer.toString());
-        LOGGER.info("returning search results: " + searchResults.toString());
+        LOGGER.info(searchResults.toString());
         return createDatasetsFromSearch(searchResults);
     }
 
@@ -276,7 +276,7 @@
     @Override
     public int makeFolder(final String path) {
         final Map<String, Object> parameters = new HashMap<String, Object>();
-        parameters.put("path", "/" + DatacatConstants.ROOT_DIR + "/" + path);
+        parameters.put("path", "/" + DatacatConstants.ROOT_FOLDER + "/" + path);
         final String name = new File(path).getName();
         parameters.put("name", name);
         parameters.put("_type", "folder");

Modified: java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java
 =============================================================================
--- java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java	(original)
+++ java/trunk/datacat-client/src/main/java/org/hps/datacat/client/DatacatConstants.java	Wed Feb 24 13:06:58 2016
@@ -5,12 +5,12 @@
  * 
  * @author Jeremy McCormick, SLAC
  */
-final class DatacatConstants {
+public final class DatacatConstants {
 
     /**
      * The root directory in the catalog for HPS folders.
      */
-    public static final String ROOT_DIR = "HPS";
+    public static final String ROOT_FOLDER = "HPS";
         
     /**
      * The base URL of the datacat server.

Modified: java/trunk/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java
 =============================================================================
--- java/trunk/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java	(original)
+++ java/trunk/datacat-client/src/main/java/org/hps/datacat/client/JSONUtilities.java	Wed Feb 24 13:06:58 2016
@@ -49,6 +49,10 @@
             dataset.put("eventCount", metadataCopy.get("eventCount"));
             metadataCopy.remove("eventCount");
         }
+        if (metadataCopy.containsKey("scanStatus")) {
+            dataset.put("scanStatus", metadataCopy.get("scanStatus"));
+            metadataCopy.remove("scanStatus");
+        }
         
         if (metadata != null && metadata.size() != 0) {
             JSONArray jsonMetadata = createJSONMetadataArray(metadataCopy);
@@ -84,16 +88,22 @@
             JSONObject metadataObject = new JSONObject();
             metadataObject.put("key", entry.getKey());
             Object rawValue = entry.getValue();
+            if (rawValue == null) {
+                throw new IllegalArgumentException("The metadata key " + entry.getKey() + " has a null value.");
+            }
             if (rawValue instanceof String) {
                 metadataObject.put("type", "string");
             } else if (rawValue instanceof Integer | rawValue instanceof Long) {
                 metadataObject.put("type", "integer");
             } else if (rawValue instanceof Float | rawValue instanceof Double) {
                 metadataObject.put("type", "decimal");
+            } else if (rawValue instanceof Boolean) {
+                metadataObject.put("type", "integer");
+                rawValue = (Boolean)rawValue ? 1 : 0;
             } else {
-                throw new IllegalArgumentException("Do not know how to handle type: " + rawValue.getClass().getName());
-            }
-            metadataObject.put("value", entry.getValue());                      
+                throw new IllegalArgumentException("Metadata value " + rawValue + " with key " + entry.getKey() + " has unknown type.");
+            }            
+            metadataObject.put("value", rawValue);                      
             array.put(metadataObject);
         }                
         return array;        

Modified: java/trunk/distribution/pom.xml
 =============================================================================
--- java/trunk/distribution/pom.xml	(original)
+++ java/trunk/distribution/pom.xml	Wed Feb 24 13:06:58 2016
@@ -88,35 +88,31 @@
                                 </program>
                                 <program>
                                     <mainClass>org.hps.job.JobManager</mainClass>
-                                    <id>job</id>
+                                    <id>job-manager</id>
                                 </program>
                                 <program>
                                     <mainClass>org.hps.conditions.cli.CommandLineTool</mainClass>
-                                    <id>conddb</id>
-                                </program>
-                                <program>
-                                    <mainClass>org.hps.crawler.DatacatCrawler</mainClass>
-                                    <id>crawler</id>
+                                    <id>conditions-cli</id>
                                 </program>
                                 <program>
                                     <mainClass>org.hps.run.database.RunDatabaseCommandLine</mainClass>
-                                    <id>rundb</id>
+                                    <id>run-database-cli</id>
                                 </program>
                                 <program>
                                     <mainClass>org.hps.monitoring.application.Main</mainClass>
-                                    <id>monapp</id>
+                                    <id>monitoring-app</id>
                                 </program>
                                 <program>
                                     <mainClass>org.lcsim.geometry.compact.converter.Main</mainClass>
-                                    <id>detcnv</id>
+                                    <id>detector-converter</id>
                                 </program>
                                 <program>
                                     <mainClass>org.hps.record.evio.EvioFileProducer</mainClass>
-                                    <id>evio_file_producer</id>
+                                    <id>evio-file-producer</id>
                                 </program>
                                 <program>
                                     <mainClass>org.jlab.coda.et.apps.StartEt</mainClass>
-                                    <id>et_server</id>
+                                    <id>et-server</id>
                                     <commandLineArguments>
                                         <commandLineArgument>-f</commandLineArgument>
                                         <commandLineArgument>ETBuffer</commandLineArgument>
@@ -124,6 +120,18 @@
                                         <commandLineArgument>20000</commandLineArgument>
                                         <commandLineArgument>-v</commandLineArgument>
                                     </commandLineArguments>
+                                </program>
+                                <program>
+                                    <mainClass>org.hps.crawler.MetadataWriter</mainClass>
+                                    <id>dc-create-metadata</id>
+                                </program>
+                                <program>
+                                    <mainClass>org.hps.crawler.DatacatAddFile</mainClass>
+                                    <id>dc-add-file</id>
+                                </program>
+                                <program>
+                                    <mainClass>org.hps.crawler.DatacatCrawler</mainClass>
+                                    <id>dc-crawler</id>
                                 </program>
                             </programs>
                         </configuration>

Modified: java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalOnlineRawConverter.java
 =============================================================================
--- java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalOnlineRawConverter.java	(original)
+++ java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalOnlineRawConverter.java	Wed Feb 24 13:06:58 2016
@@ -46,7 +46,7 @@
 				System.out.println("======================================================================");
 				System.out.println("=== FADC Pulse-Processing Settings ===================================");
 				System.out.println("======================================================================");
-				config.printConfig();
+				config.printConfig(System.out);
 			}
     	});
     }

Modified: java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalRawConverter.java
 =============================================================================
--- java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalRawConverter.java	(original)
+++ java/trunk/ecal-recon/src/main/java/org/hps/recon/ecal/EcalRawConverter.java	Wed Feb 24 13:06:58 2016
@@ -181,7 +181,7 @@
 					System.out.println("======================================================================");
 					System.out.println("=== FADC Pulse-Processing Settings ===================================");
 					System.out.println("======================================================================");
-					config.printConfig();
+					config.printConfig(System.out);
 				}
 			}
     	});

Modified: java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java
 =============================================================================
--- java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java	(original)
+++ java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java	Wed Feb 24 13:06:58 2016
@@ -21,6 +21,7 @@
 import org.apache.commons.cli.Option;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.DefaultParser;
 import org.apache.commons.cli.PosixParser;
 import org.freehep.record.source.NoSuchRecordException;
 import org.hps.conditions.database.DatabaseConditionsManager;
@@ -374,6 +375,10 @@
         // Process the LCSim job variable definitions, if any.
         jobManager = new JobManager();
         
+        // Initialize run manager and add as listener on conditions system.
+        RunManager runManager = RunManager.getRunManager();
+        DatabaseConditionsManager.getInstance().addConditionsListener(runManager);
+        
         // Enable dry run because events will be processed individually.
         jobManager.setDryRun(true);
         

Modified: java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
 =============================================================================
--- java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	(original)
+++ java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	Wed Feb 24 13:06:58 2016
@@ -67,6 +67,11 @@
      * Modulus of TI timestamp offset (units of nanoseconds).
      */
     private final long timestampCycle = 24 * 6 * 35;
+    
+    /**
+     * The current TI time offset in nanoseconds from the run manager.
+     */
+    private Long currentTiTimeOffset = null;
 
     /**
      * Class constructor.
@@ -83,36 +88,51 @@
         intBanks.add(new IntBankDefinition(TIData.class, new int[]{sspCrateBankTag, 0xe10a}));
         intBanks.add(new IntBankDefinition(HeadBankData.class, new int[]{sspCrateBankTag, 0xe10f}));
         intBanks.add(new IntBankDefinition(TDCData.class, new int[]{0x3a, 0xe107}));
-        // ecalReader = new ECalEvioReader(0x25, 0x27);
         triggerConfigReader = new TriggerConfigEvioReader();
         svtEventFlagger = new SvtEventFlagger();
     }
 
     @Override
     public void conditionsChanged(final ConditionsEvent conditionsEvent) {
+        
         super.conditionsChanged(conditionsEvent);
         svtEventFlagger.initialize();
-    }
-
-    /**
-     * Get the time from the TI data.
+        
+        // Set TI time offset from run database.
+        setTiTimeOffsetForRun(conditionsEvent.getConditionsManager().getRun());
+    }
+    
+    /**
+     * Get TI time offset from the run database, if available.
+     * @param run the run number
+     */
+    private void setTiTimeOffsetForRun(int run) {
+        currentTiTimeOffset = null; /* Reset TI offset to null indicating it is not available for the run. */
+        RunManager runManager = RunManager.getRunManager();
+        if (runManager.getRun() != null) {
+            if (runManager.runExists()) {
+                currentTiTimeOffset = runManager.getRunSummary().getTiTimeOffset();
+                LOGGER.info("TI time offset set to " + currentTiTimeOffset + " for run "
+                        + run + " from database");
+            } else {
+                LOGGER.warning("Run " + run 
+                        + " does not exist in the run database.");
+            }
+        } else {
+            LOGGER.info("Run manager is not initialized; TI time offset not available.");
+        }
+    }
+
+    /**
+     * Get the time from the TI data with time offset applied from run database.
      *
      * @param triggerList the TI data list
      */
     @Override
     protected long getTime(final List<AbstractIntData> triggerList) {
         long tiTimeOffset = 0;
-        try {
-            if (RunManager.getRunManager().runExists() && RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() != null) {
-                tiTimeOffset = (RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() / timestampCycle) * timestampCycle;
-            }
-            try {
-                RunManager.getRunManager().closeConnection();
-            } catch (Exception e) {
-                e.printStackTrace();
-            }
-        } catch (IllegalStateException e) {
-            // May happen if RunManager is not initialized; just ignore.
+        if (currentTiTimeOffset != null) {
+            tiTimeOffset = (currentTiTimeOffset / timestampCycle) * timestampCycle;
         }
         for (final AbstractIntData data : triggerList) {
             if (data instanceof TIData) {

Modified: java/trunk/integration-tests/src/test/java/org/hps/test/it/ReconSteeringTest.java
 =============================================================================
--- java/trunk/integration-tests/src/test/java/org/hps/test/it/ReconSteeringTest.java	(original)
+++ java/trunk/integration-tests/src/test/java/org/hps/test/it/ReconSteeringTest.java	Wed Feb 24 13:06:58 2016
@@ -28,6 +28,7 @@
         job.addVariableDefinition("outputFile", outputFile.getPath());
         job.addInputFile(inputFile);
         job.setup(STEERING_RESOURCE);
+        job.setNumberOfEvents(1000);
         job.run();
         System.out.println("Done processing " + job.getLCSimLoop().getTotalCountableConsumed() + " events.");
                             

Modified: java/trunk/job/pom.xml
 =============================================================================
--- java/trunk/job/pom.xml	(original)
+++ java/trunk/job/pom.xml	Wed Feb 24 13:06:58 2016
@@ -19,5 +19,9 @@
             <groupId>org.hps</groupId>
             <artifactId>hps-detector-model</artifactId>
         </dependency>
+        <dependency>
+            <groupId>org.hps</groupId>
+            <artifactId>hps-run-database</artifactId>
+        </dependency>        
     </dependencies>
 </project>

Copied: java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java (from r4241, java/branches/jeremy-dev/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java)
 =============================================================================
--- java/branches/jeremy-dev/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java	(original)
+++ java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java	Wed Feb 24 13:06:58 2016
@@ -59,10 +59,13 @@
      */
     @Override
     public void configure() {
-            
+
+        LOGGER.info("configuring conditions system");
+
         // Initialize the db conditions manager.
+        DatabaseConditionsManager.resetInstance();
         DatabaseConditionsManager conditionsManager = DatabaseConditionsManager.getInstance();
-        
+
         if (enableRunManager) {
             LOGGER.config("adding run manager conditions listener");
             conditionsManager.addConditionsListener(RunManager.getRunManager());
@@ -82,6 +85,8 @@
         for (ConditionsListener listener : listeners) {
             conditionsManager.addConditionsListener(listener);
         }
+
+        LOGGER.info("done configuring conditions system");
     }
      
     /**
@@ -91,10 +96,12 @@
      */
     @Override
     public void postInitialize() {
+        LOGGER.config("conditions setup post init");
         if (DatabaseConditionsManager.getInstance().isInitialized() || this.freeze) {
             LOGGER.config("Job manager is freezing the conditions system.");
             DatabaseConditionsManager.getInstance().freeze();
         }
+        LOGGER.config("done with post init");
     }
     
     /**
@@ -104,7 +111,9 @@
      */
     @Override
     public void cleanup() {
-        
+ 
+        LOGGER.config("conditions cleanup");
+
         // Close the conditions database connection.
         Connection connection = DatabaseConditionsManager.getInstance().getConnection();
         try {
@@ -119,5 +128,7 @@
         if (enableRunManager) {
             RunManager.getRunManager().closeConnection();
         }
+
+        LOGGER.config("done cleaning up");
     }
 }

Modified: java/trunk/job/src/main/java/org/hps/job/JobManager.java
 =============================================================================
--- java/trunk/job/src/main/java/org/hps/job/JobManager.java	(original)
+++ java/trunk/job/src/main/java/org/hps/job/JobManager.java	Wed Feb 24 13:06:58 2016
@@ -1,20 +1,21 @@
 package org.hps.job;
 
-import java.io.InputStream;
+import java.util.HashSet;
+import java.util.Set;
 
-import org.hps.conditions.ConditionsDriver;
-import org.hps.conditions.database.DatabaseConditionsManager;
-import org.hps.detector.svt.SvtDetectorSetup;
+import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.Options;
 import org.lcsim.job.JobControlManager;
-import org.lcsim.util.Driver;
 
 /**
- * Extension of standard LCSim job manager which does some HPS-specific management of the conditions system.
+ * Extension of standard LCSim job manager.
+ * <p>
+ * Provides setup of database conditions system and adds option to provide conditions system tags.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class JobManager extends JobControlManager {
-
+public final class JobManager extends JobControlManager {
+    
     /**
      * Run the job manager from the command line.
      *
@@ -31,58 +32,42 @@
      * Class constructor.
      */
     public JobManager() {
-    }
-
-    /**
-     * Override setup so the conditions system can be reset.
-     * 
-     * @param is the input stream containing config information
-     */
-    public void setup(InputStream is) {
-        
-        // Add class that will setup SVT detector with conditions data (this is awkward but has to be done someplace).
-        DatabaseConditionsManager.getInstance().addConditionsListener(new SvtDetectorSetup());
-        
-        super.setup(is);
-                
-        // Setup the conditions system if there is a ConditionsDriver present.
-        this.setupConditionsDriver();
+        conditionsSetup = new DatabaseConditionsManagerSetup();
     }
     
     /**
-     * Override the parent classes method that runs the job in order to perform conditions system initialization.
-     *
-     * @return <code>true</code> if job was successful
+     * Get the conditions setup.
+     * @return the conditions setup
+     */
+    public DatabaseConditionsManagerSetup getDatabaseConditionsManagerSetup() {
+        return (DatabaseConditionsManagerSetup) this.conditionsSetup;
+    }
+    
+    /**
+     * Override creation of command line options.
+     * @return the overridden command line options
      */
     @Override
-    public final boolean run() {
-        
-        // Run the job.
-        final boolean result = super.run();
-
-        // Close the conditions database connection if it is open.
-        DatabaseConditionsManager.getInstance().closeConnection();
-
-        return result;
+    protected Options createCommandLineOptions() {
+        Options options = super.createCommandLineOptions();
+        options.addOption("t", "tag", true, "conditions system tag (can be used multiple times)");
+        return options;
     }
-
+    
     /**
-     * This method will find the {@link org.hps.conditions.ConditionsDriver} in the list of Drivers registered with the
-     * manager and then execute its initialization method, which may override the default behavior of the conditions
-     * system.
+     * Override command line parsing.
+     * @return the overridden, parsed command line
      */
-    private void setupConditionsDriver() {
-        ConditionsDriver conditionsDriver = null;
-        for (final Driver driver : this.getDriverAdapter().getDriver().drivers()) {
-            if (driver instanceof ConditionsDriver) {
-                conditionsDriver = (ConditionsDriver) driver;
-                break;
+    @Override
+    public CommandLine parse(final String args[]) {
+        CommandLine commandLine = super.parse(args);
+        if (commandLine.hasOption("t")) {
+            Set<String> tags = new HashSet<String>();
+            for (String tag : commandLine.getOptionValues("t")) {
+                tags.add(tag);
             }
+            getDatabaseConditionsManagerSetup().setTags(tags);
         }
-        if (conditionsDriver != null) {
-            LOGGER.config("initializing conditions Driver");            
-            conditionsDriver.initialize();
-            LOGGER.warning("Conditions driver will be removed soon!");
-        }
+        return commandLine;
     }
 }

Modified: java/trunk/logging/src/main/resources/org/hps/logging/config/logging.properties
 =============================================================================
--- java/trunk/logging/src/main/resources/org/hps/logging/config/logging.properties	(original)
+++ java/trunk/logging/src/main/resources/org/hps/logging/config/logging.properties	Wed Feb 24 13:06:58 2016
@@ -52,7 +52,6 @@
 # ecal-recon
 org.hps.recon.ecal.level = CONFIG
 org.hps.recon.ecal.cluster.level = WARNING
-org.hps.recon.ecal.cluster.ClusterDriver.level = WARNING
 
 # recon
 org.hps.recon.filtering.level = WARNING
@@ -78,7 +77,7 @@
 # detector-model
 org.lcsim.detector.converter.compact.level = INFO
 org.lcsim.geometry.compact.converter.level = INFO
-org.hps.detector.svt.level = ALL
+org.hps.detector.svt.level = INFO
 
 # test data
 org.hps.data.test = INFO

Modified: java/trunk/logging/src/main/resources/org/hps/logging/config/test_logging.properties
 =============================================================================
--- java/trunk/logging/src/main/resources/org/hps/logging/config/test_logging.properties	(original)
+++ java/trunk/logging/src/main/resources/org/hps/logging/config/test_logging.properties	Wed Feb 24 13:06:58 2016
@@ -21,13 +21,13 @@
 org.freehep.math.minuit = OFF
 
 # lcsim job
-org.lcsim.job.level = WARNING
+org.lcsim.job.level = CONFIG
 org.lcsim.job.EventMarkerDriver.level = OFF
 org.lcsim.job.EventPrintLoopAdapter = ALL
 
 # conditions
 org.hps.conditions.api.level = WARNING
-org.hps.conditions.database.level = CONFIG
+org.hps.conditions.database.level = WARNING
 org.hps.conditions.cli.level = WARNING
 org.hps.conditions.ecal.level = WARNING
 org.hps.conditions.svt.level = WARNING
@@ -47,12 +47,11 @@
 org.hps.crawler.level = WARNING
 
 # datacat
-org.hps.datacat.client.level = ALL
+org.hps.datacat.client.level = OFF
 
 # ecal-recon
 org.hps.recon.ecal.level = WARNING
 org.hps.recon.ecal.cluster.level = WARNING
-org.hps.recon.ecal.cluster.ClusterDriver.level = WARNING
 
 # recon
 org.hps.recon.filtering.level = WARNING
@@ -82,3 +81,6 @@
 
 # test data
 org.hps.data.test = INFO
+
+# HPS job manager
+org.hps.job.JobManager = WARNING

Modified: java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/EventProcessing.java
 =============================================================================
--- java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/EventProcessing.java	(original)
+++ java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/EventProcessing.java	Wed Feb 24 13:06:58 2016
@@ -6,11 +6,14 @@
 import java.io.InputStream;
 import java.util.ArrayList;
 import java.util.Arrays;
+import java.util.HashSet;
 import java.util.List;
+import java.util.Set;
 import java.util.logging.Logger;
 
 import org.freehep.record.loop.RecordLoop.Command;
 import org.hps.conditions.database.DatabaseConditionsManager;
+import org.hps.job.DatabaseConditionsManagerSetup;
 import org.hps.job.JobManager;
 import org.hps.monitoring.application.model.ConfigurationModel;
 import org.hps.monitoring.application.model.ConnectionStatus;
@@ -34,7 +37,6 @@
 import org.jlab.coda.et.exception.EtClosedException;
 import org.jlab.coda.et.exception.EtException;
 import org.lcsim.conditions.ConditionsListener;
-import org.lcsim.conditions.ConditionsReader;
 import org.lcsim.util.Driver;
 
 /**
@@ -194,11 +196,6 @@
     private SessionState sessionState;
    
     /**
-     * The current conditions manager.
-     */ 
-    private DatabaseConditionsManager conditionsManager;
-
-    /**
      * Class constructor, which will initialize with reference to the current monitoring application and lists of extra
      * processors to add to the loop, as well as supplemental conditions listeners that activate when the conditions
      * change.
@@ -306,7 +303,7 @@
      *
      * @param configurationModel the current global {@link org.hps.monitoring.application.ConfigurationModel} object
      */
-    private void createEventBuilder(final ConfigurationModel configurationModel) {
+    private LCSimEventBuilder createEventBuilder(final ConfigurationModel configurationModel) {
 
         // Get the class for the event builder.
         final String eventBuilderClassName = configurationModel.getEventBuilderClassName();
@@ -318,9 +315,8 @@
         } catch (final Exception e) {
             throw new RuntimeException("Failed to create LCSimEventBuilder.", e);
         }
-
-        // Add the builder as a listener so it is notified when conditions change.
-        this.conditionsManager.addConditionsListener(this.sessionState.eventBuilder);
+        
+        return this.sessionState.eventBuilder; 
     }
 
     /**
@@ -475,27 +471,59 @@
         try {
             // Create the job manager. A new conditions manager is instantiated from this call but not configured.
             this.sessionState.jobManager = new JobManager();
-
-            // Set ref to current conditions manager.
-            this.conditionsManager = DatabaseConditionsManager.getInstance();
             
-            // Add conditions listeners after new database conditions manager is initialized from the job manager.
+            // Setup class for conditions system.
+            DatabaseConditionsManagerSetup conditions = new DatabaseConditionsManagerSetup();
+            
+            // Disable run manager.
+            conditions.setEnableRunManager(false);
+            
+            // Setup the event builder to translate from EVIO to LCIO.
+            LCSimEventBuilder eventBuilder = this.createEventBuilder(configurationModel);            
+            conditions.addConditionsListener(eventBuilder);
+                        
+            // Add extra conditions listeners.
             for (final ConditionsListener conditionsListener : this.sessionState.conditionsListeners) {
-                this.logger.config("adding conditions listener " + conditionsListener.getClass().getName());
-                this.conditionsManager.addConditionsListener(conditionsListener);
-            }
-
+                this.logger.config("Adding conditions listener " + conditionsListener.getClass().getName());
+                conditions.addConditionsListener(conditionsListener);
+            }
+
+            // Add detector alias.
             if (configurationModel.hasValidProperty(ConfigurationModel.DETECTOR_ALIAS_PROPERTY)) {
-                // Set a detector alias.
-                ConditionsReader.addAlias(configurationModel.getDetectorName(),
+                conditions.addAlias(configurationModel.getDetectorName(),
                         "file://" + configurationModel.getDetectorAlias());
-                this.logger.config("using detector alias " + configurationModel.getDetectorAlias());
-            }
-
-            // Setup the event builder to translate from EVIO to LCIO.
-            // This must happen before Driver setup so the builder's listeners are activated first!
-            this.createEventBuilder(configurationModel);
-
+                this.logger.config("Added detector alias " + configurationModel.getDetectorAlias() 
+                        + " for " + configurationModel.getDetectorName());
+            }
+
+            // Add conditions tag.
+            if (configurationModel.hasValidProperty(ConfigurationModel.CONDITIONS_TAG_PROPERTY)
+                    && !configurationModel.getConditionsTag().equals("")) {
+                Set<String> tags = new HashSet<String>();
+                tags.add(configurationModel.getConditionsTag());
+                this.logger.config("Added conditions tag " + configurationModel.getConditionsTag());
+                conditions.setTags(tags);
+            }
+            
+            // Set user specified job number.
+            if (configurationModel.hasValidProperty(ConfigurationModel.USER_RUN_NUMBER_PROPERTY)) {
+                final int userRun = configurationModel.getUserRunNumber();
+                this.logger.config("User run number set to " + userRun);
+                conditions.setRun(userRun);
+            }
+            
+            // Set detector name.
+            conditions.setDetectorName(configurationModel.getDetectorName());
+            
+            // Freeze the conditions system to ignore run numbers from event data.
+            if (configurationModel.hasPropertyKey(ConfigurationModel.FREEZE_CONDITIONS_PROPERTY)) {
+                this.logger.config("user configured to freeze conditions system");
+                conditions.setFreeze(configurationModel.getFreezeConditions());
+            }
+                        
+            // Register the configured conditions settings with the job manager.
+            this.sessionState.jobManager.setConditionsSetup(conditions);
+                        
             // Configure the job manager for the XML steering.
             this.sessionState.jobManager.setDryRun(true);
             if (steeringType == SteeringType.RESOURCE) {
@@ -503,32 +531,10 @@
             } else if (steeringType.equals(SteeringType.FILE)) {
                 this.setupSteeringFile(steering);
             }
-
-            // Set conditions tag if applicable.
-            if (configurationModel.hasValidProperty(ConfigurationModel.CONDITIONS_TAG_PROPERTY)
-                    && !configurationModel.getConditionsTag().equals("")) {
-                this.logger.config("conditions tag is set to " + configurationModel.getConditionsTag());
-            } else {
-                this.logger.config("conditions NOT using a tag");
-            }
-
-            // Is there a user specified run number from the JobPanel?
-            if (configurationModel.hasValidProperty(ConfigurationModel.USER_RUN_NUMBER_PROPERTY)) {
-                final int userRunNumber = configurationModel.getUserRunNumber();
-                final String detectorName = configurationModel.getDetectorName();
-                this.logger.config("setting user run number " + userRunNumber + " with detector " + detectorName);
-                conditionsManager.setDetector(detectorName, userRunNumber);
-                if (configurationModel.hasPropertyKey(ConfigurationModel.FREEZE_CONDITIONS_PROPERTY)) {
-                    // Freeze the conditions system to ignore run numbers from the events.
-                    this.logger.config("user configured to freeze conditions system");
-                    this.conditionsManager.freeze();
-                } else {
-                    // Allow run numbers to be picked up from the events.
-                    this.logger.config("user run number provided but conditions system is NOT frozen");
-                    this.conditionsManager.unfreeze();
-                }
-            }
-
+            
+            // Post-init conditions system which may freeze if run and name were provided.
+            this.sessionState.jobManager.getConditionsSetup().postInitialize();
+            
             this.logger.info("lcsim setup was successful");
 
         } catch (final Throwable t) {
@@ -595,15 +601,19 @@
             this.logger.config("added extra Driver " + driver.getName());
         }
 
-        // Enable conditions system activation from EVIO event data in case the PRESTART is missed.
-        loopConfig.add(new EvioDetectorConditionsProcessor(configurationModel.getDetectorName()));
-        this.logger.config("added EvioDetectorConditionsProcessor to job with detector "
-                + configurationModel.getDetectorName());
+        // Enable conditions activation from EVIO; not needed if conditions are frozen for the job.
+        if (!DatabaseConditionsManager.getInstance().isFrozen()) {
+            loopConfig.add(new EvioDetectorConditionsProcessor(configurationModel.getDetectorName()));
+            this.logger.config("added EvioDetectorConditionsProcessor to job with detector "
+                    + configurationModel.getDetectorName());
+        } else {
+            this.logger.config("Conditions activation from EVIO is disabled.");
+        }
 
         // Create the CompositeLoop with the configuration.
         this.sessionState.loop = new CompositeLoop(loopConfig);
 
-        this.logger.config("record loop is setup");
+        this.logger.config("Record loop is setup.");
     }
 
     /**

Modified: java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/MonitoringApplication.java
 =============================================================================
--- java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/MonitoringApplication.java	(original)
+++ java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/MonitoringApplication.java	Wed Feb 24 13:06:58 2016
@@ -78,7 +78,6 @@
          */
         @Override
         public void close() throws SecurityException {
-            // Does nothing.
         }
 
         /**
@@ -86,7 +85,6 @@
          */
         @Override
         public void flush() {
-            // Does nothing.
         }
 
         /**
@@ -121,8 +119,6 @@
         @Override
         public void publish(final LogRecord record) {
             super.publish(record);
-
-            // FIXME: Is this efficient? Should this always happen here?
             flush();
         }
 
@@ -295,18 +291,18 @@
             loadConfiguration(new Configuration(DEFAULT_CONFIGURATION), false);
 
             if (userConfiguration != null) {
-                // Load user configuration.
+                // Load user configuration to supplement default settings.
                 loadConfiguration(userConfiguration, true);
             }
 
-            // Enable the GUI now that initialization is complete.
+            // Enable the GUI after initialization.
             this.frame.setEnabled(true);
 
-            LOGGER.info("application initialized successfully");
+            LOGGER.info("Monitoring app initialized successfully.");
 
         } catch (final Exception e) {
-            // Don't use the ErrorHandler here because we don't know that it initialized successfully.
-            System.err.println("MonitoringApplication failed to initialize without errors!");
+            // Initialization failed so print info and die.
+            System.err.println("ERROR: MonitoringApplication failed to initialize!");
             DialogUtil.showErrorDialog(null, "Error Starting Monitoring Application",
                     "Monitoring application failed to initialize.");
             e.printStackTrace();
@@ -321,9 +317,6 @@
      */
     @Override
     public void actionPerformed(final ActionEvent e) {
-
-        // logger.finest("actionPerformed - " + e.getActionCommand());
-
         final String command = e.getActionCommand();
         if (Commands.CONNECT.equals(command)) {
             startSession();
@@ -374,7 +367,7 @@
     }
 
     /**
-     * Redirect <code>System.out</code> and <code>System.err</code> to a file chosen by a file chooser.
+     * Redirect <code>System.out</code> and <code>System.err</code> to a chosen file.
      */
     private void chooseLogFile() {
         final JFileChooser fc = new JFileChooser();
@@ -420,7 +413,7 @@
     }
 
     /**
-     * Exit from the application from exit menu item or hitting close window button.
+     * Exit from the application.
      */
     private void exit() {
         if (this.connectionModel.isConnected()) {
@@ -954,7 +947,6 @@
 
             // Add listener to push conditions changes to conditions panel.
             final List<ConditionsListener> conditionsListeners = new ArrayList<ConditionsListener>();
-            //conditionsListeners.add(this.frame.getConditionsPanel().new ConditionsPanelListener());
 
             // Instantiate the event processing wrapper.
             this.processing = new EventProcessing(this, processors, drivers, conditionsListeners);
@@ -973,7 +965,7 @@
             // Start the event processing thread.
             this.processing.start();
 
-            LOGGER.info("new session successfully initialized");
+            LOGGER.info("Event processing session started.");
 
         } catch (final Exception e) {
 
@@ -989,7 +981,7 @@
                             "There was an error while starting the session." + '\n' + "See the log for details.",
                             "Session Error");
 
-            LOGGER.severe("failed to start new session");
+            LOGGER.severe("Failed to start event processing.");
         }
     }
 

Modified: java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/SystemStatusPanel.java
 =============================================================================
--- java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/SystemStatusPanel.java	(original)
+++ java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/SystemStatusPanel.java	Wed Feb 24 13:06:58 2016
@@ -1,6 +1,3 @@
-/**
- *
- */
 package org.hps.monitoring.application;
 
 import java.awt.BorderLayout;
@@ -81,7 +78,7 @@
         
         this.statuses.clear();
     }
-
+    
     private class SystemStatusBeeper extends TimerTask {
 
         @Override
@@ -93,7 +90,7 @@
                 }
             }
             if (isAlarming) {
-                System.out.println("beep\007");
+                Toolkit.getDefaultToolkit().beep();
             }
         }
     }

Modified: java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/util/TableExporter.java
 =============================================================================
--- java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/util/TableExporter.java	(original)
+++ java/trunk/monitoring-app/src/main/java/org/hps/monitoring/application/util/TableExporter.java	Wed Feb 24 13:06:58 2016
@@ -33,7 +33,7 @@
 
         // Column headers.
         for (int columnIndex = 0; columnIndex < columnCount; columnIndex++) {
-            buffer.append("\"" + model.getColumnName(columnIndex) + "\"" + fieldDelimiter);
+            buffer.append("\"" + model.getColumnName(columnIndex) + "\"" + fieldDelimiter + ",");
         }
         buffer.setLength(buffer.length() - 1);
         buffer.append('\n');
@@ -47,6 +47,7 @@
                 } else {
                     buffer.append("\"" + value + "\"" + fieldDelimiter);
                 }
+                buffer.append(",");
             }
             buffer.setLength(buffer.length() - 1);
             buffer.append('\n');

Modified: java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/PlotAndFitUtilities.java
 =============================================================================
--- java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/PlotAndFitUtilities.java	(original)
+++ java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/PlotAndFitUtilities.java	Wed Feb 24 13:06:58 2016
@@ -50,7 +50,7 @@
     static void plot(IPlotter plotter, IBaseHistogram histogram, IPlotterStyle style, int region) {
         if (style == null)
             style = getPlotterStyle(histogram);
-        System.out.println("Putting plot in region " + region);
+        //System.out.println("Putting plot in region " + region);
         plotter.region(region).plot(histogram, style);
 
     }

Modified: java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/V0ReconPlots.java
 =============================================================================
--- java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/V0ReconPlots.java	(original)
+++ java/trunk/monitoring-drivers/src/main/java/org/hps/monitoring/drivers/trackrecon/V0ReconPlots.java	Wed Feb 24 13:06:58 2016
@@ -58,7 +58,7 @@
 
     @Override
     protected void detectorChanged(Detector detector) {
-        System.out.println("V0Monitoring::detectorChanged  Setting up the plotter");
+        //System.out.println("V0Monitoring::detectorChanged  Setting up the plotter");
 
         IAnalysisFactory fac = aida.analysisFactory();
         IPlotterFactory pfac = fac.createPlotterFactory("V0 Recon");

Modified: java/trunk/parent/pom.xml
 =============================================================================
--- java/trunk/parent/pom.xml	(original)
+++ java/trunk/parent/pom.xml	Wed Feb 24 13:06:58 2016
@@ -12,7 +12,7 @@
     <properties>
         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
         <org.lcsim.cacheDir>${user.home}</org.lcsim.cacheDir>
-        <lcsimVersion>3.1.7</lcsimVersion>
+        <lcsimVersion>3.2-SNAPSHOT</lcsimVersion>
         <skipSite>false</skipSite>
         <skipPlugin>false</skipPlugin>
     </properties>
@@ -35,17 +35,22 @@
         </repository>
         <repository>
             <id>freehep-repo-public</id>
-            <name>FreeHEP Maven Public</name>
+            <name>FreeHEP</name>
             <url>http://srs.slac.stanford.edu/nexus/content/groups/freehep-maven2-public/</url>
         </repository>
         <repository>
+            <id>srs-repo-public</id>
+            <name>SRS</name>
+            <url>http://srs.slac.stanford.edu/nexus/content/groups/srs-maven2-public/</url>
+        </repository>        
+        <repository>
             <id>lcsim-repo-public</id>
-            <name>LCSIM Public Maven Repository</name>
+            <name>LCSim</name>
             <url>http://srs.slac.stanford.edu/nexus/content/groups/lcsim-maven2-public/</url>
         </repository>
         <repository>
             <id>jlab-coda-repo-public</id>
-            <name>JLAB CODA Maven Repository</name>
+            <name>CODA</name>
             <url>https://coda.jlab.org/maven/</url>
         </repository>
     </repositories>
@@ -59,13 +64,13 @@
     <distributionManagement>
         <repository>
             <id>lcsim-repo-releases</id>
-            <name>LCSIM Releases maven repository</name>
+            <name>LCSim Releases</name>
             <!--<url>http://srs.slac.stanford.edu/nexus/content/repositories/lcsim-maven2-releases/</url>-->
             <url>http://scalnx-v01.slac.stanford.edu:8180/nexus/content/repositories/lcsim-maven2-releases/</url>
         </repository>
         <snapshotRepository>
             <id>lcsim-repo-snapshots</id>
-            <name>LCSIM Snapshots maven repository</name>
+            <name>LCSim Snapshots</name>
             <!--<url>http://srs.slac.stanford.edu/nexus/content/repositories/lcsim-maven2-snapshot/</url>-->
             <url>http://scalnx-v01.slac.stanford.edu:8180/nexus/content/repositories/lcsim-maven2-snapshot/</url>
         </snapshotRepository>
@@ -289,6 +294,11 @@
                         <artifactId>commons-math</artifactId>
                     </exclusion>
                 </exclusions>
+            </dependency>
+            <dependency>
+                <groupId>srs</groupId>
+                <artifactId>org-srs-datacat-client</artifactId>
+                <version>0.5-SNAPSHOT</version>
             </dependency>
         </dependencies>
     </dependencyManagement>

Modified: java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordProcessor.java	Wed Feb 24 13:06:58 2016
@@ -10,6 +10,8 @@
  */
 public abstract class AbstractRecordProcessor<RecordType> implements RecordProcessor<RecordType> {
 
+    private boolean active = true;
+    
     /**
      * End of job action.
      */
@@ -59,4 +61,12 @@
     @Override
     public void suspend() {
     }
+    
+    protected void setActive(boolean active) {
+        this.active = active;
+    }
+    
+    public boolean isActive() {
+        return this.active;
+    }
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordQueue.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordQueue.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/AbstractRecordQueue.java	Wed Feb 24 13:06:58 2016
@@ -102,7 +102,7 @@
      * @throws NoSuchRecordException if there are no records available from the queue
      */
     @Override
-    public void next() throws IOException, NoSuchRecordException {
+    public synchronized void next() throws IOException, NoSuchRecordException {
         try {
             if (this.timeOutMillis > 0L) {
                 // Poll the queue for the next record until timeout is exceeded.

Modified: java/trunk/record-util/src/main/java/org/hps/record/RecordProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/RecordProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/RecordProcessor.java	Wed Feb 24 13:06:58 2016
@@ -45,4 +45,11 @@
      * Suspend processing action.
      */
     void suspend();
+    
+    /**
+     * Return <code>true</code> if processor is active.
+     * 
+     * @return <code>true</code> if processor is active
+     */
+    boolean isActive();
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/daqconfig/DAQConfig.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/DAQConfig.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/daqconfig/DAQConfig.java	Wed Feb 24 13:06:58 2016
@@ -1,4 +1,8 @@
 package org.hps.record.daqconfig;
+
+import java.io.ByteArrayOutputStream;
+import java.io.PrintStream;
+
 
 /**
  * Class <code>DAQConfig</code> holds all of the supported parameters
@@ -47,13 +51,23 @@
     }
 
     @Override
-    public void printConfig() {
+    public void printConfig(PrintStream ps) {
         // Print the system-specific objects.
-        fadcConfig.printConfig();
-        System.out.println();
-        gtpConfig.printConfig();
-        System.out.println();
-        sspConfig.printConfig();
+        fadcConfig.printConfig(ps);
+        ps.println();
+        gtpConfig.printConfig(ps);
+        ps.println();
+        sspConfig.printConfig(ps);
     }
-    
-}
+
+    public String toString() {
+        ByteArrayOutputStream os = new ByteArrayOutputStream();
+        PrintStream ps = new PrintStream(os);
+        printConfig(ps);
+        try {
+            return os.toString("UTF8");
+        } catch (Exception e) {
+            throw new RuntimeException(e);
+        }
+    }
+}

Modified: java/trunk/record-util/src/main/java/org/hps/record/daqconfig/FADCConfig.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/FADCConfig.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/daqconfig/FADCConfig.java	Wed Feb 24 13:06:58 2016
@@ -1,6 +1,7 @@
 package org.hps.record.daqconfig;
 
 import java.awt.Point;
+import java.io.PrintStream;
 import java.util.HashMap;
 import java.util.Map;
 
@@ -305,18 +306,18 @@
     }
     
     @Override
-    public void printConfig() {
+    public void printConfig(PrintStream ps) {
     	// Print the basic configuration information.
-        System.out.println("FADC Configuration:");
-        System.out.printf("\tMode          :: %d%n", mode);
-        System.out.printf("\tNSA           :: %d%n", nsa);
-        System.out.printf("\tNSB           :: %d%n", nsb);
-        System.out.printf("\tWindow Width  :: %d%n", windowWidth);
-        System.out.printf("\tWindow Offset :: %d%n", offset);
-        System.out.printf("\tMax Peaks     :: %d%n", maxPulses);
+        ps.println("FADC Configuration:");
+        ps.printf("\tMode          :: %d%n", mode);
+        ps.printf("\tNSA           :: %d%n", nsa);
+        ps.printf("\tNSB           :: %d%n", nsb);
+        ps.printf("\tWindow Width  :: %d%n", windowWidth);
+        ps.printf("\tWindow Offset :: %d%n", offset);
+        ps.printf("\tMax Peaks     :: %d%n", maxPulses);
         
         // Output the pedestal/gain write-out header.
-        System.out.println("\tix\tiy\tPedestal (ADC)\tGain (MeV/ADC)\tThreshold (ADC)");
+        ps.println("\tix\tiy\tPedestal (ADC)\tGain (MeV/ADC)\tThreshold (ADC)");
         
         // Iterate over each crystal y-index.
         yLoop:
@@ -335,7 +336,7 @@
         		
         		// Output the crystal indices, pedestal, and gain.
         		int channelID = indexChannelMap.get(new Point(ix, iy));
-        		System.out.printf("\t%3d\t%3d\t%8.3f\t%8.3f\t%4d%n", ix, iy,
+        		ps.printf("\t%3d\t%3d\t%8.3f\t%8.3f\t%4d%n", ix, iy,
         				getPedestal(channelID), getGain(channelID), getThreshold(channelID));
         	}
         }

Modified: java/trunk/record-util/src/main/java/org/hps/record/daqconfig/GTPConfig.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/GTPConfig.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/daqconfig/GTPConfig.java	Wed Feb 24 13:06:58 2016
@@ -1,4 +1,6 @@
 package org.hps.record.daqconfig;
+
+import java.io.PrintStream;
 
 /**
  * Class <code>GTPConfig</code> stores GTP configuration settings
@@ -57,14 +59,14 @@
     }
     
     @Override
-    public void printConfig() {
+    public void printConfig(PrintStream ps) {
         // Print the configuration header.
-        System.out.println("GTP Configuration:");
+        ps.println("GTP Configuration:");
         
         // Print the GTP settings.
-        System.out.printf("\tTime Window Before :: %d clock-cycles%n", windowBefore);
-        System.out.printf("\tTime Window After  :: %d clock-cycles%n", windowAfter);
-        System.out.printf("\tSeed Energy Min    :: %5.3f GeV%n",       seedCut.getLowerBound());
+        ps.printf("\tTime Window Before :: %d clock-cycles%n", windowBefore);
+        ps.printf("\tTime Window After  :: %d clock-cycles%n", windowAfter);
+        ps.printf("\tSeed Energy Min    :: %5.3f GeV%n",       seedCut.getLowerBound());
     }
 
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/daqconfig/IDAQConfig.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/IDAQConfig.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/daqconfig/IDAQConfig.java	Wed Feb 24 13:06:58 2016
@@ -1,4 +1,6 @@
 package org.hps.record.daqconfig;
+
+import java.io.PrintStream;
 
 
 /**
@@ -20,5 +22,5 @@
      * Prints a textual representation of the configuration bank to the
      * terminal.
      */
-    public abstract void printConfig();
+    public abstract void printConfig(PrintStream ps);
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/daqconfig/SSPConfig.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/SSPConfig.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/daqconfig/SSPConfig.java	Wed Feb 24 13:06:58 2016
@@ -1,4 +1,6 @@
 package org.hps.record.daqconfig;
+
+import java.io.PrintStream;
 
 
 /**
@@ -98,64 +100,64 @@
     }
     
     @Override
-    public void printConfig() {
+    public void printConfig(PrintStream ps) {
         // Print the configuration header.
-        System.out.println("SSP Configuration:");
+        ps.println("SSP Configuration:");
         
         // Print the singles triggers.
         for(int triggerNum = 0; triggerNum < 2; triggerNum++) {
-            System.out.printf("\tSingles Trigger %d%n", (triggerNum + 1));
-            System.out.println("\t\tCluster Energy Lower Bound Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getEnergyMinCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", singlesTrigger[triggerNum].getEnergyMinCutConfig().getLowerBound());
+            ps.printf("\tSingles Trigger %d%n", (triggerNum + 1));
+            ps.println("\t\tCluster Energy Lower Bound Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getEnergyMinCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", singlesTrigger[triggerNum].getEnergyMinCutConfig().getLowerBound());
             
-            System.out.println("\t\tCluster Energy Upper Bound Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getEnergyMaxCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", singlesTrigger[triggerNum].getEnergyMaxCutConfig().getUpperBound());
+            ps.println("\t\tCluster Energy Upper Bound Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getEnergyMaxCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", singlesTrigger[triggerNum].getEnergyMaxCutConfig().getUpperBound());
             
-            System.out.println("\t\tCluster Hit Count Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getHitCountCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %1.0f hits%n", singlesTrigger[triggerNum].getHitCountCutConfig().getLowerBound());
-            System.out.println();
+            ps.println("\t\tCluster Hit Count Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", singlesTrigger[triggerNum].getHitCountCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %1.0f hits%n", singlesTrigger[triggerNum].getHitCountCutConfig().getLowerBound());
+            ps.println();
         }
         
         // Print the pair triggers.
         for(int triggerNum = 0; triggerNum < 2; triggerNum++) {
-            System.out.printf("\tPair Trigger %d%n", (triggerNum + 1));
-            System.out.println("\t\tCluster Energy Lower Bound Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyMinCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyMinCutConfig().getLowerBound());
+            ps.printf("\tPair Trigger %d%n", (triggerNum + 1));
+            ps.println("\t\tCluster Energy Lower Bound Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyMinCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyMinCutConfig().getLowerBound());
             
-            System.out.println("\t\tCluster Energy Upper Bound Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyMaxCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyMaxCutConfig().getUpperBound());
+            ps.println("\t\tCluster Energy Upper Bound Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyMaxCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyMaxCutConfig().getUpperBound());
             
-            System.out.println("\t\tCluster Hit Count Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getHitCountCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %1.0f hits%n", pairTrigger[triggerNum].getHitCountCutConfig().getLowerBound());
+            ps.println("\t\tCluster Hit Count Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getHitCountCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %1.0f hits%n", pairTrigger[triggerNum].getHitCountCutConfig().getLowerBound());
             
-            System.out.println("\t\tPair Energy Sum Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergySumCutConfig().isEnabled());
-            System.out.printf("\t\t\tMin     :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySumCutConfig().getLowerBound());
-            System.out.printf("\t\t\tMax     :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySumCutConfig().getUpperBound());
+            ps.println("\t\tPair Energy Sum Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergySumCutConfig().isEnabled());
+            ps.printf("\t\t\tMin     :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySumCutConfig().getLowerBound());
+            ps.printf("\t\t\tMax     :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySumCutConfig().getUpperBound());
             
-            System.out.println("\t\tPair Energy Difference Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyDifferenceCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyDifferenceCutConfig().getUpperBound());
+            ps.println("\t\tPair Energy Difference Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergyDifferenceCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergyDifferenceCutConfig().getUpperBound());
             
-            System.out.println("\t\tPair Energy Slope Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().getLowerBound());
-            System.out.printf("\t\t\tParam F :: %6.4f GeV/mm%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().getParameterF());
+            ps.println("\t\tPair Energy Slope Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %5.3f GeV%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().getLowerBound());
+            ps.printf("\t\t\tParam F :: %6.4f GeV/mm%n", pairTrigger[triggerNum].getEnergySlopeCutConfig().getParameterF());
             
-            System.out.println("\t\tPair Coplanarity Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getCoplanarityCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %3.0f degrees%n", pairTrigger[triggerNum].getCoplanarityCutConfig().getUpperBound());
+            ps.println("\t\tPair Coplanarity Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getCoplanarityCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %3.0f degrees%n", pairTrigger[triggerNum].getCoplanarityCutConfig().getUpperBound());
             
-            System.out.println("\t\tPair Time Coincidence Cut");
-            System.out.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getTimeDifferenceCutConfig().isEnabled());
-            System.out.printf("\t\t\tValue   :: %1.0f clock-cycles%n", pairTrigger[triggerNum].getTimeDifferenceCutConfig().getUpperBound());
-            System.out.println();
+            ps.println("\t\tPair Time Coincidence Cut");
+            ps.printf("\t\t\tEnabled :: %b%n", pairTrigger[triggerNum].getTimeDifferenceCutConfig().isEnabled());
+            ps.printf("\t\t\tValue   :: %1.0f clock-cycles%n", pairTrigger[triggerNum].getTimeDifferenceCutConfig().getUpperBound());
+            ps.println();
         }
     }
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsData.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsData.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsData.java	Wed Feb 24 13:06:58 2016
@@ -276,12 +276,13 @@
      * @param evioEvent the EVIO event
      * @return the EPICS data or <code>null</code> if it is not present in the event
      */
+    // FIXME: Not currently used.
     public static EpicsData getEpicsData(EvioEvent evioEvent) {
         
         EpicsData epicsData = null;
         
         // Is this an EPICS event?
-        if (EventTagConstant.EPICS.equals(evioEvent)) {
+        if (EventTagConstant.EPICS.matches(evioEvent)) {
 
             // Find the bank with the EPICS data string.
             final BaseStructure epicsBank = EvioBankTag.EPICS_STRING.findBank(evioEvent);

Modified: java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsEvioProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsEvioProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsEvioProcessor.java	Wed Feb 24 13:06:58 2016
@@ -45,7 +45,7 @@
     public void process(final EvioEvent evioEvent) {
 
         // Is this an EPICS event?
-        if (EventTagConstant.EPICS.equals(evioEvent)) {
+        if (EventTagConstant.EPICS.matches(evioEvent)) {
 
             LOGGER.fine("processing EPICS event " + evioEvent.getEventNumber());
 

Modified: java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/epics/EpicsRunProcessor.java	Wed Feb 24 13:06:58 2016
@@ -39,7 +39,7 @@
     private final EpicsEvioProcessor processor = new EpicsEvioProcessor();
 
     /**
-     * Create an EPICs log.
+     * Create a processor that will make a list of EPICS data.
      */
     public EpicsRunProcessor() {
     }
@@ -66,9 +66,9 @@
 
         // Add EPICS data to the collection.
         if (this.currentEpicsData != null) {
-            LOGGER.info("adding EPICS data for run " + this.currentEpicsData.getEpicsHeader().getRun() + " and timestamp " 
-                    + this.currentEpicsData.getEpicsHeader().getTimestamp() + " with seq " 
-                    + this.currentEpicsData.getEpicsHeader().getSequence());
+            LOGGER.fine("Adding EPICS data with run " + this.currentEpicsData.getEpicsHeader().getRun() + "; timestamp " 
+                    + this.currentEpicsData.getEpicsHeader().getTimestamp() + "; seq "
+                    + this.currentEpicsData.getEpicsHeader().getSequence() + ".");
             this.epicsDataSet.add(this.currentEpicsData);
         }
     }

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagConstant.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagConstant.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EventTagConstant.java	Wed Feb 24 13:06:58 2016
@@ -63,7 +63,7 @@
         return tag == this.tag;
     }
     
-    public boolean equals(final EvioEvent evioEvent) {
+    public boolean matches(final EvioEvent evioEvent) {
         return evioEvent.getHeader().getTag() == this.tag;
     }
     

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioBankTag.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioBankTag.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioBankTag.java	Wed Feb 24 13:06:58 2016
@@ -25,8 +25,10 @@
     /** Scaler data bank. */
     SCALERS(57621),
     /** Trigger configuration bank. */
-    TRIGGER_CONFIG(0xE10E);
-
+    TRIGGER_CONFIG(0xE10E),
+    /** TI trigger bank. */
+    TI_TRIGGER(0xe10a);
+    
     /**
      * The bank's tag value.
      */
@@ -46,7 +48,7 @@
      *
      * @param startBank the starting bank
      * @return the first bank matching the tag or <code>null<code> if not found
-     */
+     */    
     public BaseStructure findBank(final BaseStructure startBank) {
         BaseStructure foundBank = null;
         if (this.equals(startBank)) {
@@ -60,8 +62,8 @@
             }
         }
         return foundBank;
-    }
-
+    }    
+    
     /**
      * Get the bank tag value.
      *

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioDetectorConditionsProcessor.java	Wed Feb 24 13:06:58 2016
@@ -39,21 +39,28 @@
      */
     @Override
     public void process(final EvioEvent evioEvent) throws Exception {
+        
         // Get the head head bank from event.
         final BaseStructure headBank = EvioEventUtilities.getHeadBank(evioEvent);
 
-        // Is the head bank present?
-        if (headBank != null) {
+        // Initialize from head bank.
+        if (headBank != null) {            
+            initializeConditions(headBank.getIntData()[1]);
+        }
+        
+        // Initialize from PRESTART.
+        if (EventTagConstant.PRESTART.matches(evioEvent)) {
+            int runNumber = EvioEventUtilities.getControlEventData(evioEvent)[1];
+            initializeConditions(runNumber);
+        }
+    }
 
-            // Get the run number from the head bank.
-            final int runNumber = headBank.getIntData()[1];
-
-            // Initialize the conditions system from the detector name and run number.
-            try {
-                ConditionsManager.defaultInstance().setDetector(this.detectorName, runNumber);
-            } catch (final ConditionsNotFoundException e) {
-                throw new RuntimeException("Error setting up conditions from EVIO head bank.", e);
-            }
+    private void initializeConditions(final int runNumber) {
+        // Initialize the conditions system from the detector name and run number.
+        try {
+            ConditionsManager.defaultInstance().setDetector(this.detectorName, runNumber);
+        } catch (final ConditionsNotFoundException e) {
+            throw new RuntimeException("Error setting up conditions from EVIO head bank.", e);
         }
     }
 
@@ -65,6 +72,7 @@
      * @param evioEvent the <code>EvioEvent</code> to process
      */
     @Override
+    // FIXME: not activated by EvioLoop
     public void startRun(final EvioEvent evioEvent) {
         // System.out.println("EvioDetectorConditionsProcessor.startRun");
         if (EvioEventUtilities.isPreStartEvent(evioEvent)) {

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioEventUtilities.java	Wed Feb 24 13:06:58 2016
@@ -13,9 +13,6 @@
 
 import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.record.daqconfig.EvioDAQParser;
-import org.hps.record.epics.EpicsData;
-import org.hps.record.epics.EpicsHeader;
-import org.hps.record.scalers.ScalerData;
 import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
 import org.lcsim.conditions.ConditionsManager.ConditionsNotFoundException;
@@ -113,23 +110,18 @@
     /**
      * Get the run number from an EVIO event.
      *
-     * @return the run number
-     * @throws IllegalArgumentException if event does not have a head bank
-     */
-    public static int getRunNumber(final EvioEvent event) {
+     * @return the run number or <code>null</code> if not present in event
+     */
+    public static Integer getRunNumber(final EvioEvent event) {
         if (isControlEvent(event)) {
             return getControlEventData(event)[1];
         } else if (isPhysicsEvent(event)) {
             final BaseStructure headBank = EvioEventUtilities.getHeadBank(event);
             if (headBank != null) {
                 return headBank.getIntData()[1];
-            } else {
-                throw new IllegalArgumentException("Head bank is missing from physics event.");
-            }
-        } else {
-            // Not sure if this would ever happen.
-            throw new IllegalArgumentException("Wrong event type: " + event.getHeader().getTag());
-        }
+            } 
+        } 
+        return null;
     }
 
     /**

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileSource.java	Wed Feb 24 13:06:58 2016
@@ -4,6 +4,8 @@
 import java.io.IOException;
 import java.util.ArrayList;
 import java.util.List;
+import java.util.logging.Level;
+import java.util.logging.Logger;
 
 import org.freehep.record.source.NoSuchRecordException;
 import org.hps.record.AbstractRecordQueue;
@@ -12,15 +14,15 @@
 import org.jlab.coda.jevio.EvioReader;
 
 /**
- * A basic implementation of an <tt>AbstractRecordSource</tt> for supplying <tt>EvioEvent</tt> objects to a loop from a
- * list of EVIO files.
- * <p>
- * Unlike the LCIO record source, it has no rewind or indexing capabilities.
+ * A basic implementation of an <code>AbstractRecordSource</code> for supplying <code>EvioEvent</code> objects to a 
+ * loop from a list of EVIO files.
  *
  * @author Jeremy McCormick, SLAC
  */
 public final class EvioFileSource extends AbstractRecordQueue<EvioEvent> {
 
+    private static final Logger LOGGER = Logger.getLogger(EvioFileSource.class.getPackage().getName());
+    
     /**
      * The current event.
      */
@@ -40,7 +42,12 @@
      * The reader to use for reading and parsing the EVIO data.
      */
     private EvioReader reader;
-
+    
+    /**
+     * Whether to continue on parse errors or not.
+     */
+    private boolean continueOnErrors = false;
+   
     /**
      * Constructor taking a single EVIO file.
      *
@@ -60,7 +67,15 @@
         this.files.addAll(files);
         this.openReader();
     }
-
+    
+    /**
+     * Set whether to continue on errors or not.
+     * @param continueOnErrors <code>true</code> to continue on errors
+     */
+    public void setContinueOnErrors(boolean continueOnErrors) {
+        this.continueOnErrors = continueOnErrors;
+    }
+    
     /**
      * Close the current reader.
      */
@@ -135,20 +150,26 @@
         for (;;) {
             try {
                 this.currentEvent = this.reader.parseNextEvent();
-            } catch (final EvioException e) {
-                throw new IOException(e);
+                if (this.reader.getNumEventsRemaining() == 0 && this.currentEvent == null) {
+                    this.closeReader();
+                    this.fileIndex++;
+                    if (!this.endOfFiles()) {
+                        this.openReader();
+                    } else {
+                        throw new NoSuchRecordException("End of data.");
+                    }
+                } else {
+                    LOGGER.finest("Read EVIO event " + this.currentEvent.getEventNumber() + " okay.");
+                    break;
+                }                   
+            } catch (EvioException | NegativeArraySizeException e) { 
+                LOGGER.log(Level.SEVERE, "Error parsing next EVIO event.", e);
+                if (!continueOnErrors) {
+                    throw new IOException("Fatal error parsing next EVIO event.", e);
+                }
+            } catch (Exception e) {
+                throw new IOException("Error parsing EVIO event.", e);
             }
-            if (this.currentEvent == null) {
-                this.closeReader();
-                this.fileIndex++;
-                if (!this.endOfFiles()) {
-                    this.openReader();
-                    continue;
-                } else {
-                    throw new NoSuchRecordException();
-                }
-            }
-            return;
         }
     }
 
@@ -159,10 +180,9 @@
      */
     private void openReader() {
         try {
-            System.out.println("Opening reader for file " + this.files.get(this.fileIndex) + " ...");
-            // FIXME: this should use the reader directly and cached paths should be managed externally
+            // FIXME: This should use the reader directly and MSS paths should be transformed externally.
+            LOGGER.info("opening EVIO file " + this.files.get(this.fileIndex).getPath() + " ...");
             this.reader = EvioFileUtilities.open(this.files.get(this.fileIndex), true);
-            System.out.println("Done opening file.");
         } catch (EvioException | IOException e) {
             throw new RuntimeException(e);
         }

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioFileUtilities.java	Wed Feb 24 13:06:58 2016
@@ -25,24 +25,7 @@
      * Milliseconds constant for conversion to/from second.
      */
     private static final long MILLISECONDS = 1000L;
-
-    /**
-     * Get a cached file path, assuming that the input file path is on the JLAB MSS e.g. it starts with "/mss".
-     *
-     * @param file the MSS file path
-     * @return the cached file path (prepends "/cache" to the path)
-     * @throws IllegalArgumentException if the file is not on the MSS (e.g. path does not start with "/mss")
-     */
-    public static File getCachedFile(final File file) {
-        if (!isMssFile(file)) {
-            throw new IllegalArgumentException("File " + file.getPath() + " is not on the JLab MSS.");
-        }
-        if (isCachedFile(file)) {
-            throw new IllegalArgumentException("File " + file.getPath() + " is already on the cache disk.");
-        }
-        return new File("/cache" + file.getPath());
-    }
-
+   
     /**
      * Get the run number from the file name.
      *
@@ -70,26 +53,6 @@
     }
 
     /**
-     * Return <code>true</code> if this is a file on the cache disk e.g. the path starts with "/cache".
-     *
-     * @param file the file
-     * @return <code>true</code> if the file is a cached file
-     */
-    public static boolean isCachedFile(final File file) {
-        return file.getPath().startsWith("/cache");
-    }
-
-    /**
-     * Return <code>true</code> if this file is on the JLAB MSS e.g. the path starts with "/mss".
-     *
-     * @param file the file
-     * @return <code>true</code> if the file is on the MSS
-     */
-    public static boolean isMssFile(final File file) {
-        return file.getPath().startsWith("/mss");
-    }
-
-    /**
      * Open an EVIO file using an <code>EvioReader</code> in memory mapping mode.
      *
      * @param file the EVIO file
@@ -98,7 +61,7 @@
      * @throws EvioException if there is an error reading the EVIO data
      */
     public static EvioReader open(final File file) throws IOException, EvioException {
-        return open(file, false);
+        return open(file, true);
     }
 
     /**
@@ -111,14 +74,11 @@
      * @throws EvioException if there is an error reading the EVIO data
      */
     public static EvioReader open(final File file, final boolean sequential) throws IOException, EvioException {
-        File openFile = file;
-        if (isMssFile(file)) {
-            openFile = getCachedFile(file);
-        }
+        LOGGER.info("opening " + file.getPath() + " in " + (sequential ? "sequential" : "mmap" + " mode"));
         final long start = System.currentTimeMillis();
-        final EvioReader reader = new EvioReader(openFile, false, sequential);
+        final EvioReader reader = new EvioReader(file, false, sequential);
         final long end = System.currentTimeMillis() - start;
-        LOGGER.info("opened " + openFile.getPath() + " in " + (double) end / (double) MILLISECONDS + " seconds in "
+        LOGGER.info("opened " + file.getPath() + " in " + (double) end / (double) MILLISECONDS + " seconds in "
                 + (sequential ? "sequential" : "mmap" + " mode"));
         return reader;
     }

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoop.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoop.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoop.java	Wed Feb 24 13:06:58 2016
@@ -1,52 +1,24 @@
 package org.hps.record.evio;
 
-import org.freehep.record.loop.DefaultRecordLoop;
+import org.hps.record.AbstractRecordLoop;
+import org.jlab.coda.jevio.EvioEvent;
 
 /**
  * Implementation of a Freehep <code>RecordLoop</code> for EVIO data.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class EvioLoop extends DefaultRecordLoop {
-
-    /**
-     * The record adapter.
-     */
-    private final EvioLoopAdapter adapter = new EvioLoopAdapter();
+public class EvioLoop extends AbstractRecordLoop<EvioEvent> {
 
     /**
      * Create a new record loop.
      */
     public EvioLoop() {
+        this.adapter = new EvioLoopAdapter();
         this.addLoopListener(adapter);
         this.addRecordListener(adapter);
     }
-
-    /**
-     * Add an EVIO event processor to the adapter which will be activated for every EVIO event that is processed.
-     *
-     * @param evioEventProcessor the EVIO processor to add
-     */
-    public void addEvioEventProcessor(final EvioEventProcessor evioEventProcessor) {
-        adapter.addEvioEventProcessor(evioEventProcessor);
-    }
-
-    /**
-     * Loop over events from the source.
-     *
-     * @param number the number of events to process or -1L for all events from the source
-     * @return the number of records that were processed
-     */
-    public long loop(final long number) {
-        if (number < 0L) {
-            this.execute(Command.GO, true);
-        } else {
-            this.execute(Command.GO_N, number, true);
-            this.execute(Command.STOP);
-        }
-        return this.getSupplied();
-    }
-
+  
     /**
      * Set the EVIO data source.
      *

Modified: java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/evio/EvioLoopAdapter.java	Wed Feb 24 13:06:58 2016
@@ -1,14 +1,6 @@
 package org.hps.record.evio;
 
-import java.util.ArrayList;
-import java.util.List;
-import java.util.logging.Logger;
-
-import org.freehep.record.loop.AbstractLoopListener;
-import org.freehep.record.loop.LoopEvent;
-import org.freehep.record.loop.LoopListener;
-import org.freehep.record.loop.RecordEvent;
-import org.freehep.record.loop.RecordListener;
+import org.hps.record.AbstractLoopAdapter;
 import org.jlab.coda.jevio.EvioEvent;
 
 /**
@@ -16,79 +8,5 @@
  *
  * @author Jeremy McCormick, SLAC
  */
-public final class EvioLoopAdapter extends AbstractLoopListener implements RecordListener, LoopListener {
-
-    /**
-     * Initialize the logger.
-     */
-    private static final Logger LOGGER = Logger.getLogger(EvioLoopAdapter.class.getPackage().getName());
-
-    /**
-     * List of event processors to activate.
-     */
-    private final List<EvioEventProcessor> processors = new ArrayList<EvioEventProcessor>();
-
-    /**
-     * Create a new loop adapter.
-     */
-    EvioLoopAdapter() {
-    }
-
-    /**
-     * Add an EVIO processor to the adapter.
-     *
-     * @param processor the EVIO processor to add to the adapter
-     */
-    void addEvioEventProcessor(final EvioEventProcessor processor) {
-        LOGGER.info("adding " + processor.getClass().getName() + " to EVIO processors");
-        this.processors.add(processor);
-    }
-
-    /**
-     * Implementation of the finish hook which activates the {@link EvioEventProcessor#endJob()} method of all
-     * registered processors.
-     */
-    @Override
-    protected void finish(final LoopEvent event) {
-        LOGGER.info("finish");
-        for (final EvioEventProcessor processor : processors) {
-            processor.endJob();
-        }
-    }
-
-    /**
-     * Primary event processing method that activates the {@link EvioEventProcessor#process(EvioEvent)} method of all
-     * registered processors.
-     *
-     * @param recordEvent the record event to process which should have an EVIO event
-     * @throws IllegalArgumentException if the record is the wrong type
-     */
-    @Override
-    public void recordSupplied(final RecordEvent recordEvent) {
-        final Object record = recordEvent.getRecord();
-        if (record instanceof EvioEvent) {
-            final EvioEvent evioEvent = EvioEvent.class.cast(record);
-            for (final EvioEventProcessor processor : processors) {
-                try {
-                    processor.process(evioEvent);
-                } catch (final Exception e) {
-                    throw new RuntimeException(e);
-                }
-            }
-        } else {
-            throw new IllegalArgumentException("The supplied record has the wrong type: " + record.getClass());
-        }
-    }
-
-    /**
-     * Implementation of the start hook which activates the {@link EvioEventProcessor#startJob()} method of all
-     * registered processors.
-     */
-    @Override
-    protected void start(final LoopEvent event) {
-        LOGGER.info("start");
-        for (final EvioEventProcessor processor : processors) {
-            processor.startJob();
-        }
-    }
+public final class EvioLoopAdapter extends AbstractLoopAdapter<EvioEvent> {
 }

Modified: java/trunk/record-util/src/main/java/org/hps/record/scalers/ScalerUtilities.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/scalers/ScalerUtilities.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/scalers/ScalerUtilities.java	Wed Feb 24 13:06:58 2016
@@ -74,7 +74,6 @@
         // [67]/[68] = CLOCK
         final double clock = (double) clockGated / (double) clockUngated;
 
-        // Compute the live times.
         final double[] liveTimes = new double[3];
         liveTimes[LiveTimeIndex.FCUP_TDC.ordinal()] = fcupTdc;
         liveTimes[LiveTimeIndex.FCUP_TRG.ordinal()] = fcupTrg;

Modified: java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TiTimeOffsetEvioProcessor.java	Wed Feb 24 13:06:58 2016
@@ -55,13 +55,30 @@
             }
         }
     }
+    
+    public long getMinOffset() {
+        return this.minOffset;
+    }
+    
+    public long getMaxOffset() {
+        return this.maxOffset;
+    }
+    
+    public int getNumOutliers() {
+        return this.nOutliers;
+    }
+    
+    public long getTiTimeOffset() {
+        final long offsetRange = maxOffset - minOffset;
+        if (offsetRange > minRange && nOutliers < maxOutliers) {
+            return minOffset;
+        } else {
+            return 0L;
+        }
+    }
 
     public void updateTriggerConfig(final TriggerConfig triggerConfig) {
-        final long offsetRange = maxOffset - minOffset;
-        if (offsetRange > minRange && nOutliers < maxOutliers) {
-            triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, minOffset);
-        } else {
-            triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, 0L);
-        }
+        long tiTimeOffset = getTiTimeOffset();
+        triggerConfig.put(TriggerConfigVariable.TI_TIME_OFFSET, tiTimeOffset);
     }
 }

Modified: java/trunk/run-database/pom.xml
 =============================================================================
--- java/trunk/run-database/pom.xml	(original)
+++ java/trunk/run-database/pom.xml	Wed Feb 24 13:06:58 2016
@@ -20,8 +20,8 @@
             <artifactId>hps-record-util</artifactId>
         </dependency>
         <dependency>
-            <groupId>org.hps</groupId>
-            <artifactId>hps-datacat-client</artifactId>
+            <groupId>srs</groupId>
+            <artifactId>org-srs-datacat-client</artifactId>
         </dependency>
     </dependencies>
 </project>

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java	Wed Feb 24 13:06:58 2016
@@ -16,7 +16,7 @@
      *
      * @param run the run number
      */
-    public void deleteEpicsData(EpicsType epicsType, final int run);
+    public void deleteEpicsData(EpicsType epicsType, int run);
 
     /**
      * Get EPICS data by run.
@@ -34,5 +34,5 @@
      *
      * @param epicsDataList the list of EPICS data
      */
-    void insertEpicsData(List<EpicsData> epicsDataList);   
+    void insertEpicsData(List<EpicsData> epicsDataList, int run);
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java	Wed Feb 24 13:06:58 2016
@@ -74,6 +74,9 @@
 
     /**
      * Delete all EPICS data for a run from the database.
+     * <p>
+     * Only the <code>epics_header</code> records are deleted and the child records
+     * are deleted automatically via a <code>CASCADE</code>.
      *
      * @param run the run number
      */
@@ -97,12 +100,12 @@
                 deleteEpicsData.setInt(1, headerId);
                 int rowsAffected = deleteEpicsData.executeUpdate();
                 if (rowsAffected == 0) {
-                    throw new SQLException("Deletion of EPICS data failed; no rows affect.");
+                    throw new SQLException("Deletion of EPICS data failed; no rows affected.");
                 }
                 deleteHeader.setInt(1, headerId);
                 rowsAffected = deleteHeader.executeUpdate();
                 if (rowsAffected == 0) {
-                    throw new SQLException("Deletion of EPICS header failed; no rows affect.");
+                    throw new SQLException("Deletion of EPICS header failed; no rows affected.");
                 }
             }
 
@@ -137,7 +140,7 @@
      * Get EPICS data by run.
      *
      * @param run the run number
-     * @param epicsType the type of EPICS data (1s or 10s)
+     * @param epicsType the type of EPICS data (2s or 20s)
      * @return the EPICS data
      */
     @Override
@@ -149,7 +152,7 @@
             final List<EpicsVariable> variables = epicsVariableDao.getEpicsVariables(epicsType);
             selectEpicsData = connection.prepareStatement("SELECT * FROM " + epicsType.getTableName() 
                     + " LEFT JOIN epics_headers ON " + epicsType.getTableName() + ".epics_header_id = epics_headers.id"
-                    + " WHERE epics_headers.run = ?");
+                    + " WHERE epics_headers.run = ? ORDER BY epics_headers.sequence");
             selectEpicsData.setInt(1, run);
             ResultSet resultSet = selectEpicsData.executeQuery();
             while (resultSet.next()) {
@@ -189,12 +192,14 @@
     /**
      * Insert a list of EPICS data into the database.
      * <p>
-     * The run number comes from the header information.
+     * By default, the run number from the header will be used, but it will be overridden
+     * if it does not match the <code>run</code> argument.  (There are a few data files
+     * where the run in the EPICS header is occassionally wrong.)
      *
      * @param epicsDataList the list of EPICS data
      */
     @Override
-    public void insertEpicsData(final List<EpicsData> epicsDataList) {
+    public void insertEpicsData(final List<EpicsData> epicsDataList, int run) {
         if (epicsDataList.isEmpty()) {
             throw new IllegalArgumentException("The EPICS data list is empty.");
         }
@@ -208,9 +213,11 @@
                 if (epicsHeader == null) {
                     throw new IllegalArgumentException("The EPICS data is missing a header.");
                 }
-                insertHeaderStatement.setInt(1, epicsHeader.getRun());
+                insertHeaderStatement.setInt(1, run); /* Don't use run from bank as it is sometimes wrong! */
                 insertHeaderStatement.setInt(2, epicsHeader.getSequence());
                 insertHeaderStatement.setInt(3, epicsHeader.getTimestamp());
+                LOGGER.finer("creating EPICs record with run = " + run + " ; seq = " 
+                        + epicsHeader.getSequence() + "; ts = " + epicsHeader.getTimestamp());
                 final int rowsCreated = insertHeaderStatement.executeUpdate();
                 if (rowsCreated == 0) {
                     throw new SQLException("Creation of EPICS header record failed; no rows affected.");
@@ -238,11 +245,11 @@
                     insertStatement.setDouble(parameterIndex, value);
                     ++parameterIndex;
                 }
-                final int dataRowsCreated = insertStatement.executeUpdate();                
+                final int dataRowsCreated = insertStatement.executeUpdate();
                 if (dataRowsCreated == 0) {
                     throw new SQLException("Creation of EPICS data failed; no rows affected.");
                 }
-                LOGGER.info("inserted EPICS data with run " + epicsHeader.getRun() + ", seq " + epicsHeader.getSequence() + "timestamp " 
+                LOGGER.finer("inserted EPICS data with run = " + run + "; seq = " + epicsHeader.getSequence() + "; ts = " 
                         + epicsHeader.getTimestamp());
                 insertStatement.close();
             }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java	Wed Feb 24 13:06:58 2016
@@ -3,34 +3,35 @@
 import org.hps.record.epics.EpicsData;
 
 /**
- * Enum for representing different types of EPICS data in the run database, of which there are currently two (1s and
- * 10s).
+ * Enum for representing different types of EPICS data in the run database, of which there are currently two (2s and
+ * 20s).
  *
  * @author Jeremy McCormick, SLAC
  */
+// FIXME: move to record-util
 public enum EpicsType {
 
     /**
-     * 10S EPICS data.
+     * 20S EPICS data.
      */
-    EPICS_10S(10),
+    EPICS_20S(20),
     /**
-     * 1S EPICS data.
+     * 2S EPICS data.
      */
-    EPICS_1S(1);
+    EPICS_2S(2);
 
     /**
      * Get the type from an int.
      *
      * @param type the type from an int
      * @return the type from an int
-     * @throws IllegalArgumentException if <code>type</code> is invalid (not 1 or 10)
+     * @throws IllegalArgumentException if <code>type</code> is invalid (not 2 or 20)
      */
     public static EpicsType fromInt(final int type) {
-        if (type == EPICS_1S.type) {
-            return EPICS_1S;
-        } else if (type == EPICS_10S.type) {
-            return EPICS_10S;
+        if (type == EPICS_2S.type) {
+            return EPICS_2S;
+        } else if (type == EPICS_20S.type) {
+            return EPICS_20S;
         } else {
             throw new IllegalArgumentException("The type code is invalid (must be 1 or 10): " + type);
         }
@@ -44,9 +45,9 @@
     public static EpicsType getEpicsType(final EpicsData epicsData) {
         // FIXME: The type argument should be set on creation which would make this key check unnecessary.
         if (epicsData.getKeys().contains("MBSY2C_energy")) {
-            return EpicsType.EPICS_1S;
+            return EpicsType.EPICS_2S;
         } else {
-            return EpicsType.EPICS_10S;
+            return EpicsType.EPICS_20S;
         }
     }
 

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java	Wed Feb 24 13:06:58 2016
@@ -2,7 +2,7 @@
 
 /**
  * Information about an EPICS variable including its name in the EPICS database, column name for the run database,
- * description of the variable, and type (either 1s or 10s).
+ * description of the variable, and type (either 2s or 20s).
  * <p>
  * This class is used to represent data from the <i>epics_variables</i> table in the run database.
  *
@@ -29,7 +29,7 @@
     private final String variableName;
 
     /**
-     * The type of the variable (1s or 10s).
+     * The type of the variable (2s or 20s).
      */
     private final EpicsType variableType;
 

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunDatabaseCommandLine.java	Wed Feb 24 13:06:58 2016
@@ -1,62 +1,24 @@
 package org.hps.run.database;
 
 import java.io.File;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.Date;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
-import java.util.logging.Logger;
+import java.net.URISyntaxException;
 
 import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.DefaultParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
-import org.apache.commons.cli.DefaultParser;
 import org.hps.conditions.database.ConnectionParameters;
-import org.hps.datacat.client.DatacatClient;
-import org.hps.datacat.client.DatacatClientFactory;
-import org.hps.datacat.client.Dataset;
-import org.hps.datacat.client.DatasetMetadata;
-import org.hps.record.evio.EvioFileUtilities;
+import org.srs.datacat.client.Client;
+import org.srs.datacat.client.ClientBuilder;
 
 /**
- * Command line tool for updating the run database from EVIO files registered in the data catalog.
+ * Command line tool for inserting records into the run database.
  *
  * @author Jeremy McCormick, SLAC
  */
-public class RunDatabaseCommandLine {
-
-    /**
-     * Set of features supported by the tool.
-     */
-    static enum Feature {
-        /**
-         * Insert EPICS data.
-         */
-        EPICS,
-        /**
-         * Insert scaler data.
-         */
-        SCALERS,
-        /**
-         * Insert run summary.
-         */
-        SUMMARY,
-        /**
-         * Insert trigger config.
-         */
-        TRIGGER_CONFIG
-    }
-
-    /**
-     * Initialize the logger.
-     */
-    private static final Logger LOGGER = Logger.getLogger(RunDatabaseCommandLine.class.getPackage().getName());
-
+public final class RunDatabaseCommandLine {
+        
     /**
      * Command line options for the crawler.
      */
@@ -66,11 +28,18 @@
      * Statically define the command options.
      */
     static {
-        OPTIONS.addOption("f", "feature", true, "enable a feature");
-        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");
         OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
         OPTIONS.addOption("r", "run", true, "run to update");
-        OPTIONS.addOption("u", "update", false, "allow updating existing run in the database");
+        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");       
+        OPTIONS.addOption("Y", "dry-run", false, "dry run which will not update the database");
+        OPTIONS.addOption("x", "replace", false, "allow deleting and replacing an existing run");
+        OPTIONS.addOption("s", "spreadsheet", true, "path to run database spreadsheet (CSV format)");
+        OPTIONS.addOption("d", "detector", true, "conditions system detector name");
+        OPTIONS.addOption("N", "no-evio-processing", false, "skip processing of all EVIO files");
+        OPTIONS.addOption("L", "load", false, "load back run information after inserting (for debugging)");
+        OPTIONS.addOption("u", "url", true, "data catalog URL");
+        OPTIONS.addOption("S", "site", true, "data catalog site (e.g. SLAC or JLAB)");
+        OPTIONS.addOption("f", "folder", true, "folder in datacat for dataset search");
     }
 
     /**
@@ -81,111 +50,74 @@
     public static void main(final String args[]) {
         new RunDatabaseCommandLine().parse(args).run();
     }
-
-    /**
-     * Allow updating of the database for existing runs.
-     */
-    private boolean allowUpdates = false;
-
-    /**
-     * The set of enabled features.
-     */
-    private final Set<Feature> features = new HashSet<Feature>();
-
-    /**
-     * The run manager for interacting with the run db.
-     */
-    private RunManager runManager;
-
-    /**
-     * Create a run processor from the current configuration.
-     *
-     * @return the run processor
-     */
-    private RunProcessor createEvioRunProcessor(final RunSummaryImpl runSummary, final List<File> files) {
-
-        final RunProcessor runProcessor = new RunProcessor(runSummary, files);
-
-        if (features.contains(Feature.EPICS)) {
-            runProcessor.addEpicsProcessor();
-        }
-        if (features.contains(Feature.SCALERS)) {
-            runProcessor.addScalerProcessor();
-        }
-        if (features.contains(Feature.TRIGGER_CONFIG)) {
-            runProcessor.addTriggerTimeProcessor();
-        }
-
-        return runProcessor;
-    }
-
-    /**
-     * Get the list of EVIO files for the run.
-     *
-     * @param run the run number
-     * @return the list of EVIO files from the run
-     */
-    private Map<File, Dataset> getEvioFiles(final int run) {
-        final DatacatClient datacatClient = new DatacatClientFactory().createClient();
-        final Set<String> metadata = new HashSet<String>();
-        final Map<File, Dataset> files = new HashMap<File, Dataset>();
-        metadata.add("runMin");
-        metadata.add("eventCount");
-        metadata.add("fileNumber");
-        metadata.add("endTimestamp");
-        metadata.add("startTimestamp");
-        metadata.add("hasEnd");
-        metadata.add("hasPrestart");
-        final List<Dataset> datasets = datacatClient.findDatasets("data/raw",
-                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + run, metadata);
-        if (datasets.isEmpty()) {
-            throw new IllegalStateException("No EVIO datasets for run " + run + " were found in the data catalog.");
-        }
-        for (final Dataset dataset : datasets) {
-            files.put(new File(dataset.getLocations().get(0).getResource()), dataset);
-        }
-        return files;
-    }
-
-    /**
-     * Insert information for a run into the database.
-     *
-     * @param runManager the run manager for interacting with the run db
-     * @param runSummary the run summary with information about the run
-     */
-    private void insertRun(final RunManager runManager, final RunSummary runSummary) {
-
-        final RunDatabaseDaoFactory runFactory = new RunDatabaseDaoFactory(runManager.getConnection());
-
-        // Add the run summary record.
-        if (this.features.contains(Feature.SUMMARY)) {
-            LOGGER.info("inserting run summary");
-            runFactory.createRunSummaryDao().insertRunSummary(runSummary);
-        }
-
-        if (this.features.contains(Feature.EPICS)) {
-            LOGGER.info("inserting EPICS data");
-            runFactory.createEpicsDataDao().insertEpicsData(runSummary.getEpicsData());
-        }
-
-        if (this.features.contains(Feature.SCALERS)) {
-            LOGGER.info("inserting scaler data");
-            runFactory.createScalerDataDao().insertScalerData(runSummary.getScalerData(), runManager.getRun());
-        }
-
-        if (this.features.contains(Feature.TRIGGER_CONFIG)) {
-            LOGGER.info("inserting trigger config");
-            runFactory.createTriggerConfigDao().insertTriggerConfig(runSummary.getTriggerConfig(), runManager.getRun());
-        }
-    }
-
-    /**
-     * Parse command line options and return reference to <code>this</code>.
+    
+    /**
+     * Enable dry run which will not update the run database.
+     */
+    private boolean dryRun = false;
+    
+    /**
+     * Run number.
+     */
+    private int run;
+    
+    /**
+     * Path to spreadsheet CSV file.
+     */
+    private File spreadsheetFile = null;
+    
+    /**
+     * Name of detector for conditions system (default for Eng Run 2015 provided here).
+     */
+    private String detectorName = "HPS-EngRun2015-Nominal-v3";
+    
+    /**
+     * Allow replacement of existing records.
+     */
+    private boolean replace = false;
+    
+    /**
+     * Skip full EVIO file processing.
+     */
+    private boolean skipEvioProcessing = false;
+    
+    /**
+     * Load back run information after insert (for debugging).
+     */
+    private boolean reload = false;
+    
+    /**
+     * Database connection parameters.
+     */
+    private ConnectionParameters connectionParameters = null;
+    
+    /**
+     * Data catalog client interface.
+     */
+    private Client datacatClient = null;
+    
+    /**
+     * Data catalog site.
+     */
+    private String site = "JLAB";                             
+    
+    /**
+     * Data catalog URL.
+     */
+    private String url = "http://hpsweb.jlab.org/datacat/r";  
+    
+    /**
+     * Default folder for file search.
+     */
+    private String folder = "/HPS/data/raw";
+    
+    /**
+     * Parse command line options and return reference to <code>this</code> object.
      *
      * @param args the command line arguments
      * @return reference to this object
      */
-    RunDatabaseCommandLine parse(final String args[]) {
+    private RunDatabaseCommandLine parse(final String args[]) {
         try {
             final CommandLine cl = new DefaultParser().parse(OPTIONS, args);
 
@@ -204,43 +136,74 @@
                     throw new IllegalArgumentException("Connection properties file " + dbPropFile.getPath()
                             + " does not exist.");
                 }
-                final ConnectionParameters connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
-                LOGGER.config("using " + dbPropPath + " for db connection properties");
-
-                runManager = new RunManager(connectionParameters.createConnection());
-
+                connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
             } else {
                 // Database connection properties file is required.
-                throw new RuntimeException("Connection properties are required.");
-            }
-
-            Integer run = null;
+                throw new RuntimeException("Connection properties are a required argument.");
+            }
+
+            // Run number.
             if (cl.hasOption("r")) {
                 run = Integer.parseInt(cl.getOptionValue("r"));
             } else {
                 throw new RuntimeException("The run number is required.");
             }
-            runManager.setRun(run);
-
+            
+            // Dry run.
+            if (cl.hasOption("Y")) {
+                this.dryRun = true;
+            }
+            
+            // Run spreadsheet.
+            if (cl.hasOption("s")) {
+                this.spreadsheetFile = new File(cl.getOptionValue("s"));
+                if (!this.spreadsheetFile.exists()) {
+                    throw new RuntimeException("The run spreadsheet " + this.spreadsheetFile.getPath() + " is inaccessible or does not exist.");
+                }
+            }
+            
+            // Detector name.
+            if (cl.hasOption("d")) {
+                this.detectorName = cl.getOptionValue("d");
+            }
+            
+            // Replace existing run.
+            if (cl.hasOption("x")) {
+                this.replace = true;
+            }
+            
+            // Skip full EVIO processing.
+            if (cl.hasOption("N")) {
+                this.skipEvioProcessing = true;
+            }
+            
+            // Load back run info at end of job.
+            if (cl.hasOption("L")) {
+                this.reload = true;
+            }
+            
+            // Data catalog URL.
+            if (cl.hasOption("u")) {
+                url = cl.getOptionValue("u");
+            }
+            
+            // Site in the data catalog.
+            if (cl.hasOption("S")) {
+                site = cl.getOptionValue("S");
+            }
+            
+            // Set folder for dataset search.
             if (cl.hasOption("f")) {
-                // Enable individual features.
-                for (final String arg : cl.getOptionValues("f")) {
-                    features.add(Feature.valueOf(arg));
-                }
-            } else {
-                // By default all features are enabled.
-                features.addAll(Arrays.asList(Feature.values()));
-            }
-            for (final Feature feature : features) {
-                LOGGER.config("feature " + feature.name() + " is enabled.");
-            }
-
-            // Allow updates to existing runs in the db.
-            if (cl.hasOption("u")) {
-                this.allowUpdates = true;
-                LOGGER.config("updating or replacing existing run data is enabled");
-            }
-
+                folder = cl.getOptionValue("f");
+            }
+            
+            // Initialize the data catalog client.
+            try {
+                datacatClient = new ClientBuilder().setUrl(url).build();
+            } catch (URISyntaxException e) {
+                throw new RuntimeException("Bad datacat URL.", e);
+            }
+            
         } catch (final ParseException e) {
             throw new RuntimeException(e);
         }
@@ -249,90 +212,21 @@
     }
 
     /**
-     * Run the job to update the information in the run database.
+     * Configure the builder from command line options and run the job to update the database.
      */
     private void run() {
-
-        LOGGER.info("starting");
-
-        final boolean runExists = runManager.runExists();
-
-        // Fail if run exists and updates are not allowed.
-        if (runExists && !allowUpdates) {
-            throw new IllegalStateException("The run " + runManager.getRun()
-                    + " already exists and updates are not allowed.");
-        }
-
-        // Get the run number configured from command line.
-        final int run = runManager.getRun();
-
-        // Get the list of EVIO files for the run using a data catalog query.
-        final Map<File, Dataset> fileDatasets = this.getEvioFiles(run);
-        final List<File> files = new ArrayList<File>(fileDatasets.keySet());
-        EvioFileUtilities.sortBySequence(files);
-
-        // Process the run's files to get information.
-        final RunSummaryImpl runSummary = new RunSummaryImpl(run);
-        final RunProcessor runProcessor = this.createEvioRunProcessor(runSummary, files);
-        try {
-            runProcessor.processRun();
-        } catch (final Exception e) {
-            throw new RuntimeException(e);
-        }
-
-        // Set number of files from datacat query.
-        runSummary.setTotalFiles(files.size());
-
-        // Set run start date.
-        this.setStartDate(fileDatasets, files, runSummary);
-
-        // Set run end date.
-        this.setEndDate(fileDatasets, files, runSummary);
-
-        // Delete existing run.
-        if (runExists) {
-            runManager.deleteRun();
-        }
-
-        // Insert run into database.
-        this.insertRun(runManager, runSummary);
-
-        // Close the database connection.
-        runManager.closeConnection();
-
-        LOGGER.info("done");
-    }
-
-    /**
-     * Set the run end date.
-     *
-     * @param fileDatasets the run's datasets
-     * @param files the run's EVIO files
-     * @param runSummary the run summary
-     */
-    private void setEndDate(final Map<File, Dataset> fileDatasets, final List<File> files,
-            final RunSummaryImpl runSummary) {
-        final Dataset lastDataset = fileDatasets.get(files.get(files.size() - 1));
-        final DatasetMetadata metadata = lastDataset.getMetadata();
-        // System.out.println("endTimestamp: " + metadata.getLong("endTimestamp"));
-        final Date endDate = new Date(metadata.getLong("endTimestamp"));
-        // System.out.println("endDate: " + startDate);
-        runSummary.setEndDate(endDate);
-        runSummary.setEndOkay(metadata.getLong("hasEnd") == 0 ? false : true);
-    }
-
-    /**
-     * Set the run start date.
-     *
-     * @param fileDatasets the run's datasets
-     * @param files the run's EVIO files
-     * @param runSummary the run summary
-     */
-    private void setStartDate(final Map<File, Dataset> fileDatasets, final List<File> files,
-            final RunSummaryImpl runSummary) {
-        final Dataset firstDataset = fileDatasets.get(files.get(0));
-        final DatasetMetadata metadata = firstDataset.getMetadata();
-        final Date startDate = new Date(metadata.getLong("startTimestamp"));
-        runSummary.setStartDate(startDate);
-    }
+        new RunDatabaseBuilder()
+            .createRunSummary(run)
+            .setFolder(folder)
+            .setDetectorName(detectorName)
+            .setConnectionParameters(connectionParameters)
+            .setDatacatClient(datacatClient)
+            .setSite(site)
+            .setDryRun(dryRun)
+            .setReplace(replace)
+            .skipEvioProcessing(skipEvioProcessing)
+            .setSpreadsheetFile(spreadsheetFile)
+            .setReload(reload)
+            .run();
+    }        
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java	Wed Feb 24 13:06:58 2016
@@ -6,37 +6,26 @@
 import java.util.logging.Logger;
 
 import org.hps.conditions.database.ConnectionParameters;
+import org.hps.record.daqconfig.DAQConfig;
 import org.hps.record.epics.EpicsData;
 import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
+import org.hps.record.svt.SvtConfigData;
+import org.hps.record.triggerbank.TriggerConfigData;
 import org.lcsim.conditions.ConditionsEvent;
 import org.lcsim.conditions.ConditionsListener;
 
 /**
- * Manages read-only access to the run database and creates a {@link RunSummary} for a specific run.
+ * Manages access to the run database.
  *
  * @author Jeremy McCormick, SLAC
  */
 public final class RunManager implements ConditionsListener {
 
     /**
-     * Simple class for caching data.
-     */
-    private class DataCache {
-
-        List<EpicsData> epicsData;
-        RunSummary fullRunSummary;
-        Boolean runExists;
-        RunSummary runSummary;
-        List<ScalerData> scalerData;
-        TriggerConfig triggerConfig;
-    }
-
-    /**
      * The default connection parameters for read-only access to the run database.
      */
     private static ConnectionParameters DEFAULT_CONNECTION_PARAMETERS = new ConnectionParameters("hpsuser",
-            "darkphoton", "hps_run_db", "hpsdb.jlab.org");
+            "darkphoton", "hps_run_db_v2", "hpsdb.jlab.org");
 
     /**
      * The singleton instance of the RunManager.
@@ -49,8 +38,7 @@
     private static final Logger LOGGER = Logger.getLogger(RunManager.class.getPackage().getName());
 
     /**
-     * Get the global instance of the {@link RunManager}.
-     *
+     * Get the global instance of the {@link RunManager}.     
      * @return the global instance of the {@link RunManager}
      */
     public static RunManager getRunManager() {
@@ -64,21 +52,11 @@
      * The active database connection.
      */
     private Connection connection;
-
-    /**
-     * The database connection parameters, initially set to the default parameters.
-     */
-    private final ConnectionParameters connectionParameters = DEFAULT_CONNECTION_PARAMETERS;
-
-    /**
-     * The data cache of run information.
-     */
-    private DataCache dataCache;
-
+   
     /**
      * Factory for creating database API objects.
      */
-    private final RunDatabaseDaoFactory factory;
+    private final DaoProvider factory;
 
     /**
      * The run number; the -1 value indicates that this has not been set externally yet.
@@ -86,34 +64,28 @@
     private Integer run = null;
 
     /**
+     * Class constructor.     
+     * @param connection the database connection
+     */
+    public RunManager(final Connection connection) {
+        try {
+            if (connection.isClosed()) {
+                throw new RuntimeException("The connection is already closed and cannot be used.");
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
+        this.connection = connection;
+        factory = new DaoProvider(this.connection);
+    }
+    
+    /**
      * Class constructor using default connection parameters.
      */
     public RunManager() {
-        this.connection = DEFAULT_CONNECTION_PARAMETERS.createConnection();
-        this.openConnection();
-        factory = new RunDatabaseDaoFactory(this.connection);
-    }
-
-    /**
-     * Class constructor.
-     *
-     * @param connection the database connection
-     */
-    public RunManager(final Connection connection) {
-        this.connection = connection;
-        this.openConnection();
-        factory = new RunDatabaseDaoFactory(this.connection);
-    }
-
-    /**
-     * Check if the run number has been set.
-     */
-    private void checkRunNumber() {
-        if (this.run == null) {
-            throw new IllegalStateException("The run number was not set.");
-        }
-    }
-
+        this(DEFAULT_CONNECTION_PARAMETERS.createConnection());
+    }
+        
     /**
      * Close the database connection.
      */
@@ -129,7 +101,6 @@
 
     /**
      * Load new run information when conditions have changed.
-     *
      * @param conditionsEvent the event with new conditions information
      */
     @Override
@@ -138,21 +109,7 @@
     }
 
     /**
-     * Delete a run from the database.
-     *
-     * @param run the run number
-     */
-    public void deleteRun() {
-        // Create object for updating run info in the database.
-        final RunSummaryDao runSummaryDao = factory.createRunSummaryDao();
-
-        // Delete run from the database.
-        runSummaryDao.deleteFullRun(run);
-    }
-
-    /**
-     * Return the database connection.
-     *
+     * Return the database connection.     
      * @return the database connection
      */
     Connection getConnection() {
@@ -161,186 +118,159 @@
 
     /**
      * Get the EPICS data for the current run.
-     *
      * @param epicsType the type of EPICS data
      * @return the EPICS data for the current run
      */
     public List<EpicsData> getEpicsData(final EpicsType epicsType) {
-        this.checkRunNumber();
-        if (this.dataCache.epicsData == null) {
-            LOGGER.info("loading EPICS data for run " + this.run);
-            this.dataCache.epicsData = factory.createEpicsDataDao().getEpicsData(epicsType, this.run);
-        }
-        return this.dataCache.epicsData;
-    }
-
-    /**
-     * Get the EPICS variables.
-     *
+        return factory.getEpicsDataDao().getEpicsData(epicsType, this.run);
+    }
+
+    /**
+     * Get the list of EPICS variables definitions.     
      * @param epicsType the type of EPICS data
-     * @return the EPICS data for the current run
+     * @return the list of EPICS variable definitions
      */
     public List<EpicsVariable> getEpicsVariables(final EpicsType epicsType) {
-        return factory.createEpicsVariableDao().getEpicsVariables(epicsType);
-    }
-
-    /**
-     * Get the full run summary for the current run including scaler data, etc.
-     *
-     * @return the full run summary for the current run
-     */
-    public RunSummary getFullRunSummary() {
-        this.checkRunNumber();
-        if (this.dataCache.fullRunSummary == null) {
-            this.dataCache.fullRunSummary = factory.createRunSummaryDao().readFullRunSummary(this.run);
-        }
-        return this.dataCache.fullRunSummary;
-    }
-
-    /**
-     * Get the current run number.
-     *
-     * @return the run number
-     */
-    public int getRun() {
-        return run;
-    }
-
-    /**
-     * Get the complete list of run numbers from the database.
-     *
+        return factory.getEpicsVariableDao().getEpicsVariables(epicsType);
+    }
+
+    /**
+     * Get the full list of run numbers from the database.     
      * @return the complete list of run numbers
      */
     public List<Integer> getRuns() {
-        return new RunSummaryDaoImpl(this.connection).getRuns();
-    }
-
-    /**
-     * Get the full list of summaries for all runs in the database without complex data like EPICS records.
-     *
-     * @return the full list of run summaries
-     */
-    public List<RunSummary> getRunSummaries() {
-        return this.factory.createRunSummaryDao().getRunSummaries();
-    }
-
-    /**
-     * Get the run summary for the current run not including its sub-objects like scaler data.
-     *
+        return factory.getRunSummaryDao().getRuns();
+    }
+  
+    /**
+     * Get the run summary for the current run.     
      * @return the run summary for the current run
      */
     public RunSummary getRunSummary() {
-        this.checkRunNumber();
-        if (this.dataCache.runSummary == null) {
-            this.dataCache.runSummary = factory.createRunSummaryDao().getRunSummary(this.run);
-        }
-        return this.dataCache.runSummary;
-    }
-
-    /**
-     * Get the scaler data for the current run.
-     *
+        return factory.getRunSummaryDao().getRunSummary(this.run);
+    }
+
+    /**
+     * Get the scaler data for the current run.     
      * @return the scaler data for the current run
      */
     public List<ScalerData> getScalerData() {
-        this.checkRunNumber();
-        if (this.dataCache.scalerData == null) {
-            LOGGER.info("loading scaler data for run " + this.run);
-            this.dataCache.scalerData = factory.createScalerDataDao().getScalerData(run);
-        }
-        return this.dataCache.scalerData;
-    }
-
-    /**
-     * Get the trigger config for the current run.
-     *
-     * @return the trigger config for the current run
-     */
-    public TriggerConfig getTriggerConfig() {
-        this.checkRunNumber();
-        if (this.dataCache.triggerConfig == null) {
-            LOGGER.info("loading trigger config for run " + this.run);
-            this.dataCache.triggerConfig = factory.createTriggerConfigDao().getTriggerConfig(run);
-        }
-        return this.dataCache.triggerConfig;
-    }
-
-    /**
-     * Update the database with information found from crawling the files.
-     *
-     * @param runs the list of runs to update
-     * @throws SQLException if there is a database query error
-     */
-    public void insertRun(final RunSummary runSummary) throws SQLException {
-        LOGGER.info("updating run database for run " + runSummary.getRun());
-
-        // Create object for updating run info in the database.
-        final RunSummaryDao runSummaryDao = factory.createRunSummaryDao();
-
-        // Insert run summary into database.
-        runSummaryDao.insertFullRunSummary(runSummary);
-
-        LOGGER.info("done updating run database");
-    }
-
-    /**
-     * Open a new database connection from the connection parameters if the current one is closed or <code>null</code>.
-     * <p>
-     * This method does nothing if the connection is already open.
-     */
-    public void openConnection() {
-        try {
-            if (this.connection.isClosed()) {
-                LOGGER.info("creating new database connection");
-                this.connection = connectionParameters.createConnection();
-            } 
-        } catch (final SQLException e) {
-            throw new RuntimeException("Error opening database connection.", e);
-        }
-    }
+        return factory.getScalerDataDao().getScalerData(this.run);
+    }
+    
+    /**
+     * Get SVT configuration data.     
+     * @return the SVT configuration data
+     */
+    public List<SvtConfigData> getSvtConfigData() {
+        return factory.getSvtConfigDao().getSvtConfigs(this.run);
+    }
+    
+    /**
+     * Get the DAQ (trigger) configuration for the run.
+     * @return the DAQ configuration for the run
+     */
+    public DAQConfig getDAQConfig() {
+        TriggerConfigData config = factory.getTriggerConfigDao().getTriggerConfig(this.run);
+        return config.loadDAQConfig(this.run);
+    }   
 
     /**
      * Return <code>true</code> if the run exists in the database.
-     *
      * @return <code>true</code> if the run exists in the database
      */
-    public boolean runExists() {
-        this.checkRunNumber();
-        if (this.dataCache.runExists == null) {
-            this.dataCache.runExists = factory.createRunSummaryDao().runSummaryExists(this.run);
-        }
-        return this.dataCache.runExists;
-    }
-
-    /**
-     * Return <code>true</code> if the run exists in the database.
-     *
+    public boolean runExists() {      
+        return factory.getRunSummaryDao().runSummaryExists(this.run);
+    }
+
+    /**
+     * Set the run number and then load the applicable {@link RunSummary} from the database.
      * @param run the run number
-     * @return <code>true</code> if the run exists in the database
-     */
-    boolean runExists(final int run) {
-        if (this.dataCache.runExists == null) {
-            this.dataCache.runExists = factory.createRunSummaryDao().runSummaryExists(run);
-        }
-        return this.dataCache.runExists;
-    }
-
-    /**
-     * Set the run number and then load the applicable {@link RunSummary} from the database.
-     *
-     * @param run the run number
      */
     public void setRun(final int run) {
-
         if (this.run == null || run != this.run) {
-
-            LOGGER.info("setting new run " + run);
-
+            LOGGER.info("setting run " + run);
             // Set the run number.
             this.run = run;
-
-            // Reset the data cache.
-            this.dataCache = new DataCache();
-        }
-    }
+        }
+    }
+    
+    /**
+     * Get the currently active run number or <code>null</code>.
+     * @return the currently active run number of <code>null</code>
+     */
+    public Integer getRun() {
+        return this.run;
+    }
+    
+    /**
+     * Create or replace a run summary in the database.
+     * @param runSummary the run summary to update
+     * @param replaceExisting <code>true</code> to allow an existing run summary to be replaced
+     */
+    void updateRunSummary(RunSummary runSummary, boolean replaceExisting) {
+        final RunSummaryDao runSummaryDao = factory.getRunSummaryDao();
+        RunManager runManager = new RunManager();
+        runManager.setRun(runSummary.getRun());
+        if (runManager.runExists()) {
+            if (replaceExisting) {
+                runSummaryDao.updateRunSummary(runSummary);
+            } else {
+                throw new RuntimeException("Run already exists and replacement is not allowed.");
+            }
+        } else {
+            runSummaryDao.insertRunSummary(runSummary);
+        }                
+    }
+    
+    /**
+     * Create or replace the trigger config for the run.
+     * @param triggerConfig the trigger config
+     * @param replaceExisting <code>true</code> to allow an existing trigger to be replaced
+     */
+    void updateTriggerConfig(TriggerConfigData triggerConfig, boolean replaceExisting) {
+        final TriggerConfigDao configDao = factory.getTriggerConfigDao();
+        if (configDao.getTriggerConfig(run) != null) {
+            if (replaceExisting) {
+                configDao.deleteTriggerConfig(run);
+            } else {
+                throw new RuntimeException("Run already exists and replacement is not allowed.");
+            }
+        }
+        configDao.insertTriggerConfig(triggerConfig, run);
+    }
+    
+    /**
+     * Create or replace EPICS data for the run.
+     * @param epicsData the EPICS data
+     */
+    void updateEpicsData(List<EpicsData> epicsData) {
+        if (epicsData != null && !epicsData.isEmpty()) {
+            factory.getEpicsDataDao().insertEpicsData(epicsData, this.run);
+        }
+    }
+    
+    /**
+     * Create or replace scaler data for the run.
+     * @param scalerData the scaler data
+     */
+    void updateScalerData(List<ScalerData> scalerData) {
+        if (scalerData != null) {
+            factory.getScalerDataDao().insertScalerData(scalerData, this.run);
+        } 
+    }     
+    
+    /**
+     * Delete a run from the database.
+     * @param run the run number
+     */
+    void deleteRun() {        
+        factory.getEpicsDataDao().deleteEpicsData(EpicsType.EPICS_2S, run);
+        factory.getEpicsDataDao().deleteEpicsData(EpicsType.EPICS_20S, run);
+        factory.getScalerDataDao().deleteScalerData(run);
+        factory.getSvtConfigDao().deleteSvtConfigs(run);
+        factory.getTriggerConfigDao().deleteTriggerConfig(run);
+        factory.getRunSummaryDao().deleteRunSummary(run);
+    }
+    
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java	Wed Feb 24 13:06:58 2016
@@ -1,137 +1,129 @@
 package org.hps.run.database;
 
-import java.io.File;
 import java.util.Date;
-import java.util.List;
-
-import org.hps.datacat.client.DatasetFileFormat;
-import org.hps.record.epics.EpicsData;
-import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
 
 /**
- * This is an API for accessing run summary information which is persisted as a row in the <i>runs</i> table of the run
- * database.
+ * This is an API for accessing run summary information which is persisted as a row in the <i>run_summaries</i> table.
  * <p>
- * This information includes:
- * <ul>
- * <li>run number</li>
- * <li>start date</li>
- * <li>end date</li>
- * <li>number of events</li>
- * <li>number of EVIO files</li>
- * <li>whether the END event was found indicating that the DAQ did not crash</li>
- * <li>whether the run is considered good (all <code>true</code> for now)</li>
- * </ul>
- * <p>
- * It also references several complex objects including lists of {@link org.hps.record.epics.EpicsData} and
- * {@link org.hps.record.scalers.ScalerData} for the run, as well as a list of EVIO files.
+ * All timestamp fields use the Unix convention (seconds since the epoch).
  *
+ * @author Jeremy McCormick, SLAC
  * @see RunSummaryImpl
  * @see RunSummaryDao
  * @see RunSummaryDaoImpl
  * @see RunManager
- * 
- * @author Jeremy McCormick, SLAC
  */
 public interface RunSummary {
-  
+
     /**
-     * Get the creation date of this run record.
+     * Get the creation date of this record.
      *
-     * @return the creation date of this run record
+     * @return the creation date of this record
      */
     Date getCreated();
 
     /**
-     * Get the end date.
-     *
-     * @return the end date
+     * Get the END event timestamp or the timestamp from the last head bank if END is not present.
+     * 
+     * @return the last event timestamp
      */
-    Date getEndDate();
+    Integer getEndTimestamp();
 
     /**
-     * Return <code>true</code> if END event was found in the data.
-     *
-     * @return <code>true</code> if END event was in the data
+     * Get the GO event timestamp.
+     * 
+     * @return the GO event timestamp
      */
-    boolean getEndOkay();
+    Integer getGoTimestamp();
 
     /**
-     * Get the EPICS data from the run.
-     *
-     * @return the EPICS data from the run
+     * Get the livetime computed from the clock scaler.
+     * 
+     * @return the livetime computed from the clock scaler
      */
-    List<EpicsData> getEpicsData();
+    Double getLivetimeClock();
 
     /**
-     * Get the event rate (effectively the trigger rate) which is the total events divided by the number of seconds in
-     * the run.
-     *
-     * @return the event rate
+     * Get the livetime computed from the FCUP_TDC scaler.
+     * 
+     * @return the livetime computed from the FCUP_TDC scaler
      */
-    double getEventRate();
+    Double getLivetimeFcupTdc();
+
+    /**
+     * Get the livetime computed from the FCUP_TRG scaler.
+     * 
+     * @return the livetime computed from the FCUP_TRG scaler
+     */
+    Double getLivetimeFcupTrg();
+
+    /**
+     * Get the notes for the run (from the run spreadsheet).
+     * 
+     * @return the notes for the run
+     */
+    String getNotes();
+
+    /**
+     * Get the PRESTART event timestamp.
+     * 
+     * @return the PRESTART event timestamp
+     */
+    Integer getPrestartTimestamp();
 
     /**
      * Get the run number.
      *
      * @return the run number
      */
-    int getRun();
+    Integer getRun();
+   
+    /**
+     * Get the target setting for the run (string from run spreadsheet).
+     * 
+     * @return the target setting for the run
+     */
+    String getTarget();
 
     /**
-     * Return <code>true</code> if the run was okay (no major errors or data corruption occurred).
-     *
-     * @return <code>true</code> if the run was okay
+     * Get the TI time offset in ns.
+     * 
+     * @return the TI time offset in ns
      */
-    boolean getRunOkay();
+    Long getTiTimeOffset();
 
     /**
-     * Get the scaler data of this run.
+     * Get the total number of events in the run.
      *
-     * @return the scaler data of this run
+     * @return the total number of events in the run
      */
-    List<ScalerData> getScalerData();
+    Long getTotalEvents();
 
     /**
-     * Get the trigger config int values.
+     * Get the total number of EVIO files in this run.
      *
-     * @return the trigger config int values
+     * @return the total number of files in this run
      */
-    TriggerConfig getTriggerConfig();
+    Integer getTotalFiles();
 
     /**
-     * Get the start date.
-     *
-     * @return the start date
+     * Get the trigger config name (from the run spreadsheet).
+     * 
+     * @return the trigger config name
      */
-    Date getStartDate();
+    String getTriggerConfigName();
 
     /**
-     * Get the total events in the run.
-     *
-     * @return the total events in the run
+     * Get the trigger rate in KHz.
+     * 
+     * @return the trigger rate in KHz
      */
-    int getTotalEvents();
+    Double getTriggerRate();
 
     /**
-     * Get the total number of EVIO files for this run.
+     * Get the date when this record was last updated.
      *
-     * @return the total number of files for this run
-     */
-    int getTotalFiles();
-
-    /**
-     * Get the number of seconds in the run which is the difference between the start and end times.
-     *
-     * @return the total seconds in the run
-     */
-    long getTotalSeconds();
-
-    /**
-     * Get the date when this run record was last updated.
-     *
-     * @return the date when this run record was last updated
+     * @return the date when this record was last updated
      */
     Date getUpdated();
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java	Wed Feb 24 13:06:58 2016
@@ -8,14 +8,7 @@
  * @author Jeremy McCormick, SLAC
  */
 interface RunSummaryDao {
-
-    /**
-     * Delete a run summary from the database including its referenced objects such as EPICS data.
-     *
-     * @param runSummary the run summary to delete
-     */
-    void deleteFullRun(int run);
-
+  
     /**
      * Delete a run summary by run number.
      *
@@ -24,26 +17,12 @@
     void deleteRunSummary(int run);
 
     /**
-     * Delete a run summary but not its objects.
-     *
-     * @param runSummary the run summary object
-     */
-    void deleteRunSummary(RunSummary runSummary);
-
-    /**
      * Get the list of run numbers.
      *
      * @return the list of run numbers
      */
     List<Integer> getRuns();
-
-    /**
-     * Get a list of run summaries without loading their objects such as EPICS data.
-     *
-     * @return the list of run summaries
-     */
-    List<RunSummary> getRunSummaries();
-
+  
     /**
      * Get a run summary by run number without loading object state.
      *
@@ -51,36 +30,13 @@
      * @return the run summary object
      */
     RunSummary getRunSummary(int run);
-
+  
     /**
-     * Insert a list of run summaries along with its referenced objects such as scaler and EPICS data.
-     *
-     * @param runSummaryList the list of run summaries
-     * @param deleteExisting <code>true</code> to allow deletion and replacement of existing run summaries
-     */
-    void insertFullRunSummaries(List<RunSummary> runSummaryList, boolean deleteExisting);
-
-    /**
-     * Insert a run summary including all its objects.
-     *
-     * @param runSummary the run summary object
-     */
-    void insertFullRunSummary(RunSummary runSummary);
-
-    /**
-     * Insert a run summary but not its objects.
+     * Insert a run summary.
      *
      * @param runSummary the run summary object
      */
     void insertRunSummary(RunSummary runSummary);
-
-    /**
-     * Read a run summary and its objects such as scaler data.
-     *
-     * @param run the run number
-     * @return the full run summary
-     */
-    RunSummary readFullRunSummary(int run);
 
     /**
      * Return <code>true</code> if a run summary exists in the database.
@@ -89,10 +45,10 @@
      * @return <code>true</code> if <code>run</code> exists in the database
      */
     boolean runSummaryExists(int run);
-
+    
     /**
-     * Update a run summary but not its objects.
-     *
+     * Update a run summary that already exists.
+     * 
      * @param runSummary the run summary to update
      */
     void updateRunSummary(RunSummary runSummary);

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java	Wed Feb 24 13:06:58 2016
@@ -5,13 +5,8 @@
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.util.ArrayList;
-import java.util.Calendar;
-import java.util.GregorianCalendar;
 import java.util.List;
-import java.util.TimeZone;
 import java.util.logging.Logger;
-
-import org.hps.record.epics.EpicsData;
 
 /**
  * Implementation of database operations for {@link RunSummary} objects in the run database.
@@ -21,37 +16,29 @@
 final class RunSummaryDaoImpl implements RunSummaryDao {
 
     /**
-     * SQL query strings.
-     */
-    private static final class RunSummaryQuery {
-
-        /**
-         * Delete by run number.
-         */
-        private static final String DELETE_RUN = "DELETE FROM runs WHERE run = ?";
-        /**
-         * Insert a record for a run.
-         */
-        private static final String INSERT = "INSERT INTO runs (run, start_date, end_date, nevents, nfiles, end_ok, created) VALUES(?, ?, ?, ?, ?, ?, NOW())";
-        /**
-         * Select all records.
-         */
-        private static final String SELECT_ALL = "SELECT * from runs";
-        /**
-         * Select record by run number.
-         */
-        private static final String SELECT_RUN = "SELECT run, start_date, end_date, nevents, nfiles, end_ok, run_ok, updated, created FROM runs WHERE run = ?";
-        /**
-         * Update information for a run.
-         */
-        private static final String UPDATE_RUN = "UPDATE runs SET start_date, end_date, nevents, nfiles, end_ok, run_ok WHERE run = ?";
-    }
-
-    /**
-     * Eastern time zone.
-     */
-    private static Calendar CALENDAR = new GregorianCalendar(TimeZone.getTimeZone("America/New_York"));
-
+     * Delete by run number.
+     */
+    private static final String DELETE = "DELETE FROM run_summaries WHERE run = ?";
+        
+    /**
+     * Insert a record for a run.
+     */
+    private static final String INSERT = "INSERT INTO run_summaries (run, nevents, nfiles, prestart_timestamp,"
+            + " go_timestamp, end_timestamp, trigger_rate, trigger_config_name, ti_time_offset," 
+            + " livetime_clock, livetime_fcup_tdc, livetime_fcup_trg, target, notes, created, updated)"
+            + " VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, NOW(), NOW())";
+    
+    private static final String UPDATE = "UPDATE run_summaries SET nevents = ?, nfiles = ?, prestart_timestamp = ?,"
+            + " go_timestamp = ?, end_timestamp = ?, trigger_rate = ?, trigger_config_name = ?, ti_time_offset = ?," 
+            + " livetime_clock = ?, livetime_fcup_tdc = ?, livetime_fcup_trg = ?, target = ?, notes = ?, updated = NOW()"
+            + " WHERE run = ?";
+    
+                     
+    /**
+     * Select record by run number.
+     */
+    private static final String SELECT = "SELECT * FROM run_summaries WHERE run = ?";
+           
     /**
      * Initialize the logger.
      */
@@ -61,60 +48,17 @@
      * The database connection.
      */
     private final Connection connection;
-
-    /**
-     * The database API for EPICS data.
-     */
-    private EpicsDataDao epicsDataDao = null;
-
-    /**
-     * The database API for scaler data.
-     */
-    private ScalerDataDao scalerDataDao = null;
-
-    /**
-     * The database API for integer trigger config.
-     */
-    private TriggerConfigDao triggerConfigIntDao = null;
-
+  
     /**
      * Create a new DAO object for run summary information.
      *
      * @param connection the database connection
      */
     RunSummaryDaoImpl(final Connection connection) {
-        // Set the connection.
         if (connection == null) {
             throw new IllegalArgumentException("The connection is null.");
         }
         this.connection = connection;
-
-        // Setup DAO API objects for managing complex object state.
-        epicsDataDao = new EpicsDataDaoImpl(this.connection);
-        scalerDataDao = new ScalerDataDaoImpl(this.connection);
-        triggerConfigIntDao = new TriggerConfigDaoImpl(this.connection);
-    }
-
-    /**
-     * Delete a run from the database including its referenced objects such as EPICS data.
-     *
-     * @param runSummary the run summary to delete
-     */
-    @Override
-    public void deleteFullRun(int run) {
-
-        // Delete EPICS log.
-        this.epicsDataDao.deleteEpicsData(EpicsType.EPICS_1S, run);
-        this.epicsDataDao.deleteEpicsData(EpicsType.EPICS_10S, run);
-
-        // Delete scaler data.
-        this.scalerDataDao.deleteScalerData(run);
-
-        // Delete trigger config.
-        this.triggerConfigIntDao.deleteTriggerConfigInt(run);
-
-        // Finally delete the run summary information.
-        this.deleteRunSummary(run);
     }
 
     /**
@@ -126,7 +70,7 @@
     public void deleteRunSummary(final int run) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.DELETE_RUN);
+            preparedStatement = connection.prepareStatement(DELETE);
             preparedStatement.setInt(1, run);
             preparedStatement.executeUpdate();
         } catch (final SQLException e) {
@@ -141,32 +85,7 @@
             }
         }
     }
-
-    /**
-     * Delete a run summary but not its objects.
-     *
-     * @param runSummary the run summary object
-     */
-    @Override
-    public void deleteRunSummary(final RunSummary runSummary) {
-        PreparedStatement preparedStatement = null;
-        try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.DELETE_RUN);
-            preparedStatement.setInt(1, runSummary.getRun());
-            preparedStatement.executeUpdate();
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
-            if (preparedStatement != null) {
-                try {
-                    preparedStatement.close();
-                } catch (final SQLException e) {
-                    e.printStackTrace();
-                }
-            }
-        }
-    }
-
+   
     /**
      * Get the list of run numbers.
      *
@@ -177,7 +96,7 @@
         final List<Integer> runs = new ArrayList<Integer>();
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = this.connection.prepareStatement("SELECT distinct(run) FROM runs ORDER BY run");
+            preparedStatement = this.connection.prepareStatement("SELECT distinct(run) FROM run_summaries ORDER BY run");
             final ResultSet resultSet = preparedStatement.executeQuery();
             while (resultSet.next()) {
                 final Integer run = resultSet.getInt(1);
@@ -196,47 +115,9 @@
         }
         return runs;
     }
-
-    /**
-     * Get a list of run summaries without loading their objects such as EPICS data.
-     *
-     * @return the list of run summaries
-     */
-    @Override
-    public List<RunSummary> getRunSummaries() {
-        PreparedStatement statement = null;
-        final List<RunSummary> runSummaries = new ArrayList<RunSummary>();
-        try {
-            statement = this.connection.prepareStatement(RunSummaryQuery.SELECT_ALL);
-            final ResultSet resultSet = statement.executeQuery();
-            while (resultSet.next()) {
-                final RunSummaryImpl runSummary = new RunSummaryImpl(resultSet.getInt("run"));
-                runSummary.setStartDate(resultSet.getTimestamp("start_date"));
-                runSummary.setEndDate(resultSet.getTimestamp("end_date"));
-                runSummary.setTotalEvents(resultSet.getInt("nevents"));
-                runSummary.setTotalFiles(resultSet.getInt("nfiles"));
-                runSummary.setEndOkay(resultSet.getBoolean("end_ok"));
-                runSummary.setRunOkay(resultSet.getBoolean("run_ok"));
-                runSummary.setUpdated(resultSet.getTimestamp("updated"));
-                runSummary.setCreated(resultSet.getTimestamp("created"));
-                runSummaries.add(runSummary);
-            }
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
-            if (statement != null) {
-                try {
-                    statement.close();
-                } catch (final SQLException e) {
-                    e.printStackTrace();
-                }
-            }
-        }
-        return runSummaries;
-    }
-
-    /**
-     * Get a run summary by run number without loading object state.
+   
+    /**
+     * Get a run summary.
      *
      * @param run the run number
      * @return the run summary object
@@ -246,22 +127,28 @@
         PreparedStatement statement = null;
         RunSummaryImpl runSummary = null;
         try {
-            statement = this.connection.prepareStatement(RunSummaryQuery.SELECT_RUN);
+            statement = this.connection.prepareStatement(SELECT);
             statement.setInt(1, run);
             final ResultSet resultSet = statement.executeQuery();
             if (!resultSet.next()) {
-                throw new IllegalArgumentException("No record exists for run " + run + " in database.");
-            }
-
+                throw new IllegalArgumentException("Run " + run + " does not exist in database.");
+            }
             runSummary = new RunSummaryImpl(run);
-            runSummary.setStartDate(resultSet.getTimestamp("start_date"));
-            runSummary.setEndDate(resultSet.getTimestamp("end_date"));
-            runSummary.setTotalEvents(resultSet.getInt("nevents"));
+            runSummary.setTotalEvents(resultSet.getLong("nevents"));
             runSummary.setTotalFiles(resultSet.getInt("nfiles"));
-            runSummary.setEndOkay(resultSet.getBoolean("end_ok"));
-            runSummary.setRunOkay(resultSet.getBoolean("run_ok"));
+            runSummary.setPrestartTimestamp(resultSet.getInt("prestart_timestamp"));
+            runSummary.setGoTimestamp(resultSet.getInt("go_timestamp"));
+            runSummary.setEndTimestamp(resultSet.getInt("end_timestamp"));
+            runSummary.setTriggerRate(resultSet.getDouble("trigger_rate"));
+            runSummary.setTriggerConfigName(resultSet.getString("trigger_config_name"));
+            runSummary.setTiTimeOffset(resultSet.getLong("ti_time_offset"));
+            runSummary.setLivetimeClock(resultSet.getDouble("livetime_clock"));
+            runSummary.setLivetimeFcupTdc(resultSet.getDouble("livetime_fcup_tdc"));
+            runSummary.setLivetimeFcupTrg(resultSet.getDouble("livetime_fcup_trg"));
+            runSummary.setTarget(resultSet.getString("target"));
+            runSummary.setNotes(resultSet.getString("notes"));
+            runSummary.setCreated(resultSet.getTimestamp("created"));
             runSummary.setUpdated(resultSet.getTimestamp("updated"));
-            runSummary.setCreated(resultSet.getTimestamp("created"));
         } catch (final SQLException e) {
             throw new RuntimeException(e);
         } finally {
@@ -275,129 +162,9 @@
         }
         return runSummary;
     }
-
-    /**
-     * Insert a list of run summaries along with their complex state such as referenced scaler and EPICS data.
-     *
-     * @param runSummaryList the list of run summaries
-     * @param deleteExisting <code>true</code> to allow deletion and replacement of existing run summaries
-     */
-    @Override
-    public void insertFullRunSummaries(final List<RunSummary> runSummaryList, final boolean deleteExisting) {
-
-        if (runSummaryList == null) {
-            throw new IllegalArgumentException("The run summary list is null.");
-        }
-        if (runSummaryList.isEmpty()) {
-            throw new IllegalArgumentException("The run summary list is empty.");
-        }
-
-        LOGGER.info("inserting " + runSummaryList.size() + " run summaries into database");
-
-        // Turn off auto commit.
-        try {
-            LOGGER.info("turning off auto commit");
-            this.connection.setAutoCommit(false);
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        }
-
-        // Loop over all runs found while crawling.
-        for (final RunSummary runSummary : runSummaryList) {
-
-            final int run = runSummary.getRun();
-
-            LOGGER.info("inserting run summary for run " + run + " into database");
-
-            // Does the run exist in the database already?
-            if (this.runSummaryExists(run)) {
-                // Is deleting existing rows allowed?
-                if (deleteExisting) {
-                    LOGGER.info("deleting existing run summary");
-                    // Delete the existing rows.
-                    this.deleteFullRun(runSummary.getRun());
-                } else {
-                    // Rows exist but updating is disallowed which is a fatal error.
-                    throw new IllegalStateException("Run " + runSummary.getRun()
-                            + " already exists and updates are disallowed.");
-                }
-            }
-
-            // Insert full run summary information including sub-objects.
-            LOGGER.info("inserting run summary");
-            this.insertFullRunSummary(runSummary);
-            LOGGER.info("run summary for " + run + " inserted successfully");
-
-            try {
-                // Commit the transaction for the run.
-                LOGGER.info("committing transaction");
-                this.connection.commit();
-            } catch (final SQLException e1) {
-                try {
-                    LOGGER.severe("rolling back transaction");
-                    // Rollback the transaction if there was an error.
-                    this.connection.rollback();
-                } catch (final SQLException e2) {
-                    throw new RuntimeException(e2);
-                }
-            }
-
-            LOGGER.info("done inserting run summary " + run);
-        }
-
-        try {
-            LOGGER.info("turning auto commit on");
-            // Turn auto commit back on.
-            this.connection.setAutoCommit(true);
-        } catch (final SQLException e) {
-            e.printStackTrace();
-        }
-
-        LOGGER.info("done inserting run summaries");
-    }
-
-    /**
-     * Insert a run summary including all its objects.
-     *
-     * @param runSummary the run summary object to insert
-     */
-    @Override
-    public void insertFullRunSummary(final RunSummary runSummary) {
-
-        if (runSummary == null) {
-            throw new IllegalArgumentException("The run summary is null.");
-        }
-        
-        // Insert basic run log info.
-        this.insertRunSummary(runSummary);
-
-        // Insert EPICS data.
-        if (runSummary.getEpicsData() != null && !runSummary.getEpicsData().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getEpicsData().size() + " EPICS records");
-            epicsDataDao.insertEpicsData(runSummary.getEpicsData());
-        } else {
-            LOGGER.warning("no EPICS data to insert");
-        }
-
-        // Insert scaler data.
-        if (runSummary.getScalerData() != null && !runSummary.getScalerData().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getScalerData().size() + " scaler data records");
-            scalerDataDao.insertScalerData(runSummary.getScalerData(), runSummary.getRun());
-        } else {
-            LOGGER.warning("no scaler data to insert");
-        }
-
-        // Insert trigger config.
-        if (runSummary.getTriggerConfig() != null && !runSummary.getTriggerConfig().isEmpty()) {
-            LOGGER.info("inserting " + runSummary.getTriggerConfig().size() + " trigger config variables");
-            triggerConfigIntDao.insertTriggerConfig(runSummary.getTriggerConfig(), runSummary.getRun());
-        } else {
-            LOGGER.warning("no trigger config to insert");
-        }
-    }
-
-    /**
-     * Insert a run summary but not its objects.
+      
+    /**
+     * Insert a run summary.
      *
      * @param runSummary the run summary object
      */
@@ -405,13 +172,23 @@
     public void insertRunSummary(final RunSummary runSummary) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.INSERT);
+            preparedStatement = connection.prepareStatement(INSERT);                       
             preparedStatement.setInt(1, runSummary.getRun());
-            preparedStatement.setTimestamp(2, new java.sql.Timestamp(runSummary.getStartDate().getTime()), CALENDAR);
-            preparedStatement.setTimestamp(3, new java.sql.Timestamp(runSummary.getEndDate().getTime()), CALENDAR);
-            preparedStatement.setInt(4, runSummary.getTotalEvents());
-            preparedStatement.setInt(5, runSummary.getTotalFiles());
-            preparedStatement.setBoolean(6, runSummary.getEndOkay());
+            preparedStatement.setLong(2, runSummary.getTotalEvents());
+            preparedStatement.setInt(3, runSummary.getTotalFiles());
+            /* Use setObject on the rest as they may be null. */
+            preparedStatement.setObject(4, runSummary.getPrestartTimestamp());
+            preparedStatement.setObject(5, runSummary.getGoTimestamp());
+            preparedStatement.setObject(6, runSummary.getEndTimestamp());
+            preparedStatement.setObject(7, runSummary.getTriggerRate());
+            preparedStatement.setObject(8, runSummary.getTriggerConfigName());
+            preparedStatement.setObject(9, runSummary.getTiTimeOffset());
+            preparedStatement.setObject(10, runSummary.getLivetimeClock());
+            preparedStatement.setObject(11, runSummary.getLivetimeFcupTdc());
+            preparedStatement.setObject(12, runSummary.getLivetimeFcupTrg());
+            preparedStatement.setObject(13, runSummary.getTarget());
+            preparedStatement.setObject(14, runSummary.getNotes());
+            LOGGER.fine(preparedStatement.toString());
             preparedStatement.executeUpdate();
         } catch (final SQLException e) {
             throw new RuntimeException(e);
@@ -425,34 +202,42 @@
             }
         }
     }
-
-    /**
-     * Read a run summary and its objects such as scaler data.
-     *
-     * @param run the run number
-     * @return the full run summary
-     */
-    @Override
-    public RunSummary readFullRunSummary(final int run) {
-
-        // Read main run summary but not referenced objects.
-        final RunSummaryImpl runSummary = (RunSummaryImpl) this.getRunSummary(run);
-
-        // Read EPICS data and set on RunSummary.
-        final List<EpicsData> epicsDataList = new ArrayList<EpicsData>();
-        epicsDataList.addAll(epicsDataDao.getEpicsData(EpicsType.EPICS_1S, run));
-        epicsDataList.addAll(epicsDataDao.getEpicsData(EpicsType.EPICS_10S, run));
-        runSummary.setEpicsData(epicsDataList);
-
-        // Read scaler data and set on RunSummary.
-        runSummary.setScalerData(scalerDataDao.getScalerData(run));
-
-        // Read trigger config.
-        runSummary.setTriggerConfig(triggerConfigIntDao.getTriggerConfig(run));
-
-        return runSummary;
-    }
-
+    
+    @Override
+    public void updateRunSummary(RunSummary runSummary) {
+        PreparedStatement preparedStatement = null;
+        try {
+            preparedStatement = connection.prepareStatement(UPDATE);                                   
+            preparedStatement.setLong(1, runSummary.getTotalEvents());
+            preparedStatement.setInt(2, runSummary.getTotalFiles());
+            preparedStatement.setObject(3, runSummary.getPrestartTimestamp());
+            preparedStatement.setObject(4, runSummary.getGoTimestamp());
+            preparedStatement.setObject(5, runSummary.getEndTimestamp());
+            preparedStatement.setObject(6, runSummary.getTriggerRate());
+            preparedStatement.setObject(7, runSummary.getTriggerConfigName());
+            preparedStatement.setObject(8, runSummary.getTiTimeOffset());
+            preparedStatement.setObject(9, runSummary.getLivetimeClock());
+            preparedStatement.setObject(10, runSummary.getLivetimeFcupTdc());
+            preparedStatement.setObject(11, runSummary.getLivetimeFcupTrg());
+            preparedStatement.setObject(12, runSummary.getTarget());
+            preparedStatement.setObject(13, runSummary.getNotes());
+            preparedStatement.setInt(14, runSummary.getRun());
+            LOGGER.fine(preparedStatement.toString());
+            preparedStatement.executeUpdate();
+        } catch (final SQLException e) {
+            throw new RuntimeException(e);
+        } finally {
+            if (preparedStatement != null) {
+                try {
+                    preparedStatement.close();
+                } catch (final SQLException e) {
+                    e.printStackTrace();
+                }
+            }
+        }
+    }
+    
+   
     /**
      * Return <code>true</code> if a run summary exists in the database for the run number.
      *
@@ -463,7 +248,7 @@
     public boolean runSummaryExists(final int run) {
         PreparedStatement preparedStatement = null;
         try {
-            preparedStatement = connection.prepareStatement("SELECT run FROM runs where run = ?");
+            preparedStatement = connection.prepareStatement("SELECT run FROM run_summaries where run = ?");
             preparedStatement.setInt(1, run);
             final ResultSet rs = preparedStatement.executeQuery();
             return rs.first();
@@ -479,35 +264,4 @@
             }
         }
     }
-
-    /**
-     * Update a run summary but not its complex state.
-     *
-     * @param runSummary the run summary to update
-     */
-    @Override
-    public void updateRunSummary(final RunSummary runSummary) {
-        PreparedStatement preparedStatement = null;
-        try {
-            preparedStatement = connection.prepareStatement(RunSummaryQuery.UPDATE_RUN);
-            preparedStatement.setTimestamp(1, new java.sql.Timestamp(runSummary.getStartDate().getTime()), CALENDAR);
-            preparedStatement.setTimestamp(2, new java.sql.Timestamp(runSummary.getEndDate().getTime()), CALENDAR);
-            preparedStatement.setInt(3, runSummary.getTotalEvents());
-            preparedStatement.setInt(4, runSummary.getTotalFiles());
-            preparedStatement.setBoolean(5, runSummary.getEndOkay());
-            preparedStatement.setBoolean(6, runSummary.getRunOkay());
-            preparedStatement.setInt(7, runSummary.getRun());
-            preparedStatement.executeUpdate();
-        } catch (final SQLException e) {
-            throw new RuntimeException(e);
-        } finally {
-            if (preparedStatement != null) {
-                try {
-                    preparedStatement.close();
-                } catch (final SQLException e) {
-                    e.printStackTrace();
-                }
-            }
-        }
-    }
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java	Wed Feb 24 13:06:58 2016
@@ -1,40 +1,13 @@
 package org.hps.run.database;
 
-import java.io.File;
-import java.text.DateFormat;
-import java.text.SimpleDateFormat;
-import java.util.ArrayList;
 import java.util.Date;
-import java.util.GregorianCalendar;
-import java.util.HashMap;
-import java.util.List;
-import java.util.Map;
-import java.util.TimeZone;
-
-import org.hps.datacat.client.DatasetFileFormat;
-import org.hps.record.epics.EpicsData;
-import org.hps.record.scalers.ScalerData;
-import org.hps.record.triggerbank.TriggerConfig;
 
 /**
  * Implementation of {@link RunSummary} for retrieving information from the run database.
  *
  * @author Jeremy McCormick, SLAC
  */
-public final class RunSummaryImpl implements RunSummary {
-
-    /**
-     * Default date display format.
-     */
-    private static final DateFormat DATE_DISPLAY = new SimpleDateFormat();
-
-    static {
-        /**
-         * Set default time zone for display to East Coast (JLAB) where data was
-         * taken.
-         */
-        DATE_DISPLAY.setCalendar(new GregorianCalendar(TimeZone.getTimeZone("America/New_York")));
-    }
+final class RunSummaryImpl implements RunSummary {
 
     /**
      * Date this record was created.
@@ -42,60 +15,80 @@
     private Date created;
 
     /**
-     * End date of run.
-     */
-    private Date endDate;
-
-    /**
-     * This is <code>true</code> if the END event is found in the data.
-     */
-    private boolean endOkay;
-
-    /**
-     * The EPICS data from the run.
-     */
-    private List<EpicsData> epicsDataList;
+     * Timestamp of END event.
+     */
+    private Integer endTimestamp;
+
+    /**
+     * Timestamp of GO event.
+     */
+    private Integer goTimestamp;
+
+    /**
+     * Clock livetime calculation.
+     */
+    private Double livetimeClock;
+
+    /**
+     * FCup TDC livetime calculation.
+     */
+    private Double livetimeTdc;
+
+    /**
+     * FCup TRG livetime calculation.
+     */
+    private Double livetimeTrg;
+
+    /**
+     * Notes about the run (from spreadsheet).
+     */
+    private String notes;
+
+    /**
+     * Timestamp of PRESTART event.
+     */
+    private Integer prestartTimestamp;
 
     /**
      * The run number.
      */
-    private final int run;
-
-    /**
-     * Flag to indicate run was okay.
-     */
-    private boolean runOkay = true;
-
-    /**
-     * The scaler data for the run.
-     */
-    private List<ScalerData> scalerDataList;
-
-    /**
-     * The trigger data for the run.
-     */
-    private TriggerConfig triggerConfig;
-
-    /**
-     * Start date of run.
-     */
-    private Date startDate;
+    private final Integer run;
+
+    /**
+     * Target setup (string from run spreadsheet).
+     */
+    private String target;
+
+    /**
+     * TI time offset in ns.
+     */
+    private Long tiTimeOffset;
 
     /**
      * The total events found in the run across all files.
      */
-    private int totalEvents = -1;
+    private Long totalEvents;
 
     /**
      * The total number of files in the run.
      */
-    private int totalFiles = 0;
+    private Integer totalFiles;
+   
+    /**
+     * Name of the trigger config file.
+     */
+    private String triggerConfigName;
+
+    /**
+     * Trigger rate in KHz.
+     */
+    private double triggerRate;
 
     /**
      * Date when the run record was last updated.
      */
     private Date updated;
-    
+
     /**
      * Create a run summary.
      *
@@ -105,209 +98,174 @@
         this.run = run;
     }
 
-    /**
-     * Get the creation date of this run record.
-     *
-     * @return the creation date of this run record
-     */
+    @Override
     public Date getCreated() {
         return this.created;
     }
 
-    /**
-     * Get the end date.
-     *
-     * @return the end date
-     */
-    public Date getEndDate() {
-        return endDate;
-    }
-
-    /**
-     * Return <code>true</code> if END event was found in the data.
-     *
-     * @return <code>true</code> if END event was in the data
-     */
-    public boolean getEndOkay() {
-        return this.endOkay;
-    }
-
-    /**
-     * Get the EPICS data from the run.
-     *
-     * @return the EPICS data from the run
-     */
-    public List<EpicsData> getEpicsData() {
-        return this.epicsDataList;
-    }
-
-    /**
-     * Get the event rate (effectively the trigger rate) which is the total
-     * events divided by the number of seconds in the run.
-     *
-     * @return the event rate
-     */
-    public double getEventRate() {
-        if (this.getTotalEvents() <= 0) {
-            throw new RuntimeException("Total events is zero or invalid.");
-        }
-        return (double) this.getTotalEvents() / (double) this.getTotalSeconds();
-    }
-
-    /**
-     * Get the run number.
-     *
-     * @return the run number
-     */
-    public int getRun() {
+    @Override
+    public Integer getEndTimestamp() {
+        return endTimestamp;
+    }
+
+    @Override
+    public Integer getGoTimestamp() {
+        return goTimestamp;
+    }
+
+    @Override
+    public Double getLivetimeClock() {
+        return this.livetimeClock;
+    }
+
+    @Override
+    public Double getLivetimeFcupTdc() {
+        return this.livetimeTdc;
+    }
+
+    @Override
+    public Double getLivetimeFcupTrg() {
+        return this.livetimeTrg;
+    }
+
+    @Override
+    public String getNotes() {
+        return this.notes;
+    }
+
+    @Override
+    public Integer getPrestartTimestamp() {
+        return prestartTimestamp;
+    }
+
+    @Override
+    public Integer getRun() {
         return this.run;
     }
 
-    /**
-     * Return <code>true</code> if the run was okay (no major errors or data
-     * corruption occurred).
-     *
-     * @return <code>true</code> if the run was okay
-     */
-    public boolean getRunOkay() {
-        return this.runOkay;
-    }
-
-    /**
-     * Get the scaler data of this run.
-     *
-     * @return the scaler data of this run
-     */
-    public List<ScalerData> getScalerData() {
-        return this.scalerDataList;
-    }
-
-    /**
-     * Get the trigger config of this run.
-     *
-     * @return the trigger config of this run
-     */
-    public TriggerConfig getTriggerConfig() {
-        return triggerConfig;
-    }
-
-    /**
-     * Get the start date.
-     *
-     * @return the start date
-     */
-    public Date getStartDate() {
-        return startDate;
-    }
-
-    /**
-     * Get the total events in the run.
-     *
-     * @return the total events in the run
-     */
-    public int getTotalEvents() {
+    @Override
+    public String getTarget() {
+        return this.target;
+    }
+
+    @Override
+    public Long getTiTimeOffset() {
+        return this.tiTimeOffset;
+    }
+
+    @Override
+    public Long getTotalEvents() {
         return this.totalEvents;
     }
 
-    /**
-     * Get the total number of files for this run.
-     *
-     * @return the total number of files for this run
-     */
-    public int getTotalFiles() {
+    @Override
+    public Integer getTotalFiles() {
         return this.totalFiles;
     }
-
-    /**
-     * Get the number of seconds in the run which is the difference between the
-     * start and end times.
-     *
-     * @return the total seconds in the run
-     */
-    public long getTotalSeconds() {
-        return (endDate.getTime() - startDate.getTime()) / 1000;
-    }
-
-    /**
-     * Get the date when this run record was last updated.
-     *
-     * @return the date when this run record was last updated
-     */
+   
+    @Override
+    public String getTriggerConfigName() {
+        return this.triggerConfigName;
+    }
+
+    @Override
+    public Double getTriggerRate() {
+        return this.triggerRate;
+    }
+
+    @Override
     public Date getUpdated() {
         return updated;
     }
-    
-    /**
-     * Set the creation date of the run record.
-     *
-     * @param created the creation date of the run record
-     */
-    void setCreated(final Date created) {
+
+    /**
+     * Set the creation date of the run summary.
+     * 
+     * @param created the creation date
+     */
+    void setCreated(Date created) {
         this.created = created;
     }
 
     /**
-     * Set the start date.
-     *
-     * @param startDate the start date
-     */
-    void setEndDate(final Date endDate) {
-        this.endDate = endDate;
-    }
-
-    /**
-     * Set if end is okay.
-     *
-     * @param endOkay <code>true</code> if end is okay
-     */
-    void setEndOkay(final boolean endOkay) {
-        this.endOkay = endOkay;
-    }
-   
-    /**
-     * Set the EPICS data for the run.
-     *
-     * @param epics the EPICS data for the run
-     */
-    void setEpicsData(final List<EpicsData> epicsDataList) {
-        this.epicsDataList = epicsDataList;
-    }
-    
-    /**
-     * Set whether the run was "okay" meaning the data is usable for physics
-     * analysis.
-     *
-     * @param runOkay <code>true</code> if the run is okay
-     */
-    void setRunOkay(final boolean runOkay) {
-        this.runOkay = runOkay;
-    }
-
-    /**
-     * Set the scaler data of the run.
-     *
-     * @param scalerData the scaler data
-     */
-    void setScalerData(final List<ScalerData> scalerDataList) {
-        this.scalerDataList = scalerDataList;
-    }
-
-    /**
-     * Set the trigger config of the run.
-     *
-     * @param triggerConfig the trigger config
-     */
-    void setTriggerConfig(final TriggerConfig triggerConfig) {
-        this.triggerConfig = triggerConfig;
-    }
-
-    /**
-     * Set the start date.
-     *
-     * @param startDate the start date
-     */
-    void setStartDate(final Date startDate) {
-        this.startDate = startDate;
+     * Set the end timestamp.
+     * 
+     * @param endTimestamp the end timestamp
+     */
+    void setEndTimestamp(Integer endTimestamp) {
+        this.endTimestamp = endTimestamp;
+    }
+
+    /**
+     * Set the GO timestamp.
+     * 
+     * @param goTimestamp the GO timestamp
+     */
+    void setGoTimestamp(Integer goTimestamp) {
+        this.goTimestamp = goTimestamp;
+    }
+
+    /**
+     * Set the clock livetime. 
+     * 
+     * @param livetimeClock the clock livetime
+     */
+    void setLivetimeClock(Double livetimeClock) {
+        this.livetimeClock = livetimeClock;
+    }
+
+    /**
+     * Set the FCUP TDC livetime.
+     * 
+     * @param livetimeTdc the FCUP TDC livetime
+     */
+    void setLivetimeFcupTdc(Double livetimeTdc) {
+        this.livetimeTdc = livetimeTdc;
+    }
+
+    /**
+     * Set the FCUP TRG livetime.
+     * 
+     * @param livetimeTrg the FCUP TRG livetime
+     */
+    void setLivetimeFcupTrg(Double livetimeTrg) {
+        this.livetimeTrg = livetimeTrg;
+    }
+
+    /**
+     * Set the notes.
+     * 
+     * @param notes the notes
+     */
+    void setNotes(String notes) {
+        this.notes = notes;
+    }
+
+    /**
+     * Set the PRESTART timestamp.
+     * 
+     * @param prestartTimestamp the PRESTART timestamp
+     */
+    void setPrestartTimestamp(Integer prestartTimestamp) {
+        this.prestartTimestamp = prestartTimestamp;
+    }
+
+    /**
+     * Set the target description.
+     * 
+     * @param target the target description
+     */
+    void setTarget(String target) {
+        this.target = target;
+    }
+
+    /**
+     * Set the TI time offset in ns.
+     * 
+     * @param tiTimeOffset the TIM time offset in ns
+     */
+    void setTiTimeOffset(Long tiTimeOffset) {
+        this.tiTimeOffset = tiTimeOffset;
     }
 
     /**
@@ -315,7 +273,7 @@
      *
      * @param totalEvents the total number of physics events in the run
      */
-    void setTotalEvents(final int totalEvents) {
+    void setTotalEvents(final Long totalEvents) {
         this.totalEvents = totalEvents;
     }
 
@@ -324,37 +282,59 @@
      *
      * @param totalFiles the total number of EVIO files in the run
      */
-    void setTotalFiles(final int totalFiles) {
+    void setTotalFiles(final Integer totalFiles) {
         this.totalFiles = totalFiles;
     }
 
     /**
-     * Set the date when this run record was last updated.
-     *
-     * @param updated the date when the run record was last updated
-     */
-    void setUpdated(final Date updated) {
+     * Set the trigger config file.
+     * 
+     * @param triggerConfigName the trigger config file
+     */
+    void setTriggerConfigName(String triggerConfigName) {
+        this.triggerConfigName = triggerConfigName;
+    }
+
+    /**
+     * Set the trigger rate in KHz.
+     * 
+     * @param triggerRate the trigger rate in KHz
+     */
+    void setTriggerRate(Double triggerRate) {
+        this.triggerRate = triggerRate;
+    }
+
+    /**
+     * Set the updated date of the summary.
+     * 
+     * @param updated the updated date
+     */
+    void setUpdated(Date updated) {
         this.updated = updated;
     }
-    
-    /**
-     * Convert this object to a string.
-     *
+
+    /**
+     * Convert the object to a string.
+     * 
      * @return this object converted to a string
      */
     @Override
     public String toString() {
         return "RunSummary { " 
                 + "run: " + this.getRun() 
-                + ", startDate: " + (this.getStartDate() != null ? DATE_DISPLAY.format(this.getStartDate()) : null)
-                + ", endDate: " + (this.getEndDate() != null ? DATE_DISPLAY.format(this.getEndDate()) : null) 
-                + ", totalEvents: " + this.getTotalEvents()
-                + ", totalFiles: " + this.getTotalFiles() 
-                + ", endOkay: " + this.getEndOkay() 
-                + ", runOkay: "
-                + this.getRunOkay() 
-                + ", updated: " + this.getUpdated() 
+                + ", events: " + this.getTotalEvents() 
+                + ", files: " + this.getTotalFiles() 
                 + ", created: " + this.getCreated() 
+                + ", updated: " + this.getUpdated()
+                + ", prestartTimestamp: " + this.getPrestartTimestamp()
+                + ", goTimestamp: " + this.getGoTimestamp()
+                + ", endTimestamp: " + this.getEndTimestamp()
+                + ", triggerConfigFile: " + this.getTriggerConfigName()
+                + ", triggerRate: " + this.getTriggerRate()
+                + ", livetimeClock: " + this.getLivetimeClock()
+                + ", livetimeTdc: " + this.getLivetimeFcupTdc()
+                + ", livetimeTrg: " + this.getLivetimeFcupTrg()
+                + ", tiTimeOffset: " + this.getTiTimeOffset() 
                 + " }";
     }
 }

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java	Wed Feb 24 13:06:58 2016
@@ -18,15 +18,9 @@
 final class ScalerDataDaoImpl implements ScalerDataDao {
 
     /**
-     * SQL query strings.
+     * Insert a record.
      */
-    private static final class ScalerDataQuery {
-
-        /**
-         * Insert a record.
-         */
-        private static final String INSERT = createInsertSql();
-    }
+    private static final String INSERT = createInsertSql();    
 
     /**
      * Create insert SQL for scaler data.
@@ -102,7 +96,8 @@
         PreparedStatement selectScalers = null;
         final List<ScalerData> scalerDataList = new ArrayList<ScalerData>();
         try {
-            selectScalers = this.connection.prepareStatement("SELECT * FROM scalers WHERE run = ? ORDER BY event");
+            selectScalers = this.connection.prepareStatement("SELECT * FROM sc"
+                    + "alers WHERE run = ? ORDER BY event");
             selectScalers.setInt(1, run);
             final ResultSet resultSet = selectScalers.executeQuery();
             while (resultSet.next()) {
@@ -139,7 +134,7 @@
     public void insertScalerData(final List<ScalerData> scalerDataList, final int run) {
         PreparedStatement insertScalers = null;
         try {
-            insertScalers = this.connection.prepareStatement(ScalerDataQuery.INSERT);
+            insertScalers = this.connection.prepareStatement(INSERT);
             for (final ScalerData scalerData : scalerDataList) {
                 insertScalers.setInt(1, run);
                 insertScalers.setInt(2, scalerData.getEventId());

Modified: java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java	Wed Feb 24 13:06:58 2016
@@ -1,4 +1,4 @@
 /**
- * API for accessing the HPS run database.
+ * API for accessing and updating the HPS run database.
  */
 package org.hps.run.database;

Modified: java/trunk/users/src/main/java/org/hps/users/holly/EcalRawConverter.java
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/holly/EcalRawConverter.java	(original)
+++ java/trunk/users/src/main/java/org/hps/users/holly/EcalRawConverter.java	Wed Feb 24 13:06:58 2016
@@ -138,7 +138,7 @@
 					System.out.println("======================================================================");
 					System.out.println("=== FADC Pulse-Processing Settings ===================================");
 					System.out.println("======================================================================");
-					config.printConfig();
+					config.printConfig(System.out);
 				}
 			}
     	});

Modified: java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	(original)
+++ java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	Wed Feb 24 13:06:58 2016
@@ -122,10 +122,13 @@
                 if (runNum != currentRun) {
                     if (useTI && !useCrawlerTI) {
                         RunManager.getRunManager().setRun(runNum);
-                        if (!RunManager.getRunManager().runExists() || RunManager.getRunManager().getTriggerConfig().getTiTimeOffset() == null) {
+                        if (!RunManager.getRunManager().runExists() || RunManager.getRunManager().getRunSummary().getTiTimeOffset() == null) {
                             continue;
                         }
-                        tiTimeOffset = RunManager.getRunManager().getTriggerConfig().getTiTimeOffset();
+                        tiTimeOffset = RunManager.getRunManager().getRunSummary().getTiTimeOffset();
+                        if (tiTimeOffset == 0) {
+                            continue;
+                        }
                         if (tiTimeOffset == 0) {
                             continue;
                         }

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

November 2017
August 2017
July 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use