LISTSERV mailing list manager LISTSERV 16.5

Help for HPS-SVN Archives


HPS-SVN Archives

HPS-SVN Archives


HPS-SVN@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

HPS-SVN Home

HPS-SVN Home

HPS-SVN  July 2016

HPS-SVN July 2016

Subject:

r4419 - in /java/trunk: ./ crawler/ crawler/src/main/java/org/hps/crawler/ datacat/ datacat/src/ datacat/src/main/ datacat/src/main/java/ datacat/src/main/java/org/ datacat/src/main/java/org/hps/ datacat/src/main/java/org/hps/datacat/ evio/src/main/java/org/hps/evio/ evio/src/test/java/org/hps/evio/ job/src/main/java/org/hps/job/ parent/ record-util/src/main/java/org/hps/record/daqconfig/ record-util/src/main/java/org/hps/record/triggerbank/ record-util/src/main/java/org/hps/record/util/ run-database/ run-database/src/main/java/org/hps/run/ run-database/src/main/java/org/hps/rundb/ run-database/src/main/java/org/hps/rundb/builder/ run-database/src/test/java/org/hps/run/ run-database/src/test/java/org/hps/rundb/ run-database/src/test/java/org/hps/rundb/builder/ users/src/main/java/org/hps/users/meeg/ users/src/main/java/org/hps/users/spaul/

From:

[log in to unmask]

Reply-To:

Notification of commits to the hps svn repository <[log in to unmask]>

Date:

Thu, 7 Jul 2016 22:52:48 -0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (2938 lines)

Author: [log in to unmask]
Date: Thu Jul  7 15:52:41 2016
New Revision: 4419

Log:
[HPSJAVA-675] Major reorg of run database and datacat modules (see JIRA item sub-tasks for details).

Added:
    java/trunk/datacat/
    java/trunk/datacat/pom.xml
    java/trunk/datacat/src/
    java/trunk/datacat/src/main/
    java/trunk/datacat/src/main/java/
    java/trunk/datacat/src/main/java/org/
    java/trunk/datacat/src/main/java/org/hps/
    java/trunk/datacat/src/main/java/org/hps/datacat/
    java/trunk/datacat/src/main/java/org/hps/datacat/DataType.java
      - copied, changed from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/DataType.java
    java/trunk/datacat/src/main/java/org/hps/datacat/DatacatConstants.java
    java/trunk/datacat/src/main/java/org/hps/datacat/DatacatPrintRun.java
      - copied, changed from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/DatacatPrintRun.java
    java/trunk/datacat/src/main/java/org/hps/datacat/DatacatUtilities.java
    java/trunk/datacat/src/main/java/org/hps/datacat/FileEventRange.java
    java/trunk/datacat/src/main/java/org/hps/datacat/FileFormat.java
      - copied, changed from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/FileFormat.java
    java/trunk/datacat/src/main/java/org/hps/datacat/Site.java
      - copied, changed from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/Site.java
    java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TriggerConfigEvioProcessor.java
      - copied, changed from r4415, java/trunk/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java
    java/trunk/run-database/src/main/java/org/hps/rundb/
    java/trunk/run-database/src/main/java/org/hps/rundb/DaoProvider.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/DaoProvider.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsType.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariable.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/RunManager.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java
    java/trunk/run-database/src/main/java/org/hps/rundb/RunSummary.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java
    java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDao.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java
    java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDaoImpl.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/AbstractRunBuilder.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/AbstractRunBuilder.java
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/BuilderCommandLine.java   (with props)
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/DatacatBuilder.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/DatacatBuilder.java
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataBuilder.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EvioDataBuilder.java
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataCommandLine.java   (with props)
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/LivetimeBuilder.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/LivetimeBuilder.java
    java/trunk/run-database/src/main/java/org/hps/rundb/builder/SpreadsheetBuilder.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SpreadsheetBuilder.java
    java/trunk/run-database/src/main/java/org/hps/rundb/package-info.java
      - copied, changed from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java
    java/trunk/run-database/src/test/java/org/hps/rundb/
    java/trunk/run-database/src/test/java/org/hps/rundb/builder/
    java/trunk/run-database/src/test/java/org/hps/rundb/builder/RunBuilderTest.java
      - copied, changed from r4415, java/trunk/run-database/src/test/java/org/hps/run/database/RunBuilderTest.java
Removed:
    java/trunk/crawler/src/main/java/org/hps/crawler/DataType.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatPrintRun.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileFormat.java
    java/trunk/crawler/src/main/java/org/hps/crawler/Site.java
    java/trunk/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java
    java/trunk/record-util/src/main/java/org/hps/record/util/
    java/trunk/run-database/src/main/java/org/hps/run/
    java/trunk/run-database/src/test/java/org/hps/run/
Modified:
    java/trunk/crawler/pom.xml
    java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
    java/trunk/crawler/src/main/java/org/hps/crawler/DatacatHelper.java
    java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
    java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java
    java/trunk/crawler/src/main/java/org/hps/crawler/MetadataWriter.java
    java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java
    java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
    java/trunk/evio/src/test/java/org/hps/evio/ScalersTest.java
    java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java
    java/trunk/parent/pom.xml
    java/trunk/pom.xml
    java/trunk/run-database/pom.xml
    java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java
    java/trunk/users/src/main/java/org/hps/users/spaul/FindBiasOnRange.java

Modified: java/trunk/crawler/pom.xml
 =============================================================================
--- java/trunk/crawler/pom.xml	(original)
+++ java/trunk/crawler/pom.xml	Thu Jul  7 15:52:41 2016
@@ -20,8 +20,8 @@
             <artifactId>hps-record-util</artifactId>
         </dependency>
         <dependency>
-            <groupId>srs</groupId>
-            <artifactId>org-srs-datacat-client</artifactId>
+            <groupId>org.hps</groupId>
+            <artifactId>hps-datacat</artifactId>
         </dependency>
     </dependencies>
 </project>

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/CrawlerConfig.java	Thu Jul  7 15:52:41 2016
@@ -11,6 +11,9 @@
 import java.util.Set;
 
 import org.hps.conditions.database.ConnectionParameters;
+import org.hps.datacat.DatacatConstants;
+import org.hps.datacat.FileFormat;
+import org.hps.datacat.Site;
 
 /**
  * Full configuration information for the {@link Crawler} class.
@@ -78,7 +81,7 @@
     /**
      * Base URL of datacat client.
      */
-    private String baseUrl = DatacatHelper.DATACAT_URL;
+    private String baseUrl = DatacatConstants.DATACAT_URL;
         
     /**
      * Set of paths used for filtering files (file's path must match one of these).

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/DatacatAddFile.java	Thu Jul  7 15:52:41 2016
@@ -6,10 +6,12 @@
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
-import org.apache.commons.cli.PosixParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.PosixParser;
+import org.hps.datacat.DatacatUtilities;
+import org.hps.datacat.Site;
 import org.srs.datacat.model.DatasetModel;
 
 /**
@@ -23,6 +25,11 @@
     
     private List<File> paths = new ArrayList<File>();
     
+    private String folder = null;
+    private Site site = Site.JLAB;
+    private String datacatUrl = "http://hpsweb.jlab.org/datacat/r";
+    private boolean dryRun = false;
+    
     /**
      * Command line options.
      */
@@ -32,11 +39,12 @@
      * Statically define the command options.
      */
     static {
-        OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
-        OPTIONS.addOption("f", "folder", true, "datacat folder");
-        OPTIONS.addOption("s", "site", true, "datacat site");
-        OPTIONS.addOption("u", "base-url", true, "provide a base URL of the datacat server");
-        OPTIONS.addOption("D", "dry-run", false, "dry run mode which will not updated the datacat");
+        OPTIONS.addOption("h", "help", false, "Print help and exit (overrides all other arguments).");
+        OPTIONS.addOption("f", "folder", true, "Datacat folder (required)");
+        OPTIONS.addOption("s", "site", true, "Datacat site (default is JLAB)");
+        OPTIONS.addOption("u", "url", true, "Set the URL of a datacat server (default is JLAB prod server)");
+        OPTIONS.addOption("D", "dry-run", false, "Dry run mode which will not update the datacat");
+        OPTIONS.addOption("p", "patch", false, "Allow patching of existing records in the datacat");
     }
 
     /**
@@ -49,14 +57,11 @@
     }
 
     /**
-     * The crawler configuration.
-     */
-    private CrawlerConfig config;
-
-    /**
      * The options parser.
      */
     private final PosixParser parser = new PosixParser();
+    
+    private boolean patch = false;
     
     /**
      * Parse command line options.
@@ -68,8 +73,6 @@
         
         LOGGER.config("parsing command line options");
 
-        this.config = new CrawlerConfig();
-
         try {
             final CommandLine cl = this.parser.parse(OPTIONS, args);
 
@@ -80,15 +83,15 @@
 
             // Datacat folder.
             if (cl.hasOption("f")) {
-                config.setDatacatFolder(cl.getOptionValue("f"));
-                LOGGER.config("set datacat folder to " + config.folder());
+                folder = cl.getOptionValue("f");
+                LOGGER.config("set datacat folder to " + folder);
             } else {
                 throw new RuntimeException("The -f argument with the datacat folder is required.");
             }
 
             // Dry run.
             if (cl.hasOption("D")) {
-                config.setDryRun(true);
+                this.dryRun = true;
             }
                         
             // List of paths.
@@ -103,21 +106,23 @@
             }
             
             // Dataset site (defaults to JLAB).
-            Site site = Site.JLAB;
             if (cl.hasOption("s")) {
-                site = Site.valueOf(cl.getOptionValue("s"));
+                this.site = Site.valueOf(cl.getOptionValue("s"));
             }
-            LOGGER.config("dataset site " + site);
-            config.setSite(site);
-            
+            LOGGER.config("datacat site: " + site);
+                        
             // Data catalog URL.
             if (cl.hasOption("u")) {
-                config.setDatacatUrl(cl.getOptionValue("u"));
-                LOGGER.config("datacat URL " + config.datacatUrl());
+                datacatUrl = cl.getOptionValue("u");
             }
+            LOGGER.config("datacat url: " + datacatUrl);
 
+            if (cl.hasOption("p")) {
+                this.patch = true;
+            }
+            
         } catch (final ParseException e) {
-            throw new RuntimeException("Error parsing options.", e);
+            throw new RuntimeException("Error parsing the command line options.", e);
         }
 
         LOGGER.info("Done parsing command line options.");
@@ -138,12 +143,12 @@
      * Run the job.
      */
     private void run() {        
-        List<DatasetModel> datasets = DatacatHelper.createDatasets(paths, config.folder(), config.site().toString());        
-        if (!config.dryRun()) {
-            DatacatHelper.addDatasets(datasets, config.folder(), config.datacatUrl());
-            LOGGER.info("Added " + datasets.size() + " datasets to datacat.");
+        List<DatasetModel> datasets = DatacatHelper.createDatasets(paths, folder, site.toString());
+        if (!dryRun) {
+            DatacatUtilities.updateDatasets(datasets, folder, datacatUrl, patch);
+            //LOGGER.info("Added " + datasets.size() + " datasets to datacat.");
         } else {
-            LOGGER.info("Dry run mode; skipped adding dataset.");
+            LOGGER.info("Dry run is enabled; skipped adding dataset.");
         }
      }
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/DatacatCrawler.java	Thu Jul  7 15:52:41 2016
@@ -13,10 +13,13 @@
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
-import org.apache.commons.cli.PosixParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.PosixParser;
+import org.hps.datacat.DatacatUtilities;
+import org.hps.datacat.FileFormat;
+import org.hps.datacat.Site;
 import org.srs.datacat.model.DatasetModel;
 
 /**
@@ -63,7 +66,7 @@
         OPTIONS.addOption("t", "timestamp-file", true, "existing or new timestamp file name");
         OPTIONS.addOption("x", "max-depth", true, "max depth to crawl");
         OPTIONS.addOption("D", "dry-run", false, "dry run which will not update the datacat");
-        OPTIONS.addOption("u", "base-url", true, "provide a base URL of the datacat server");
+        OPTIONS.addOption("u", "url", true, "provide a base URL of the datacat server");
     }
 
     /**
@@ -300,7 +303,7 @@
         if (!visitor.getFiles().isEmpty()) {
             List<DatasetModel> datasets = DatacatHelper.createDatasets(visitor.getFiles(), config.folder(), config.site().toString());
             LOGGER.info("built " + datasets.size() + " datasets");
-            DatacatHelper.addDatasets(datasets, config.folder(), config.datacatUrl());
+            DatacatUtilities.updateDatasets(datasets, config.folder(), config.datacatUrl(), false);
             LOGGER.info("added datasets to datacat");
         } else {
             LOGGER.warning("No files were found by the crawler.");

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/DatacatHelper.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DatacatHelper.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/DatacatHelper.java	Thu Jul  7 15:52:41 2016
@@ -2,23 +2,15 @@
 
 import java.io.File;
 import java.io.IOException;
-import java.net.URISyntaxException;
 import java.util.ArrayList;
-import java.util.HashMap;
-import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
-import java.util.Map.Entry;
-import java.util.Set;
-import java.util.logging.Level;
 import java.util.logging.Logger;
 
-import org.srs.datacat.client.Client;
-import org.srs.datacat.client.ClientBuilder;
+import org.hps.datacat.DataType;
+import org.hps.datacat.DatacatUtilities;
+import org.hps.datacat.FileFormat;
 import org.srs.datacat.model.DatasetModel;
-import org.srs.datacat.model.DatasetView.VersionId;
-import org.srs.datacat.shared.Dataset;
-import org.srs.datacat.shared.Provider;
 
 /**
  * Datacat helper functions for the crawler.
@@ -28,39 +20,7 @@
 class DatacatHelper {
     
     private static final Logger LOGGER = Logger.getLogger(DatacatHelper.class.getPackage().getName());
-    
-    /*
-     * Default base URL for datacat.
-     */
-    static final String DATACAT_URL = "http://hpsweb.jlab.org/datacat/r";
-
-    /*
-     * Static map of strings to file formats.
-     */
-    private static final Map<String, FileFormat> FORMATS = new HashMap<String, FileFormat>();
-    static {
-        for (final FileFormat format : FileFormat.values()) {
-            FORMATS.put(format.extension(), format);
-        }
-    }
-    
-    /* 
-     * System metadata fields. 
-     */
-    static final Set<String> SYSTEM_METADATA = new HashSet<String>();
-    static {
-        SYSTEM_METADATA.add("eventCount");
-        SYSTEM_METADATA.add("size");
-        SYSTEM_METADATA.add("runMin");
-        SYSTEM_METADATA.add("runMax");
-        SYSTEM_METADATA.add("checksum");
-        SYSTEM_METADATA.add("scanStatus");
-    }
-    
-    static final boolean isSystemMetadata(String name) {
-        return SYSTEM_METADATA.contains(name);
-    }
-           
+                 
     /**
      * Create metadata for a file using its {@link FileMetadataReader}.
      *
@@ -68,7 +28,7 @@
      * @return the metadata for the file
      */
     static Map<String, Object> createMetadata(final File file) {
-        LOGGER.fine("creating metadata for " + file.getPath());
+        LOGGER.fine("creating metadata for " + file.getPath() + " ...");
         File actualFile = file;
         if (FileUtilities.isMssFile(file)) {
             actualFile = FileUtilities.getCachedFile(file);
@@ -133,7 +93,7 @@
             name = stripEvioFileNumber(name);
         }
         final String extension = name.substring(name.lastIndexOf(".") + 1);
-        return FORMATS.get(extension);
+        return FileFormat.findFormat(extension);
     }
 
     /**
@@ -172,72 +132,7 @@
         }
         return strippedName;
     }
-           
-    
-    
-    /**
-     * Create a dataset for insertion into the data catalog.
-     * 
-     * @param file the file on disk
-     * @param metadata the metadata map 
-     * @param folder the datacat folder
-     * @param site the datacat site
-     * @param dataType the data type 
-     * @param fileFormat the file format
-     * @return the created dataset
-     */
-    static DatasetModel createDataset(
-            File file,
-            Map<String, Object> metadata,
-            String folder,
-            String site,
-            String dataType,
-            String fileFormat) {
-        
-        LOGGER.info("creating dataset for " + file.getPath());
-        
-        Provider provider = new Provider();                                              
-        Dataset.Builder datasetBuilder = provider.getDatasetBuilder();
-        
-        // Set basic info on new dataset.
-        datasetBuilder.versionId(VersionId.valueOf("new"))
-            .master(true)
-            .name(file.getName())
-            .resource(file.getPath())
-            .dataType(dataType)
-            .fileFormat(fileFormat)
-            .site(site)
-            .scanStatus("OK");
-        
-        // Set system metadata from the provided metadata map.
-        if (metadata.get("eventCount") != null) {
-            datasetBuilder.eventCount((Long) metadata.get("eventCount"));
-        }
-        if (metadata.get("checksum") != null) {
-            datasetBuilder.checksum((String) metadata.get("checksum"));
-        }
-        if (metadata.get("runMin") != null) {                   
-            datasetBuilder.runMin((Long) metadata.get("runMin"));
-        }
-        if (metadata.get("runMax") != null) {
-            datasetBuilder.runMax((Long) metadata.get("runMax"));
-        }
-        if (metadata.get("size") != null) {
-            datasetBuilder.size((Long) metadata.get("size"));
-        }
-                                
-        // Create user metadata, leaving out system metadata fields.
-        Map<String, Object> userMetadata = new HashMap<String, Object>();
-        for (Entry<String, Object> metadataEntry : metadata.entrySet()) {
-            if (!SYSTEM_METADATA.contains(metadataEntry.getKey())) {
-                userMetadata.put(metadataEntry.getKey(), metadataEntry.getValue());
-            }
-        }
-        datasetBuilder.versionMetadata(userMetadata);
-        
-        return datasetBuilder.build();
-    }
-    
+                  
     /**
      * Create datasets from a list of files.
      * 
@@ -250,7 +145,7 @@
             Map<String, Object> metadata = createMetadata(file);
             DataType dataType = DatacatHelper.getDataType(file);
             FileFormat fileFormat = DatacatHelper.getFileFormat(file);
-            DatasetModel dataset = DatacatHelper.createDataset(
+            DatasetModel dataset = DatacatUtilities.createDataset(
                     file,
                     metadata,
                     folder,
@@ -260,28 +155,5 @@
             datasets.add(dataset);
         }
         return datasets;
-    }
-    
-    /**
-     * Add datasets to the data catalog.
-     * 
-     * @param datasets the list of datasets
-     * @param folder the target folder
-     * @param url the datacat URL
-     */
-    static void addDatasets(List<DatasetModel> datasets, String folder, String url) {
-        Client client = null;
-        try {
-            client = new ClientBuilder().setUrl(url).build();
-        } catch (URISyntaxException e) {
-            throw new RuntimeException("Bad datacat URL.", e);
-        }
-        for (DatasetModel dataset : datasets) {
-            try {
-                client.createDataset(folder, dataset);
-            } catch (Exception e) {
-                LOGGER.log(Level.SEVERE, e.getMessage(), e);
-            }
-        }
-    }
+    }    
 }

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/EvioMetadataReader.java	Thu Jul  7 15:52:41 2016
@@ -58,10 +58,11 @@
         Integer lastHeadTimestamp = null;
         Integer lastPhysicsEvent = null;
         Integer firstPhysicsEvent = null;
-        Integer prestartTimestamp = null;
+        Integer prestartTimestamp = null;        
         Integer endTimestamp = null;
         Integer goTimestamp = null;
         Double triggerRate = null;
+        Integer endEventCount = null;
         
         // Processor for calculating TI time offsets.
         TiTimeOffsetEvioProcessor tiProcessor = new TiTimeOffsetEvioProcessor();
@@ -168,16 +169,21 @@
                         ++physicsEvents;
                     } else if (EvioEventUtilities.isControlEvent(evioEvent)) {
                         int[] controlData = EvioEventUtilities.getControlEventData(evioEvent);
-                        if (controlData[0] != 0) {
-                            if (EventTagConstant.PRESTART.isEventTag(evioEvent)) {
-                                prestartTimestamp = controlData[0];
-                            }                        
-                            if (EventTagConstant.GO.isEventTag(evioEvent)) {
-                                goTimestamp = controlData[0];
+                        if (controlData != null) { /* Why is this null sometimes? */
+                            if (controlData[0] != 0) {
+                                if (EventTagConstant.PRESTART.isEventTag(evioEvent)) {
+                                    prestartTimestamp = controlData[0];
+                                }                        
+                                if (EventTagConstant.GO.isEventTag(evioEvent)) {
+                                    goTimestamp = controlData[0];
+                                }
+                                if (EventTagConstant.END.isEventTag(evioEvent)) {
+                                    endTimestamp = controlData[0];
+                                    endEventCount = controlData[2];
+                                }
                             }
-                            if (EventTagConstant.END.isEventTag(evioEvent)) {
-                                endTimestamp = controlData[0];
-                            }
+                        } else {
+                            LOGGER.warning("Event " + evioEvent.getEventNumber() + " is missing valid control data bank.");
                         }
                     }
 
@@ -193,9 +199,12 @@
                     tiProcessor.process(evioEvent);
                     
                 } catch (Exception e) {  
-                    // Trap all event processing errors.
+                    
+                    // Log event processing errors.
+                    LOGGER.log(Level.WARNING, "Error processing EVIO event " + evioEvent.getEventNumber(), e);
+                    
+                    // Increment bad event count.
                     badEvents++;
-                    LOGGER.warning("Error processing EVIO event " + evioEvent.getEventNumber());
                 }
             }
         } catch (final EvioException e) {
@@ -278,19 +287,23 @@
         if (goTimestamp != null) {
             metadataMap.put("GO_TIMESTAMP", goTimestamp);
         }
+        
+        if (endEventCount != null) {
+            metadataMap.put("END_EVENT_COUNT", endEventCount);
+        }
 
         // TI times and offset.
         metadataMap.put("TI_TIME_MIN_OFFSET", new Long(tiProcessor.getMinOffset()).toString());
         metadataMap.put("TI_TIME_MAX_OFFSET", new Long(tiProcessor.getMaxOffset()).toString());
         metadataMap.put("TI_TIME_N_OUTLIERS", tiProcessor.getNumOutliers());
         
-        // Event counts.
+        // Bad event count.
         metadataMap.put("BAD_EVENTS", badEvents);
         
         // Physics event count.
         metadataMap.put("PHYSICS_EVENTS", physicsEvents);
         
-        // Rough trigger rate.
+        // Rough trigger rate calculation.
         if (triggerRate != null && !Double.isInfinite(triggerRate) && !Double.isNaN(triggerRate)) {
             DecimalFormat df = new DecimalFormat("#.##");
             df.setRoundingMode(RoundingMode.CEILING);
@@ -323,7 +336,7 @@
      * @param firstTimestamp the first physics timestamp
      * @param lastTimestamp the last physics timestamp
      * @param events the number of physics events
-     * @return the trigger rate calculation in KHz
+     * @return the trigger rate in Hz
      */
     private double calculateTriggerRate(Integer firstTimestamp, Integer lastTimestamp, long events) {
         return ((double) events / ((double) lastTimestamp - (double) firstTimestamp));

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/FileFormatFilter.java	Thu Jul  7 15:52:41 2016
@@ -3,6 +3,8 @@
 import java.io.File;
 import java.io.FileFilter;
 import java.util.Set;
+
+import org.hps.datacat.FileFormat;
 
 /**
  * Filter files on their format.

Modified: java/trunk/crawler/src/main/java/org/hps/crawler/MetadataWriter.java
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/MetadataWriter.java	(original)
+++ java/trunk/crawler/src/main/java/org/hps/crawler/MetadataWriter.java	Thu Jul  7 15:52:41 2016
@@ -9,10 +9,11 @@
 import java.util.logging.Logger;
 
 import org.apache.commons.cli.CommandLine;
-import org.apache.commons.cli.PosixParser;
 import org.apache.commons.cli.HelpFormatter;
 import org.apache.commons.cli.Options;
 import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.PosixParser;
+import org.hps.datacat.DatacatConstants;
 
 /**
  * Creates metadata for a file and writes the results to a Python snippet that can be used as input to the SRS datacat.
@@ -95,7 +96,7 @@
     private static String toPyDict(Map<String, Object> metadata) {
         StringBuffer sb = new StringBuffer();
         sb.append("{");
-        for (String name : DatacatHelper.SYSTEM_METADATA) {
+        for (String name : DatacatConstants.getSystemMetadata()) {
             if (metadata.containsKey(name)) {
                 Object value = metadata.get(name);
                 if (value instanceof Number) {
@@ -108,7 +109,7 @@
         sb.setLength(sb.length() - 2);
         sb.append(", \"versionMetadata\" : {");
         for (Map.Entry<String, Object> entry : metadata.entrySet()) {
-            if (!DatacatHelper.isSystemMetadata(entry.getKey())) {
+            if (!DatacatConstants.isSystemMetadata(entry.getKey())) {
                Object value = entry.getValue();
                String name = entry.getKey();
                if (value instanceof Number) {

Added: java/trunk/datacat/pom.xml
 =============================================================================
--- java/trunk/datacat/pom.xml	(added)
+++ java/trunk/datacat/pom.xml	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,27 @@
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+    <modelVersion>4.0.0</modelVersion>
+    <artifactId>hps-datacat</artifactId>
+    <name>datacat</name>
+    <description>SRS data catalog utilities</description>
+    <parent>
+        <groupId>org.hps</groupId>
+        <artifactId>hps-parent</artifactId>
+        <relativePath>../parent/pom.xml</relativePath>
+        <version>3.10-SNAPSHOT</version>
+    </parent>
+    <scm>
+        <url>http://java.freehep.org/svn/repos/hps/list/java/trunk/datacat/</url>
+        <connection>scm:svn:svn://svn.freehep.org/hps/java/trunk/datacat/</connection>
+        <developerConnection>scm:svn:svn://svn.freehep.org/hps/java/trunk/datacat/</developerConnection>
+    </scm>    
+    <dependencies>
+        <dependency>
+            <groupId>srs</groupId>
+            <artifactId>org-srs-datacat-client</artifactId>
+        </dependency>
+        <dependency>
+            <groupId>org.hps</groupId>
+            <artifactId>hps-logging</artifactId>
+        </dependency>
+    </dependencies>
+</project>

Copied: java/trunk/datacat/src/main/java/org/hps/datacat/DataType.java (from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/DataType.java)
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DataType.java	(original)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/DataType.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.crawler;
+package org.hps.datacat;
 
 /**
  * Dataset types for HPS.

Added: java/trunk/datacat/src/main/java/org/hps/datacat/DatacatConstants.java
 =============================================================================
--- java/trunk/datacat/src/main/java/org/hps/datacat/DatacatConstants.java	(added)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/DatacatConstants.java	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,81 @@
+package org.hps.datacat;
+
+import java.util.HashSet;
+import java.util.Set;
+
+/**
+ * Static constants for use with the Data Catalog.
+ * @author jeremym
+ */
+public final class DatacatConstants {
+
+    /*
+     * Default base URL for datacat.
+     */
+    public static final String DATACAT_URL = "http://hpsweb.jlab.org/datacat/r";
+    
+    /*
+     * Datacat folder where the EVIO files reside.
+     */
+    public static final String RAW_DATA_FOLDER = "/HPS/data/raw";
+    
+    /*
+     * Datacat folder where the EVIO files reside.
+     */
+    public static final Site DEFAULT_SITE = Site.JLAB;
+    
+    /*
+     * The set of system metadata which is always set for each file.
+     */
+    private static final Set<String> SYSTEM_METADATA = new HashSet<String>();
+    static {
+        SYSTEM_METADATA.add("eventCount");
+        SYSTEM_METADATA.add("size");
+        SYSTEM_METADATA.add("runMin");
+        SYSTEM_METADATA.add("runMax");
+        SYSTEM_METADATA.add("checksum");
+        SYSTEM_METADATA.add("scanStatus");
+    }
+    
+    /* metadata fields that should be included in search results */
+    public static final String[] EVIO_METADATA = {
+        "BAD_EVENTS",
+        "BLINDED",
+        "END_EVENT_COUNT",
+        "END_TIMESTAMP",
+        "FILE",
+        "FIRST_HEAD_TIMESTAMP",
+        "FIRST_PHYSICS_EVENT",
+        "LAST_HEAD_TIMESTAMP",
+        "LAST_PHYSICS_EVENT",
+        "LED_COSMIC",
+        "PAIRS0",
+        "PAIRS1",
+        "PHYSICS_EVENTS",
+        "PULSER",
+        "SINGLES0",
+        "SINGLES1",
+        "TI_TIME_MAX_OFFSET",
+        "TI_TIME_MIN_OFFSET",
+        "TI_TIME_N_OUTLIERS",
+        "TRIGGER_RATE"
+    };
+    
+    /**
+     * Get the set of system metadata field names.
+     * 
+     * @return the set of system metadata field names
+     */
+    public static Set<String> getSystemMetadata() {
+        return SYSTEM_METADATA;
+    }
+    
+    /**
+     * Return <code>true</code> if the metadata field is system metadata.
+     * @param name the metadata field name
+     * @return <code>true</code> if the metadata field is system metadata
+     */
+    public static final boolean isSystemMetadata(String name) {
+        return SYSTEM_METADATA.contains(name);
+    }
+}

Copied: java/trunk/datacat/src/main/java/org/hps/datacat/DatacatPrintRun.java (from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/DatacatPrintRun.java)
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/DatacatPrintRun.java	(original)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/DatacatPrintRun.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.crawler;
+package org.hps.datacat;
 
 import java.util.Map;
 import java.util.SortedSet;
@@ -15,32 +15,8 @@
  * Example of printing information from all files for a given run in the datacat.
  * @author jeremym
  */
-public class DatacatPrintRun {
-    
-    /* metadata fields that should be included in search results */
-    private static final String[] METADATA_FIELDS = {
-        "BAD_EVENTS",
-        "BLINDED",
-        "END_EVENT_COUNT",
-        "END_TIMESTAMP",
-        "FILE",
-        "FIRST_HEAD_TIMESTAMP",
-        "FIRST_PHYSICS_EVENT",
-        "LAST_HEAD_TIMESTAMP",
-        "LAST_PHYSICS_EVENT",
-        "LED_COSMIC",
-        "PAIRS0",
-        "PAIRS1",
-        "PHYSICS_EVENTS",
-        "PULSER",
-        "SINGLES0",
-        "SINGLES1",
-        "TI_TIME_MAX_OFFSET",
-        "TI_TIME_MIN_OFFSET",
-        "TI_TIME_N_OUTLIERS",
-        "TRIGGER_RATE"
-    };
-    
+public final class DatacatPrintRun {
+            
     public static void main(String[] args) throws Exception {
         if (args.length == 0) {
             throw new RuntimeException("Missing run number argument.");
@@ -61,7 +37,7 @@
                 "JLAB",
                 "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + run, /* basic query */
                 new String[] {"FILE"}, /* sort on file number */
-                METADATA_FIELDS /* metadata field values to return from query */
+                DatacatConstants.EVIO_METADATA /* metadata field values to return from query */
                 );
         
         /* print results including metadata */

Added: java/trunk/datacat/src/main/java/org/hps/datacat/DatacatUtilities.java
 =============================================================================
--- java/trunk/datacat/src/main/java/org/hps/datacat/DatacatUtilities.java	(added)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/DatacatUtilities.java	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,166 @@
+package org.hps.datacat;
+
+import java.io.File;
+import java.net.URISyntaxException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.srs.datacat.client.Client;
+import org.srs.datacat.client.ClientBuilder;
+import org.srs.datacat.model.DatasetModel;
+import org.srs.datacat.model.DatasetResultSetModel;
+import org.srs.datacat.model.DatasetView.VersionId;
+import org.srs.datacat.model.dataset.DatasetWithViewModel;
+import org.srs.datacat.shared.Dataset;
+import org.srs.datacat.shared.Provider;
+
+/**
+ * Data Catalog utility functions.
+ * @author jeremym
+ */
+public class DatacatUtilities {
+    
+    private static final Logger LOGGER = Logger.getLogger(DatacatUtilities.class.getPackage().getName());
+                  
+    /**
+     * Add datasets to the data catalog or patch existing ones.
+     * 
+     * @param datasets the list of datasets
+     * @param folder the target folder
+     * @param url the datacat URL
+     * @param patch <code>true</code> to allow patching existing datasets
+     */
+    public static final void updateDatasets(List<DatasetModel> datasets, String folder, String url, boolean patch) {
+        int nUpdated = 0;
+        Client client = null;
+        try {
+            client = new ClientBuilder().setUrl(url).build();
+        } catch (URISyntaxException e) {
+            throw new RuntimeException("Invalid datacat URL.", e);
+        }
+        for (DatasetModel dataset : datasets) {
+            try {
+                if (client.exists(folder + "/" + dataset.getName())) {
+                    
+                    // Throw an error if patching is not allowed.
+                    if (!patch) {
+                        throw new RuntimeException("Dataset " + folder + "/" + dataset.getName() + " already exists and patching is disabled.");
+                    }
+                    
+                    LOGGER.info("patching existing dataset " + folder + "/" + dataset.getName());
+                
+                    String site = 
+                            ((DatasetWithViewModel) dataset).getViewInfo().getLocations().iterator().next().getSite();                                                                               
+                    client.patchDataset(folder + "/" + dataset.getName(), "current", site, dataset);
+                    
+                } else {
+                    LOGGER.info("creating new dataset for " + folder + "/" + dataset.getName());
+                    client.createDataset(folder, dataset);                                       
+                }
+            } catch (Exception e) {
+                LOGGER.log(Level.SEVERE, e.getMessage(), e);
+            }
+            ++nUpdated;
+        }
+        LOGGER.info("Inserted or updated " + nUpdated + " datasets.");
+    }
+    
+    /**
+     * Create a dataset for insertion into the data catalog.
+     * 
+     * @param file the file on disk
+     * @param metadata the metadata map 
+     * @param folder the datacat folder
+     * @param site the datacat site
+     * @param dataType the data type 
+     * @param fileFormat the file format
+     * @return the created dataset
+     */
+    public static final DatasetModel createDataset(
+            File file,
+            Map<String, Object> metadata,
+            String folder,
+            String site,
+            String dataType,
+            String fileFormat) {
+        
+        LOGGER.info("creating dataset for " + file.getPath());
+        
+        Provider provider = new Provider();                                              
+        Dataset.Builder datasetBuilder = provider.getDatasetBuilder();
+        
+        // Set basic info on new dataset.
+        datasetBuilder.versionId(VersionId.valueOf("new"))
+            .master(true)
+            .name(file.getName())
+            .resource(file.getPath())
+            .dataType(dataType)
+            .fileFormat(fileFormat)
+            .site(site)
+            .scanStatus("OK");
+        
+        // Set system metadata from the provided metadata map.
+        if (metadata.get("eventCount") != null) {
+            datasetBuilder.eventCount((Long) metadata.get("eventCount"));
+        }
+        if (metadata.get("checksum") != null) {
+            datasetBuilder.checksum((String) metadata.get("checksum"));
+        }
+        if (metadata.get("runMin") != null) {                   
+            datasetBuilder.runMin((Long) metadata.get("runMin"));
+        }
+        if (metadata.get("runMax") != null) {
+            datasetBuilder.runMax((Long) metadata.get("runMax"));
+        }
+        if (metadata.get("size") != null) {
+            datasetBuilder.size((Long) metadata.get("size"));
+        }
+                                
+        // Create user metadata, leaving out system metadata fields.
+        Map<String, Object> userMetadata = new HashMap<String, Object>();
+        for (Entry<String, Object> metadataEntry : metadata.entrySet()) {
+            if (!DatacatConstants.isSystemMetadata(metadataEntry.getKey())) {
+                userMetadata.put(metadataEntry.getKey(), metadataEntry.getValue());
+            }
+        }
+        datasetBuilder.versionMetadata(userMetadata);       
+        return datasetBuilder.build();
+    }
+    
+    public static Client createDefaultClient() {
+        try {
+            return new ClientBuilder().setUrl(DatacatConstants.DATACAT_URL).build();
+        } catch (URISyntaxException e) {
+            throw new RuntimeException("Error initializing datacat client.", e);
+        }
+    }
+    
+    public static DatasetResultSetModel findEvioDatasets(Client client, String folder, Site site, String[] metadata, String[] sort, int run) {
+        if (client == null) {
+            client = createDefaultClient();
+        }
+        return client.searchForDatasets(
+                folder,
+                "current", /* dataset version */
+                site.toString(),
+                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + run, /* basic query */
+                sort, /* sort on file number */
+                metadata /* metadata field values to return from query */
+                );
+    }
+    
+    public static DatasetResultSetModel findEvioDatasets(int run) {        
+        return findEvioDatasets(
+                null,
+                DatacatConstants.RAW_DATA_FOLDER,
+                DatacatConstants.DEFAULT_SITE,
+                DatacatConstants.EVIO_METADATA,
+                new String[] {"FILE"},
+                run
+                );
+    }
+}

Added: java/trunk/datacat/src/main/java/org/hps/datacat/FileEventRange.java
 =============================================================================
--- java/trunk/datacat/src/main/java/org/hps/datacat/FileEventRange.java	(added)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/FileEventRange.java	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,67 @@
+package org.hps.datacat;
+
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+
+import org.srs.datacat.model.DatasetModel;
+import org.srs.datacat.model.DatasetResultSetModel;
+import org.srs.datacat.model.dataset.DatasetWithViewModel;
+import org.srs.datacat.shared.DatasetLocation;
+
+/**
+ * Utility class for assocating a file in the datacat to its event ID range.
+ * @author jeremym
+ */
+public final class FileEventRange {
+    
+    private long startEvent;
+    private long endEvent;
+    private String path;
+    
+    FileEventRange(long startEvent, long endEvent, String path) {
+        this.startEvent = startEvent;
+        this.endEvent = endEvent;
+        this.path = path;
+    }
+    
+    public String getPath() {
+        return path;
+    }
+    
+    public long getStartEvent() {
+        return startEvent;
+    }
+    
+    public long getEndEvent() {
+        return endEvent;
+    }
+    
+    public boolean matches(long eventId) {
+        return eventId >= startEvent && eventId <= endEvent;
+    }
+    
+    public static List<FileEventRange> createEventRanges(DatasetResultSetModel results) {
+        List<FileEventRange> ranges = new ArrayList<FileEventRange>();
+        for (DatasetModel ds : results) {
+            DatasetWithViewModel view = (DatasetWithViewModel) ds;
+            Map<String, Object> metadata = view.getMetadataMap();
+            long firstPhysicsEvent = (Long) metadata.get("FIRST_PHYSICS_EVENT");
+            long lastPhysicsEvent = (Long) metadata.get("LAST_PHYSICS_EVENT");
+            DatasetLocation loc = (DatasetLocation) view.getViewInfo().getLocations().iterator().next();
+            ranges.add(new FileEventRange(firstPhysicsEvent, lastPhysicsEvent, loc.getPath()));
+        }
+        return ranges;
+    }
+    
+    public static FileEventRange findEventRage(List<FileEventRange> ranges, long eventId) {
+        FileEventRange match = null;
+        for (FileEventRange range : ranges) {
+            if (range.matches(eventId)) {
+                match = range;
+                break;
+            }
+        }
+        return match;
+    }
+}

Copied: java/trunk/datacat/src/main/java/org/hps/datacat/FileFormat.java (from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/FileFormat.java)
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/FileFormat.java	(original)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/FileFormat.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,7 @@
-package org.hps.crawler;
+package org.hps.datacat;
+
+import java.util.HashMap;
+import java.util.Map;
 
 
 /**
@@ -28,7 +31,14 @@
      * Testing only (do not use in production).
      */
     TEST(null);
-                
+    
+    private static final Map<String, FileFormat> FORMAT_EXTENSIONS = new HashMap<String, FileFormat>();
+    static {
+        for (final FileFormat format : FileFormat.values()) {
+            FORMAT_EXTENSIONS.put(format.extension(), format);
+        }
+    }
+            
     /**
      * The file extension of the format.
      */
@@ -58,4 +68,13 @@
     public String extension() {
         return extension;
     }        
+    
+    /**
+     * Find format by its extension.
+     * @param extension
+     * @return the format for the extension or <code>null</code> if does not exist
+     */
+    public static FileFormat findFormat(String extension) {
+        return FORMAT_EXTENSIONS.get(extension);
+    }
 }

Copied: java/trunk/datacat/src/main/java/org/hps/datacat/Site.java (from r4415, java/trunk/crawler/src/main/java/org/hps/crawler/Site.java)
 =============================================================================
--- java/trunk/crawler/src/main/java/org/hps/crawler/Site.java	(original)
+++ java/trunk/datacat/src/main/java/org/hps/datacat/Site.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.crawler;
+package org.hps.datacat;
 
 /**
  * Site of a dataset (SLAC or JLAB).

Modified: java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java
 =============================================================================
--- java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java	(original)
+++ java/trunk/evio/src/main/java/org/hps/evio/EvioToLcio.java	Thu Jul  7 15:52:41 2016
@@ -29,7 +29,7 @@
 import org.hps.record.LCSimEventBuilder;
 import org.hps.record.evio.EvioEventQueue;
 import org.hps.record.evio.EvioEventUtilities;
-import org.hps.run.database.RunManager;
+import org.hps.rundb.RunManager;
 import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
 import org.jlab.coda.jevio.EvioException;

Modified: java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java
 =============================================================================
--- java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	(original)
+++ java/trunk/evio/src/main/java/org/hps/evio/LCSimEngRunEventBuilder.java	Thu Jul  7 15:52:41 2016
@@ -17,7 +17,7 @@
 import org.hps.record.triggerbank.SSPData;
 import org.hps.record.triggerbank.TDCData;
 import org.hps.record.triggerbank.TIData;
-import org.hps.run.database.RunManager;
+import org.hps.rundb.RunManager;
 import org.jlab.coda.jevio.EvioEvent;
 import org.lcsim.conditions.ConditionsEvent;
 import org.lcsim.event.EventHeader;

Modified: java/trunk/evio/src/test/java/org/hps/evio/ScalersTest.java
 =============================================================================
--- java/trunk/evio/src/test/java/org/hps/evio/ScalersTest.java	(original)
+++ java/trunk/evio/src/test/java/org/hps/evio/ScalersTest.java	Thu Jul  7 15:52:41 2016
@@ -15,7 +15,7 @@
 import org.hps.record.scalers.ScalerData;
 import org.hps.record.scalers.ScalerUtilities;
 import org.hps.record.scalers.ScalerUtilities.LiveTimeIndex;
-import org.hps.run.database.RunManager;
+import org.hps.rundb.RunManager;
 import org.lcsim.event.EventHeader;
 import org.lcsim.util.Driver;
 import org.lcsim.util.cache.FileCache;

Modified: java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java
 =============================================================================
--- java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java	(original)
+++ java/trunk/job/src/main/java/org/hps/job/DatabaseConditionsManagerSetup.java	Thu Jul  7 15:52:41 2016
@@ -8,15 +8,13 @@
 
 import org.hps.conditions.database.DatabaseConditionsManager;
 import org.hps.detector.svt.SvtDetectorSetup;
-import org.hps.run.database.RunManager;
+import org.hps.rundb.RunManager;
 import org.lcsim.conditions.ConditionsListener;
 import org.lcsim.job.DefaultConditionsSetup;
 
 /**
  * Provides setup for HPS specific conditions manager.
- * 
- * @author Jeremy McCormick, SLAC
- *
+ * @author jeremym
  */
 public final class DatabaseConditionsManagerSetup extends DefaultConditionsSetup {
 

Modified: java/trunk/parent/pom.xml
 =============================================================================
--- java/trunk/parent/pom.xml	(original)
+++ java/trunk/parent/pom.xml	Thu Jul  7 15:52:41 2016
@@ -254,6 +254,11 @@
                 <artifactId>hps-integration-tests</artifactId>
                 <version>3.10-SNAPSHOT</version>
             </dependency>
+            <dependency>
+                <groupId>org.hps</groupId>
+                <artifactId>hps-datacat</artifactId>
+                <version>3.10-SNAPSHOT</version>
+            </dependency>
             <!-- Next are external dependencies used in multiple modules. -->
             <dependency>
                 <groupId>org.jlab.coda</groupId>
@@ -298,7 +303,7 @@
             <dependency>
                 <groupId>srs</groupId>
                 <artifactId>org-srs-datacat-client</artifactId>
-                <version>0.5-TEST3</version>
+                <version>0.5-SNAPSHOT</version>
             </dependency>
         </dependencies>
     </dependencyManagement>

Modified: java/trunk/pom.xml
 =============================================================================
--- java/trunk/pom.xml	(original)
+++ java/trunk/pom.xml	Thu Jul  7 15:52:41 2016
@@ -107,6 +107,7 @@
         <module>analysis</module>
         <module>conditions</module>
         <module>crawler</module>
+        <module>datacat</module>
         <module>detector-data</module>
         <module>detector-model</module>
         <module>distribution</module>

Copied: java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TriggerConfigEvioProcessor.java (from r4415, java/trunk/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java)
 =============================================================================
--- java/trunk/record-util/src/main/java/org/hps/record/daqconfig/TriggerConfigEvioProcessor.java	(original)
+++ java/trunk/record-util/src/main/java/org/hps/record/triggerbank/TriggerConfigEvioProcessor.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.record.daqconfig;
+package org.hps.record.triggerbank;
 
 import java.util.HashMap;
 import java.util.Map;
@@ -8,7 +8,6 @@
 import org.hps.record.evio.EvioBankTag;
 import org.hps.record.evio.EvioEventProcessor;
 import org.hps.record.evio.EvioEventUtilities;
-import org.hps.record.triggerbank.TriggerConfigData;
 import org.hps.record.triggerbank.TriggerConfigData.Crate;
 import org.jlab.coda.jevio.BaseStructure;
 import org.jlab.coda.jevio.EvioEvent;
@@ -91,29 +90,19 @@
                                     // Add string data to map.
                                     stringData.put(crate, subBank.getStringData()[0]);
                                     LOGGER.info("Added crate " + crate.getCrateNumber() + " data ..." + '\n' + subBank.getStringData()[0]);
-                                } /*else { 
-                                    LOGGER.warning("The string bank has no data.");
-                                }*/
+                                }
                             } 
                         } catch (Exception e) {
                             LOGGER.log(Level.SEVERE, "Error parsing DAQ config from crate " + crateNumber, e);
                             e.printStackTrace();
                         }
                     }
-                } /*else {
-                    LOGGER.warning("Trigger config bank is missing string data.");
-                }*/
+                } 
             }
         }
         if (stringData != null) {
             LOGGER.info("Found " + stringData.size() + " config data strings in event " + evioEvent.getEventNumber());
-            TriggerConfigData currentConfig = new TriggerConfigData(stringData, timestamp);
-            if (currentConfig.isValid()) {
-                triggerConfig = currentConfig;
-                LOGGER.info("Found valid DAQ config data in event num " + evioEvent.getEventNumber());
-            } else {
-                LOGGER.warning("Skipping invalid DAQ config data in event num "  + evioEvent.getEventNumber());
-            }
+            triggerConfig = new TriggerConfigData(stringData, timestamp);
         }
     }
    

Modified: java/trunk/run-database/pom.xml
 =============================================================================
--- java/trunk/run-database/pom.xml	(original)
+++ java/trunk/run-database/pom.xml	Thu Jul  7 15:52:41 2016
@@ -20,6 +20,10 @@
             <artifactId>hps-record-util</artifactId>
         </dependency>
         <dependency>
+            <groupId>org.hps</groupId>
+            <artifactId>hps-datacat</artifactId>
+        </dependency>
+        <dependency>
             <groupId>srs</groupId>
             <artifactId>org-srs-datacat-client</artifactId>
         </dependency>

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/DaoProvider.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/DaoProvider.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/DaoProvider.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/DaoProvider.java	Thu Jul  7 15:52:41 2016
@@ -1,14 +1,13 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.SQLException;
 
 /**
  * Provider for creating database API objects for interacting with the run database.
- *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-final class DaoProvider {
+public final class DaoProvider {
 
     /**
      * The database connection.
@@ -28,7 +27,7 @@
      *
      * @param connection the database connection
      */
-    DaoProvider(final Connection connection) {
+    public DaoProvider(final Connection connection) {
         if (connection == null) {
             throw new IllegalArgumentException("The connection is null.");
         }
@@ -47,7 +46,7 @@
      *
      * @return the EPICS DAO
      */
-    EpicsDataDao getEpicsDataDao() {
+    public EpicsDataDao getEpicsDataDao() {
         if (epicsDao == null) {
             epicsDao = new EpicsDataDaoImpl(connection); 
         }
@@ -59,7 +58,7 @@
      *
      * @return the EPICS variable DAO
      */
-    EpicsVariableDao getEpicsVariableDao() {
+    public EpicsVariableDao getEpicsVariableDao() {
         if (epicsVariableDao == null) {
             epicsVariableDao = new EpicsVariableDaoImpl(connection); 
         }
@@ -71,7 +70,7 @@
      *
      * @return the run summary DAO
      */
-    RunSummaryDao getRunSummaryDao() {
+    public RunSummaryDao getRunSummaryDao() {
         if (runSummaryDao == null) {
             runSummaryDao = new RunSummaryDaoImpl(connection); 
         }
@@ -83,7 +82,7 @@
      *
      * @return the scaler data DAO
      */
-    ScalerDataDao getScalerDataDao() {
+    public ScalerDataDao getScalerDataDao() {
         if (scalerDao == null) {
             scalerDao = new ScalerDataDaoImpl(connection); 
         }
@@ -95,7 +94,7 @@
      * 
      * @return the SVT config DAO
      */
-    SvtConfigDao getSvtConfigDao() {
+    public SvtConfigDao getSvtConfigDao() {
         if (svtDao == null) {
             svtDao = new SvtConfigDaoImpl(connection); 
         }
@@ -107,7 +106,7 @@
      * 
      * @return the trigger config DAO
      */
-    TriggerConfigDao getTriggerConfigDao() {
+    public TriggerConfigDao getTriggerConfigDao() {
         if (configDao == null) {
             configDao = new TriggerConfigDaoImpl(connection); 
         }

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDao.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.List;
 
@@ -6,10 +6,9 @@
 
 /**
  * Database Access Object (DAO) API for EPICS data from the run database.
- *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-interface EpicsDataDao {
+public interface EpicsDataDao {
 
     /**
      * Delete all EPICS data for a run from the database.

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsDataDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsDataDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.PreparedStatement;
@@ -17,7 +17,7 @@
 /**
  * Implementation of database operations for EPICS data.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremy
  */
 final class EpicsDataDaoImpl implements EpicsDataDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsType.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsType.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsType.java	Thu Jul  7 15:52:41 2016
@@ -1,14 +1,13 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import org.hps.record.epics.EpicsData;
 
 /**
- * Enum for representing different types of EPICS data in the run database, of which there are currently two (2s and
- * 20s).
+ * Enum for representing different types of EPICS data in the run database, of which there are currently two 
+ * (2s and 20s).
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-// FIXME: move to record-util
 public enum EpicsType {
 
     /**

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariable.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariable.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariable.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 /**
  * Information about an EPICS variable including its name in the EPICS database, column name for the run database,
@@ -7,9 +7,10 @@
  * This class is used to represent data from the <i>epics_variables</i> table in the run database.
  *
  * @see EpicsType
- * @see org.hps.run.database.EpicsVariableDao
- * @see org.hps.run.database.EpicsVariableDaoImpl
- * @author Jeremy McCormick, SLAC
+ * @see org.hps.rundb.EpicsVariableDao
+ * @see org.hps.rundb.EpicsVariableDaoImpl
+ * 
+ * @author jeremym
  */
 public final class EpicsVariable {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDao.java	Thu Jul  7 15:52:41 2016
@@ -1,11 +1,11 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.List;
 
 /**
  * Database interface for EPICS variables.
  * 
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 interface EpicsVariableDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EpicsVariableDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/EpicsVariableDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.PreparedStatement;
@@ -11,7 +11,7 @@
 /**
  * Implementation of database interface for EPICS variable information in the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 final class EpicsVariableDaoImpl implements EpicsVariableDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/RunManager.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunManager.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/RunManager.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.SQLException;
@@ -17,7 +17,7 @@
 /**
  * Manages access to the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 public final class RunManager implements ConditionsListener {
 
@@ -112,7 +112,7 @@
      * Return the database connection.     
      * @return the database connection
      */
-    Connection getConnection() {
+    public Connection getConnection() {
         return this.connection;
     }
 
@@ -208,7 +208,7 @@
      * @param runSummary the run summary to update
      * @param replaceExisting <code>true</code> to allow an existing run summary to be replaced
      */
-    void updateRunSummary(RunSummary runSummary, boolean replaceExisting) {
+    public void updateRunSummary(RunSummary runSummary, boolean replaceExisting) {
         final RunSummaryDao runSummaryDao = factory.getRunSummaryDao();
         RunManager runManager = new RunManager();
         runManager.setRun(runSummary.getRun());
@@ -228,7 +228,7 @@
      * @param triggerConfig the trigger config
      * @param replaceExisting <code>true</code> to allow an existing trigger to be replaced
      */
-    void updateTriggerConfig(TriggerConfigData triggerConfig, boolean replaceExisting) {
+    public void updateTriggerConfig(TriggerConfigData triggerConfig, boolean replaceExisting) {
         final TriggerConfigDao configDao = factory.getTriggerConfigDao();
         if (configDao.getTriggerConfig(run) != null) {
             if (replaceExisting) {
@@ -244,7 +244,7 @@
      * Create or replace EPICS data for the run.
      * @param epicsData the EPICS data
      */
-    void updateEpicsData(List<EpicsData> epicsData) {
+    public void updateEpicsData(List<EpicsData> epicsData) {
         if (epicsData != null && !epicsData.isEmpty()) {
             factory.getEpicsDataDao().insertEpicsData(epicsData, this.run);
         }
@@ -254,7 +254,7 @@
      * Create or replace scaler data for the run.
      * @param scalerData the scaler data
      */
-    void updateScalerData(List<ScalerData> scalerData) {
+    public void updateScalerData(List<ScalerData> scalerData) {
         if (scalerData != null) {
             factory.getScalerDataDao().insertScalerData(scalerData, this.run);
         } 
@@ -264,7 +264,7 @@
      * Delete a run from the database.
      * @param run the run number
      */
-    void deleteRun() {        
+    public void deleteRun() {        
         factory.getEpicsDataDao().deleteEpicsData(EpicsType.EPICS_2S, run);
         factory.getEpicsDataDao().deleteEpicsData(EpicsType.EPICS_20S, run);
         factory.getScalerDataDao().deleteScalerData(run);

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/RunSummary.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummary.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/RunSummary.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.Date;
 
@@ -6,12 +6,13 @@
  * This is an API for accessing run summary information which is persisted as a row in the <i>run_summaries</i> table.
  * <p>
  * All timestamp fields use the Unix convention (seconds since the epoch).
- *
- * @author Jeremy McCormick, SLAC
+ *  
  * @see RunSummaryImpl
  * @see RunSummaryDao
  * @see RunSummaryDaoImpl
  * @see RunManager
+ * 
+ * @author jeremym
  */
 public interface RunSummary {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDao.java	Thu Jul  7 15:52:41 2016
@@ -1,13 +1,13 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.List;
 
 /**
  * Database API for managing basic run summary information in the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-interface RunSummaryDao {
+public interface RunSummaryDao {
   
     /**
      * Delete a run summary by run number.

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.PreparedStatement;
@@ -11,7 +11,7 @@
 /**
  * Implementation of database operations for {@link RunSummary} objects in the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 final class RunSummaryDaoImpl implements RunSummaryDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/RunSummaryImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/RunSummaryImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,13 +1,13 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.Date;
 
 /**
  * Implementation of {@link RunSummary} for retrieving information from the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-final class RunSummaryImpl implements RunSummary {
+public final class RunSummaryImpl implements RunSummary {
 
     /**
      * Date this record was created.
@@ -183,7 +183,7 @@
      * 
      * @param created the creation date
      */
-    void setCreated(Date created) {
+    public void setCreated(Date created) {
         this.created = created;
     }
 
@@ -192,7 +192,7 @@
      * 
      * @param endTimestamp the end timestamp
      */
-    void setEndTimestamp(Integer endTimestamp) {
+    public void setEndTimestamp(Integer endTimestamp) {
         this.endTimestamp = endTimestamp;
     }
 
@@ -201,7 +201,7 @@
      * 
      * @param goTimestamp the GO timestamp
      */
-    void setGoTimestamp(Integer goTimestamp) {
+    public void setGoTimestamp(Integer goTimestamp) {
         this.goTimestamp = goTimestamp;
     }
 
@@ -210,7 +210,7 @@
      * 
      * @param livetimeClock the clock livetime
      */
-    void setLivetimeClock(Double livetimeClock) {
+    public void setLivetimeClock(Double livetimeClock) {
         this.livetimeClock = livetimeClock;
     }
 
@@ -219,7 +219,7 @@
      * 
      * @param livetimeTdc the FCUP TDC livetime
      */
-    void setLivetimeFcupTdc(Double livetimeTdc) {
+    public void setLivetimeFcupTdc(Double livetimeTdc) {
         this.livetimeTdc = livetimeTdc;
     }
 
@@ -228,7 +228,7 @@
      * 
      * @param livetimeTrg the FCUP TRG livetime
      */
-    void setLivetimeFcupTrg(Double livetimeTrg) {
+    public void setLivetimeFcupTrg(Double livetimeTrg) {
         this.livetimeTrg = livetimeTrg;
     }
 
@@ -237,7 +237,7 @@
      * 
      * @param notes the notes
      */
-    void setNotes(String notes) {
+    public void setNotes(String notes) {
         this.notes = notes;
     }
 
@@ -246,7 +246,7 @@
      * 
      * @param prestartTimestamp the PRESTART timestamp
      */
-    void setPrestartTimestamp(Integer prestartTimestamp) {
+    public void setPrestartTimestamp(Integer prestartTimestamp) {
         this.prestartTimestamp = prestartTimestamp;
     }
 
@@ -255,7 +255,7 @@
      * 
      * @param target the target description
      */
-    void setTarget(String target) {
+    public void setTarget(String target) {
         this.target = target;
     }
 
@@ -264,7 +264,7 @@
      * 
      * @param tiTimeOffset the TIM time offset in ns
      */
-    void setTiTimeOffset(Long tiTimeOffset) {
+    public void setTiTimeOffset(Long tiTimeOffset) {
         this.tiTimeOffset = tiTimeOffset;
     }
 
@@ -273,7 +273,7 @@
      *
      * @param totalEvents the total number of physics events in the run
      */
-    void setTotalEvents(final Long totalEvents) {
+    public void setTotalEvents(final Long totalEvents) {
         this.totalEvents = totalEvents;
     }
 
@@ -282,7 +282,7 @@
      *
      * @param totalFiles the total number of EVIO files in the run
      */
-    void setTotalFiles(final Integer totalFiles) {
+    public void setTotalFiles(final Integer totalFiles) {
         this.totalFiles = totalFiles;
     }
 
@@ -291,7 +291,7 @@
      * 
      * @param triggerConfigName the trigger config file
      */
-    void setTriggerConfigName(String triggerConfigName) {
+    public void setTriggerConfigName(String triggerConfigName) {
         this.triggerConfigName = triggerConfigName;
     }
 
@@ -300,7 +300,7 @@
      * 
      * @param triggerRate the trigger rate in KHz
      */
-    void setTriggerRate(Double triggerRate) {
+    public void setTriggerRate(Double triggerRate) {
         this.triggerRate = triggerRate;
     }
 
@@ -309,7 +309,7 @@
      * 
      * @param updated the updated date
      */
-    void setUpdated(Date updated) {
+    public void setUpdated(Date updated) {
         this.updated = updated;
     }
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDao.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.List;
 
@@ -7,9 +7,9 @@
 /**
  * Database Access Object (DAO) for scaler data in the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-interface ScalerDataDao {
+public interface ScalerDataDao {
 
     /**
      * Delete scaler data for the run.

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/ScalerDataDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/ScalerDataDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Connection;
 import java.sql.PreparedStatement;
@@ -13,7 +13,7 @@
 /**
  * Implementation of database API for {@link org.hps.record.scalers.ScalerData} in the run database.
  *
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 final class ScalerDataDaoImpl implements ScalerDataDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDao.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.util.List;
 
@@ -7,9 +7,9 @@
 /**
  * Database API for accessing SVT configuration in run database.
  * 
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-interface SvtConfigDao {
+public interface SvtConfigDao {
    
     /**
      * Insert SVT configurations.

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/SvtConfigDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/SvtConfigDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Clob;
 import java.sql.Connection;
@@ -14,7 +14,7 @@
 /**
  * Implementation of SVT configuration database operations.
  * 
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
 final class SvtConfigDaoImpl implements SvtConfigDao {
 

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDao.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDao.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDao.java	Thu Jul  7 15:52:41 2016
@@ -1,13 +1,13 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import org.hps.record.triggerbank.TriggerConfigData;
 
 /**
  * Database interface for getting raw trigger config data and inserting into run db.
  * 
- * @author Jeremy McCormick, SLAC
+ * @author jeremym
  */
-interface TriggerConfigDao {
+public interface TriggerConfigDao {
     
     /**
      * Get a trigger config by run number.

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDaoImpl.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/TriggerConfigDaoImpl.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/TriggerConfigDaoImpl.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb;
 
 import java.sql.Clob;
 import java.sql.Connection;
@@ -11,6 +11,11 @@
 import org.hps.record.triggerbank.TriggerConfigData;
 import org.hps.record.triggerbank.TriggerConfigData.Crate;
 
+/**
+ * Implementation of trigger configuration database operations.
+ * 
+ * @author jeremym
+ */
 final class TriggerConfigDaoImpl implements TriggerConfigDao {
       
     private static final String INSERT =
@@ -37,26 +42,23 @@
         }
         this.connection = connection;
     }
-    
 
     @Override
     public void insertTriggerConfig(TriggerConfigData config, int run) {
-        if (!config.isValid()) {
-            throw new RuntimeException("The trigger config is not valid.");
-        }
         PreparedStatement preparedStatement = null;
         try {
             preparedStatement = connection.prepareStatement(INSERT);
             preparedStatement.setInt(1, run);
             preparedStatement.setInt(2, config.getTimestamp());
             Map<Crate, String> data = config.getData();
-            if (data.size() != TriggerConfigData.Crate.values().length) {
-                throw new IllegalArgumentException("The trigger config data has the wrong length.");
-            }
             preparedStatement.setBytes(3, data.get(TriggerConfigData.Crate.CONFIG1).getBytes());
             preparedStatement.setBytes(4, data.get(TriggerConfigData.Crate.CONFIG2).getBytes());
             preparedStatement.setBytes(5, data.get(TriggerConfigData.Crate.CONFIG3).getBytes());
-            preparedStatement.setBytes(6, data.get(TriggerConfigData.Crate.CONFIG4).getBytes());
+            if (data.get(TriggerConfigData.Crate.CONFIG4) != null) {
+                preparedStatement.setBytes(6, data.get(TriggerConfigData.Crate.CONFIG4).getBytes());
+            } else {
+                preparedStatement.setObject(6, null);
+            }
             preparedStatement.executeUpdate();
         } catch (SQLException e) {
             throw new RuntimeException(e);

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/builder/AbstractRunBuilder.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/AbstractRunBuilder.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/AbstractRunBuilder.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/AbstractRunBuilder.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,6 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
+
+import org.hps.rundb.RunSummaryImpl;
 
 /**
  * Class for incrementally building records for the run database.

Added: java/trunk/run-database/src/main/java/org/hps/rundb/builder/BuilderCommandLine.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/rundb/builder/BuilderCommandLine.java	(added)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/BuilderCommandLine.java	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,237 @@
+package org.hps.rundb.builder;
+
+import java.io.File;
+import java.net.URISyntaxException;
+import java.sql.Connection;
+import java.sql.SQLException;
+import java.util.logging.Logger;
+
+import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.HelpFormatter;
+import org.apache.commons.cli.Options;
+import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.PosixParser;
+import org.hps.conditions.database.ConnectionParameters;
+import org.hps.rundb.DaoProvider;
+import org.hps.rundb.RunManager;
+import org.hps.rundb.RunSummaryDao;
+import org.hps.rundb.RunSummaryImpl;
+import org.srs.datacat.client.ClientBuilder;
+
+import org.hps.datacat.DatacatConstants;
+import org.hps.datacat.Site;
+
+/**
+ * Creates a basic run database record from information in the data catalog 
+ * as well as (optionally) a CSV dump of the run spreadsheet from Google Docs.
+ * 
+ * @author jeremym
+ */
+public class BuilderCommandLine {
+    
+    private static final Logger LOGGER = 
+            Logger.getLogger(BuilderCommandLine.class.getPackage().getName());
+       
+    /**
+     * Command line options for the crawler.
+     */
+    private static final Options OPTIONS = new Options();
+    
+    /**
+     * Statically define the command options.
+     */
+    static {
+        OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
+        OPTIONS.addOption("r", "run", true, "run to insert or update (required)");
+        OPTIONS.getOption("r").setRequired(true);
+        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");
+        OPTIONS.getOption("p").setRequired(true);
+        OPTIONS.addOption("s", "spreadsheet", true, "path to run database spreadsheet CSV file (optional)");
+        OPTIONS.addOption("u", "url", true, "data catalog URL (optional)");
+        OPTIONS.addOption("S", "site", true, "data catalog site e.g. SLAC or JLAB (optional)");
+        OPTIONS.addOption("f", "folder", true, "folder in datacat for dataset search (optional)");
+        OPTIONS.addOption("D", "dry-run", false, "enable dry run with no db update (optional)");
+    }
+
+    /**
+     * Run the program from the command line.
+     *
+     * @param args the command line arguments
+     */
+    public static void main(final String args[]) {
+        new BuilderCommandLine().parse(args).run();
+    }
+        
+    /**
+     * Run number.
+     */
+    private int run;
+    
+    /**
+     * Path to spreadsheet CSV file.
+     */
+    private File spreadsheetFile = null;
+        
+    /**
+     * Data catalog site.
+     */
+    private String site = Site.JLAB.toString();
+    
+    /**
+     * Data catalog URL.
+     */
+    private String url = DatacatConstants.DATACAT_URL;  
+    
+    /**
+     * Default folder for file search.
+     */
+    private String folder = DatacatConstants.RAW_DATA_FOLDER;
+    
+    /**
+     * Database connection parameters.
+     */
+    private ConnectionParameters connectionParameters = null;
+    
+    /**
+     * <code>true</code> if database should not be updated.
+     */
+    private boolean dryRun = false;
+    
+    /**
+     * Parse command line options and return reference to <code>this</code> object.
+     *
+     * @param args the command line arguments
+     * @return reference to this object
+     */
+    private BuilderCommandLine parse(final String args[]) {
+        try {
+            final CommandLine cl = new PosixParser().parse(OPTIONS, args);
+
+            // Print help and exit.
+            if (cl.hasOption("h") || args.length == 0) {
+                final HelpFormatter help = new HelpFormatter();
+                help.printHelp("RunDatabaseCommandLine [options]", "", OPTIONS, "");
+                System.exit(0);
+            }
+            
+            // Run number.
+            if (cl.hasOption("r")) {
+                run = Integer.parseInt(cl.getOptionValue("r"));
+            } else {
+                throw new RuntimeException("The run number is required.");
+            }
+                        
+            // Run spreadsheet.
+            if (cl.hasOption("s")) {
+                this.spreadsheetFile = new File(cl.getOptionValue("s"));
+                if (!this.spreadsheetFile.exists()) {
+                    throw new RuntimeException("The run spreadsheet " + this.spreadsheetFile.getPath() + " is inaccessible or does not exist.");
+                }
+            }
+                                               
+            // Data catalog URL.
+            if (cl.hasOption("u")) {
+                url = cl.getOptionValue("u");
+            }
+            
+            // Site in the data catalog.
+            if (cl.hasOption("S")) {
+                site = cl.getOptionValue("S");
+            }
+            
+            // Set folder for dataset search.
+            if (cl.hasOption("f")) {
+                folder = cl.getOptionValue("f");
+            }
+            
+            // Database connection properties file.
+            if (cl.hasOption("p")) {
+                final String dbPropPath = cl.getOptionValue("p");
+                final File dbPropFile = new File(dbPropPath);
+                if (!dbPropFile.exists()) {
+                    throw new IllegalArgumentException("Connection properties file " + dbPropFile.getPath() + " does not exist.");
+                }
+                connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
+            } else {
+                // Database connection properties file is required.
+                throw new RuntimeException("Connection properties are a required argument.");
+            }
+            
+            if (cl.hasOption("D")) {
+                this.dryRun = true;
+            }
+            
+        } catch (final ParseException e) {
+            throw new RuntimeException(e);
+        }
+
+        return this;
+    }
+
+    /**
+     * Configure the builder from command line options and run the job to update the database.
+     */
+    private void run() {
+        
+        System.out.println("connecting to " + this.connectionParameters.getConnectionString() + " ...");
+                        
+        RunManager mgr = new RunManager(this.connectionParameters.createConnection());
+        Connection connection = mgr.getConnection();
+        try {
+            connection.setAutoCommit(true);
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
+        
+        mgr.setRun(run);
+        
+        RunSummaryImpl runSummary = new RunSummaryImpl(run);
+        
+        // build info from datacat
+        DatacatBuilder datacatBuilder = new DatacatBuilder();
+        try {
+            datacatBuilder.setDatacatClient(new ClientBuilder().setUrl(url).build());
+        } catch (URISyntaxException e) {
+            throw new RuntimeException("Datacat URL " + url + " is invalid.", e);
+        }
+        datacatBuilder.setFolder(folder);
+        datacatBuilder.setSite(site);
+        datacatBuilder.setRunSummary(runSummary);
+        datacatBuilder.build();
+                
+        // build info from run spreadsheet
+        if (spreadsheetFile != null) {
+            SpreadsheetBuilder spreadsheetBuilder = new SpreadsheetBuilder();
+            spreadsheetBuilder.setSpreadsheetFile(spreadsheetFile);
+            spreadsheetBuilder.setRunSummary(datacatBuilder.getRunSummary());
+            spreadsheetBuilder.build();
+        } else {
+            LOGGER.warning("No run spreadsheet provided with command line option!");
+        }
+        
+        LOGGER.info(runSummary.toString());
+                
+        // insert run summary
+        if (!dryRun) {
+            RunSummaryDao runSummaryDao = new DaoProvider(connection).getRunSummaryDao();
+            if (mgr.runExists()) {
+                System.out.println("updating existing run summary ...");
+                runSummaryDao.updateRunSummary(runSummary);
+            } else {
+                System.out.println("inserting new run summary ...");
+                runSummaryDao.insertRunSummary(runSummary);
+            }        
+        
+            try {
+                System.out.println("closing db connection ...");
+                connection.close();
+            } catch (SQLException e) {
+                throw new RuntimeException(e);
+            }
+        } else {
+            LOGGER.info("Dry run enabled.  Database was not updated!");
+        }
+        
+        System.out.println("DONE!");
+    }        
+}

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/builder/DatacatBuilder.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/DatacatBuilder.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/DatacatBuilder.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/DatacatBuilder.java	Thu Jul  7 15:52:41 2016
@@ -1,8 +1,10 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
 
 import java.io.File;
+import java.util.ArrayList;
 import java.util.List;
 import java.util.Map;
+import java.util.logging.Level;
 import java.util.logging.Logger;
 
 import org.hps.record.triggerbank.TiTimeOffsetCalculator;
@@ -13,17 +15,24 @@
 import org.srs.datacat.model.dataset.DatasetWithViewModel;
 import org.srs.datacat.shared.DatasetLocation;
 
-final class DatacatBuilder extends AbstractRunBuilder {
+/**
+ * Builds information for the run database from the EVIO data catalog entries.
+ * 
+ * @author jeremym
+ */
+public final class DatacatBuilder extends AbstractRunBuilder {
     
     private static final Logger LOGGER = Logger.getLogger(DatacatBuilder.class.getPackage().getName());
     
     private static final String[] METADATA_FIELDS = {
-        "TI_TIME_MIN_OFFSET", 
-        "TI_TIME_MAX_OFFSET", 
-        "TI_TIME_N_OUTLIERS", 
+        "TI_TIME_MIN_OFFSET",
+        "TI_TIME_MAX_OFFSET",
+        "TI_TIME_N_OUTLIERS",
         "END_TIMESTAMP",
         "GO_TIMESTAMP",
-        "PRESTART_TIMESTAMP"
+        "PRESTART_TIMESTAMP",
+        "END_EVENT_COUNT",
+        "FILE"
     };
     
     private Client datacatClient;
@@ -31,7 +40,7 @@
     private String folder;    
     private List<File> files;
                 
-    private static long calculateTiTimeOffset(DatasetResultSetModel results) {
+    private long calculateTiTimeOffset(DatasetResultSetModel results) {
         TiTimeOffsetCalculator calc = new TiTimeOffsetCalculator();
         for (DatasetModel ds : results) {
             DatasetWithViewModel view = (DatasetWithViewModel) ds;
@@ -48,18 +57,21 @@
         }
         return calc.calculateTimeOffset();
     }
-    
-    private static long getTotalEvents(DatasetResultSetModel results) {
+        
+    private long countEvents(DatasetResultSetModel results) {
+        LOGGER.info("Calculating total events from file event counts ...");
         long totalEvents = 0;
         for (DatasetModel ds : results) {
             DatasetWithViewModel view = (DatasetWithViewModel) ds;
-            DatasetLocation loc = (DatasetLocation) view.getViewInfo().getLocations().iterator().next();
-            totalEvents += loc.getEventCount();
-        }
+            //Map<String, Object> metadata = view.getMetadataMap();                
+            long eventCount = ((DatasetLocation) view.getViewInfo().getLocations().iterator().next()).getEventCount();
+            totalEvents += eventCount;
+        } 
+        LOGGER.info("Calculated " + totalEvents + " total events from event counts.");
         return totalEvents;
     }
     
-    private static Integer getPrestartTimestamp(DatasetResultSetModel results) {
+    private Integer getPrestartTimestamp(DatasetResultSetModel results) {
         DatasetWithViewModel ds = (DatasetWithViewModel) results.getResults().get(0);
         if (ds.getMetadataMap().containsKey("PRESTART_TIMESTAMP")) {
             return (int) (long) ds.getMetadataMap().get("PRESTART_TIMESTAMP");
@@ -68,7 +80,7 @@
         }
     }
     
-    private static Integer getEndTimestamp(DatasetResultSetModel results) {        
+    private Integer getEndTimestamp(DatasetResultSetModel results) {        
         DatasetWithViewModel ds = (DatasetWithViewModel) results.getResults().get(results.getResults().size() - 1);
         if (ds.getMetadataMap().containsKey("END_TIMESTAMP")) {
             return (int) (long) ds.getMetadataMap().get("END_TIMESTAMP");
@@ -78,7 +90,7 @@
     }
     
     
-    private static Integer getGoTimestamp(DatasetResultSetModel results) {
+    private Integer getGoTimestamp(DatasetResultSetModel results) {
         DatasetWithViewModel ds = (DatasetWithViewModel) results.getResults().get(0);
         if (ds.getMetadataMap().containsKey("GO_TIMESTAMP")) {
             return (int) (long) ds.getMetadataMap().get("GO_TIMESTAMP");
@@ -87,7 +99,7 @@
         }
     }
     
-    private static double calculateTriggerRate(Integer startTimestamp, Integer endTimestamp, long nEvents) {
+    private double calculateTriggerRate(Integer startTimestamp, Integer endTimestamp, long nEvents) {
         if (startTimestamp == null) {
             throw new IllegalArgumentException("The start timestamp is null.");
         }
@@ -102,6 +114,18 @@
         }
         double triggerRate = (double) nEvents / ((double) endTimestamp - (double) startTimestamp);
         return triggerRate;
+    }
+    
+    private long calculateTotalEvents(DatasetResultSetModel results) {
+        DatasetWithViewModel lastDataset = 
+                (DatasetWithViewModel) results.getResults().get(results.getResults().size() - 1);
+        long totalEvents = 0;
+        if (lastDataset.getMetadataMap().containsKey("END_EVENT_COUNT")) { /* calculate from each file */
+            totalEvents = (Long) lastDataset.getMetadataMap().get("END_EVENT_COUNT");
+        } else {
+            totalEvents = countEvents(results);
+        }
+        return totalEvents;
     }
         
     void build() {
@@ -127,32 +151,46 @@
             throw new RuntimeException(e);
         }
         
-        files = DatacatUtilities.toFileList(results);
+        files = toFileList(results);
         
         if (results.getResults().isEmpty()) {
             throw new RuntimeException("No results found for datacat search.");
         }
-        
-        long tiTimeOffset = calculateTiTimeOffset(results);
-        getRunSummary().setTiTimeOffset(tiTimeOffset);
-        
-        long totalEvents = getTotalEvents(results);
+
+        try {
+            long tiTimeOffset = calculateTiTimeOffset(results);
+            getRunSummary().setTiTimeOffset(tiTimeOffset);
+        } catch (Exception e) {
+            LOGGER.log(Level.WARNING, "Error calculating TI time offset.", e);
+        }
+                
+        long totalEvents = calculateTotalEvents(results);
         getRunSummary().setTotalEvents(totalEvents);
         
         int nFiles = results.getResults().size();
         getRunSummary().setTotalFiles(nFiles);
         
-        int prestartTimestamp = getPrestartTimestamp(results);
-        getRunSummary().setPrestartTimestamp(prestartTimestamp);
-        
-        int goTimestamp = getGoTimestamp(results);
-        getRunSummary().setGoTimestamp(goTimestamp);
-        
-        int endTimestamp = getEndTimestamp(results);
-        getRunSummary().setEndTimestamp(endTimestamp);
-        
-        double triggerRate = calculateTriggerRate(prestartTimestamp, endTimestamp, totalEvents);
-        getRunSummary().setTriggerRate(triggerRate);        
+        Integer prestartTimestamp = getPrestartTimestamp(results);
+        if (prestartTimestamp != null) {
+            getRunSummary().setPrestartTimestamp(prestartTimestamp);
+        }
+        
+        Integer goTimestamp = getGoTimestamp(results);
+        if (goTimestamp != null) {
+            getRunSummary().setGoTimestamp(goTimestamp);
+        }
+        
+        Integer endTimestamp = getEndTimestamp(results);
+        if (endTimestamp != null) {
+            getRunSummary().setEndTimestamp(endTimestamp);
+        }
+        
+        try {
+            double triggerRate = calculateTriggerRate(prestartTimestamp, endTimestamp, totalEvents);
+            getRunSummary().setTriggerRate(triggerRate);
+        } catch (Exception e) {
+            LOGGER.log(Level.WARNING, "Error calculating trigger rate.", e);
+        }
     }
                          
     private DatasetResultSetModel findDatasets() {
@@ -161,11 +199,11 @@
         
         DatasetResultSetModel results = datacatClient.searchForDatasets(
                 this.folder,
-                "current",
+                "current", /* dataset version */
                 this.site,
-                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + getRun(),
-                new String[] {"FILE"},
-                METADATA_FIELDS
+                "fileFormat eq 'EVIO' AND dataType eq 'RAW' AND runMin eq " + getRun(), /* basic query */
+                new String[] {"FILE"}, /* sort on file number */
+                METADATA_FIELDS /* metadata field values to return from query */
                 );
         
         LOGGER.info("found " + results.getResults().size() + " EVIO datasets for run " + getRun());
@@ -188,4 +226,17 @@
     List<File> getFileList() {
         return files;
     }
+    
+    static final List<File> toFileList(DatasetResultSetModel datasets) {
+        List<File> files = new ArrayList<File>();
+        for (DatasetModel dataset : datasets.getResults()) {
+            String resource = 
+                    ((DatasetWithViewModel) dataset).getViewInfo().getLocations().iterator().next().getResource();
+            if (resource.startsWith("/ss")) {
+                resource = "/cache" + resource;
+            }
+            files.add(new File(resource));
+        }
+        return files;
+    }
 }

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataBuilder.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/EvioDataBuilder.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/EvioDataBuilder.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataBuilder.java	Thu Jul  7 15:52:41 2016
@@ -1,29 +1,28 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
 
 import java.io.File;
 import java.util.List;
-import java.util.logging.Logger;
 
 import org.hps.record.epics.EpicsData;
 import org.hps.record.epics.EpicsRunProcessor;
 import org.hps.record.evio.EvioFileSource;
-import org.hps.record.evio.EvioFileUtilities;
 import org.hps.record.evio.EvioLoop;
 import org.hps.record.scalers.ScalerData;
 import org.hps.record.scalers.ScalersEvioProcessor;
+import org.hps.record.triggerbank.TriggerConfigData;
+import org.hps.record.triggerbank.TriggerConfigEvioProcessor;
 
 /**
- * Extracts lists of EPICS and scaler data in an EVIO file and insert
- * them into the run database.
+ * Extracts EPICS data, scaler data and trigger configuration from an EVIO file.
  * 
  * @author Jeremy McCormick, SLAC
  */
 public class EvioDataBuilder extends AbstractRunBuilder {
 
-    private Logger LOGGER = Logger.getLogger(EvioDataBuilder.class.getPackage().getName());
     private File evioFile;
     private List<EpicsData> epicsData;
     private List<ScalerData> scalerData;
+    private TriggerConfigData triggerConfig;
     
     void setEvioFile(File evioFile) {
         this.evioFile = evioFile;
@@ -37,6 +36,10 @@
         return scalerData;
     }
     
+    TriggerConfigData getTriggerConfig() {
+        return triggerConfig;
+    }
+    
     @Override
     void build() {
         if (evioFile == null) {
@@ -48,36 +51,12 @@
         ScalersEvioProcessor scalersProcessor = new ScalersEvioProcessor();
         scalersProcessor.setResetEveryEvent(false);        
         EpicsRunProcessor epicsProcessor = new EpicsRunProcessor();
-        loop.addProcessor(epicsProcessor);                
+        loop.addProcessor(epicsProcessor);
+        TriggerConfigEvioProcessor configProcessor = new TriggerConfigEvioProcessor();
+        loop.addProcessor(configProcessor);
         loop.loop(-1);
         this.epicsData = epicsProcessor.getEpicsData();
         this.scalerData = scalersProcessor.getScalerData();
-    }
-    
-    public void main(String args[]) {
-        
-        if (args.length == 0) {
-            throw new RuntimeException("No command line arguments provided.");
-        }
-        String path = args[0];
-        File file = new File(path);
-        int run = EvioFileUtilities.getRunFromName(file);
-        
-        EvioDataBuilder builder = new EvioDataBuilder();
-        builder.setEvioFile(file);
-        builder.build();
-        
-        if (!builder.getEpicsData().isEmpty()) {
-            RunManager runManager = null;
-            try {
-                runManager = new RunManager();
-                runManager.setRun(run);
-                runManager.updateEpicsData(epicsData);
-            } finally {
-                runManager.closeConnection();
-            }
-        } else {
-            LOGGER.warning("No EPICS data was found to insert into run database.");
-        }
+        this.triggerConfig = configProcessor.getTriggerConfigData();
     }
 }

Added: java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataCommandLine.java
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataCommandLine.java	(added)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/EvioDataCommandLine.java	Thu Jul  7 15:52:41 2016
@@ -0,0 +1,163 @@
+package org.hps.rundb.builder;
+
+import java.io.File;
+import java.sql.Connection;
+import java.sql.SQLException;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.HelpFormatter;
+import org.apache.commons.cli.Options;
+import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.PosixParser;
+import org.hps.conditions.database.ConnectionParameters;
+import org.hps.record.evio.EvioFileUtilities;
+import org.hps.rundb.RunManager;
+
+/**
+ * Extracts information from EVIO files and inserts into the run database.
+ * 
+ * @author jeremym
+ */
+public class EvioDataCommandLine {
+
+    private Logger LOGGER = Logger.getLogger(EvioDataCommandLine.class.getPackage().getName());
+
+    private boolean dryRun = false;
+    private ConnectionParameters connectionParameters = null;
+    private List<File> evioFiles = new ArrayList<File>();
+
+    /**
+     * Command line options for the crawler.
+     */
+    private static final Options OPTIONS = new Options();
+
+    static {
+        OPTIONS.addOption("h", "help", false, "print help and exit (overrides all other arguments)");
+        OPTIONS.addOption("p", "connection-properties", true, "database connection properties file (required)");
+        OPTIONS.getOption("p").setRequired(true);
+        OPTIONS.addOption("D", "dry-run", false, "enable dry run with no db update (optional)");
+    }
+    
+    public static void main(String[] args) {
+        new EvioDataCommandLine().parse(args).run();
+    }
+
+    private EvioDataCommandLine parse(String[] args) {
+        try {
+            final CommandLine cl = new PosixParser().parse(OPTIONS, args);
+            
+            if (cl.hasOption("h") || args.length == 0) {
+                final HelpFormatter help = new HelpFormatter();
+                help.printHelp("EvioDataCommandLine [options] file1 file2 ...", "", OPTIONS, "");
+                System.exit(0);
+            }
+
+            if (cl.hasOption("D")) {
+                dryRun = true;
+                LOGGER.config("Dry run enabled; database will not be updated.");
+            }
+
+            if (cl.hasOption("p")) {
+                final String dbPropPath = cl.getOptionValue("p");
+                final File dbPropFile = new File(dbPropPath);
+                if (!dbPropFile.exists()) {
+                    throw new IllegalArgumentException("Connection properties file " + dbPropFile.getPath()
+                            + " does not exist.");
+                }
+                connectionParameters = ConnectionParameters.fromProperties(dbPropFile);
+                LOGGER.config("connection props set from " + dbPropFile.getPath());
+            } else {
+                // Database connection properties file is required.
+                throw new RuntimeException("Connection properties are a required argument.");
+            }
+
+            for (String arg : cl.getArgList()) {
+                evioFiles.add(new File(arg));
+                LOGGER.config("adding file " + arg + " to job");
+            }
+
+            if (evioFiles.isEmpty()) {
+                throw new RuntimeException("No EVIO files were provided from the command line.");
+            }
+
+        } catch (ParseException e) {
+            throw new RuntimeException("Error parsing command line arguments.", e);
+        }
+        return this;
+    }
+
+    private void run() {
+        
+        RunManager runManager = new RunManager(this.connectionParameters.createConnection());
+        Connection connection = runManager.getConnection();
+        try {
+            connection.setAutoCommit(false);
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
+        
+        for (File evioFile : evioFiles) {
+
+            LOGGER.info("Processing file " + evioFile.getPath() + " ...");
+            
+            int run = EvioFileUtilities.getRunFromName(evioFile);
+            
+            EvioDataBuilder builder = new EvioDataBuilder();
+            builder.setEvioFile(evioFile);
+            builder.build();
+                       
+            LOGGER.info("Found " + builder.getEpicsData().size() + " EPICS records.");
+            LOGGER.info("Found " + builder.getScalerData().size() + " scaler records.");            
+            
+            LOGGER.info("Set run " + run + " from file " + evioFile);
+            
+            runManager.setRun(run);
+                                    
+            if (!dryRun) {
+                
+                try {
+                    runManager.updateEpicsData(builder.getEpicsData());
+                } catch (Exception e) {
+                    LOGGER.log(Level.WARNING, "Problem updating EPICS data in db.", e);
+                }
+
+                try {
+                    runManager.updateScalerData(builder.getScalerData());
+                } catch (Exception e) {
+                    LOGGER.log(Level.WARNING, "Problem updating scaler data in db.", e);
+                }
+
+                try {
+                    if (builder.getTriggerConfig() != null) {
+                        runManager.updateTriggerConfig(builder.getTriggerConfig(), false /* do not replace existing config */);
+                    } else {
+                        LOGGER.info("No valid trigger config data was found.");
+                    }
+                } catch (Exception e) {
+                    LOGGER.log(Level.WARNING, "Problem updating trigger config data in db.", e);
+                }
+            } else {
+                LOGGER.info("Dry run is enabled; database will not be updated.");
+            }
+            
+            LOGGER.info("Done processing " + evioFile.getPath());
+        }
+        
+        // Commit the transaction.
+        try {
+            connection.commit();
+        } catch (SQLException e) {
+            throw new RuntimeException("Failed to commit db transaction.", e);
+        }
+        
+        try {
+            connection.close();
+        } catch (SQLException e) {
+            LOGGER.log(Level.WARNING, "Error closing db connection.", e);
+        }        
+    }
+}

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/builder/LivetimeBuilder.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/LivetimeBuilder.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/LivetimeBuilder.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/LivetimeBuilder.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
 
 import java.io.File;
 import java.io.IOException;
@@ -13,6 +13,11 @@
 import org.jlab.coda.jevio.EvioException;
 import org.jlab.coda.jevio.EvioReader;
 
+/**
+ * Computes livetimes from a set of EVIO files.
+ * 
+ * @author jeremym
+ */
 public class LivetimeBuilder extends AbstractRunBuilder {
     
     private List<File> files;

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/builder/SpreadsheetBuilder.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/SpreadsheetBuilder.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/SpreadsheetBuilder.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/builder/SpreadsheetBuilder.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
 
 import java.io.File;
 import java.util.logging.Logger;
@@ -7,17 +7,11 @@
 import org.hps.conditions.run.RunSpreadsheet.RunData;
 
 /**
- * Builds a complete {@link RunSummary} object from various data sources, including the data catalog and the run
- * spreadsheet, so that it is ready to be inserted into the run database using the DAO interfaces.  This class also 
- * extracts EPICS data, scaler data, trigger config and SVT config information from all of the EVIO files in a run.
- * <p>
- * The setters and some other methods follow the builder pattern and so can be chained by the caller.
+ * Adds information to a {@link RunSummary} from the run spreadsheet.
  * 
- * @author Jeremy McCormick, SLAC
- * @see RunSummary
- * @see RunSummaryImpl
+ * @author jeremym
  */
-final class SpreadsheetBuilder extends AbstractRunBuilder {
+public final class SpreadsheetBuilder extends AbstractRunBuilder {
     
     private static final Logger LOGGER = Logger.getLogger(SpreadsheetBuilder.class.getPackage().getName());
     
@@ -34,7 +28,7 @@
      * @return this object
      */
     @Override
-    void build() {       
+    void build() {
         if (this.spreadsheetFile == null) {
             throw new IllegalStateException("The spreadsheet file was never set.");
         }

Copied: java/trunk/run-database/src/main/java/org/hps/rundb/package-info.java (from r4415, java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java)
 =============================================================================
--- java/trunk/run-database/src/main/java/org/hps/run/database/package-info.java	(original)
+++ java/trunk/run-database/src/main/java/org/hps/rundb/package-info.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
 /**
  * API for accessing and updating the HPS run database.
  */
-package org.hps.run.database;
+package org.hps.rundb;

Copied: java/trunk/run-database/src/test/java/org/hps/rundb/builder/RunBuilderTest.java (from r4415, java/trunk/run-database/src/test/java/org/hps/run/database/RunBuilderTest.java)
 =============================================================================
--- java/trunk/run-database/src/test/java/org/hps/run/database/RunBuilderTest.java	(original)
+++ java/trunk/run-database/src/test/java/org/hps/rundb/builder/RunBuilderTest.java	Thu Jul  7 15:52:41 2016
@@ -1,4 +1,4 @@
-package org.hps.run.database;
+package org.hps.rundb.builder;
 
 import java.io.File;
 import java.util.List;
@@ -6,6 +6,8 @@
 import junit.framework.TestCase;
 
 import org.hps.conditions.database.ConnectionParameters;
+import org.hps.rundb.RunManager;
+import org.hps.rundb.RunSummaryImpl;
 import org.srs.datacat.client.ClientBuilder;
 
 public class RunBuilderTest extends TestCase {
@@ -38,12 +40,7 @@
         livetimeBuilder.setRunSummary(runSummary);
         livetimeBuilder.setFiles(files);
         livetimeBuilder.build();
-        
-        // trigger config
-        TriggerConfigBuilder configBuilder = new TriggerConfigBuilder();
-        configBuilder.setFiles(files);
-        configBuilder.build();
-        
+                
         // run spreadsheet
         SpreadsheetBuilder spreadsheetBuilder = new SpreadsheetBuilder();
         spreadsheetBuilder.setSpreadsheetFile(new File(SPREADSHEET));
@@ -58,7 +55,6 @@
         // update in database
         RunManager runManager = new RunManager(CONNECTION_PARAMETERS.createConnection());
         runManager.updateRunSummary(runSummary, true);
-        runManager.updateTriggerConfig(configBuilder.getTriggerConfigData(), true);
         runManager.updateEpicsData(dataBuilder.getEpicsData());
         runManager.updateScalerData(dataBuilder.getScalerData());
     }

Modified: java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	(original)
+++ java/trunk/users/src/main/java/org/hps/users/meeg/SvtChargeIntegrator.java	Thu Jul  7 15:52:41 2016
@@ -12,6 +12,7 @@
 import java.util.TimeZone;
 import java.util.logging.Level;
 import java.util.logging.Logger;
+
 import org.apache.commons.cli.CommandLine;
 import org.apache.commons.cli.CommandLineParser;
 import org.apache.commons.cli.HelpFormatter;
@@ -29,7 +30,7 @@
 import org.hps.conditions.svt.SvtMotorPosition;
 import org.hps.conditions.svt.SvtMotorPosition.SvtMotorPositionCollection;
 import org.hps.conditions.svt.SvtTimingConstants;
-import org.hps.run.database.RunManager;
+import org.hps.rundb.RunManager;
 
 /**
  * @author Sho Uemura <[log in to unmask]>

Modified: java/trunk/users/src/main/java/org/hps/users/spaul/FindBiasOnRange.java
 =============================================================================
--- java/trunk/users/src/main/java/org/hps/users/spaul/FindBiasOnRange.java	(original)
+++ java/trunk/users/src/main/java/org/hps/users/spaul/FindBiasOnRange.java	Thu Jul  7 15:52:41 2016
@@ -4,19 +4,18 @@
 import java.io.PrintStream;
 
 import org.hps.record.epics.EpicsData;
-import org.hps.run.database.EpicsType;
 import org.lcsim.event.EventHeader;
 import org.lcsim.util.Driver;
 
 public class FindBiasOnRange extends Driver{
-    /*tab*//*tab*//*tab*/
+    
     String svtBiasName = "SVT:bias:top:0:v_sens";
     
     String outfile = "bias_on.txt";
     @Override 
     public void process(EventHeader event){
         final EpicsData edata = EpicsData.read(event);
-        if (edata == null) 
+        if (edata == null)
             return;
         System.out.println(edata.getKeys());
         if(!edata.hasKey(svtBiasName))

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

November 2017
August 2017
July 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use