BTW, I use this utility to import into SQLite
https://github.com/dumblob/mysql2sqlite
full commands:
# dump the db from a JLAB machine such as ifarm
mysqldump -h hpsdb.jlab.org -ujeremym -pMyPasswordHere --skip-extended-insert --skip-create-options --compact --skip-comments hps_conditions &> hps_conditions_for_sqlite.sql
# import into a local SQLite db
mysql2sqlite hps_conditions_for_sqlite.sql | sqlite3 hps_conditions.db
It seemed to work fine but I have not verified that the recon/readout is exactly the same as before using the MySQL db.
Note that SQLite itself is NOT installed on ifarm, so I ended up dumping the db at JLab, checking in the SQL file to github, pulling it at SLAC, and then importing using sqlite3 from rhel6-64.
I've also had to compress the SQL and db files lately because they are over the 100 MB github file size limit or close to it.
Where do I save this file? When the DB is updated, how do I get the latest file? Do I need to ask you to generate the file?
I have a more serious problem. As you may have heard, there was a bug in the field map generation. I need to re-run some MC as soon as the field map is corrected. How do I update HPS-MC? Once HPSJAVA is updated, all I need to do is to build HPS-MC from scratch?
Another related question. We need to study next year's configuration. Once next year's detector is created in HPSJAVA, how do I update HPS-MC? Build HPS-MC from scratch?
Correction:
The file to download and use locally is actually here
https://github.com/JeffersonLab/hps-conditions-backup/blob/master/hps_conditions.db.tar.gz
Hi,
I have a basic working version of hps-java on a branch (iss320) which can load conditions from a SQLite db that I exported from MySQL. This means that you should be able to run the recon entirely offline without needing to install MySQL locally and import the conditions db.
The issue branch is here:
https://github.com/JeffersonLab/hps-java/tree/iss320
What are the next steps? Should we attempt to run it in hps-mc?
I have only run the readout and I did not check the results carefully (just saw that it wrote out some events); perhaps we need to run & validate the recon as well before doing anything else? It is possible, for instance,
that some values of conditions data might not be the same, e.g. due to rounding/precision errors. So that should be checked carefully.
It is a fairly simple matter to run the bin jar and point to the local db file...
java -Dorg.hps.conditions.url=jdbc:sqlite:hps_conditions.db -jar hps-java-bin.jar [...]
The db file with the exported conditions data can be obtained from here:
https://github.com/JeffersonLab/hps-conditions-backup/blob/master/hps_conditions_for_sqlite.sql.tar.gz
It should be downloaded into the working directory and decompressed using 'tar -zxvf'.
To run in hps-mc should be pretty straightforward; I think we would just need something like this passed to the JobManager in each script:
java_args=["-Dorg.hps.conditions.url=jdbc:sqlite:hps_conditions.db"]
We would need to either copy over the db file to each batch node and read from this relative path, symlink the file from each node, or read the file from a fixed/hard-coded NFS location. (All of these options require some relatively minor changes to the python scripts.)
Let me know how you want to proceed and what I can do to help...
BTW I will be leaving on a 3 month trip next Tuesday evening, and I do not plan on doing any HPS work during this time, so please get back to me ASAP on this so hopefully we can get this running successfully before I
leave.
--Jeremy
Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1
Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1
Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1