Stepan,

I think it’s because the previous data sets were so small, it was deemed ok and useful, and not a big overhead on analyzers.  For 2019 (and future big runs) it’s obviously not an option.

-Nathan

On May 20, 2021, at 9:30 AM, Stepan Stepanyan <[log in to unmask]> wrote:

Nathan,

Thanks. 
So, my question is, why we are producing x2.5 of EVIO for physics analysis? 

Stepan

On May 20, 2021, at 9:17 AM, Graf, Norman A. <[log in to unmask]> wrote:

Hello Nathan,

Thanks for this comprehensive summary. 

Norman


From: [log in to unmask] <[log in to unmask]> on behalf of Nathan Baltzell <[log in to unmask]>
Sent: Thursday, May 20, 2021 5:42 AM
To: hps-software <[log in to unmask]>
Subject: 2019 computing numbers
 
Hello Everyone,

Following up on yesterday’s discussion, I put some numbers regarding cpu/tape/disk at JLab for processing the HPS 2019 data at the link below.  I also re-estimated from scratch the processing time estimates and came out again with the same numbers I reported yesterday (so that’s good:).

https://jeffersonlab-my.sharepoint.com/:w:/g/personal/baltzell_jlab_org/EcGppN7oZIBCoOv0YMDlXb8Boq8xyl2_wdSzSBSORM9K8w?e=oM9tW9

-Nathan


Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1


Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1




Use REPLY-ALL to reply to list

To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1