Print

Print


This reminds me an issue, that we got on pass0 of 1016 data a year ago.

At that time also dst was crashing on one of the first events (in 
average less than 50).
Then we noticed that even lcio C++ API is not able to read the file.
Bradley, could you do  $LCIO/bin/lcio_event_counter your_file.slcio
See if it is able to count all the events, or it outputs only 10 (or so),
if that outputs very small number, then the problem is not with the dst,
but rather with the jar file (I think).

also which jar have you used, for creating slcio file.
with release3.11.1 I didn't see such an issue on pass1 data.

Rafo


On 07/21/2017 08:32 AM, Nathan Baltzell wrote:
> Maybe completely unrelated, but there was an email from Sandy earlier 
> this week that the /apps/root/PRO link is now ROOT 6.  Meanwhile 
> dstmaker and libs were built on 5, but hps's env uses the PRO link.
>
> -Nathan
>
> On Jul 21, 2017, at 01:36, Omar Moreno <[log in to unmask] 
> <mailto:[log in to unmask]>> wrote:
>
>> Can you print out all tracks in the event and all tracks associated 
>> with final state particles for all events up to the one that fails?  
>> The above is implying that a track was no used to make a final state 
>> particle, but printing out the info would also use to make sure.
>>
>> On Thu, Jul 20, 2017 at 8:20 PM, Bradley T Yale 
>> <[log in to unmask] <mailto:[log in to unmask]>> wrote:
>>
>>     Not on the first event, but after 10 or so.
>>
>>
>>     The analysis linked with the same libraries works on an old DST
>>     made from the same binary (June 1), but after rerunning the DST
>>     maker on the recon used to make it originally, that new DST no
>>     longer works.
>>
>>
>>     So probably not from a recent recon problem.
>>
>>
>>     At first I thought it could be due to the new 2016 MC recon
>>     steering file I was testing, so I tried it on recon that
>>     definitely did not use it.
>>
>>
>>
>>     ------------------------------------------------------------------------
>>     *From:* Omar Moreno <[log in to unmask]
>>     <mailto:[log in to unmask]>>
>>     *Sent:* Thursday, July 20, 2017 7:25:12 PM
>>     *To:* Bradley T Yale
>>     *Cc:* [log in to unmask]
>>     <mailto:[log in to unmask]>
>>     *Subject:* Re: DST crash
>>     Does this happen on the first event? This seems to point to an
>>     issue with the recon.
>>
>>     On Thu, Jul 20, 2017 at 4:09 PM, Bradley T Yale
>>     <[log in to unmask] <mailto:[log in to unmask]>> wrote:
>>
>>         Hi,
>>
>>         When analyzing recent files made from the DST maker in either
>>         of these places:
>>
>>         /u/group/hps/hps_soft/hps-dst/centos7-64/bin/dst_maker
>>         /u/group/hps/hps_soft/hps-dst/build_new/bin/dst_maker
>>
>>         It causes a crash when getMomentum() or getCharge() is called
>>         on an SvtTrack object:
>>
>>         ===========================================================
>>         There was a crash.
>>         This is the entire stack trace of all threads:
>>         ===========================================================
>>         #0  0x00007efc2a30203c in waitpid () from /lib64/libc.so.6
>>         #1  0x00007efc2a287092 in do_system () from /lib64/libc.so.6
>>         #2  0x00007efc2ed64949 in TUnixSystem::StackTrace
>>         (this=0x16970c0) at
>>         /apps/root/5.34.36/root/core/unix/src/TUnixSystem.cxx:2419
>>         #3  0x00007efc2ed6658c in TUnixSystem::DispatchSignals
>>         (this=0x16970c0, sig=kSigSegmentationViolation) at
>>         /apps/root/5.34.36/root/core/unix/src/TUnixSystem.cxx:1294
>>         #4  <signal handler called>
>>         #5  HpsParticle::getMomentum (this=0x0) at
>>         /home/hps/hps_soft/hps-dst/hps-dst/src/hps_event/HpsParticle.cxx:164
>>         #6  0x00007efc2f4f8bfe in SvtTrack::getMomentum
>>         (this=0x28c3570) at
>>         /home/hps/hps_soft/hps-dst/hps-dst/src/hps_event/SvtTrack.cxx:127
>>         #7  0x00000000004078ec in main ()
>>         ===========================================================
>>
>>
>>         The lines below might hint at the cause of the crash.
>>         If they do not help you then please submit a bug report at
>>         http://root.cern.ch/bugs
>>         <https://urldefense.proofpoint.com/v2/url?u=http-3A__root.cern.ch_bugs&d=DwMFaQ&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=SyXCERKmF6ZPnHcvjvj_i035JU5-CekNtcXyMlms6kc&m=RstTSFd6ITb7KUTcPENp8b9bdEwBvkCuLYunhAknhGM&s=q3xq2edWJ8a-dc8LRxm82HhXxZWThbNCBZaQQSlTaYA&e=>.
>>         Please post the ENTIRE stack trace
>>         from above as an attachment in addition to anything else
>>         that might help us fixing this issue.
>>         ===========================================================
>>         #5  HpsParticle::getMomentum (this=0x0) at
>>         /home/hps/hps_soft/hps-dst/hps-dst/src/hps_event/HpsParticle.cxx:164
>>         #6  0x00007efc2f4f8bfe in SvtTrack::getMomentum
>>         (this=0x28c3570) at
>>         /home/hps/hps_soft/hps-dst/hps-dst/src/hps_event/SvtTrack.cxx:127
>>         #7  0x00000000004078ec in main ()
>>         ===========================================================
>>
>>
>>
>>         The culprit seems to be this part of the code in SvtTrack.cxx:
>>
>>
>>         int SvtTrack::getBlah() {
>>             if (fs_particle == NULL) return 9999;
>>             return ((HpsParticle*)
>>         this->fs_particle.GetObject())->getBlah();
>>         }
>>
>>
>>         Files that were made around June 1 with the same DST maker
>>         work fine.
>>
>>         Did something change since then?
>>
>>
>>         -Bradley
>>
>>
>>
>>
>>
>> ------------------------------------------------------------------------
>>
>> Use REPLY-ALL to reply to list
>>
>> To unsubscribe from the HPS-SOFTWARE list, click the following link:
>> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1 
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMFaQ&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=SyXCERKmF6ZPnHcvjvj_i035JU5-CekNtcXyMlms6kc&m=RstTSFd6ITb7KUTcPENp8b9bdEwBvkCuLYunhAknhGM&s=3tK9PIQWOM_vuGK2AMGaTB9Z5--lRRZT9GyK2xzYBmI&e=> 
>>
>>
>
> ------------------------------------------------------------------------
>
> Use REPLY-ALL to reply to list
>
> To unsubscribe from the HPS-SOFTWARE list, click the following link:
> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1 
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__listserv.slac.stanford.edu_cgi-2Dbin_wa-3FSUBED1-3DHPS-2DSOFTWARE-26A-3D1&d=DwMFaQ&c=lz9TcOasaINaaC3U7FbMev2lsutwpI4--09aP8Lu18s&r=0HDJrGO9TZQTE97J9Abt2A&m=XmYrUSACj3LpXzf7GZwvZNVfldSUyWz-WYuSJ0zYYnI&s=O--zZYE6sIWhnOTktX9XeD_IQ7llkoofd36P8OQBkgI&e=> 
>
>


########################################################################
Use REPLY-ALL to reply to list

To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1