Print

Print


Hi,
this is an update on the killer performances.
1st) The code compiles, the exec runs (this is not so obvious ;-) )
2nd) I've performed a trial over 50k ev: it works fine.
3rd) I've ran over D0 run1 data.
     I've found 41661 duplicates, I've ran over  506532

The GIGArootfile produced has 464871 entries (right number !!!)
The ran took :

########################

Resource usage summary:

    CPU time   :  52544.90 sec.
    Max Memory :       103 MB
    Max Swap   :       129 MB

    Max Processes  :         3

########################

The file produced is 1.59 Gb large (like the ensamble of all the rootfiles
for d0 run1) so we don't save any space. We only get one file instead of
hundred...

So all seems to work fine, but....

a) the job ended with:

###############################

Writing outp3/sx-d0-run1-data.root

 *** Break *** bus error

------------------------------------------------------------
Sender: LSF System <lsf@barb0484>
Subject: Job 644308: <./Killer -c chains/d0/sx-d0-run1-data -D outp3 -w
outp3/prova.root -k list_D0> Exited

###################################

When I try to load the file in root I get:

##########################################

CINT/ROOT C/C++ Interpreter version 5.15.07, July 7 2001
Type ? for help. Commands must be C++ statements.
Enclose multiple statements between { }.
root [0] TFile f = TFile("outp3/prova.root");
Warning in <TFile::TFile>: file outp3/prova.root probably not closed,
trying to recover
Warning in <TFile::Recover>: successfully recovered 3 keys

###########################################

The recovering takes a while...
But then I can...

###########################################
root [1] f->ls();
TFile**		outp3/prova.root
 TFile*		outp3/prova.root
  KEY: TList	StreamerInfo;1	Doubly linked list
  KEY: TTree	h9;46	CompBRecoNtuple
  KEY: TTree	h9;45	CompBRecoNtuple
root [2] h9->Print();
...........................
...........................
...........................

*............................................................................*
*Br  476 :energyGam : energyGam[nGam]/F
*
*Entries :   464871 : Total  Size=   31445076 bytes  File Size  =
24358678 *
*Baskets :     4172 : Basket Size=       8000 bytes  Compression=   1.29
*
*............................................................................*
*Br  477 :B0RecGam  : B0RecGam[nGam]/I
*
*Entries :   464871 : Total  Size=   31440904 bytes  File Size  =
1555677 *
*Baskets :     4172 : Basket Size=       8000 bytes  Compression=  20.21
*
*............................................................................*
*Br  478 :chBRecGam : chBRecGam[nGam]/I
*
*Entries :   464871 : Total  Size=   31445076 bytes  File Size  =
3619975 *
*Baskets :     4172 : Basket Size=       8000 bytes  Compression=   8.69
*
*............................................................................*


###########################################

......the number of entries is correct and I can make meaningful plots:

Mes
http://www.slac.stanford.edu/~asarti/recoil/killing/prova1.eps
De
http://www.slac.stanford.edu/~asarti/recoil/killing/prova2.eps

I need the expert opinion: Urs, do we have to worry about this problem?
Do you think that the problem will become larger for larger data?
There is a way to find out how much data we can 'pack' in one file without
this problem?

Now I'm waiting for reprocessed run1+run2 TS to work on 50fb-1

Comments and questions are welcomed.
Alessio

______________________________________________________
Alessio Sarti
 Universita' & I.N.F.N. Ferrara
 tel  +39-0532-781928  Ferrara

"Quod non fecerunt barbari, fecerunt Berlusconi"

"Che il bianco sia bianco, e che il nero sia nero
 che uno e uno fanno due e che la scienza dice il vero....
 DIPENDE !"