Print

Print


> 
> > I noticed yesterday that there seems to be another problem. The jobs do
> > not seem to be able to read all events from those files:
> > Error in <TTree::Write>: No file open,  while reading the files. And I

If I can give my 2 cents for this error, I think that it is still the one
related to the creation of dataset when running against a file with > than
3M events. In this case there's a problem with the root version that we
are currently using (and that is directly related to the RooFit problem)
that is going to be fixed when newer root version will be released and
implemented by the rooFit team.
I've asked several times Vouter and the answer has been: the error message
is completely harmless....
So you can relax on that point and wait a root-bug-fix.
Cheers,
alessio

> > find different numbers of events in the output histos for different runs
> > of the VirVubFit. At the moment so many events are left out (something
> > like a factor of 0.5) that I do not feel comfortable using the results
> > from the new files. What I have done so far does not seem to be affected,
> > but doing other things like varying Lambdabar is not really possible with
> > this.
> 
> I tried to look into this a bit closer and I found out that the loss of so
> many events is due to a stupid mistake by myself. Sorry in case I caused
> confusion with this.
> This does not explain the "Error in <TTree::Write>: No file open" and I
> still do not understand where this comes from, but I do not see any harm
> that is caused so far.
> 
> Still it would be nice if those three reduced ntuples could find some
> space at the AWG disk since then reading them with the VirVubFitter will
> probably be smoother and not affected by the problem described in the last
> mail.
> 
> Cheers,
> Kerstin
> 
>