LISTSERV mailing list manager LISTSERV 16.5

Help for VUB-RECOIL Archives


VUB-RECOIL Archives

VUB-RECOIL Archives


VUB-RECOIL@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

VUB-RECOIL Home

VUB-RECOIL Home

VUB-RECOIL  July 2006

VUB-RECOIL July 2006

Subject:

Re: S/P systematics

From:

Concezio Bozzi <[log in to unmask]>

Reply-To:

[log in to unmask]

Date:

27 Jul 2006 11:47:31 +0200Thu, 27 Jul 2006 11:47:31 +0200

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (107 lines)

Hi, 
yes this should be the one with lepton cuts. However I am still
recomputing the S/P ratios by using Antonio's latest package and will
repeat the exercise again. Wolfgang, why should we apply 50% and 100%
correlations? The mX bins should be statistically independent, no? 
Concezio. 

Il giorno gio, 27-07-2006 alle 11:27 +0200, Wolfgang Menges ha scritto:
> Hi Concezio,
> 
> 	as Heiko said, very promissing. Can you run with 50% and 100% correlation?
> 
> Cheers,
> 
> 	Wolfgang
> 
> Concezio Bozzi wrote:
> > Hi all, 
> > 
> > I run a test to estimate the systematic uncertainty due to the
> > uncertainty on S/P in mES data fits in the following way: 
> > 
> > 1) Take the S/P ratios determined as 
> > 
> > S/P(data_enriched) = [S/P(MC_enriched) / S/P(MC_depleted)] * S/P
> > (data_depleted)
> > 
> > I have used the values which I have been recently playing with, i.e. 
> > #mx_l mx_h  corr   err_corr
> > 0.00 1.55 1.499 +- 0.495
> > 1.55 1.90 2.688 +- 0.655
> > 1.90 2.20 1.801 +- 0.296
> > 2.20 2.50 1.896 +- 0.611
> > 2.50 2.80 1.165 +- 0.468
> > 2.80 3.10 0.637 +- 0.311
> > 3.10 3.40 19.367+- 34.585
> > 3.40 3.70 1.524 +- 1.610
> > 3.70 4.20 8.180 +- 31.833
> > 4.20 5.00 0.555 +- 6.639
> > 
> > No attempt to fit a n-th order polynomial, just take the values as they
> > come out of the single bin-by-bin fits on data depleted, MC enriched and
> > depleted. 
> > Note that the relative errors are quite large (e.g. 33% on the first
> > bin, 25% on the second, 22% on the third, 32% on the 4th, higher and
> > higher as mX increases). 
> > 
> > 2) Randomize simultaneously the 10 above values according to a gaussian
> > distribution whose mean is the correction (column corr) and whose sigma
> > is the uncertainty (err_corr). The random number is of course different
> > for each mX bin. 
> > 
> > 3) Fit with VVF by using the randomized S/P of point 2) 
> > 
> > 4) go to 2), change the random seed, repeat 100 times
> > 
> > Results of the 100 jobs are in 
> > http://www.slac.stanford.edu/~bozzi/scra/Ibu_SP_*
> > *=1,...,100
> > 
> > Take the (width/mean) ratio of the resulting 100 fits as systematic
> > uncertainty: 
> > 
> > yakut02(~:) grep "BRBR           " ~bozzi/scra/Ibu_SP_*/*dat | awk
> > 'BEGIN{sum=0; sum2=0}{sum+= $3; sum2+=$3*$3; num++}END{print sum/num;
> > print sqrt(sum2/num-sum*sum/num/num)}'
> > 0.0291231
> > 0.00188286
> > 
> > The relative uncertainty is 0.00188/0.02912 = 6.46% i.e. 3.2% on Vub. 
> > This is somewhat lower than a naive argument which can be used (see
> > below) to give the error on the fitted Vub events in the first bin and
> > which give about twice (13.2%) the error on BRBR. I think the reason for
> > this is that the errors on the first 4 bins are comparable, which
> > reduces the lever arm and therefore the variation in the first bin. 
> > 
> > Quite promising, isn't it? 
> > 
> > Concezio. 
> > 
> > 
> > PS: here is the naive argument on the uncertainty on the number of
> > signal events in the first bin, which translates in the uncertainty on
> > BRBR. We have
> > 
> > N_signal = N_data - N_argus - N_peaking 
> > 
> > N_peaking = N_signal * 1/corr
> > (corr is the S/P correction factor)
> > 
> > Solving for N_signal:
> > 
> > N_signal = [corr / (1+ corr)] * [N_data - N_argus] 
> > 
> > Error propagation gives: 
> > 
> > delta(N_signal) / N_signal = [delta(corr) / corr] / (1+corr) 
> > 
> > Taking corr = 1.499, delta(corr)/corr = 0.33 we get 
> > 
> > delta(N_signal) / N_signal = 13.2%
> > 
> 



Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

March 2010
December 2009
August 2009
January 2009
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
December 2005
November 2005
October 2005
September 2005
August 2005
July 2005
June 2005
May 2005
April 2005
March 2005
February 2005
January 2005
December 2004
November 2004
October 2004
September 2004
August 2004
July 2004
June 2004
May 2004
April 2004
March 2004
February 2004
January 2004
December 2003
November 2003
October 2003
September 2003
August 2003
July 2003
June 2003
May 2003
April 2003
March 2003
February 2003
January 2003
December 2002
November 2002
October 2002
September 2002
August 2002
July 2002
June 2002
May 2002
April 2002
March 2002
February 2002
January 2002
December 2001
November 2001
October 2001

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use