Difference: Brenoorzari (32 vs. 33)

Revision 332019-10-04 - brenoorzari

Line: 1 to 1


Line: 570 to 570
  I can do the same thing with the -2, -1, 0, 1 and 2 sigma expected CLs, converting everything to number of events of signal and than to cross sections (would this be a brazilian plot?).


Today me and Pedro checked the class about statistics. Everything seems clear, and the exercises were extremely helpful to understand different types of CL calculations, especially when it talked about the CLs method (since I had no clue of what it was doing). I'll prepare something for the next week meeting.

I'm also working on a macro to write the results of TLimit with 1 and 2 sigmas. It's named as tlimit_testes.C. The other important macro of today is the tlimit_tlimit.C, where I'm testing the uncertainties in the signal and BG values. Unfortunately, the TLimit calculation with histograms without uncertainties is not equal to the TLimit calculation with double values. I've checked that, and what was happening was that I forgot to turn off the statistical uncertainties when I was comparing both of them. I'll ask Thiago if it's important to account for both uncertainties right now.

I've created the macro brazilian_not_brazilian_plot.C with the 1/2 sigmas values, and already updated the macro tlimit_to_xsec_mdh_mzp.C to plot that graph. It is very impressive, but I don't know how to make it beautiful for now. I'll show this plot and some results at tomorrow's meeting.


Yesterday, while talking to Pedro, we saw a little divergence in my macros. I was using MG5 luminosity to perform all the calculations until the tlimit macros. Unfortunately, the correct luminosity is achieved by using the expression L = N/xsec. But madgraph gives me 2 numbers for each of those quantities. The first one is the number of events in the run_card (that I set to do the Monte Carlo simulations), and the xsec assigned to the process concerning thi number of events and a fixed luminosity. The other ones are the number of events after pythia matching/merging and the xsec assigned to that number of events, that provide the same luminosity as the numbers before. Since Delphes is using the later, I needed to change the luminosity that I was calculating using Lnew = N(after matching/merging) / xsec (after matching/merging). This shall not change the results significantly, but it will make the cross sections calculated through TLimit and the MG5 ones more consistent.

I've already compared the m_{J} graphs from the paper with the ones using this new quantity to validate this case, and it seems pretty good.

The new xsecs (after matching/merging) for the different values of m_{d_{H}} are:

m_{d_{H}} [GeV] approx x-sec [pb]
50 0.536
60 0.486
70 0.435
80 0.396
90 0.359
100 0.325
110 0.291
120 0.251
130 0.198
140 0.135
150 0.0723
155 0.00368 (w/ MET cut)
156 0.00311 (w/ MET cut)
160 0.0192
170 0.00361
180 0.00223

I still need to do the processes for 155 and 156 without the MET cut, but with pythia and Delphes to get the merged xsec.

The new xsecs (after matching/merging) for the different values of m_{Z'} are:

m_{d_{H}} [GeV] approx x-sec [pb]
500 3.030
625 2.069
1000 0.594
1100 0.435
1500 0.136
1700 0.0797
2000 0.0369
2500 0.0114
2750 0.00169
3000 0.00378
3500 0.00131
4000 0.000462

I still need to do the process for 2750 without the MET cut, but with pythia and Delphes to get the merged xsec.

All the BG xsecs are all right, I just need to put the 1.2 factor inside their k-factors.


The cross sections for the processes with m_{d_{H}} of 155 and 156 without MET cut are 0.0422 pb and 0.0354 pb. For the m_{Z'} 2750 process it is 0.00643 pb.



This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2021 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback

antalya escort bursa escort eskisehir escort istanbul escort izmir escort