8000 GitHub - StephenChao/boostedHWW: CMS boosted HWW analysis
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

StephenChao/boostedHWW

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

boostedHWW

boosted HWW example

Search for boosted(high transverse momentum) Higgs boson(H) decaying to two W bosons, with the decay product in a single large radius jet. The majority of the analysis uses nanoAOD-tools, coffea and scikit-hep Python libraries to process input tree-based NanoAOD files.

Preprocessing

Firstly, we run a set of necessary processors to get the flat tree files that we finally used in the analysis.

MiniAOD to Ntuple(e.g., submit condor jobs on CMSconnect)

In preprocessing/nano_to_ntuple, XWWNano framework is used to produce tuple files from customized NanoAOD. Note that MiniAOD is initially used to derive the customized NanoAOD, implementing the ParticleTransformer taggers.

  1. Update voms certificate
voms-proxy-init -voms cms -valid 192:00
  1. Create json files, get official MiniAOD path on DAS, e.g.,
python condor.py --DAS DAS_2018_Signal --Filesjson "./json/2018_HWW_Signal.json" --createfilejson 
  1. Create condor job files, e.g.,
python condor.py --DAS DAS_2018_Signal  --Filesjson "./json/2018_HWW_Signal.json"  --outputPath "/ospool/cms-user/yuzhe/NtupleStore/V3/2018/Signal"  --year 2018  --excutable "exe_UL18_NanoNtupleChain.sh" --TaskFolder "production/NanoNtupleChain_16_Feb_2024" --submitsh "NanoNtupleChain_16_Feb_2024.sh" --Condor --AddtionalArgs "-a '-o ./ -M HWW -m --year 2018'"
  1. Submit condor jobs.

Ntuple to PKUTree

In preprocessing/ntuple_to_tree, a C++ and ROOT framework is used to further convert the tuple to flat tree. Higgs candidate jet is also selected from this step.

  1. Set up environment:
source /cvmfs/cms.cern.ch/cmsset_default.sh
cmsrel CMSSW_10_6_27
cd CMSSW_10_6_27/src
cmsenv
  1. To compile, run:
root -l
.L EDBR2PKUTree.C++
  1. Convert the tuple files to tree files:
python TransMergedMC.py

PKUTree to SlimmedTree

In preprocessing/tree_to_slimmed, the tree files are further cleaned, some scale factors(SFs) can also be applied in this step, e.g., HLT trigger SFs.

  1. Set up environment:
conda env create -f slim.yml
conda activate slim
  1. Convert the tree files to slimmed tree files:
python runSlimmedTree.py

Postprocessing

After processing the NanoAOD files, all kinds of variable distribution can be shown, e.g., jet mass. We can also save the histograms to .pkl templates and then use it later.

show prefit results:

Use file ./postprocessing/makeplots.ipynb

save distribution histograms:

Use file ./postprocessing/templates.ipynb

show variation with respect to different systematic uncertainties:

Use file ./postprocessing/variation.ipynb

Combine

CMSSW + Combine Quickstart

source /cvmfs/cms.cern.ch/cmsset_default.sh
cmsrel CMSSW_11_3_4
cd CMSSW_11_3_4/src
cmsenv
git clone -b v9.2.0 https://github.com/cms-analysis/HiggsAnalysis-CombinedLimit.git HiggsAnalysis/CombinedLimit
git clone -b v2.0.0 https://github.com/cms-analysis/CombineHarvester.git CombineHarvester
scramv1 b clean; scramv1 b

Packages

To create datacards, you need to use the same cmsenv as above + these packages:

pip3 install --upgrade pip
pip3 install rhalphalib
cd /path/to/your/local/boostedHWW/repo
pip3 install -e .

Create Datacards

Need root==6.22.6 and rhalphalib installed. Enter ./combine

python3 -u ./create_datacard.py --model-name nTFa_3_nTFb_6 --templates-dir ../postprocessing/templates/25Jun2024_sig_qcd --nTFa 3 --nTFb 6 --cards-dir .

Run fits and diagnostics locally

All via the below script, with a bunch of options (see script):

./combine/scripts/run_blinded.sh --workspace --bfit --limits

Perform GoF

Via file ./combine/scripts/run_ftest.sh and ./combine/scripts/run_blinded.sh

About

CMS boosted HWW analysis

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0