Search for boosted(high transverse momentum) Higgs boson(H) decaying to two W bosons, with the decay product in a single large radius jet. The majority of the analysis uses nanoAOD-tools, coffea and scikit-hep Python libraries to process input tree-based NanoAOD files.
Firstly, we run a set of necessary processors to get the flat tree files that we finally used in the analysis.
MiniAOD to Ntuple(e.g., submit condor jobs on CMSconnect)
In preprocessing/nano_to_ntuple
, XWWNano framework is used to produce tuple files from customized NanoAOD. Note that MiniAOD is initially used to derive the customized NanoAOD, implementing the ParticleTransformer taggers.
- Update
voms
certificate
voms-proxy-init -voms cms -valid 192:00
- Create json files, get official MiniAOD path on DAS, e.g.,
python condor.py --DAS DAS_2018_Signal --Filesjson "./json/2018_HWW_Signal.json" --createfilejson
- Create condor job files, e.g.,
python condor.py --DAS DAS_2018_Signal --Filesjson "./json/2018_HWW_Signal.json" --outputPath "/ospool/cms-user/yuzhe/NtupleStore/V3/2018/Signal" --year 2018 --excutable "exe_UL18_NanoNtupleChain.sh" --TaskFolder "production/NanoNtupleChain_16_Feb_2024" --submitsh "NanoNtupleChain_16_Feb_2024.sh" --Condor --AddtionalArgs "-a '-o ./ -M HWW -m --year 2018'"
- Submit condor jobs.
In preprocessing/ntuple_to_tree
, a C++ and ROOT framework is used to further convert the tuple to flat tree. Higgs candidate jet is also selected from this step.
- Set up environment:
source /cvmfs/cms.cern.ch/cmsset_default.sh
cmsrel CMSSW_10_6_27
cd CMSSW_10_6_27/src
cmsenv
- To compile, run:
root -l
.L EDBR2PKUTree.C++
- Convert the tuple files to tree files:
python TransMergedMC.py
In preprocessing/tree_to_slimmed
, the tree files are further cleaned, some scale factors(SFs) can also be applied in this step, e.g., HLT trigger SFs.
- Set up environment:
conda env create -f slim.yml
conda activate slim
- Convert the tree files to slimmed tree files:
python runSlimmedTree.py
After processing the NanoAOD files, all kinds of variable distribution can be shown, e.g., jet mass. We can also save the histograms to .pkl
templates and then use it later.
Use file ./postprocessing/makeplots.ipynb
Use file ./postprocessing/templates.ipynb
Use file ./postprocessing/variation.ipynb
source /cvmfs/cms.cern.ch/cmsset_default.sh
cmsrel CMSSW_11_3_4
cd CMSSW_11_3_4/src
cmsenv
git clone -b v9.2.0 https://github.com/cms-analysis/HiggsAnalysis-CombinedLimit.git HiggsAnalysis/CombinedLimit
git clone -b v2.0.0 https://github.com/cms-analysis/CombineHarvester.git CombineHarvester
scramv1 b clean; scramv1 b
To create datacards, you need to use the same cmsenv as above + these packages:
pip3 install --upgrade pip
pip3 install rhalphalib
cd /path/to/your/local/boostedHWW/repo
pip3 install -e .
Need root==6.22.6
and rhalphalib
installed. Enter ./combine
python3 -u ./create_datacard.py --model-name nTFa_3_nTFb_6 --templates-dir ../postprocessing/templates/25Jun2024_sig_qcd --nTFa 3 --nTFb 6 --cards-dir .
All via the below script, with a bunch of options (see script):
./combine/scripts/run_blinded.sh --workspace --bfit --limits
Via file ./combine/scripts/run_ftest.sh
and ./combine/scripts/run_blinded.sh