----------------------- README for macro/dayone17/fastsim ----------------------- A set of macros to run some day one FastSim studies for the channels - pbp -> phi phi @ 2.3 GeV - pbp -> J/psi pi+ pi-; J/psi -> e+ e- @ 3.872 GeV - pbp -> J/psi pi+ pi-; J/psi -> mu+ mu- @ 3.872 GeV - pbp -> e+ e- @ p = 1.5 GeV/c Runs everything for the two setups - Setup A: Full with 75% GEM, 35% FTS, no EMC Barrel Xtals, no TOF Fwd, no MVD Pixels - Setup B: Full with 75% GEM, 35% FTS, no EMC Barrel Xtals, no TOF Fwd, no MVD Pixels, no Muon, no Shashlyk as a function of reducing the EMC barrel x-tals polar angle theta wise --------------------------------------------------------------------------------- Supermodul | 1 | 2 | 3 | 4 | 5 | 6 | 7 | num Alveolen in theta | 1 | 3 | 3 | 3 | 3 | 3 | 2 | coverage from 22 to | 140.0 | 133.4 | 113.8 | 94.1 | 74.4 | 54.8 | 35.1 | --------------------------------------------------------------------------------- ############ CONTENTS ############ - Files in directory - Simulation and Analysis - Data production - Submit to cluster (general info) ###################### FILES IN DIRECTORY ###################### Simulation - prod_fsim.C # run simulation Analysis macros - prod_ana.C # run predefined analysis - quickana.C # run quick analysis - quickfsimana.C # run fast sim and analysis in one go Decay files (in sub dir decfiles/) pp_2phi.dec # pbp -> 2phi, phi -> K+ K- pp_4pi.dec # pbp -> 2pi+ 2pi- pp_4pipi0.dec # pbp -> 2pi+ 2pi- pi0 pp_ee.dec # pbp -> e+ e- pp_eepi0.dec # pbp -> e+ e- pi0 pp_Jpsi2pi_Jpsi_ee.dec # pbp -> J/psi pi+ pi-, J/psi -> e+ e- pp_Jpsi2pi_Jpsi_ee_nophot.dec # pbp -> J/psi pi+ pi-, J/psi -> e+ e- (PHOTOS off) pp_Jpsi2pi_Jpsi_ll.dec # pbp -> J/psi pi+ pi-, J/psi -> e+ e-/mu+ mu- pp_Jpsi2pi_Jpsi_mm.dec # pbp -> J/psi pi+ pi-, J/psi -> mu+ mu- pp_Jpsi2pi_Jpsi_mm_nophot.dec # pbp -> J/psi pi+ pi-, J/psi -> mu+ mu- (PHOTOS off) Job scripts for kronos - jobfsim_kronos.sh # submit simulation (prod_fsim.C) and optional analysis jobs (prod_ana.C) - jobana_kronos.sh # submit analysis only jobs (prod_ana.C; used by anasub.pl) - jobquickfa_kronos.sh # submit quickfsimana.C jobs - job_rootmacro.sh # submit arbitrary ROOT macro Perl-scripts - resub.pl # re-submit jobs - resub_arr.pl # re-submit jobs (as arrays if possible) - anasub.pl # submit analysis jobs Job-file - d1study.jobs # contains all submit commands for data production ######################### SIMULATION AND ANALYSIS ######################### -------------------- Simulation -------------------- > root -l -b -q prod_fsim.C("", , "", , "") with parameters - Prefix for output files - Number of events to simulate - Generator setting (see above) - Beam momentum in GeV/c, of if <0 interpreted as -E_cm - option like e.g. "SetupB:Filter2:EMC[22-120]" + SetupA/SetupB as described above + Filter0 [J/psi -> e+ e-], Filter1 [J/psi -> mu+ mu-], Filter2 [2 phi -> K+ K-], Filter3 [pbp -> e+ e-] + EMC[x-y] defines theta range x < theta[deg] < y, x and y being integer numbers Examples: > root -l -b -q 'prod_fsim.C("sig_2phi_A_EMC1",10000,"decfiles/pp_2phi.dec",-2.3,"SetupA:Filter2:EMC[22-140]")' > root -l -b -q 'prod_fsim.C("bkg_2phi_B_EMC3",10000,"DPM",-2.3,"SetupB:Filter2:EMC[22-114]")' -------------------- Analysis -------------------- > root -l -b -q 'prod_ana.C("", , , , )' *** *** *** The macro has been pre-configured for reconstruction *** *** of the modes above, triggered by keywords in prefix *** *** *** Parameters: - Prefix for output files, or ROOT file name + if contains 'phi', reco = "phi -> K+ K-; pbp -> phi phi" + if contains 'Jee', reco = "J/psi -> e+ e-; pbp -> J/psi pi+ pi-" + if contains 'Jmm', reco = "J/psi -> mu+ mu-; pbp -> J/psi pi+ pi-" + if contains 'ppee', reco = "pbp -> e+ e-" - Mininum run number - Maximum run number - Arbitrary mode number; default=0. Stored in TTree produced by PndSimpleCombinerTask. - Number of events to analyze, 0 = default -> all events Examples: > root -l -b -q prod_ana.C("sig_2phi_A_EMC1_fsim.root") > root -l -b -q prod_ana.C("data/bkg_Jee_B_EMC3",1,20,100) -------------------------- Submitting analysis jobs -------------------------- Submitting the analysis macro from above as job to Kronos (e.g. to split to several jobs) can be done from macro/prod by -> sbatch jobana_kronos.sh *** *** *** prod_ana.C has to be configured for reconstruction, and the fastsim flag has to be set properly. *** *** *** Parameters: - Prefix for output files - run from this job number (default: 1) - .. to this (default: 20) - arbitrary mode number; default=0. Stored in TTree produced by PndSimpleCombinerTask. Examples: > sbatch jobana_kronos.sh bkg_Jee_B_EMC3 1 20 --------------------------------- Split to multiple analysis jobs --------------------------------- In case a large number of output files have to be analysed, so that it is reasonable to split into several jobs, the perl script 'anasub.pl' is useful. It works for both full and fast simulation input. -> ./anasub.pl [mode] [num] [min] [max] Parameters: : prefix of output filenames of pattern 'data/__pid.root; if 'check_' is prepended only prints commands if the name of a file 'some_name.jobs' containing many sbatch commands is given, analysis submission is performed for all of them. [mode] : mode number to be written in TTree; default: 0. mode = -1 counts up the mode for each line if filename is given as input [num] : number of runs per job to be analysed; default: 50 [min] : first run number; if omitted all found files are considered [max] : last run number; if omitted, all runs starting from [min] are considered When parameters [min] and [max] are not given, the scripts searches for all files with the according pattern and finds limiting numbers automatically. Examples: > ./anasub.pl check_bkg_Jee_B_EMC3 // prints all submit commands for available files > ./anasub.pl bkg_Jee_B_EMC3 0 20 // submits analysis jobs on bunches of 20 inputs each > ./anasub.pl bkg_Jee_B_EMC3 0 25 201 400 // submits jobs in bunches of 25 for runs 201 to 400 > ./anasub.pl d1study.jobs -1 // submits jobs according to all sbatch commands in file mystudy.jobs; run mode increased automatically ######################## DATA PRODUCTION ######################## -------------------- Initial production -------------------- Simply source the file d1study.jobs by > . d1study.jobs or by pasting individual lines from file in shell prompt > sbatch -a1-10 jobfsim_kronos.sh M2phi_A_EMC3 10000 decfiles/pp_2phi.dec -2.3 SetupA:Filter2:EMC[22-114] ana 2 -------------------- Re-submission -------------------- Check failed jobs by > ./resub_arr.pl d1study.jobs check Failed jobs can be resubmitted with > ./resub_arr.pl d1study.jobs check ##################### SUBMIT TO CLUSTER ##################### General information about submitting jobs on the Prometheus/ cluster can be found here: https://panda-wiki.gsi.de/foswiki/bin/view/Computing/PandaRootSimulationPrometheus http://wiki.gsi.de/cgi-bin/view/Linux/GridEngine ** For Kronos cluster take a look here: ** https://panda-wiki.gsi.de/foswiki/bin/view/Computing/PandaRootSimulationKronos https://wiki.gsi.de/foswiki/bin/view/Linux/KronosCluster https://wiki.gsi.de/foswiki/bin/view/Linux/SlurmUsage ************************************************************************************************************************** *** In the following the actions to be taken for KRONOS are given only, since Prometheus will be decommissioned soon. *** *** Refer to an older revision of this README file for information about Prometheus usage. *** ************************************************************************************************************************** -------------- Prerequisits -------------- BEFORE being able to submit jobs to Kronos, you have to 1. Login to an lxbk machine by ssh yourname@kronos.hpc.gsi.de locally from a lxpool.gsi.de machine. 2. Create a work directory on the data storage element NYX, e.g. /lustre/nyx/panda/yourname/pandaroot 3. Install PandaRoot in that place, since your GSI linux home directory is not mounted on Kronos nodes. There are several versions of the external packages and FairRoot pre-installed under /cvmfs. Take a look with > ls /cvmfs/fairroot.gsi.de/fairsoft/ > ls /cvmfs/fairroot.gsi.de/fairroot/ and set the environment variables SIMPATH to an appropriate fairsoft and FAIRROOTPATH to a corresponding fairroot version before compiling PandaRoot. ------------------------------- Most important slurm commands ------------------------------- Some nice condensed information about slurm can be found under https://rc.fas.harvard.edu/resources/documentation/convenient-slurm-commands/ - Job array submission with time setting, partition (sth like queue) setting > sbatch --time=01:00:00 --partition=debug --array=1-10 my_slurm_job_script.sh