echidna.scripts package

Submodules

echidna.scripts.combine_hdf5 module

Example smearing script

This script:
  • Reads in multiple spectra from hdf5
  • Combines the spectra into one spectrum
  • Dumps spectrum to combined.hdf5

Examples

To combine hdf5 files example1.hdf5 and example2.hdf5

$ python echidna/scripts/combine_hdf5.py -f /path/to/example1.hdf5
  /path/to/example2.hdf5

This will create the hdf5 file combined.hdf5. There is no limit to the number of files you can combine.

echidna.scripts.dump_scaled module

Example scaling script

This script:
  • Reads in mc spectra from hdf5
  • Scales spectra
  • Scaled spectrum is saved to the same directory with _scaled added to the file name

Examples

To scale energy_mc of the hdf5 file example.hdf5 by a factor 1.1:

$ python dump_scaled.py -d energy_mc -f /path/to/example.hdf5 -s 1.1

This will create the scaled hdf5 file /path/to/example_scaled.hdf5.

echidna.scripts.dump_shifted module

Example shifting script

This script:
  • Reads in mc spectra from hdf5
  • Shifts spectra
  • Shifted spectrum is saved to the same directory with _shifted added to the file name

Examples

To shift energy_mc of the hdf5 file example.hdf5 by 0.1 MeV:

$ python dump_shifted.py -d energy_mc -f /path/to/example.hdf5 -s 0.1

This will create the shifted hdf5 file /path/to/example_shifted.hdf5.

echidna.scripts.dump_smeared module

Example smearing script

This script:
  • Reads in mc spectra from hdf5
  • Smears spectra, default is to use weighted Gaussian method, but can also use specify random Gaussian method via command line
  • Smeared spectrum is saved to the same directory with _smeared added to the file name

Examples

To smear hdf5 file example.hdf5 using the random Gaussian method:

$ python dump_smeared.py --smear_method "random" /path/to/example.hdf5

This will create the smeared hdf5 file /path/to/example_smeared.hdf5.

Note

Valid smear methods include:

  • “weight”, default
  • “random”

echidna.scripts.dump_spectra_ntuple module

Example script to create spectrum objects from ntuple file and store
in hdf5 format.
This script:
  • Reads in ntuple file of background / signal isotope
  • Creates and fills spectra objects with mc and reconstructed information
  • Plots Energy, radius and time dimensions of spectra object
  • Saves spectra objects to file in hdf5 format

Examples

To read rat_ds root file “file.ntuple.root” with config file cnfg.yml:

$ python dump_spectra_ntuple.py /path/to/cnfg.yml /path/to/file.ntuple.root

This will create the smeared hdf5 file ./example.hdf5. To specify a save directory, include a -s flag followed by path to the required save destination.

echidna.scripts.dump_spectra_ntuple.plot_spectrum(spec, config)[source]

Plot spectra for each of the spectrum dimensions (e.g. energy)

Parameters:
echidna.scripts.dump_spectra_ntuple.read_and_dump_ntuple(fname, config_path, spectrum_name, save_path, bipo, fv_radius, outer_radius)[source]
Creates both mc and reco spectra from ntuple files, dumping the
results as a spectrum object in a hdf5 file
Raises:

ValueError – If outer_radius not None and radial3 is not in the config.

Parameters:
  • fname (str) – The file to be evaluated
  • config_path (str) – Path to the config file
  • spectrum_name (str) – Name to be applied to the spectrum
  • save_path (str) – Path to a directory where the hdf5 files will be dumped
  • bipo (bool) – Apply Bi*Po* cuts when extracting data if True.
  • fv_radius (float) – Cut events outside the fiducial volume of this radius.
  • outer_radius (float) – Used for calculating the radial3 parameter. See echidna.core.dsextract for details.
echidna.scripts.dump_spectra_ntuple.read_tab_delim_file(fname)[source]

Read file paths from text file.

Parameters:fname (str) – Name of file to be read.
Returns:List of file paths read from file
Return type:list

echidna.scripts.dump_spectra_root module

Example script to create spectrum objects from rat_ds root files
and store in hdf5 format.
This script:
  • Reads in rat_ds root file of background / signal isotope
  • Creates and fills spectra objects with mc and reconstructed information
  • Plots Energy, radius and time dimensions of spectra object
  • Saves spectra objects to file in hdf5 format

Examples

To read rat_ds root file “example.root”:

$ python dump_spectra_ntuple.py /path/to/config.yml /path/to/example.root

This will create the smeared hdf5 file ./example.hdf5. To specify a save directory, include a -s flag followed by path to the required save destination.

echidna.scripts.dump_spectra_root.plot_spectrum(spec, config)[source]

Plot spectra for each of the spectrum dimensions (e.g. energy)

Parameters:
echidna.scripts.dump_spectra_root.read_and_dump_root(fname, config_path, spectrum_name, save_path, bipo, fv_radius, outer_radius)[source]

Creates both mc and reco spectra from ROOT files, dumping the results as a spectrum object in a hdf5 file

Parameters:
  • fname (str) – The file to be evaluated
  • config_path (str) – Path to the config file
  • spectrum_name (str) – Name to be applied to the spectrum
  • save_path (str) – Path to a directory where the hdf5 files will be dumped
  • bipo (bool) – Apply Bi*Po* cuts when extracting data if True.
  • fv_radius (float) – Cut events outside the fiducial volume of this radius.
  • outer_radius (float) – Used for calculating the radial3 parameter. See echidna.core.dsextract for details.
echidna.scripts.dump_spectra_root.read_tab_delim_file(fname)[source]

Read file paths from text file

Parameters:fname (str) – Name of file to be read.
Returns:List of file paths read from file
Return type:list

echidna.scripts.klz_majoron_limits module

KamLAND-Zen (plot-grab) Majoron limits script

This script:

  • Sets 90% confidence limit on the Majoron-emitting neutrinoless double beta decay modes (with spectral indices n = 1, 2, 3 and 7), using plot-grabbed data from KamLAND-Zen.

Examples

To use simply run the script and supply a YAML file detailing the spectra (data, fixed, floating) to load:

$ python zero_nu_limit.py --from_file klz_majoron_limits_config.yaml

The --upper_bound and --lower_bound flags from the command, can be used to return an estimate on the error introduced through the plot-grabbing process.

Note

An example config would be:

data:
    data/klz/v1.0.0/klz_data.hdf5
fixed:
    {data/klz/v1.0.0/total_b_g_klz.hdf5: 26647.1077395}
floating:
    [data/klz/v1.0.0/Xe136_2n2b_fig2.hdf5]
signals:
    {
        klz_n1: data/klz/v1.0.0/Xe136_0n2b_n1_fig2.hdf5,
        klz_n2: data/klz/v1.0.0/Xe136_0n2b_n2_fig2.hdf5,
        klz_n3: data/klz/v1.0.0/Xe136_0n2b_n3_fig2.hdf5,
        klz_n7: data/klz/v1.0.0/Xe136_0n2b_n7_fig2.hdf5}
roi:
    energy:
        !!python/tuple [1.0, 3.0]
per_bin:
    true
store_summary:
    true
class echidna.scripts.klz_majoron_limits.ReadableDir(option_strings, dest, nargs=None, const=None, default=None, type=None, choices=None, required=False, help=None, metavar=None)[source]

Bases: argparse.Action

Custom argparse action

Adapted from http://stackoverflow.com/a/11415816

Checks that hdf5 files supplied via command line exist and can be read

echidna.scripts.klz_majoron_limits.main(args)[source]

The limit setting script.

Parameters:args (argparse.Namespace) – Arguments passed via command- line

echidna.scripts.multi_ntuple_spectrum module

Example script to create a sinlge spectrum file from multiple ntuple files

This script:
  • Reads all ntuple files in a directory
  • For each ntuple, both mc and reconstructed spectra are created and saved as hdf5 files in dedicated mc / reco directories (themselves automatically generated)
  • Summed spectra containing the information from all ntuples are created for both mc and reconstructed data sets in dedicated “Summed” directory. Spectra saved in hdf5 format.

Examples

To read all ntuples in a directory and save both individual and summed spectra to file with config cnfg.yml:

$ python multi_ntuple_spectrum.py /path/to/cnfg.yml /path/to/ntuple/direc/

To specify a save directory, include a -s flag followed by path to the required save destination.

echidna.scripts.multi_ntuple_spectrum.create_combined_ntuple_spectrum(data_path, config_path, bkgnd_name, save_path, bipo, fv_radius, outer_radius)[source]

Creates both mc, truth and reco spectra from directory containing background ntuples, dumping the results as a spectrum object in an hdf5 file.

Parameters:
  • data_path (str) – Path to directory containing the ntuples to be evaluated
  • config_path (str) – Path to config file
  • bkgnd_name (str) – Name of the background being processed
  • save_path (str) – Path to a directory where the hdf5 files will be dumped
  • bipo (bool) – Apply Bi*Po* cuts when extracting data if True.
  • fv_radius (float) – Cut events outside the fiducial volume of this radius.
  • outer_radius (float) – Used for calculating the radial3 parameter. See echidna.core.dsextract for details.
echidna.scripts.multi_ntuple_spectrum.plot_spectrum(spec, config)[source]

Plot spectra for each of the spectrum dimensions (e.g energy)

Parameters:Spec (echidna.core.spectra.Spectra) – Spectrum object to be plotted.
Returns:None

echidna.scripts.plot_summary module

Example script to plot the data stored in a floating background
summary hdf5s.
This script:
  • Reads in a summary hdf5
  • Plots total test statistic, penalty term, best fit value and best fit value in terms of number of sigma away from the prior value all vs. signal scale.

Examples

To plot the summary hdf5 “file.hdf5”:

$ python plot_summary -f /path/to/file.hdf5

This will create the following plots ./file_stat.png, ./file_best_fit.png, ./file_sigma_best_fit.png and ./file_penalty_term.png. To specify a save directory, include a -s flag followed by path to the required save destination. To plot to screen use the –graphical flag.

echidna.scripts.rebin module

Example smearing script

This script:
  • Reads in spectrum from hdf5
  • Rebins spectrum with user input number of bins
  • Saves rebinned spectrum to given output or, by default, to the same directory with _rebin added to the filename.

Examples

To rebin hdf5 file example.hdf5 that has 3 dimensions with 1000 bins in each dimension to 500 bins in the first dimension and save the spectrum to example2.hdf5 then the following command is required:

$ python rebin.py -i /path/to/example.hdf5 -o /path/to/example2.hdf5
  -b 500 1000 1000

This will create the smeared hdf5 file /path/to/example_smeared.hdf5.

echidna.scripts.zero_nu_limit module

*CURRENTLY NOT WORKING*

Example limit setting script

This script provides an example of how to use the limit setting tools, built into echidna, to set a 90% confidence limit on neutrinoless double beta decay.

The numbers used in scaling the signal/backgrounds should set a reasonable limit, but are not necessariy the optimum choice of parameters.

Note that this script assumes the user has already made a fiducial volume cut when creating the spectra and the energy par is “energy_mc” for all spectra.

Examples

To use simply run the script:

$ python zero_nu_limit.py -s /path/to/signal.hdf5 -t /path/to/2n2b.hdf5
  -b /path/to/B8_Solar.hdf5

Note

Use the -v option to print out progress and timing information

class echidna.scripts.zero_nu_limit.ReadableDir(option_strings, dest, nargs=None, const=None, default=None, type=None, choices=None, required=False, help=None, metavar=None)[source]

Bases: argparse.Action

Custom argparse action

Adapted from http://stackoverflow.com/a/11415816

Checks that hdf5 files supplied via command line exist and can be read

Module contents