Commit 8c87a161 authored by Enrique Garcia's avatar Enrique Garcia
Browse files

Update readme stream

parent 838b42be
# --> hiperta_stream documentation v0.1 <--
# HiPeRTA_stream
# Using `streamSingularity.simg` (a.k.a DummyCameraServer)
Python library to manage the the HiPeRTA C++ library and the different tools associated with it.
1. Create basic common hdf5 configuration file used by all the stream files
- Will contains the gain and pedestal info of the run
- [Steps to run the RTA](#1-steps-to-run-the-rta)
- [Steps to stream ZFITS or HDF5 data](#2-steps-to-stream-simulated-data)
- [Use the HiPeRTA chain with Real Data ](#3-use-the-hiperta-chain-with-real-data)
---------------------------
## 1. Steps to run the RTA
1. Create the LST calibration files (lstchain scripts), [go to section](#11-create-the-lst-calibration-files).
2. Retrieve the pixel order of the camera, [go to section](#12-retrieve-the-pixel-order-of-the-camera)
3. Create the h5 file containing the basic observing information, [go to section](#13-create-hdf5-file-containing-basic-observing-information);
- Gain and pedestal info of the run
- The injection table with the pixel order
- The general /configuration node (pix position, geometry...)
2. Launch the DummyCameraServer
3. Launch the hiperta_stream main script.
- Modify `config_hiperta_stream.yml` file (the full configuration hiperta_stream)
- The general / configuration node (pix, position, geometry...)
4. Run the RTA command, providing the RTA configuration YAML file, [go to section](#14-launch-the-rta).
#### 1.1 Create the LST calibration files
1. Run the `lstchain_data_create_drs4_pedestal_file` script.
2. Run the `lstchain_create_calibration_file` script.
Example with;
- Calibration DRS4 run = 3944
- Calibration pedestal run = 3945
```bash
$ lstchain_data_create_drs4_pedestal_file
--input-file /fefs/aswg/data/real/R0/20210310/LST-1.1.Run03944.0000.fits.fz
--output-file /fefs/aswg/workspace/thomas.vuillaume/data/real/calibration/20210310/v0.6.3/drs4_LST-1.1.Run03944.0000.fits
```
And
## 1. Create common hdf5 file
```bash
$ cd /fefs/aswg/workspace/enrique.garcia/rta_stream_test/
$ create_base_config -c $CONDA_PREFIX/share/HiPeRTA/default_configuration.h5
-p $CONDA_PREFIX/share/HiPeRTA/LST-1.1.Run00442.0000_pixel_order_bin.npy
-g calibration.Run3074.0000.hdf5 -o base_structure_hdf5_2948.h5
$ lstchain_create_calibration_file
--input_file=/fefs/aswg/data/real/R0/20210310/LST-1.1.Run03945.0000.fits.fz
--output_file=/fefs/aswg/workspace/thomas.vuillaume/data/real/calibration/20210310/v0.6.3/calibration_Run03945.h5
--pedestal_file=/fefs/aswg/workspace/thomas.vuillaume/data/real/calibration/20210310/v0.6.3/drs4_LST-1.1.Run03944.0000.fits
--config=/fefs/home/thomas.vuillaume/software/cta-observatory/cta-lstchain/lstchain/data/onsite_camera_calibration_param.json
```
The calibration file, needed in the following stage, will then be at;
`/fefs/aswg/workspace/thomas.vuillaume/data/real/calibration/20210310/v0.6.3/calibration_Run03945.h5`
#### 1.2 Retrieve the pixel order of the camera
The file created in this stage will be passed in the immediate next one.
```bash
$ stream_get_zfits_pix_order
--input /fefs/aswg/data/real/R0/20210310/LST-1.1.Run03944.0000.fits.fz
--output ./LST-1.1.Run03944.0000_pix_order_bin.npy
```
#### 1.3 Create HDF5 file containing basic observing information
```bash
$ stream_create_base_config
--config_file $CONDA_PREFIX/share/HiPeRTA/default_configuration.h5
--pixel_order ./LST-1.1.Run03944.0000_pix_order_bin.npy
--gain_pedestal ./calibration_Run03945.h5
--output_file base_structure_hdf5_3945.h5
````
- `--config_file` Once `HiPeRTA` is installed, it must be on
`~/miniconda3/envs/CONDA_ENV_NAME/share/HiPeRTA/default_configuration.h5`
- `--gain_pedestal` : SAME calibration file as run to be processed, for example;
`~/miniconda3/envs/$CONDA_ENV_NAME/share/HiPeRTA/default_configuration.h5`
- `--gain_pedestal` : Same calibration file as run to be processed, for example;
- /fefs/aswg/data/real/calibration/20201008/v05/calibration.Run2833.0000.hdf5
- LST-1.1.Run2833.0000.fits.fz
- `--pixel_order` : Once `HiPeRTA` is installed, it must be on
`~/miniconda3/envs/CONDA_ENV_NAME/share/HiPeRTA/LST-1.1.Run00442.0000_pixel_order_bin.npy`
- `--pixel_order` : Pixel order computed in previous stage with the `stream_get_zfits_pix_order` get command.
(Default file: If `HiPeRTA` is installed, the default file would be at
`~/miniconda3/envs/CONDA_ENV_NAME/share/HiPeRTA/LST-1.1.Run00442.0000_pixel_order_bin.npy`)
- `--output_file`: Default `base_structure_hdf5.h5`
A "normal" calibration file should contain the following
```
$ pttree -L 5 /fefs/aswg/data/real/calibration/20201008/v05/calibration.Run2833.0000.hdf5
$ pttree -L 5 /fefs/aswg/workspace/thomas.vuillaume/data/real/calibration/20210310/v0.6.3/calibration_Run03945.h5
------------------------------------------------------------
......@@ -53,65 +106,150 @@ Mean compression ratio: 0.80
HDF5 file size: 451.0KiB
------------------------------------------------------------
```
You can also plot the calibration distributions to check that everything worked correctly.
```bash
stream_plot_r0_calib -i calibration_Run03945.h5
```
## 2. Launch the DummyCameraServer
#### 1.4 Launch the RTA
```bash
$ cd /fefs/aswg/workspace/enrique.garcia/rta_stream_test/
$ hiperta_stream_start --config_file config_hiperta_stream.yml
```
---------------------------
## 2. Steps to stream simulated data
2.1 Using the ZFITS streamer, a.k.a `DummyCameraServer`, by running a Singularity container.
2.2 Using the HDF5 streamer, a.k.a `DummyFountainHDF5`, HiPeRTA tool.
## 2.1 Stream ZFITS data (DummyCameraServer - `streamSingularity.simg`)
1. Create basic common hdf5 configuration file used by all the stream files, [go to section 1.3](#13-create-hdf5-file-containing-basic-observing-information).
2. Launch the DummyCameraServer [go to section](#212-launch-the-dummycameraserver).
3. Launch the hiperta_stream main script, [go to section](#213-launch-the-stream).
- Modify `config_hiperta_stream.yml` file (the full configuration hiperta_stream)
#### 2.1.1 Base hdf5 config
- Check that the following argument in the `config_hiperta_stream.yml` is set to `True` !
- `stream_read_zfit: true` !
##### 2.1.2 Launch the DummyCameraServer
Example of working dir: `/fefs/aswg/workspace/enrique.garcia/rta_stream_test/`.
Data and the container must be present in the same directory. Singularity does not like simlinks (or we are not passing the correct command).
```bash
$ singularity run -B $PWD:/Data/ --app stream streamSingularity.simg /Data/LST-1.1.Run02960.0000.fits.fz 3390 3000
```
## 3. Launch the hiperta_stream
#### 2.1.3 Launch the stream
On the same machine (two different terminals !)
```bash
$ cd /fefs/aswg/workspace/enrique.garcia/rta_stream_test/
$ hiperta_stream -c ~/BASE_config_hiperta_stream.yml -d True
$ hiperta_stream -c config_hiperta_stream.yml -d True
```
- `--config_file` : the yaml base config file
- `--debug_mode` : bool
-------------------
# Using `hiperta_stream_r0` (a.k.a DummyFountainHDF5)
- `--config_file -c` : the yaml base config file
- `--debug_mode -d` : bool
## 1. Modify `config_hiperta_stream.yml`
## 2.2 Stream HDF5 data (`hiperta_stream_r0` - DummyFountainHDF5)
Example of working dir: `/fefs/aswg/workspace/enrique.garcia/rta_stream_tests`
#### 2.2.1 Modify `config_hiperta_stream.yml`
- `stream_read_zfit: false` !
#### **No need of creating the base common hdf5 file as the HDF5 files use the same nodes structure**
#### **No need of creating the base common hdf5 file as all the HDF5 files use the same nodes structure**
## 2. Launch DummyFountainHDF5
#### 2.2.2 Launch DummyFountainHDF5
(`launchDummyFountainHDF5.sh` already copied into the working dir)
`launchDummyFountainHDF5.sh` already copied into the working dir.
```bash
$ cd /fefs/aswg/workspace/enrique.garcia/rta_stream_test/
$ ./launchDummyFountainHDF5.sh gamma_20deg_180deg_run9___cta-prod3-demo-2147m-LaPalma-baseline-mono_off0.4.h5
```
## 3. Launch the hiperta_stream
#### 2.2.3 Launch the stream
On the same machine (two different terminals !)
```bash
$ cd /fefs/aswg/workspace/enrique.garcia/rta_stream_test/
$ hiperta_stream -c ~/BASE_config_hiperta_stream.yml -d True
$ hiperta_stream -c ~/config_hiperta_stream.yml -d True
```
---------------------------
## 3. Use the HiPeRTA chain with Real Data
#### 3.1 Convert the real data into H5
For the conversion you'll need to have installed the `ctapipe_io_mchdf5` package.
```bash
pip install https://github.com/cta-observatory/ctapipe_io_mchdf5/archive/refs/tags/0.2.1-dev.zip
```
Then you will be able to run (Example of file conversion);
```bash
$ mchdf5_simtel2r0
-i /fefs/aswg/data/real/R0/20210310/LST-1.1.Run03945.0000.fits.fz
-o LST-1.1.Run03945.0000.h5
```
#### 3.2 Merge the file with the hdf5 base configuration structure
The base configuration file was created in [section 1.3](#13-create-hdf5-file-containing-basic-observing-information).
```bash
$ stream_realData2simulStructure
--input_file ./LST-1.1.Run03945.0000.h5
--base_h5_structure base_structure_hdf5_3945.h5
--output_file LST-1.1.Run03945.0000_complete_r0.h5
```
#### 3.3 Run the `r0_to_dl1` HiPeRTA stage with the needed args
If needed, dump (load) the default HiPeRTA configuration by running
```bash
$ hiperta_r0_dl1 --default
```
this will create a default `hiperta_configuration.yml` file.
And run the `r0_dl1` stage;
```bash
$ hiperta_r0_dl1 -i LST-1.1.Run03945.0000_complete_r0.h5 -o ./ -c hiperta_configuration.yml
```
#### 3.4 Plot dl1 results
```bash
$ stream_plot_dl1 -i dl1_LST-1.1.Run03945.0000_complete_r0.h5
```
---------------------------
---------------------------
-------------------
# Installation (versions) of HiPeRTA and lstchain
# Installation of HiPeRTA and lstchain (versions)
## HiPeRTA
```
***** check that you are pointing to upstream/slurm_hiperta *******
1 - Clone and fetch/pull hiperta
2 - bash installAnacondaDependencies.sh
- This shoul install correctly everythin, however, check the binaries
- This should install everything correctly, however, check the binaries
at the corresponding conda env directory
- 06/10/202 we have done the following;
......@@ -120,29 +258,23 @@ at the corresponding conda env directory
export CXX=/fefs/aswg/workspace/enrique.garcia/miniconda3/envs/rta-test/bin/x86_64-conda_cos6-linux-gnu-g++
```
Entry points created, just need to install `hiperta_stream` module
```bash
$ cd YOUR_PATH_TO_HIPERTA/RAMBO_PROGRAM/hiperta_stream
$ python setup.py install
```
## lstchain
```
***** check that you are pointing to origin/dl2_dl3_for_rta_test *******
v0.6.3 + own dev for hiperta_stream
***** check that you are pointing to garciagenrique/dl2_dl3_for_rta_test *******
(containing v0.6.3 + own dev for hiperta_stream)
conda install numpy cython
pip install git+https://github.com/cta-observatory/ctapipe_io_lst
install fixed version of lstchain: `pip install .`
pip install https://github.com/cta-observatory/ctapipe-extra/archive/v0.3.1.tar.gz
pip install git+https://github.com/cta-observatory/ctapipe_io_lst@v0.5.3
pip install git+https://github.com/cta-observatory/ctapipe-extra/archive/v0.3.1.tar.gz
- install fixed version of lstchain:
pip install https://github.com/garciagenrique/cta-lstchain/archive/refs/tags/v0.1-rta.zip
```
## Inside the singularity container
```bash
# R0 file MUST be copied in /$HOME
# R0 file MUST be copied in /.
Singularity streamSingularity.simg:~> /project/CamerasToACTL/trunk/Build.Debug/bin/DummyCameraServer --baseport 3390 --streams --events_type R1 --input LST-1.1.Run02960.0000.fits.fz --hertz 3000
```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment