1952
Comment:
|
2755
|
Deletions are marked like this. | Additions are marked like this. |
Line 9: | Line 9: |
* To read KAGRA data by using NDS, We need to edit `gwpy` code (see [[http://gwwiki.icrr.u-tokyo.ac.jp/JGWwiki/KAGRA/Subgroups/DET/OmegaScan|instruction]]) | |
Line 12: | Line 12: |
conda activate o4_dqr_proto -y conda install M2Crypto -y |
conda activate o4_dqr_proto conda install m2crypto ciecplib python-nds2-client cryptography==3.3.1 -y |
Line 17: | Line 17: |
conda install ciecplib -y | |
Line 51: | Line 50: |
dqr-create-dag S190924h -v -c ex_configs/defaults.ini ex_configs/condor.ini ex_configs/stationarity.ini ex_configs/omega.ini | dqr-create-dag -g S190924h -v -c ex_configs/defaults.ini ex_configs/condor.ini ex_configs/stationarity1.ini ex_configs/omega1.ini |
Line 54: | Line 53: |
* This procedure will require the access GraceDB. | * The generated DAG files are located in {{{/home/controls/public_html/events/}}} * Because we don't store the O3 data in KAGRA site, we need to run this job with gps option. {{{ mkdir ~/public_html/event dqr-create-dag -g 1318260000 -v -c ex_configs/defaults.ini ex_configs/condor.ini ex_configs/stationarity1.ini ex_configs/omega1.ini }}} == Submit DAG jobs == * For example on k1det1 {{{ cd /home/controls/public_html/events/ condor_submit_dag S190924h/data_quality_report_S190924h.dag }}} * Result plots and html are {{{~/public_html/eventsS190924h/omegascan/plots/}}} and {{{~/public_html/eventsS190924h/omegascan/about/index.html}}} |
Task Manager
Installation
Yuzurihara prepared the virtual environment o4_dqr_proto on k1det1.
- DQR tasks are supported by python 3.8
This procedure will require the authentication of git.ligo.org.
To read KAGRA data by using NDS, We need to edit gwpy code (see instruction)
conda create --name o4_dqr_proto python=3.8 -y conda activate o4_dqr_proto conda install m2crypto ciecplib python-nds2-client cryptography==3.3.1 -y pip install git+https://git.ligo.org/o4-dqr/o4-dqr-configuration.git pip install git+https://github.com/gwdetchar/gwdetchar.git pip install git+https://git.ligo.org/detchar/dqrtasks.git
install ligo-proxy-init, from the LDG Client package
wget http://www.lsc-group.phys.uwm.edu/lscdatagrid/doc/ldg-client.sh -O /tmp/ldg-client.sh && sudo bash /tmp/ldg-client.sh
- X.509 certification
- For the additional permission to access GraceDB, we need to ask LIGO+Virgo colleague.
There are two options for KAGRA account. Here Yuzurihara used ligo-proxy-init.
ligo-proxy-init -i shibbi.pki.itc.u-tokyo.ac.jp albert.einstein
- To test the certification,
gracedb ping
grid-proxy-info
Generate DAG files
By following instruction, prepare ini files for DQR task manager.
- This procedure will NOT require the access GraceDB, because the public alert will be used.
For test code, see /users/yuzu/work/DQR/Script/o4-dqr-configuration:k1det1
mkdir ~/public_html/event dqr-create-dag -g S190924h -v -c ex_configs/defaults.ini ex_configs/condor.ini ex_configs/stationarity1.ini ex_configs/omega1.ini
The generated DAG files are located in /home/controls/public_html/events/
- Because we don't store the O3 data in KAGRA site, we need to run this job with gps option.
mkdir ~/public_html/event dqr-create-dag -g 1318260000 -v -c ex_configs/defaults.ini ex_configs/condor.ini ex_configs/stationarity1.ini ex_configs/omega1.ini
Submit DAG jobs
- For example on k1det1
cd /home/controls/public_html/events/ condor_submit_dag S190924h/data_quality_report_S190924h.dag
Result plots and html are ~/public_html/eventsS190924h/omegascan/plots/ and ~/public_html/eventsS190924h/omegascan/about/index.html