The barycentric correction is dependent on the flux rate received during the exposure time. Exposure meter readings, both direct (channel 'PC' - off the 0th order of the echelle grating) and and indirect (channel 'FRD' - from the spill-over at the pupil slicer) can be used to determine the correct exposure midpoint. The algorithm is using barycorpy, the python version of Wright&Eastman's 2014 core BaryCorr. More specifically, we use the routine exposure_meter_BC_vel() which computes the mean of the barycentric velocity weighted by the exposuremeter readings rather than for the nominal midpoint of the observation, this accounting for non-linearities in the slope of the barycentric velocity over the exposure time. Values are computer for both exposure meter input channels and written to the header of the reduced science file as well as in a log file.
Exposure meter readings covering the exposure time plus some minutes before and after as a numpy pickle data file.
Reduced science spectra where header info provide the name of the target, the time of exposure start, and exposure length.
Barycentric correction for nominal exposure midpoint and flux-weighted for both exposure meter channels ('PC', and 'FRD') written to input file header and into log file.
PDF plot with exposure meter readings.
STEP 1:
As a first step, exposure meter readings need to be collected from the influxdb database on the instrument control computer and written into a numpy pickle file. This is best done once for a complete observing run and needs to be done on a computer with access to port 8086 of mxcontrol1 (10.2.147.24).
After the cyber incident of August 2023, external access to port 8086 on mxcontrol1 is no longer permitted. The python code producing the numpy pickle file needs to be edited and executed through the Keeper interface.
(1) User the Keeper interface to reach the instrument control computer.
(2) Go to workspace 5 (the last one, has typically DS9 running).
(3) Open a terminal or use an open one and go to ~/MaroonX_spectra/
(4) Use nano InfluxQuery.py and edit the lines at the very bottom of the file in the __main()__ section. Set the desired time range and output file name. Use CRTL-x to save the file and close nano.
(5) Execute the python code by calling python InfluxQuery.py
(6) Copy the resultant numpy pickle file into the latest data directory so it will be transferred alongside the data by Gemini personnel.
(7) Once the file has been copied through the google drive to the lab server, create a new directory in the appropriate /dataxx directory, ideally matching the data range of the numpy pickle file. Example: /data10/MaroonX_spectra_reduced/Maroonx_masterframes/202405xx/expmeter contains expmeter052024.pkl
STEP 2:
Subsequently, a batch processing routine is handling the actual computations.
From the base directory (maroonx_reduce) call: PYTHONPATH=${PWD} python analyze/recipes/batch_barycor.py
with the following parameters:
-p EXPMETER_FILE, --expmeter_file EXPMETER_FILL Full name and path of exposure meter file containing pandas Dataframe.
Default = '/data2/MaroonX_spectra_reduced/Maroonx_masterframes/202011xx/
expmeter/expmeter112020.pkl'.
--zp_pc ZP_PC Zeropoint for 'counts_pc' channel. Determined from data if not provided.
--zp_frd ZP_FRD Zeropoint for 'counts_frd' channel. Determined from data if not provided.
-dd DATA_DIRECTORY, --data_directory DATA_DIRECTORY Directory for hdf input files. Default ='/data2/MaroonX_spectra_reduced/'.
-d DATE, --date DATE UTC date of file, e.g. '20200901' or '202009*'.
-c CAMERA_ARM, --camera_arm CAMERA_ARM Camera arm, e.g. 'b' or 'r'. Default: '?' for both
-o OBS_TYPE, - -obs_type OBS_TYPE Obs type. Default: 'SOOOE'
-t EXPTIME, --exptime EXPTIME Exposure time in sec to downselect files, e.g. '300'. Default = '*'
-n TARGET_NAME, --target_name TARGET_NAME Target name to downselect files, e.g. 'TOI-1685'
-sn TARGET_NAME --simbad_target_name TARGET_NAME SIMBAD resolvable target name e.g. 'UCAC4 666-026266'
--use_coords TRUE/FALSE Use telescope pointing coordinates instead of target name.
Example 1: PYTHONPATH=${PWD} python analyze/recipes/batch_barycor.py -d '20201201'
Example call collects all blue and red science (SOOOE) frames from /data2/MaroonX_spectra who's date is matching 20201201 and computes barycentric velocities using the default exposermeter file and SIMBAD coordinates for all targets. If a target is not found in SIMBAD, the file is skipped.
Example 2: PYTHONPATH=${PWD} python analyze/recipes/batch_barycor.py -d '20201201' -n 'TOI-1685' -sn 'UCAC4 666-026266'
Same as example 1, but only work on files with target name 'TOI-1685' and search SIMBAD for UCAC4 666-026266 instead.
Example 3: PYTHONPATH=${PWD} python analyze/recipes/batch_barycor.py -d '20201201' -n 'TOI-1693' --use_coords True
Same as example 1, but only work on files with target name 'TOI-1693' and use telescope pointing coordinates instead of catalog entry.
Calculates barycentric velocities using barycorrpy for (1) nominal exposure midpoint and (2) flux weighted midpoint from 'PC' and 'FRD' channel of exposuremeter.
Exposuremeter data are automatically retrieved from pickle file, their zeropoint determined (from outlier-corrected lowest value before or after exposure). Note: 'PC' values are often not useful for fainter targets due to higher background and lower sensitivity.
Algorithm also calculates the dv/dt, i.e. the average change rate of the barycentric velocity during the exposure to give a sense of sensitivity towards timing issues.
QC: Procedure should raise a flag if it fails (e.g., wrong exposuremeter file, no exposuremeter data, etc.).
When several exposures are done back-to-back, there is no 'zero flux' baseline recorded within 5 min to either side of the exposure and the zeropoint is determined incorrectly. In such cases, the zeropoint must be given manually. The code should probably throw an error if automatically determined zeropoints are above a certain threshold (~4.0 for FRD and ~25 for PC).
The correct ZP matters most for faint targets with low count rates.