BurstMon

online DMT tool for production of burst FOMs

S.Klimenko and  A.Sazonov ,
University of Florida



description: BurstMon is the DMT tool for monitoring of the performance of LIGO detectors. The performance estimation is based on the burst Figures of Merit (BFOMs) produced in real time for each LIGO detector. There are three types of BFOMs: 1) glitch rates, 2) detector sensitivity to injected waveforms and 3) noise variability.  The monitor produces data for the DMTViewer and trends: 1 min trends for rate and sensitivity, and  1 sec trends for noise variability.

  1. rates: raw rates after cluster reconstruction at given black pixel probability (1% by default).
  2. sensitivity: For estimation of the detector sensitivity, the BurstMon performs a real time simulation by injecting simulated waveforms of different strength into the AS_Q data. The waveforms are provided by user. The injected signals are detected with the algorithm similar to WaveBurst, but working at one specific time-frequency resolution. The detection efficiency is reconstructed for each type of injected waveforms. The detector sensitivity is estimated as the root-square-sum amplitude at 50% detection efficiency. Currently the BM produces uncalibrated sensitivity.
  3. variability: The BurstMon tracks variability of the noise in selected frequency bands. The noise variability is calculated in three steps: 1) perform wavelet transformation to obtain TF plot with resolution of 16 Hz x 1/32 sec,  2) whiten data - normalize each wavelet layer by the noise RMS in this layer (averaged over BM stride) and 3) calculate the RMS of the normalized wavelet coefficients with the same time stamp - this is the noise variability. It is calculated every 1/32 sec and it is supposed to be close to unity for stationary noise.

BurstMon results:

BurstMon documentation:

BurstMon tests:

Related links: