The LIGO
Web Newsletter
Front Page Hanford Livingston

LIGO Caltech News

LIGO Laboratory and LSC Successfully Complete Second Mock Data Challenge
End-to-End (E2E) Simulation Helps LIGO Scientists Achieve and Characterize Lock Acqusition of the Hanford 2-km Interferometer

LIGO Laboratory and LSC Successfully Complete Second Mock Data Challenge

- Contributed by Albert Lazzarini

The LIGO Data Analysis System is an environment for distributed parallel computing that will be used to perform pipeline analyses of LIGO interferometer data in order to search for rare, weak astrophysical signals embedded in the datastream that will be collected from these instruments.

Because the expected signals are weak and because the LIGO interferometers are omnidirectional antennas to gravitational radiation, we will require extensive frequency domain signal processing to extract astrophysical signatures from the datastream.

For an important class of sources that are associated with the final seconds in the life of a binary neutron star system before it disappears from view into a newly formed black hole, the expected waveforms constitute chirps of well-defined amplitude and phase evolution. The chirps cross the audio band in a few seconds (example chirps may be heard here.) These signals are parameterized by the masses of the pair of stars. Each mass pair constitutes a unique waveform with its own frequency and phase evolution. The search strategy we will employ is to look for a large number of possible mass pairs at the same time. This is achieved by parallelizing the search algorithm over mass parameters and using a large cluster of standard PCs operating under the Linux operating system to process the data using a technique known as optimal Wiener (matched) filtering.

In preparing for the science run, it is important to validate the integrity of the analysis pipeline and the software running within it. The LIGO Laboratory and its Science Collaboration, the LSC, have developed a set of "Mock Data Challenges" (MDC) to incrementally test and integrate the many software components which must run flawlessly together for days on end during the actual science run.

Our first Mock Data Challenge took place in August 2000. This pioneer effort was dedicated to testing a specific subset of LIGO software that will be used to pre-process, or condition, the interferometer datastreams. The software module responsible for this is termed the "data conditioning application programming interface," or dcAPI for short. It is a tool set of C++ codes that will be used to estimate power spectral densities of signals, to resample and decimate the data at different data rates, and to mix, or heterodyne, the raw data with signals of known frequency and phase in order to look for signal strength at particular frequencies. These techniques are standard ones in the signal processing field. The initial MDC was successful in that it validated many of our concepts and also exposed many issues needing to be resolved. This activity included researchers and programmers from the Pennsylvania State University, the University of Texas at Brownsville, the Australian National University in Canberra, and the LIGO Laboratory Data and Computing Group at Caltech.

Now, approximately six months later, we have successfully completed our second MDC. This one was specifically targeted at validation of the parallel optimal Wiener filtering algorithms that constitute the basis of the search strategy for chirp signals embedded in the interferometer noise. This latest activity involved researchers from the University of Wisconsin at Milwaukee (UWM) and the Data and Computing Group at Caltech.

This latest MDC proved that the data analysis software system consisting of C++ data management and parallel processing code based on the Message Passing Interface (MPI) standard written at Caltech could be successfully integrated with a set of C libraries as dynamically loaded shared objects developed principally at UWM. The MDC was the culmination of over half a year of intense collaboration among approximately a dozen people.

The main component of the challenge was to demonstrate that the pipeline analysis design we developed could be run around-the-clock to process up to 1,000 different waveforms ("templates") per node representing various mass pairs for chirp signals. The datastream used in the analysis was produced with a simulation program so that its statistical properties could be precisely controlled and known a priori. Representative waveforms were then embedded in this data stream so as to test the signal search code. The equivalent to over three days of data were processed using just nine nodes of our cluster of PCs. This analysis required performing several million Fast Fourier Transforms (FFTs), each of order half a million data points. The code has not yet been optimized and factors of four improvement are expected to be possible in the C libraries.

To learn more about the LDAS system being developed at Caltech, follow this link. There you will find interesting links to the joint software projects involving LDAS and the LSC which are at the core of these Mock Data Challenges (follow the link to "LDAS/LSC Software Development" in the bulleted list). To read about the actual trials and tribulations of the January MDC you can find interesting electronic log book entries here. Follow the read-only instructions to access the "Mock Data Challenge Group Log" link near the bottom of the page to review the diaries for January 11th through January 19th, 2001. Finally, here are a few candid shots to enjoy taken at the end of our MDC.


End-to-End (E2E) Simulation Helps LIGO Scientists
Achieve and Characterize Lock Acqusition of the Hanford 2-km Interferometer

- Contributed by Hiro Yamamoto

A software package has been under development over the past four years to assist us in modeling the LIGO interferometer hardware. This is a time domain model, written in C++, with GUI front-end, and it can simulate a wide variety of optical configurations. Scientists can use this software to generate a time series of data that can be compared to the real instrument output by defining digital filters that represent the input-output behavior of the instrument. It can also be used to study the behavior of a particular subsystem, such as the effect of the optics table motion of the pre-stabilized laser on the laser frequency or amplitude noise. It can also simulate the full LIGO interferometer in order to design the lock acquisition servo system or to generate psuedo-data for data analysis or other types of Monte Carlo studies. One can even use the simulation package to model and design future interferometer configurations for those configurations in which all optical elements lie in a common plane.

The first important application of our simulation program was to design and debug the lock acquisition servo. This was done by Matt Evans, a Caltech graduate student who is also a major contributor and designer of the simulation package itself. The locking servo design was based on a model of the Hanford 2-km interferometer that includes the seismic motion, seismic isolation and suspension systems for the full six suspended-mirror system of a power recycled Fabry-Perot Michelson interferometer. Included in the model are the basic length control servo systems with their associated sensors and actuators.

Figures 1 and 2 below show the actual and simulated time-domain of signals from the interferometer. Figure 1 presents measured data from the Hanford 2-km interferometer taken during the recent successful commissioning tests. Figure 2 is a modeled result portraying the same signals as calculated within our simulation package.

Figure 1 Figure 2

Figure 2(a) depicts the time evolution of optical power at various points within the interferometer. It is showing the side-band power in the Y-arm. From studying these signals, scientists are able to observe that the power-recycling mirror loss-of-lock event in this case is due to the resonance of the side-band in the long arm (not a nominal condition). Figure 2(c) shows the lengths of various cavities comprising the interferometer. As may be seen from these figures, the simulation reproduces the important characteristics of the time evolution of the LIGO interferometer. This was a critical success for our tools.

The next application for LIGO I will be the generation of pseudo-time series data from the in-lock state of the interferometer for purposes of noise source identification and reduction, as well as data analysis. The simulation program also includes an implementation of the expected thermal noise and shot noise for the LIGO I interferometers. This is included with the full simulation of the suspended core optics systems while they are locked and controlled by realistic servos. Because of the flexible modular design of our software, it is easy to simulate a wide range of phenomena: almost any type of suspicious noise source can be included in the modeling environment in order to see its effect at various output channels of the machine.