Alignment of an interferometric gravitational wave detector

Appl Opt. 1998 Oct 1;37(28):6734-47. doi: 10.1364/ao.37.006734.

Abstract

Interferometric gravitational wave detectors are designed to detect small perturbations in the relative lengths of their kilometer-scale arms that are induced by passing gravitational radiation. An analysis of the effects of imperfect optical alignment on the strain sensitivity of such an interferometer shows that to achieve maximum strain sensitivity at the Laser Interferometer Gravitational Wave Observatory requires that the angular orientations of the optics be within 10(-8) rad rms of the optical axis, and the beam must be kept centered on the mirrors within 1 mm. In addition, fluctuations in the input laser beam direction must be less than 1.5 x 10(-14) rad/ radicalHz in angle and less than 2.8 x 10(-10) m/ radicalHz in transverse displacement for frequencies f > 150 Hz in order that they not produce spurious noise in the gravitational wave readout channel. We show that seismic disturbances limit the use of local reference frames for angular alignment at a level approximately an order of magnitude worse than required. A wave-front sensing scheme that uses the input laser beam as the reference axis is presented that successfully discriminates among all angular degrees of freedom and permits the implementation of a closed-loop servo control to suppress the environmentally driven angular fluctuations sufficiently.