I asked this question in another thread, and was not satisfied with the answer, so I'm posing it here in the Q&A section.
Interferometers measure a phase-change between two beams of light that take different paths. This phase-change is interpreted as a change in distance, with the implicit assumption that the wavelength of light does not change.
I.e. a phase change of 360 degrees is interpreted as a change in distance of 1 wavelength of the light used. For example, in high-precision measurements of thermal coefficient of expansion, an interferometer will be attached to a metal rod of a known length. The the rod is heated by a known amount, say 10 degrees. The interferometer measures the phase change in the length of the path between the ends of the rod. The phase change is interpreted as a chnage in length of the rod, under the implicit assumption that the wavelength of light does not change with temperature. In other words, the measurement rests on the assumption that the "measuring stick," the wavelength of light, does not change with temperature, that which is being measured.
The interferometer used in the Gravity Wave Detector works on the same principle--and same underlying assumption. This is the question.
According to GR, does the wavelength of light not stretch or compress in exact proportion to the expansion/compression of space? How can an interferometer yield anything but a null result? If a gravity wave passes perpendicular to one arm, and lengthens it by 1 part per zillion, will the wavelength of light along that arm not also increase by 1 part in a zillion?
This will yield zero phase change, i.e. a null result, like the Michelson-Morley experiment. So far, a null result is all the interferometer G-wave detectors have offered up.
But how can it be otherwise? How can a measuring stick be used to measure G-waves, when the meareuing stick itelf expands and contracts in the same proportion as that which is being measured?