Thursday, August 29, 2019

gravitational waves - Is the sensitivity of the LIGO sensitive enough to measure the "expansion of the universe"? What specifically is the numerical ratio of effects?


It would seem that LIGO measures wibbles in the metric (not manifold) of spacetime:


How is it that distortions in space can be measured as distances?


It would seem that the expansion of the universe is an expansion of the metric of spacetime:


If space is "expanding" in itself - why then is there redshift?


Imagine for a moment that LIGO is not an interferometer. (So, it just plain times speed changes, rather than using phase shift of orthogonal directions.)



If the ends of one of the arms was indeed receding away from each other, at a speed consistent with the expansion of the universe, is the sensitivity of real-world LIGO of the needed sensitivity of a machine which could measure that shift?


On other words: ideally I'd like to know these two meters:


(A) The left arm of LIGO is about 4km. It was stretched/shrunk (a few times) for roughly .01 seconds by the gravitational wave. How many meters was it stretched/shrunk in .01 seconds?


(B) Assuming the same abstract LIGO arm was affectable and affected by the expansion of the Universe. How many meters is it stretch/shrunk every .01 seconds?




Note - of course, an interferometer is an ingenious device to measure extremely small changes in speed - assuming the changes are orthogonal. Of course, an interferometer, per se, can't at all measure the expansion of the universe since that is uniform in both directions. The sense of my question is, can something that measures distance changes > as accurately as < the LIGO does, measure the expansion of the universe? How big or small is the ongoing expansion of the universe compared to the wibble from the black hole system in question?



Answer



Well, CuriousOne gives the most direct answer, that the universe is not expanding on scales which the gravitational attraction between objects dominates, like on the Earth (indeed, the entire Milky Way). But, let's pretend that we take LIGO, stick it out in space (even away from our local group, to be sure it's in an isolated region), and ask it to measure the expansion of the universe. I'll call this XLIGO to make sure we don't confuse it with reality.


Answer A: I guess you're saying 0.01 s because the frequency of that particular observation was around 100 Hz, but in any case, the maximum strain detected by LIGO for this specific event was around $1\times 10^{-21}$ (see here, look at Figure 2). So, over 4 km that's simply $1\times 10^{-18}~m$.


Answer B: I'm going to do this using the Hubble constant, with tells us how fast two objects are moving away from each other depending on their distance. The approximate value of the Hubble Constant is 75 km/s/Mpc. Read that "For every megaparsec difference, the objects receed at a speed of 75 km/s."



So, over 4 km, $$\rm (75~km/s/Mpc)(1~Mpc/3 \times10^{19}~km)(4~km)\approx 100\times10^{-19}~km/s$$


Or an expansion speed along a single arm of XLIGO of $1\times 10^{-14}~m/s$. Over a time period of 0.01 s that means an absolute shift of $$ (1\times 10^{-14}~m/s)(0.01~s)=1\times 10^{-16}~m$$ So the sensitivity of XLIGO, modeled by the only positive observation, is around 100 times larger then the rough expansion of the universe.


But there is a bigger problem here, and that is that the expansion of the universe is isotropic, identical in every direction. So in order to see the effect, we would have to try to measure the aniotropies (actually, we would need to see the higher-order pole, specifically the quadrapole term). I found a paper (here) discussing that $\Delta H/H\approx 3\%$ is not excluded, so that actually brings us down to the same level of approximate scales.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...