Gravitational Wave Standard Sirens
Gravitational waves from merging compact objects — neutron stars or black holes — are self-calibrating distance indicators, dubbed "standard sirens." Unlike light-based standard candles, the amplitude and frequency evolution of a gravitational wave signal directly encodes the absolute luminosity distance to the source without any calibration chain. When LIGO and Virgo detected the neutron star merger GW170817 and its electromagnetic counterpart in galaxy NGC 4993, astronomers extracted H₀ = 70 ⁺¹²₋₈ km/s/Mpc — a result with large error bars but sitting tantalizingly between the Planck value (~67) and SH0ES (~73). As the gravitational wave event catalog grows, the statistical precision is improving, and emerging measurements continue to favor values intermediate between the two camps, failing to decisively arbitrate the tension but also failing to converge on the low CMB value.
The deeper tension is structural: ΛCDM requires H₀ to be a single, universal constant, and the growing catalog of standard sirens should converge on one value regardless of the source type or redshift. Instead, different binary merger populations — and different assumptions about host galaxy identification — yield subtly different central values, suggesting either unknown systematics in the gravitational wave analysis or genuine local variations in the expansion rate. The promise of gravitational wave cosmology was to provide a clean, assumption-free anchor; the current results instead add another voice to an already discordant chorus, and show no sign of resolving cleanly onto the Planck-cosmology prediction.