A pair of papers recently published show that the Universe is 13.772 billion (plus or minus 39 million) years old.
That's cool! It also agrees with some earlier measurements of the Universe made in a similar way. Also cool.
What's not cool is that this doesn't seem to alleviate a growing discrepancy in measurements made in different ways, which get an age a few hundred million years less. While that may not seem like a big deal, it's actually a really big problem. Both groups of methods should get the same age, and they don't. This means there's something fundamental about the Universe we're missing.
The new observations were made using the Atacama Cosmology Telescope (or ACT) a six-meter dish in Chile that is sensitive to light in the microwave part of the spectrum, between infrared light and radio waves. When the Universe was very young it was extremely hot and dense, but after about 380,000 years after the Big Bang it cooled enough to become transparent. It was about as hot as the surface of the Sun at the time, and the light it emitted would have been more or less in the visible part of the spectrum, the kind of light we see with our eyes.
But the Universe has expanded since then, a lot. That light has lost a lot of energy getting to us fighting that expansion, and has redshifted; literally the wavelength has gotten longer. It's now in the microwave part of the spectrum. It's also everywhere, literally in every part of the sky, so we call it the Cosmic Microwave Background, or CMB.
A huge amount of information is stored in that light, so by scanning the sky with 'scopes like ACT we can measure conditions in the Universe when it was just 380,000 years old.
ACT covered 15,000 square degrees, more than a third of the entire sky! Looking at about 5,000 square degrees of that survey, they were able to determine a lot of the behavior of the young Universe, including its age. Combining that with results from the Wilkinson Microwave Anisotropy Probe (or WMAP) they got the age of 13.77 billion years. That also agrees with the European Planck mission's value, which also measured microwaves from the early cosmos.
They can also measure the expansion rate of the Universe. The expansion was first discovered in the 1920s, and what astronomers found is that an object farther away from us was receding away from us faster. Something twice as far away appeared to be moving away from us twice as fast. This rate of expansion became known as the Hubble constant, and it's measured in a speed per distance: How fast something is moving versus how far away it is.
The new observations get a value for this constant of 67.6 ±1.1 kilometers per second / megaparsec (a megaparsec, abbreviated as Mpc, is a distance unit convenient in some aspects of astronomy, equal to 3.26 million light years; a bit farther than the distance to the Andromeda Galaxy, if that helps). So, due to the cosmic expansion, an object 1 Mpc away should recede from us at 67.6 km/sec, and one 2 Mpc away twice that at 135.2 km/sec, and so on. It's a bit more complicated than this, but that's the gist.
And that's a problem. There a lot of ways to measure the Hubble constant — looking at supernovae in distant galaxies, observing gravitational lenses, observing huge clouds of gas in distant galaxies, and so on — and many of them get a larger number, around 73 or so km/sec/Mpc. Those numbers are close, which is reassuring in some ways, but far enough apart that it's extremely puzzling. They should agree, and they don't.
They also get different ages for the Universe as well. A higher Hubble constant means the Universe is expanding more rapidly, so it didn't need as much time to get to its current size, making it younger. A lower constant means the Universe is older. So while the expansion rate may seem esoteric, it's directly tied to the more fundamental concept of how old the Universe is, and the two groups of methods get different numbers.
So which is right? That's a difficult question to answer, and maybe the wrong one to ask. A better one is, why don't they agree?
There's an obvious issue, and that's that both of these methods are correct, but they're measuring two different parts of the Universe. The ones looking at the CMB are examining the Universe when it was less than a million years old. The others are looking at the Universe when it was already a few billion years old. Perhaps the expansion rate changed during that time.
In other words, maybe the Hubble constant isn't. A constant, I mean.
There could be issues in the methods themselves, but these have been checked in many ways and by so many different methods in each group that this seems very unlikely at this point.
The fault apparently is in the Universe, and not in ourselves. Or, better stated (sorry, Bard, and maybe John), the fault lies in the way we measure the Universe. It's doing what it does. We just have to figure out why.
A lot of papers have been published about this, and it's no exaggeration to say it's one of the biggest and thorniest problems in cosmology right now.
A personal thought. My first job after getting my PhD was briefly working on one part of COBE, the Cosmic Background Explorer, which looked at the CMB and confirmed the Big Bang was real. At that time the measurements were good, but there was room for improvement. Then WMPA came along, and Planck, and now ACT, and these measurements are made with incredible accuracy. Astronomers call it high-precision cosmology, a kind of an inside joke since, for a long time, we barely had any idea about these numbers.
Astronomers are so good at this now that a discrepancy of 10% is considered a huge problem, when back in the day a factor of two was considered OK. Watching this field improve over time has been a true joy, because the better we get at it the better we understand the Universe itself as a whole.
Yeah, we're having some problems. But these are great problems to have.
Nevertheless, hopefully we'll see them resolved soon. Because when we do, it means our understanding will have taken yet another big leap.