[1] https://academic.oup.com/mnras/article/544/1/975/8281988?log...
Exciting times in cosmology after decades of a standard LCDM model.
Could you help me understand this sentence: "After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model”?
Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.
Consider figure 5 of the following article for example:
https://arxiv.org/abs/1105.3470
The differently shaded ellipses represent different confidence levels. For the largest ellipsis, the probability of the true values being outside of it is less than 1%. We call that 3-sigma confidence.
> Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
Well, then do some error analysis and report your results. Give us sigmas, percentages, probabilities. Science isn't based on gut feelings, but cold hard numbers.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
The expansion rate of the universe is not a velocity in the usual sense of distance/time. It's actually in units of velocity/distance, which reduces to 1/time. An expansion rate of r Hertz means that a given span of distance intrinsically doubles every 1/r seconds. The objects occupying the space don't "move" in any real sense due to expansion. They just wind up farther apart because space itself grew.
And, just like measurements of distance and time, measurements of the expansion rate change if you change your velocity. There is a special velocity in our universe which causes the expansion in all directions to be the same. From this special perspective, which is traveling at a kind of cosmic "rest" velocity, you can calculate the expansion rate. It turns out that the Sun is traveling at approximately 370 km/s with respect to that special "rest" velocity.
Indeed. It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho.
Significant breakthroughs do both. Prove things aren’t as we thought. And are as the new model suggests.
I'll set a reminder to check back at that time to see who was right.
With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"
https://www.theguardian.com/science/2025/nov/06/universe-exp...
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
Why would you assume this? It's not correct.
Type 1a supernovae aren't even assumed to be "standard candles" as is often claimed: rather, they're standardizable, i.e. with cross-checks and statistical analysis, they can be used as an important part of a cosmological distance ladder.
A great deal of analysis has gone into the development of that distance ladder, with cross-checks being used wherever it's possible to use them.
They look at surface brightness fluctuations in the same galaxies, Tully-Fisher distances[1], tip of the red giant branch distances[2], and even baryon acoustic oscillations[3]
Is it possible that this one single paper has upended all that? Theoretically. Is it likely? No.
[1] https://en.wikipedia.org/wiki/Tully%E2%80%93Fisher_relation
[2] https://en.wikipedia.org/wiki/Tip_of_the_red-giant_branch
[3] https://en.wikipedia.org/wiki/Baryon_acoustic_oscillations
(paraphrasing George Ellis)
We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
"Remarkably, this agrees with what is independently predicted from BAO-only or BAO+CMB analyses, though this fact has received little attention so far.""A change on the standard candles calibration would be a huge deal for cosmology and galactic astronomy (and other fields) and would not be taken lightly at all. There are all sorts of ramifications from this and if astronomers aren't all in an oof about it, it is because big proof is needed for big claims.
And a change in the standard candles calibration is indeed a very big claim.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws. Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc... Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
I can say the same about forgnoz, which is something I've just invented that must exist by definition.
You'd need to try a bit harder to make existence actually inevitable.
It is not because it is impredicative that it needs to be hard to understand I think. It's almost a tautology rather.
Oh by the way, forgniz exist, you made it to design something. It doesn't have to refer to something material. It could be an idea. After all, inventions don't exist by being material in the first place. But idea have at least material support (your brain signals) and the data acquired through your body. As far as we know.
Side-note: the deontological argument is an argument for the existence of God, which uses the same principle as the grandparent. “Imagine God. Imagine God is good. A good God should exist, because otherwise that god is not good. Therefore, the good God we imagined has the property of existence. Therefore God exists”. The issue is exactly the same — we can imagine something with property X, but that doesn’t mean we can find something with property X
Universe gong.
If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
But wavering around a line above y = 0.
Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.
> Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.
> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ΛCDM model was ruled out with overwhelming significance, the researchers said.
I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?
Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)
At the very bottom. Weird how style guides keep putting important information like this in harder to reach places.
https://newscenter.lbl.gov/2025/03/19/new-desi-results-stren...
Roger Penrose seems to be leaning/more convinced of the circular universe theory....
<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>
The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.
[1] https://www.sciencedirect.com/science/article/abs/pii/S00945..., PDF at https://www.aleph.se/papers/Spamming%20the%20universe.pdf
The observable universe is ~93B LY - unless you're assuming FTL (and MUCH faster than light), I don't see how that's possible?
Interesting way to put it... This doesn't seem that accurate. With sufficiently advanced technology, many of which we already possess, we could expect to propel a minute spacecraft to a considerable fraction of the speed of light, and reach nearby stars possibly within the end of the century. Reaching the other end of the galaxy is a massive undertaking. It's a logarithmic scale at every step of the way.
Pluto is about 38 AU from Earth. Proxima Centauri is about 6.3 × 10^4 AU away (or about 4.24 ly), and that's roughly a 2 × 10^3 multiplication. The Milky Way is about 50000 ly in radius, and the Andromeda Galaxy is about 3 × 10^6 ly away. Going from interplanetary distances to interstellar, and thence to intergalactic, involves at least a 10^5 factor (give or take) at each step.
...what? That doesn't seem right, just from a really quick gut check it looks like the observable universe has a radius of 45.7 billion light years [0]. Even if the universe wasn't expanding nobody could get to everything any faster than that number of years right? Maybe you saw something that was talking about the local (Virgo) supercluster, which I think has a radius of around 55 million light years, so that sounds more like something that could be done on that timescale "in theory". But there are millions and millions of superclusters in the observable universe overall.
----
Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.
https://arxiv.org/pdf/1205.2281 https://ntrs.nasa.gov/api/citations/20200001904/downloads/20...
https://ia800108.us.archive.org/view_archive.php?archive=/24...
"The third mission uses a three-stage sail for a roundtrip manned exploration of Eridani at 10.8 light years distance."
Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
All of that without having traveled farther than one light second from its home.
It is not questioning that the universe is expanding. It is questioning how the expansion is happening. Massive difference. The rate of expansion has always been more of a "probably" and "looks like" rather than "we have very strong evidence" (unlike expansion itself, for which there is very strong evidence). This is a classic "we have tweaked our model as we've learned more" type thing (assuming it holds).
Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.
Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.
Most problems involving space are insanely hard.
And of course, the people concerned with tracking near-earth asteroids are not connected in any way with cosmology.
while science might not have a definitive answer for everything, they distinguish from fact and theory.
There seem to be so many fudge factors in the whole chain of analysis we won't have an idea until we can make vastly improved measurements.
Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.
Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.
Also this discovery does still is being explained with dark energy (albeit time varying …) so it still does not assume global energy conservation.
My favorite quote:
> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they “forgot” it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.
Because there is no shortage of 'crackpots' that have 'obvious' solutions to unsolved physics problems, and that want to publish papers about it.