At the end of the last post on falsifiability, I mentioned the possibility that the multiverse hypothesis might provide an explanation for the famous

Limitations of space and time mean that I cannot possibly start this post as I would like to, with an explanation of what the cosmological problem

This enormous discrepancy between theory and observation, of somewhere between 60 and 120 orders of magnitude, has for a long time been one of the outstanding problems – not to say embarrassments – of high energy theory. Many very smart people have tried many ingenious ways of solving it, but it turns out to be a very hard problem indeed. Sections 2 and 3 of this review by Raphael Bousso provide some sense of the various attempts that have been made at explanation and how they have failed (though this review is unfortunately also at a fairly technical level).

This is where the multiverse and the anthropic argument comes in. In this very famous paper back in 1987, Steven Weinberg used the hypothesis of a multiverse consisting of causally separated universes which have different values of $\rho_\Lambda$ to explain why we might be living in a universe with a very small $\rho_\Lambda$, and to predict that if this were true, $\rho_\Lambda$ in our universe would nevertheless be large enough to measure, with a value a few times larger than the energy density of matter, $\rho_m$. This was particularly important because the value of $\rho_\Lambda$ had not at that time been conclusively measured, and many theorists were working under the assumption that the cosmological constant problem would be solved by some theoretical advance which would demonstrate why it had to be exactly zero, rather than some exceedingly small but non-zero number.

Weinberg's prediction is generally regarded as having been successful. In 1998, observations of distant supernovae indicated that $\rho_\Lambda$ was in fact non-zero, and in the subsequent decade-and-a-half increasingly precise cosmological measurements, especially of the CMB, have confirmed its value to be a little more than three times that of $\rho_m$.

This has been viewed as strong evidence in favour of the multiverse hypothesis in general and in particular for string theory, which provides a potential mechanism for the realisation of this multiverse. Indeed in the absence of any other observational evidence for the multiverse (perhaps even in principle), and the ongoing lack of experimental lack of experimental evidence for other predictions of string theory, Weinberg's anthropic prediction of the value of the cosmological constant is often regarded as the most important reason for believing that these theories are part of the correct description of the world. For instance, to provide just three arbitrarily chosen examples, Sean Carroll argues this here, Max Tegmark here, and Raphael Bousso in the review linked to above.

I have a problem with this argument, and it is not a purely philosophical one. (The philosophical objection is loosely the one made here.) Instead I disagree that Weinberg's argument still correctly predicts the value of $\rho_\Lambda$. This is partly because Weinberg's argument, though brilliant, relied upon a few assumptions about the theory in which the multiverse was to be realised, and theory has subsequently developed not to support these assumptions but to negate them. And it is partly because, even given these assumptions, the argument gives the wrong value when applied to cosmological observations from 2014 rather than 1987. Both theory and observation have moved away from the anthropic multiverse.

To see better where the problem lies, let's go through Weinberg's original argument again. It is based on the following fundamental assumptions:

*cosmological constant problem*. Today I'm going to try to elaborate a little on that argument and why I find it unconvincing.Limitations of space and time mean that I cannot possibly start this post as I would like to, with an explanation of what the cosmological problem

*is*, and why it is so hard to resolve it. Readers who would like to learn a bit more about this could try reading this, this, this or this (arranged in roughly descending order of accessibility to the non-expert). For my purposes I will have to simply summarise the problem by saying that our models of the history of the Universe contain a parameter $\rho_\Lambda$ – which is related to the vacuum energy density and sometimes called the dark energy density – whose expected value, according to our current understanding of quantum field theory, should be*at least*$10^{-64}$ (in units of the Planck scale energy) and quite possibly as large as 1, but whose actual value, deduced from our reconstruction of the history of the Universe, is approximately $1.5\times10^{-123}$. (As ever with this blog, the mathematics may not display correctly in RSS readers, so you might have to click through.)This enormous discrepancy between theory and observation, of somewhere between 60 and 120 orders of magnitude, has for a long time been one of the outstanding problems – not to say embarrassments – of high energy theory. Many very smart people have tried many ingenious ways of solving it, but it turns out to be a very hard problem indeed. Sections 2 and 3 of this review by Raphael Bousso provide some sense of the various attempts that have been made at explanation and how they have failed (though this review is unfortunately also at a fairly technical level).

This is where the multiverse and the anthropic argument comes in. In this very famous paper back in 1987, Steven Weinberg used the hypothesis of a multiverse consisting of causally separated universes which have different values of $\rho_\Lambda$ to explain why we might be living in a universe with a very small $\rho_\Lambda$, and to predict that if this were true, $\rho_\Lambda$ in our universe would nevertheless be large enough to measure, with a value a few times larger than the energy density of matter, $\rho_m$. This was particularly important because the value of $\rho_\Lambda$ had not at that time been conclusively measured, and many theorists were working under the assumption that the cosmological constant problem would be solved by some theoretical advance which would demonstrate why it had to be exactly zero, rather than some exceedingly small but non-zero number.

Weinberg's prediction is generally regarded as having been successful. In 1998, observations of distant supernovae indicated that $\rho_\Lambda$ was in fact non-zero, and in the subsequent decade-and-a-half increasingly precise cosmological measurements, especially of the CMB, have confirmed its value to be a little more than three times that of $\rho_m$.

This has been viewed as strong evidence in favour of the multiverse hypothesis in general and in particular for string theory, which provides a potential mechanism for the realisation of this multiverse. Indeed in the absence of any other observational evidence for the multiverse (perhaps even in principle), and the ongoing lack of experimental lack of experimental evidence for other predictions of string theory, Weinberg's anthropic prediction of the value of the cosmological constant is often regarded as the most important reason for believing that these theories are part of the correct description of the world. For instance, to provide just three arbitrarily chosen examples, Sean Carroll argues this here, Max Tegmark here, and Raphael Bousso in the review linked to above.

I have a problem with this argument, and it is not a purely philosophical one. (The philosophical objection is loosely the one made here.) Instead I disagree that Weinberg's argument still correctly predicts the value of $\rho_\Lambda$. This is partly because Weinberg's argument, though brilliant, relied upon a few assumptions about the theory in which the multiverse was to be realised, and theory has subsequently developed not to support these assumptions but to negate them. And it is partly because, even given these assumptions, the argument gives the wrong value when applied to cosmological observations from 2014 rather than 1987. Both theory and observation have moved away from the anthropic multiverse.

To see better where the problem lies, let's go through Weinberg's original argument again. It is based on the following fundamental assumptions:

- $\rho_\Lambda$ is not fixed but takes on different values in different causally disconnected regions or bubble universes
- these bubble universes are identical in
*every respect*except for their different values of $\rho_\Lambda$ - at least some of these bubble universes have $\rho_\Lambda$ values compatible with our being here to puzzle over them (in a sense to be made clear in a minute)
- the distribution of $\rho_\Lambda$ values within this restricted subset favours larger absolute values $|\rho_\Lambda|$.

Weinberg then showed that the range of possible $\rho_\Lambda$ values that is compatible with the mere observation that we exist is limited from above: since vacuum energy drives an accelerated expansion of the universe, if there is too much of it the universe expands too fast, diluting the matter density to the extent that fluctuations in this density are unable to congeal under the action of gravity to form galaxies, or indeed any non-linear structures. This, we can all agree, would preclude the existence of any observers in such a universe. (It is also limited from below by the requirement that the universe not recollapse too soon in a Big Crunch, but this lower limit is a negative value with a much smaller $|\rho_\Lambda|$, so we can ignore it here.)

There exists therefore a

*catastrophic boundary*in the distribution of $\rho_\Lambda$. Assumption 2 allows us to calculate where this boundary is, and assumptions 3 and 4, together with the fact that galaxies and humans exist, then predict that in our universe the value of $\rho_\Lambda$ must be very close to this catastrophic boundary – as large as it is possible to be, because all the bubble universes with larger values have no observers recording their existence. If the observed value is indeed close to the catastrophic boundary, this may be taken as vindication of assumption 1.
Note that assumption 4 is crucial here, but is also very plausible. This is because over

*all*bubble universes $\rho_\Lambda$ can presumably take all values up to $\sim1$. But the catastrophic boundary is at values $\rho_\Lambda\sim10^{-121}$. We need only assume that the probability distribution does not vary much over this absolutely*tiny*allowed range (and that there is nothing special about the value $\rho_\Lambda=0$); then it follows that $\mathrm{d}P/\mathrm{d}\rho_\Lambda$ is approximately constant, and assumption 4 is valid.
The first problem is one of detail. The catastrophic boundary value is clearly related to the size of the initial density fluctuations which collapse to form galaxies: smaller initial fluctuations take more time to collapse, so require a smaller $\rho_\Lambda$, whereas bigger fluctuations would collapse quicker and so be more tolerant of larger values. Knowing how big the initial fluctuations were is the tricky bit, but Weinberg argued – in a stroke of brilliant simplicity – that we could obtain a limit on how big the initial fluctuations had been by seeing how soon the first galaxies formed. The older the oldest galaxy we see today, the quicker it must have collapsed and so the larger the initial fluctuations. He derived a formula $$\rho_{\Lambda,\mathrm{max}}=\frac{\pi^2}{3}(1+z_c)^3\rho_m$$ for the catastrophic boundary in terms of the matter density $\rho_m$, where $z_c$ is the redshift of the highest-redshift galaxy we observe today.

Back in 1987, the highest redshift galaxies that had been observed were at $z_c\sim4.5$. Plugging that value in gives us the anthropic bound $\rho_\Lambda<550\rho_m$. Already the observed value $\rho_\Lambda\sim3.2\rho_m$ starts to feel uncomfortably small. But in 2014, the highest-redshift known galaxies are no longer at redshift 4.5 but instead at redshifts of at least 8.6 (and possibly even as high as 11.9). Because the redshift appears to the third power in the equation above, this difference is significant. It means that the best estimate for the location of the catastrophic boundary is now at least $$\rho_\Lambda\sim 3000\rho_m\;,$$ and possibly as large as $$\rho_\Lambda\sim7000\rho_m.$$ Weinberg himself actually stated that if the (then unmeasured) value of $\rho_\Lambda$ were to turn out to be more than 3 orders of magnitude smaller than his anthropic upper bound, "we would have to conclude that the anthropic principle does

*not*explain why it is so small" (p. 8). Unfortunately, we are pretty much at that limit now.
But there's also a theoretical reason that Weinberg's argument does not work so well in the context of the stringy multiverse, and it is because assumption 2 fails. At the time, Weinberg did not propose any concrete theoretical framework which could produce models satisfying assumptions 1 to 3, but suggested that this might happen in the future. And indeed the future did provide the "string landscape" with its $10^{500}$ vacua and the idea of eternal inflation which would help to populate them. This is why proponents of string theory like to claim the "prediction" of the value of the cosmological constant as evidence in favour of the multiverse.

Unfortunately, however, it does not seem to be possible to obtain an ensemble of universes in the landscape which differ

*only*in the value of $\rho_\Lambda$. In particular, it seems inevitable that these universes will also generically have different amplitudes of the initial fluctuations than that which we observe in ours. As soon as this quantity becomes a variable on the same footing as $\rho_\Lambda$, the whole idea of a hard catastrophic boundary melts away. There will be other bubble universes which have*much*larger values of $\rho_\Lambda$ but nevertheless contain galaxies and other non-linear structures, because they also had much larger initial fluctuations.
In this circumstance, it is still possible to argue that the anthropic multiverse is compatible with the observed cosmological constant, but only if one also abandons assumption 4, which appeared so attractive. Instead one must postulate that the probability distribution $\mathrm{d}P/\mathrm{d}\rho_\Lambda$ is

*not*a constant but disfavours both values of $\rho_\Lambda$ that are "too large" and "too small", perhaps because universes with smaller $\rho_\Lambda$ contain "more observers". But this is now just another manifestation of the*measure problem*in cosmology: how to calculate probabilities of events (existence of observers) in a (possibly) infinite collection of universes, each of infinite extent, and still get a meaningful result. The answer obtained depends on the measure (the method of treating the infinities) used, but isn't at all clear which measure is most appropriate.
In my opinion, the argument now degenerates to

*post-*dicting the choice of measure that will most closely reproduce the observed value of $\rho_\Lambda$. This may or may not be an interesting thing to do – personally I think it's not for me – but one thing is for sure: it can no longer be argued that the multiverse hypothesis*predicts*the cosmological constant in any testable or falsifiable way. Rather the observed value of the cosmological constant is being used to work out possible measures to apply to a multiverse whose existence is already taken for granted.
Which is an entirely different thing.

Something which is rarely mentioned by the pundits is that Weinberg accepts the huge particle-physics cosmological constant and has a "bare" cosmological constant like Einstein's original one (which has nothing to do with particle physics, quantum theory etc) with a slightly smaller absolute value but negative, so that the two almost, but not quite, cancel.

ReplyDeleteIs it so hard to believe that the particle physicists are just wrong here? Their huge-lambda prediction seems to be accepted without much debate.

Note that many authors discuss the multiverse independently of string theory.

There are two problems I see with your suggestion. The first is that the effect of vacuum energy on inertial mass is confirmed by measurement of the Lamb shift and its effect on gravitational mass has been confirmed by free fall experiments (which test the equivalence principle). So we know that the vacuum gravitates. We know that the Standard Model of particle physics works very well up at least LHC energies. If particle physics is going to turn out to be wrong it's got to do so in some extraordinary fashion.

DeleteThe second problem – perhaps of more relevance to those of an astrophysical mindset – is that if the vacuum doesn't gravitate there can be no inflation, so what about the horizon problem, the initial seed fluctuations and so on?

Has Weinberg ever expressed any opinion about this question of what one gets using modern numbers in his original argument? Also, what about his later arguments in

ReplyDeletehttp://arxiv.org/abs/astro-ph/9701099

Is the argument you're making here, with its "three orders of magnitude" below the anthropic bound a widely discussed one?

Thanks!

Raphael Bousso's review I mentioned above acknowledges it (in a footnote), but argues that (a) a discrepancy of three orders of magnitude is better than one of 120 orders of magnitude, and that (b) with certain approaches to the measure problem one can eliminate the three orders of magnitude. But he doesn't deal with the problem of circularity of argument that then results.

DeleteI haven't read the Martel et al paper as carefully as Weinberg's original. The value of the anthropic bound they get might be better, but they still fix the amplitude of the initial fluctuations to be the same as in our universe, which appears unlikely to be justified by theory.

Thanks

DeleteWeinberg's opinion of his original argument is presumably the same as stated in his paper that you mention, i.e. that it doesn't work precisely because of the high-redshift galaxies.

DeleteI don't understand the point you're trying to make. If it is that the multiverse is science, only it is difficult science, then I agree. Maybe I'm just overreacting to the "not even wrong" brigade when I assume that instead you're trying to claim that a multiverse has been tested and ruled out and all that is left is unscientific.

ReplyDeleteBut the point to me is just that it just isn't as simple as what Weinberg thought. Fine. The multiverse may be correct, it might not; we should continue testing and find out. It just so happens that those tests are hard. The measure problem means that in all likelihood one has to include one's choice of measure within one's model. Fine. After post-dicting one observation one can start predicting others.

How is this one model for how the universe works any different to any others that have come in the past? It might be right, we should try to work out if it is, just like any other scientific model!

The point I'm making is that the "prediction" of the cosmological constant is not the simple victory for the multiverse that it is made out to be.

DeleteA corollary point is that if one must retro-fit the measure to to get the right cosmological constant then the fact that one gets the right cosmological constant is not evidence that the multiverse exists.

You'll notice that other than these two points I make no comment on whether the multiverse is in general falsifiable (perhaps there are all sorts of other possibilities I'm not expert enough to know about).

As you pointed out yourself in a comment above 1 in 1000 is much better than 1 in 10^60. The retrofitting is only for the last three orders of magnitude.

DeleteThe multiverse might not be true, but I don't see the qualitative difference between it and any other scientific model in history.

That's only one in 1000

Deleteifnothing except $\Lambda$ can vary.In my opinion the qualitative difference when it comes to the multiverse is that it may not be able to make any kind of predictions other than anthropic ones (I await correction on this if someone knows of some), and anthropic predictions are plagued by the measure problem.

So it

mayturn out to be falsifiable, but that's a possibility that is rather less likely than for most other theories."... in the absence of any other observational evidence for the multiverse ..." Does string theory with the finite nature hypothesis make 3 empirical predictions, based upon restriction of string vibrations to the Leech lattice? Is a complete infinity always either (1) a mathematical convenience or (2) a physical error?

ReplyDelete"In the physics I have learned there were many examples of where the mathematics was giving infinite degenerate solutions to a certain problem (classical mechanics problems e.g.). There the problem was always a mistake in the physics assumptions. Infinity is mathematical not physical, as far as I know." — Maria Spiropulu, Caltech physics professor http://www.edge.org/discourse/landscape.html#spiropulu

If X’s law is to string theory as Kepler’s laws are to Newtonian mechanics, then who is X?

http://www.weizmann.ac.il/weizsites/milgrom/ Mordehai (Moti) Milgrom, Weizmann Institute of Science

Have string theorists stupidly ignored Milgrom for 30 years?

Google "space roar dark energy".

Please don't view this page as the appropriate place for such nonsensical stream-of-consciousness comments.

DeleteHowever, I thought it worth pointing out that clicking through from the first link you provide one sees Leonard Susskind claiming "there is a constant in nature called the cosmological constant, and it's a certain number. If that number differed by the tiniest amount from what it really is, the universe could not have been born with galaxies, stars, planets, and so forth". This is exactly the kind of hyperbole I was trying to refute above. Even making favourable assumptions that the landscape picture is currently unable to justify, the number could be

more than 1000 times biggerthan it is and galaxies, stars and planets would still be able to form.I just really annoys me when all arguments about the anthropic principle go on and on about how Weinberg correctly predicted the value of the cosmological constant. Weinberg himself was probably the first to admit that his argument does not give the correct value.

This comment has been removed by a blog administrator.

DeleteSorry, further irrelevant discussion about MOND will be ruthlessly deleted. Comments that don't make a point but just regurgitate irrelevant quotes from elsewhere will also be deleted.

DeleteSesh, thanks for bringing up the important question of the interplay between the measure problem and Weinberg's argument.

ReplyDeleteFirst, a technical point unrelated to the measure problem: even if we treated Weinberg's catastrophic boundary (galaxy formation) as sharp, it is just a theta function that multiplies the prior probability distribution, dP/d(log Lambda), which is proportional to Lambda. So the exponential growth of the probability density in log Lambda survives, up to Lambda of order 1/(galaxy formation time)^2, and the probability is zero for greater values. Thus, in this simplified treatment, the probability density is indeed highest right at the catastrophic boundary. But given a probability distribution, one never claims that the prediction is the value where the probability density is highest. Rather, we ask how many standard deviations the observed value is from the central value. If this is many sigma, we have falsified the prediction at that level of confidence. So it's not correct to say that Weinberg's argument predicts that we should find ourselves right at the catastrophic boundary. However, depending on the strength of the anthropic assumptions, one does find that the observed value is outside two or three sigma, so there is definitely some tension. Moreover, as you correctly note, this tension becomes catastrophic once we allow the initial density contrast to vary.

Now concerning the measure. You are right that the measure is an important ingredient in how predictions are made in an eternally inflating universe. (This hasn't got much to do with the multiverse. Since the observed cosmological constant is positive, we have eternal inflation, and hence a measure problem, unless our own vacuum is tuned to decay in a few billion years.)

It turns out to be very difficult to write down proposals for measures that are both well-defined and not obviously ruled out. For example, until the causal patch measure was invented, known measures predicted that even the most basic observations are 10^dozens (!) of standard deviations from the mean. This devastating conclusion obtains independently of whether there are other vacua, so these measures are ruled out. Importantly, the measure Weinberg implicitly used (observers-per-baryon) is ruled out, roughly for this reason; see e.g. hep-th/0610132 .

I proposed the causal patch measure in 2006. (Closely related measures were developed subsequently by Vilenkin, Guth, Linde and others.) I was not motivated (indeed, at the time, unaware) of the catastrophic problems with Weinberg's measure; nor by the phenomenology of the prediction for Lambda. My interest was entirely formal: it seemed to make sense to apply certain lessons from quantum black hole physics to cosmology, and this measure seemed like the most straightforward implementation. The idea is to average of all possible causally connected regions, so it's really quite simple and it certainly contained no knobs to dial, to yield some particular probability distribution for some observable. (In fact it would be very difficult to do this with any measure that is defined purely as a geometric cutoff, which all the leading proposals are.)

(... continued from above)

DeleteWhen Harnik, Kribs, Perez and I decided to apply this measure to the prediction of the cosmological constant, in hep-th/0702115, we had no idea that it would so dramatically improve the prediction; nor, that it would permit us to drop the assumption that observers require galaxies. We set out by carefully modelling galaxy and star formation, writing mathematica codes, etc., only to discover that the only relevant time scale for Lambda, in the causal patch approach, is the time when the observers live. We realized this only after our postdiction for Lambda, produced by our code, turned out to be in perfect agreement with observation (in the sense that the observed value was within the central 1 sigma region). Only then did we understand that we had worked way too hard, and it wasn't even necessary to keep track of galaxies. Including traditional anthropic boundaries such as galaxy formation, etc., only further suppresses the distribution in a regime where it is already negligible. (The argument is quite simple, so in hindsight this should have been obvious, as is often the case.)

It took a while to appreciate the generality of this result. It remains somewhat obscured by the discussion of entropy production in the Harnik et al. paper. Even in my 2007 review which you cite doesn't discuss it very clearly. The point is that with the causal patch, one primarily resolves the coincidence problem that Lambda dominates around the time when the chosen class of observers live. It depends on nothing else. It is robust, e.g., against variations of the initial density contrast, and it applies even to vacua with completely different low energy physics.

For a brief and more up-to-date summary of this, see Sec. 3.7 of 1203.0307. For a more speculative discussion of where the enormous scales in our universe may originate, see 1011.0714.

To summarize, you are right that in general, it is impossible to compute a probability distribution without specifying both a measure and the structure of the landscape. However, by asking the right questions it is possible to test measures independently of other vacua, using only the existence of our own vacuum. (It is also possible to rule out certain landscapes of vacua for all known measures, such as the Brown-Teitelboim landscape.) The causal patch measure has turned out to resolve serious problems with Weinberg's seminal argument, and to make more robust predictions. But this came as quite a surprise; it was certainly not designed for this purpose.

Hi Raphael, I'm kind of being a bit of a poacher here, but if you were interested in writing a guest post about this at our blog (i.e here), we would be super-keen to host you. It would only need to be, basically, exactly what you've written above, but in ever-so-slightly less technical language (though not too much less technical). Though you could also add a little bit more detail, if you wished.

DeleteOf course, because you wrote your comment here, at Sesh's blog, and not at Trenches, if you do want to write a guest post and Sesh would prefer to host it, I would definitely back-off and look forward to reading it at Blank on the Map.

However, either way, I do think this perspective deserves to be aired at a higher level than a comment, written almost a week after the original post. So, you should definitely write a guest post about this *somewhere*!

Apparently I can't spell my own name correctly! (I wondered why it wasn't auto-completing)

Delete"Both theory and observation have moved away from the anthropic multiverse." IMO cleverly contrived D-brane adjustments can make the string landscape, SUSY, and eternal cosmological inflation correspond to any empirical observations — whether for good or ill is unclear.

DeleteAre you there Sesh? I think there’s a need to go back to basics with the vacuum catastrophe. Start with Einstein’s stress-energy tensor. It’s got an energy-pressure diagonal. And see page 185 of the Doc 30 Foundation of the General Theory of Relativity:

ReplyDelete“the energy of the gravitational field shall act gravitatively in the same way as any other kind of energy”. Vacuum energy density varies in a gravitational field. And conservation of energy ought to tell you that in an expanding universe, it varies over time. IMHO people miss the trick here. It’s like they don’t know about the bag model, or the shear stress term in the stress-energy-tensor.The universe expands like the balloon analogy we’ve all seen on the Discovery Channel. And a balloon is the size it is because the internal pressure is counterbalanced by the tension in the skin. Now, how do you make that balloon bigger? You can blow some more air into it. That’s like adding more energy, increasing the pressure. But when it comes to the universe, that’s creation ex nihilo. It drives a coach and horses through conservation of energy. Plus dark energy is described as negative pressure, but space has a positive energy density and a positive volume, and pressure is energy divided by volume. So the pressure can’t

benegative. That sucks. But there is another way to make that balloon bigger: reduce the tensile strength of the skin. Think bubblegum. Tension is negative pressure and when you reduce the tension the balloon gets bigger until the internal pressure is reduced and balances the tension again. But the balloon got bigger so the skin is now thinner so the tensile strength is reduced again, so the balloon gets bigger again.And so it goes, the universe expands, and that expansion doesn’t stop, because as space expands the “strength of space” reduces. I’m no advocate of MOND, but see page 5 of http://arxiv.org/abs/0912.2678 where Milgrom mentions elasticity and strength.

IMHO looking at it in this simple mundane way makes all talk of a multiverse sound bizarre.

John Duffield

If only physics were so simple, and we could solve all problems just by thinking of analogies with bubblegum and balloons and watching the Discovery channel! Unfortunately it isn't. A lot of people don't seem to realise that the "explanations" of physics that pop-science sources such as the Discovery channel provide are just

Deleteanalogies, they're not the real physics. Often they're not even good analogies, and completely fail as soon as they are examined in even the tiniest bit more detail.Another thing that many people curiously don't seem to realise is that professional physicists are not only very smart people, they also often have to study for something like 10 years before they even get a PhD, not to mention spending a lifetime studying these issues after that. So if you think, based on reasoning from Discovery channel analogies, that you have found a very simple solution to a problem that all professional physicists acknowledge is a serious one, it is overwhelmingly more likely that your "solution" and your analogies are wrong than that all physicists are wrong.

I haven't the time to write a whole lot about this now, but here are some things you should consider. "Energy conservation" in general relativity is a subtle concept; energy is not "conserved" according to the definition of conservation you seem to be thinking of. In particular, the vacuum energy density most definitely does not vary with time; it remains constant even as the universe expands ("expansion creates more vacuum" is one way to think about it). Then the expanding balloon, which is just an awful analogy, gets almost everything about cosmology wrong (see comments here). Then the statement that pressure is energy divided by volume is wrong - if anything, pressure would be p=-dE/dV, which is why a vacuum energy density that remains constant does correspond to a negative pressure. Also, thinking about "tensile strength" and negative pressure as "tension" is completely counter-productive and wrong (if negative pressure were tension, why would negative pressure cause accelerated expansion?).

And finally, please stop with the postings about MOND. If what you meant was that the observed acceleration can be explained by modifying gravity at very large scales, there are better-motivated theories that try to do this. But these are theories trying to generate a mechanism for getting some acceleration rather than no acceleration. They don't explain the cosmological constant problem, which is why the acceleration is not huge.

For further reading about the conservation of energy, go to Sean Carroll. His post also nicely conveys the sense of frustration cosmologists feel with people who think they've discovered the "simple mistake" we've all been making for so many years.

DeleteSean's a nice guy, but he's a cosmologist, and a "senior research associate". He's no expert on relativity. Energy IS conserved in general relativity. Direct a 511keV photon into a black hole, and the black hole mass increases by 511keV/c². Lift a brick and you do work on it. Its mass increases. Drop it and some of that mass-energy, that which we call potential energy, is converted into kinetic energy. Discard that kinetic energy, and the mass of the brick is reduced. Ergo the mass defect. The same applies to you and your measuring equipment, hence a downward photon appears to have gained energy. Even though it hasn't. You know this, because when you direct a 511keV photon into a black hole, the black hole mass increases by 511keV/c². See Phil Gibbs’ essay at: http://vixra.org/abs/1305.0034.

ReplyDeleteThere are other issues like this wherein people who promote themselves as the experts get things hopelessly wrong, and end up talking about the multiverse, or the evil twin universe, or some other speculative nonsense that has absolutely no evidential support. And some have a bad habit of grandstanding with “problems”, such that when somebody points out a problem with their problem, or a potential solution, they react badly. Sometimes they say “that can’t be right because we didn’t think of it”. Look again at what I said Sesh. Don’t forget the bag model. Ask yourself where the strong force goes in low-energy proton-antiproton annihilation to gamma photons. And note that I’m no advocate of MOND. Milgrom was referring to f(r) gravity, that’s all.

If you read Phil Gibbs' essay carefully you will see that his argument reinforces Sean's point and mine. In order to show that energy is "conserved", he has had to introduce an extra term in his definition of "energy" to account for "the energy of the gravitational field". Now Sean's argument is that such a modification of the definition is not useful, but we don't even need to get into that discussion. The definition of energy you are using to claim that energy conservation means the vacuum energy density has to vary with time is not this modified definition. So either way you are wrong.

DeleteYour attitude really annoys me, actually. What the hell are your qualifications, since you are so happy to cast ignorant aspersions on everyone else's? Your patronizing tone is a bit rich given your own obvious lack of knowledge of physics and even general reading comprehension.

I'm happy to host genuine questions here from people who don't understand something and would like to learn more about it - if it is beyond my expertise I will say so - but comments combining ignorance and arrogance will be simply deleted.

And incidentally, I was also referring to f(R) gravity models in my comment above: these theories do not solve the cosmological constant problem.

DeletePlease read with a degree of levity :)

ReplyDelete"" Many very smart people have tried many ingenious ways of solving it, but it turns out to be a very hard problem indeed""

I see.

Well..let me take a crack it......

Nope...I got nothin

Oh Wait. The long standing company line in the past was "The world only """appears""" designed...we're sure the parameters are wide to get to the complexity we see with our eyes"

Ok............and the results are In...............and.......TaDa

The Math is Infinitely worse than what we can see.

In fact, its so mind blowing that it makes former claims of the "appearance of design" look childish. Our Universe is Impossible.

If only there were more universes...but wait, there's a problem with that..

.....that's more than there are.

Maybe we're in computer simulation? Maybe if we get some adhesive and patch all the loopholes together, we can glue it all together, and prop it up with string like a talking puppet. It would be embarrassing but aren't we already there?

When is somebody gonna have the courage to just state the obvious. A magical everything maker machine that just happens to pop out exactly what you need doesn't solve problems ...it defines the real problem.

Has it ever occurred to you that there would be another You that witnesses the same data and the only difference in that world is You(he) completely disagrees with you. There would be an infinite amount of them, robbing your conclusion of any claim to Reason. What I'm saying is I hear people treating the issue as if it can dismissed by a loophole using probability, but it introduces a reality where probability becomes meaningless. I think if people would spend a few days running through all the absurdities and contradictions that would exists if everything existed, they'd see ... not only is the price too high but the disease is worse than the cure.