## Tuesday, April 9, 2013

### Celebrating Tom Lehrer

This is not a post about physics, but one to mark the birthday today of mathematician, teacher, satirist, lyricist and performer Tom Lehrer. Today he turns 85 – or, since he apparently prefers to measure his age in Centigrade – 29 (I must remember to use that one myself sometime!).

To commemorate the occasion, the BBC ran a half-hour long radio feature on his life and work last Saturday. This is available to listen to here for another four days; do try to catch it before then!

Even readers who have not heard of Lehrer might have heard of some of his better-known songs, such as The Elements Song. Other pieces of simple comedy gold include Lobachevsky, or New Math. But for me the best of Lehrer's songs are the ones with darkly satirical lyrics juxtaposed with curiously uplifting melodies. (These were probably also part of the reason that he never achieved the mainstream popularity he deserved.) So I want to feature one such example here:

Kim Jong-un, I hope you are listening.

## Sunday, April 7, 2013

### Unnecessary spin

A few people have asked me why I have not blogged about the recent announcement about, and publication of, results from the Alpha Magnetic Spectrometer, which were widely touted as a possible breakthrough in the search for dark matter.

The reason I have not is simply that there are many other better informed commenters who have already done so. In case you have not yet read these accounts, you could do worse than going to Résonaances, or Ethan Siegel, or Stuart Clark in the Guardian, who provide commentary at different levels of technical detail. The simple short summary would be: AMS has not provided evidence about the nature of dark matter, nor is it likely to do so in the near future. The dramatic claims to the contrary are spin, pure and simple. Siegel in fact goes so far as to say "calling it misleading is generous, because I personally believe it is deceitful" (emphasis his own).

However, since this incident brought it up again, I do want to comment on a related piece of annoying spin, which is the habit of physicists in the business of communicating science to the public of making vastly exaggerated claims about the possible practical applications of fundamental physics. The example that caught my attention this week occurred when Maggie Aderin-Pocock – who is apparently a space science research fellow at UCL – appeared on the BBC's Today programme on Thursday to discuss the significance of the AMS findings.

At one point in the discussion the interviewer John Humphrys asked a slightly tricky question: I understand that dark matter and dark energy are endlessly fascinating, he said, and that learning about the composition of the universe is very exciting. But what practical benefits might it bring? The answer Aderin-Pocock gave was that if we understood what dark matter and dark energy were, we might be able to use them to supply ourselves with energy – dark matter as a fuel source.

I'm sorry, but that is just rubbish.

Unfortunately, it's the kind of rubbish that is increasingly commonly voiced by scientists. You may argue that Aderin-Pocock was simply commenting on something she didn't understand – and if you  listen to the whole interview (available here for a few days; skip to the segment between 1h 23m and 1h 26m), including the cringe-worthy suggestion that dark matter and dark energy are the same thing really (because E=mc2, apparently), it's hard to avoid that conclusion. But a few weeks ago I thought I heard Andrew Pontzen and Tom Whyntie suggest something similar about the Higgs boson on BBC Radio 5 (unfortunately this episode is no longer on the iPlayer so I can't check the exact words they used). And here is Jon Butterworth seeming to suggest (in the midst of an otherwise reasonable piece) that the Higgs could be used to power interstellar travel ...

Why do people feel the need to do this? It's patently rubbish, and they know better. Do we as a scientific community feel that continued public support of science is so important that we should mislead or deceive the public in order to guarantee our future access to it? Do we feel that there is no convincing honest case to be made instead? Or are we just too lazy to make the honest case, and so rely on the catchy but inaccurate soundbite instead?

I think the sensible answer to the question John Humphrys posed would go something like this. Discovering the nature dark matter is a fascinating and exciting adventure. Knowing the answer will almost certainly have no practical applications whatsoever. However, on the journey to the answer we will have to develop new technologies and equipment (made of ordinary matter!) which may serendipitously turn out to have spin-off applications that we cannot yet foresee. More importantly, the very fact that the search is fascinating is part of what draws talented and creative young minds to physics – indeed to science – in the first place, from where they go on to enrich our society in a myriad different ways, none of which may later be connected to dark matter at all. I tried to make this case at greater length here in the early days of this blog.

It's a more subtle argument than just throwing empty phrases about "energy source" around, and it might be hard to reduce to a sound-byte. But it is justifiable, and also honest. And since science is after all about careful argumentation, let's have less spin all round please.

## Wednesday, March 27, 2013

### Explaining Planck by analogy

Explaining physics to the public is hard. Most physicists do a lousy job of conveying a summary of what their research really means and why it is important, without the use of jargon and in terms that can be readily understood. So it is not particularly surprising that occasionally non-experts trying to translate these statements for the benefit of other non-experts come up with misleading headlines such as this, or this.

Just to be clear: Planck has not mapped the universe as it was in the first tiny fraction of a second. (To be fair, most other reports correctly make this distinction, though they differ widely on when inflation is supposed to have occurred.) I think this is an important thing to get right, and I'm going to try to explain why, and what the CMB actually is.

However, I'm going to try to do so with the help of an analogy. This analogy is not my original invention – I heard Simon White use it during the Planck science briefing – but I think it is brilliant, simple to understand and not vastly misleading. So, despite the health warning about analogies above, I'm going to run with it and see how far we get.

## Thursday, March 21, 2013

### What Planck has seen

Update at 16:30 CET: I've now had a chance to listen to the main science briefing, and also to glance at some of the scientific papers released today, albeit very briefly. So here are a few more thoughts, though in actual fact it will take quite some time for cosmologists to fully assess the Planck results.

The first thing to say – and it's something easy to forget to say – is just what a wonderful achievement it is to send a satellite carrying two such precise instruments up into space, station it at L2, cool the instruments to a tenth of a degree above absolute zero with a fluctuations of less than one part in a million about that, spin the satellite once per minute, scan the whole sky in 9 different frequency bands, subtract all the messy foreground radiation from our own galaxy and even our solar system, all to obtain this perfect image of the universe as it was nearly 14 billion years ago:

 The CMB sky according to Planck.

So congratulations and thanks to the Planck team!

Now I said all that first up because I don't want to now sound churlish when I say that overall the results are a little disappointing for cosmologists. This is because, as I noted earlier in the day, there isn't much by way of exciting new results to challenge our current model of the universe. And of course physicists are more excited by evidence that what they have hitherto believed was wrong than by evidence that it continues to appear to be right.

There are however still some results that will be of interest, and where I think you can expect to see a fair amount of debate and new research in the near future.

Firstly, as I pointed out earlier, Planck sees the same large scale anomalies as WMAP, thus confirming that they are real rather than artifacts of some systematic error or foreground contamination (I believe Planck even account for possible contamination from our own solar system, which WMAP didn't do). These anomalies include not enough power on large angular scales ($\ell\leq30$), an asymmetry between the power in two hemispheres, a colder-than-expected large cold spot, and so on.

The problem with these anomalies is that they lie in the grey zone between being not particularly unusual and being definitely something to worry about. Roughly speaking, they're unlikely at around a 1% level. This means that how seriously you take them depends a lot on your personal prejudices priors. One school of thought – let's call it the "North American school" – tends to downplay the importance of anomalies and question the robustness of the statistical methods by which they were analysed. The other – shall we say "European" – school tends instead to play them up a bit: to highlight the differences with theory and to stress the importance of further investigation. Neither approach is wrong, because as I said this is a grey area. But the Planck team, for what it's worth, seem to be in the "European" camp.

The second surprise is the change in the best-fit values for the parameters of the simplest $\Lambda$CDM model. In particular the Hubble parameter is lower than WMAP's, which was already getting a bit low compared to distance-ladder measurements from supernovae. This will be a cause for concern for the people working on distance-ladder measurements, and potentially something interesting for inventive theorists.

And finally, something close to my own heart. A few days ago I wrote a post about the discrepancy in the integrated Sachs-Wolfe signal seen from very rare structures, and pointed out that this effect had now been confirmed in two independent measurements. Almost immediately I had to change that statement, because one of those independent measurements had been partially retracted.

Well, the Planck team have been on the case (here, paper XIX), and have now filled that gap with a second independent measurement (as well as re-confirming the first one). The effect is definitely there to be seen, and it is still discrepant with $\Lambda$CDM theory (though I'll need to read the paper in more detail before quantifying that statement).

So there's a ray of hope for something exciting.

11:30 am CET: Well, ESA's first press conference to announce the cosmological results from Planck has just concluded. The full scientific papers will be released in about an hour, and there will be a proper technical briefing on the results in the afternoon (this first announcement was aimed primarily at the media). However, here is a very quick summary of what I gathered is going to be presented:
• The standard Lambda Cold Dark Matter Model continues to be a good fit to CMB data
• However, the best fit parameters have changed: in particular, Planck indicates slightly more dark matter and ordinary (baryonic) matter than WMAP did, and slightly less dark energy. (This is possibly not a very fair comparison – my hunch is that the Planck values are obtained from Planck data alone, whereas the "WMAP values" that were quoted were actually the best fit to WMAP plus additional (non-CMB) datasets.)
• The value of the Hubble parameter has decreased a bit, to around 67 km/s/Mpc. Given the error bars this is actually getting a bit far away from the value measured from supernovae, whch is around 74 km/s/Mpc. I think the quoted error bars on the measurement from supernovae are underestimated.
• The Planck value of the spectral tilt is a bit smaller than, but consistent with, what WMAP found.
• There is no evidence for extra neutrino-like species.
• There is no evidence for non-zero neutrino masses.
• There is no evidence for non-Gaussianity.
• There is no evidence for deviations from a simple power-law form of the primordial power spectrum.
• No polarisation data, and therefore no evidence of gravitational waves or their absence, for around another year.
• There is evidence for anomalies in the large-scale power, consistent with what was seen in WMAP. We'll have to wait and see how statistically significant this is – the general response to the anomalies WMAP saw could be summarised as "interesting, but inconclusive"; I don't think Planck is going to do a lot better than this (and the bigging-up of it in the press conference might have had more to do with the lack of other truly exciting discoveries), but I'd love to be surprised!
That's about all I got out of the media briefing. Obviously we are all waiting for more details this afternoon!

## Wednesday, March 20, 2013

### The Planck guessing game

At 10 am CET on Thursday morning, the Planck mission will hold a press conference and announce the first cosmology results based on data from their satellite, which has now been in orbit for nearly 1406 days, according to the little clock on their website. (I think the conference information will be available live here, though the website's not as clear as it could be.)

Planck is an incredible instrument, which has been measuring the pattern of cosmic microwave background (CMB) temperature anisotropies with great precision. And the CMB itself is an incredible treasure trove of information about the history of the universe, telling us not only about how it began, but what it consists of, and what might happen to it in the future. When the COBE and WMAP satellites first published detailed data from measurements of the CMB, the result was basically a revolution in cosmology and our understanding of the universe we live in. Planck will provide a great improvement in sensitivity over WMAP, which in turn was a great improvement on everything that came before it.

Another feature of the Planck mission has been the great secrecy with which they have guarded their results. The members of the mission themselves have known most of their results for some time now. Apparently on the morning of March 21st they will release a cache of something like 20 to 30 scientific papers detailing their findings, but so far nobody outside the Planck team itself has much of an idea what will be in them.

So let's have a little guessing game. What do you think they will announce? Dramatic new results, or a mere confirmation of WMAP results and nothing else? I'll list below some of the things they might announce and how likely I think they are (I have no inside information about what they actually have seen). Add your own suggestions via the comments box!

Tensors: Planck is much more sensitive to a primordial tensor perturbation spectrum than the best current limits. If they did see a non-zero tensor-to-scalar ratio, indicative of primordial gravitational waves, this would be pretty big news, because it is a clear smoking gun signal for the theory of inflation. Of course there are other bits of evidence that make us think that inflation probably did happen, but this really would nail it.

Unfortunately, I think it is unlikely that they will see any tensor signal – not least because many (and some would argue the most natural) inflation models predict it should be too small for Planck's sensitivity.

Number of relativistic species: CMB measurements can place constraints on the number of relativistic species in the early universe, usually parameterised as the effective number of neutrino species. I wrote about this a bit here. The current best fit value is $N_{\rm eff}=3.28\pm0.40$ according to an analysis of the latest WMAP, ACT and SPT data combined with measurements of baryon acoustic oscillations and the Hubble parameter (though some other people find a slightly larger number).

I would be very surprised indeed if Planck did not confirm the basic compatibility of the data with the Standard Model value $N_{\rm eff}=3.04$. It will help to resolve the slight differences between the ACT and SPT results and the error bars will probably shrink, but I wouldn't bet on any dramatic results.

Non-Gaussianity: One thing that all theorists would love to hear is that Planck has found strong evidence for non-zero non-Gaussianity of the primordial perturbations. At a stroke this would rule out a large class of models of inflation (and there are far too many models of inflation to choose between), meaning we would have to somehow incorporate non-minimal kinetic terms, multiple scalar fields or complicated violations of slow-roll dynamics during inflation. Not that there is a shortage of these sorts of models either …

Current WMAP and large-scale structure data sort of weakly favour a positive value of the non-Gaussianity parameter $f_{\rm NL}^{\rm local}$ that is larger than the sensitivity claimed for Planck before its launch. So if it lives up to that sensitivity billing we might be in luck. On the other hand, my guess (based on not very much) is it's more likely that they will report a detection of the orthogonal form, $f_{\rm NL}^{\rm ortho}$, which is more difficult – but not impossible – to explain from inflationary models. Let's see.

Neutrino mass: The CMB power spectrum is sensitive to the total mass of all neutrino species, $\Sigma m_\nu$, through a number of different effects. Massive neutrinos form (hot) dark matter, contributing to the total mass density of the universe and affecting the distance scale to the last-scattering surface. They also increase the sound horizon distance at decoupling and increase the early ISW effect by altering the epoch of matter-radiation equality.

WMAP claim a current upper bound of $$\Sigma m_\nu<0.44\;{\rm eV}$$ at 95% confidence from the CMB and baryon acoustic oscillations and the Hubble parameter value. But a more recent SPT analysis suggests that WMAP and SPT data alone give weak indications of a non-zero value, so it is possible that Planck could place a lower bound on $\Sigma m_\nu$. This would be cool from an observational point of view, but it's not really "new" physics, since we know that neutrinos have mass.

Running of the spectral index: Purely based on extrapolating from WMAP results, I expect Planck will find some evidence for non-zero running of the spectral index. But given the difficulty in explaining such a value in most inflationary models, I also expect the community will continue to ignore this, especially since the vanilla model with no running will probably still provide an acceptable fit to the data.

Anything else? Speculate away … we'll find out on Thursday!

## Tuesday, March 19, 2013

### A real puzzle in cosmology: part II

(This post continues the discussion of the very puzzling observation of the integrated Sachs-Wolfe effect, the first part of which is here. Part II is a bit more detailed: many of these questions are real ones that have been put to me in seminars and other discussions.)

Update: I've been informed that one of the papers mentioned in this discussion has just been withdrawn by the authors pending some re-investigation of the results. I'll leave the original text of the post up here for comparison, but add some new material in the relevant sections.

Last time you told me about what you said was an unusual observation of the ISW effect made in 2008.

Yes. Granett, Neyrinck and Szapudi looked at the average CMB temperature anisotropies along directions on the sky where they had previously identified large structures in the galaxy distribution. For 100 structures (50 "supervoids" and 50 "superclusters"), after applying a specific aperture photometry method, they found an average temperature shift of almost 10 micro Kelvin, which was more than 4 standard deviations away from zero.

Then you claimed that this observed value was five times too large. That if our understanding of the universe was correct, the value they should have seen should not have been definitely bigger than zero. Theory and observation grossly disagree.

Right again. Our theoretical calculation showed the signal should have been at most around 2 micro Kelvin, which is pretty much the same size as the contamination from random noise.

But you used a simple theoretical model for your calculations. I don't like your model. I think it is too simple. That's the answer to your problem – your calculation is wrong.

That could be true – thought I don't think so. Why don't we go over your objections one by one?

## Thursday, March 7, 2013

### Higgs animations

The news recently from the LHC experiments hasn't been very exciting for my colleagues on the particle theory side of things (see for instance here for summaries and discussion). But via the clever chaps at ATLAS we do have a series of very nice gif animations showing how the evidence for the existence of the Higgs changed with time, as they collected more and more data.

This example shows the development of one plot, for the Higgs-to-gamma-gamma channel:

That's pretty cool. Also nice to see the gif format being put to better use than endless animations of cats doing silly things! (Though if you are a PhD student, you might find this use of gifs amusing ... )

Here's another one, this time for the decay channel to 4 leptons:

Note that in this case the scale on the y axis is also changing with time! There's a version of this animation with a fixed axis here, and one of the gamma-gamma channel with a floating axis here.

## Tuesday, March 5, 2013

### A real puzzle in cosmology: part I

In a previous post, I wrote about recent updates to the evidence from the cosmic microwave background for extra neutrino species. This was something that a lot of people in cosmology were prepared to get excited about, but I argued that reality turned out to be really rather boring. This is because the new data neither showed anything wrong with the current model of what the universe is made of, nor managed to rule out any competing models.

Today I'd like to write about something else, which currently is a really exciting puzzle. Measurements have been made of a particular cosmological effect, known as the integrated Sachs-Wolfe or ISW effect, and the data show a measured value that is five times larger than it should be if our understanding of gravitational physics, our model of the universe, and our analysis of the experimental method are correct. No one yet knows why this should be so. The point of this post is to try to explain what is going on, and to speculate on how we might hope to solve the puzzle. It has been written in a conversational format with the lay reader in mind, but there should be some useful information even for experts.

Before beginning, I should point out that this what a lot of my own research is about at the moment. In fact, this was the topic of a seminar I gave at the University of Helsinki last week (and much of this post is taken from the seminar). My host in Helsinki, Shaun Hotchkiss, with whom I have written two papers on this "ISW mystery", has also put up several posts about it at The Trenches of Discovery blog over the last year (see here for parts I, II, III, IV, V, and VI). I will be more concise and limit myself to just two!

Obviously you could view this as a bit of an effort at self-publicity. But at a time when, both in particle physics and cosmology, many experiments are disappointingly failing to provide much guidance on new directions for theorists to follow, this is one of the few results that could do so. (Unlike a lot of the rubbish you might read in other popular science reports, it also has a pretty good chance of being true.) So I won't apologise for it!

What is the integrated Sachs-Wolfe effect?

The entire universe is filled with very cold photons. These photons weren't always very cold; on the contrary, they are leftovers from the time soon after the Big Bang when the universe was still very young and very small and very hot, so hot that all the protons and electrons (and a few helium nuclei) formed a single hot plasma, the photons and electrons bouncing off each so often that they all had the same temperature. And as the universe was expanding, this plasma was also cooling, until suddenly it was cool enough for the electrons and protons to come together to form hydrogen atoms, without immediately getting swept apart again. And when this happened, the photons stopped bouncing off the electrons, and instead just continued travelling straight through space minding their own business, cooling as the universe continued to expand. (The neutrinos, which only interact weakly with other stuff, had stopped bouncing and started minding their own business some time before this.)

What I've just told you is a cartoon picture of the history of the early universe. These cooling photons streaming through space form the cosmic microwave background radiation, or CMB for short. They fill the universe, and they arrive at Earth from all directions – they even make up about 1% of the 'snow' you see on an (old-school) untuned TV set.

The most important property of the CMB photons is that, to a very great degree of accuracy, they are all at the same temperature, whichever direction they come from. This is how we know that the universe used to be very hot, and how we learned that it has been expanding since then. It is also why we think it is probably very uniform. The second important property of CMB photons is that they are not all at the same temperature – by looking carefully enough with an extremely sensitive instrument, we can see tiny anisotropies in temperature across the sky. These differences in temperature are the signs of the very small inhomogeneities in the early matter-radiation plasma which are responsible for all the structure we see around us in the night sky today. When the photons decoupled from the primordial plasma, they kept the traces of the tiny inhomogeneities as they streamed across the universe. The matter, on the other hand, was subject to gravity, which took the small initial lumpiness and over billions of years caused it to become bigger and lumpier, forming stars, galaxies, clusters of galaxies and vast clusters of clusters.

 The CMB sky as seen by the WMAP satellite. The colours represent deviations of the measured CMB temperature from the mean value – the CMB anisotropies (red is hot and blue is cold). This map uses the Mollweide projection to display a sphere in two dimensions. Image credit: NASA / WMAP Science team.

Yes I knew that, but what is the integrated Sachs-Wolfe effect?

## Thursday, February 28, 2013

### The nature of publications

A paper in the journal of Genome Biology and Evolution has been doing the rounds on the internet recently and was shown to me by a friend. It is titled "On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE", by Graur et al. The title is blunt enough, but the abstract is extraordinarily so. Let me quote the entire thing here:
A recent slew of ENCODE Consortium publications, specifically the article signed by all Consortium members, put forward the idea that more than 80% of the human genome is functional. This claim flies in the face of current estimates according to which the fraction of the genome that is evolutionarily conserved through purifying selection is under 10%. Thus, according to the ENCODE Consortium, a biological function can be maintained indefinitely without selection, which implies that at least 80 – 10 = 70% of the genome is perfectly invulnerable to deleterious mutations, either because no mutation can ever occur in these “functional” regions, or because no mutation in these regions can ever be deleterious. This absurd conclusion was reached through various means, chiefly (1) by employing the seldom used “causal role” definition of biological function and then applying it inconsistently to different biochemical properties, (2) by committing a logical fallacy known as “affirming the consequent,” (3) by failing to appreciate the crucial difference between “junk DNA” and “garbage DNA,” (4) by using analytical methods that yield biased errors and inflate estimates of functionality, (5) by favoring statistical sensitivity over specificity, and (6) by emphasizing statistical significance rather than the magnitude of the effect. Here, we detail the many logical and methodological transgressions involved in assigning functionality to almost every nucleotide in the human genome. The ENCODE results were predicted by one of its authors to necessitate the rewriting of textbooks. We agree, many textbooks dealing with marketing, mass-media hype, and public relations may well have to be rewritten.
Ouch.

The paper that Graur et al. implicitly deride as "marketing, mass-media hype and public relations" is one of series of publications in Nature (link here for those interested) by the ENCODE consortium. I'm not going to claim any expertise in genetics, though the arguments put forward by Graur appear sensible and convincing.1 But I do think it is interesting that the ENCODE papers were published in Nature.

Nature is of course a very prestigious journal to publish in. In some fields, the presence or lack of a Nature article on a young researcher's CV can make or break their career chances. It is very selective in accepting articles: not only must contributions meet all the usual requirements of peer-review, they should also be judged to be in "the five most significant papers" published in that discipline that year. It has a very high Impact Factor rating, probably one of the highest of all science journals. In fact it is apparently one of the very few journals that does better on citation counts than the arXiv, which accepts everything.

But among some cosmologists, Nature has a reputation for often publishing claims that are over-exaggerated, describe dramatic results that turn out to be less dramatic in subsequent experiments, or are just plain wrong.One professor even once told me – and he was only half-joking – that he wouldn't believe a particular result because it had been published in Nature.

It is easy to see how such things can happen. The immense benefit of a high-profile Nature publication to a scientist's career leads to a pressure to find results that are dramatic enough to pass the "significance test" imposed by the journal, or to exaggerate the interpretation of results that are not quite dramatic enough. On the other hand, if a particular result does start to look interesting enough for Nature, the authors may be – perhaps unwittingly – less likely to subject it to the same level of close scrutiny they would otherwise give it. The journal then is more reliant on its referee's to provide the scrutiny to weed out the hype from the substance, but even with the most efficient refereeing system in the world given enough submitted papers full of earth-shattering results, some amount of rubbish will always slip through.

I was thinking along these lines after seeing Graur et al.'s paper, and I was reminded of a post by Sabine Hossenfelder at the Backreaction blog, which linked to this recent pre-print on the arXiv titled "Deep Impact: Unintended Consequences of Journal Rank". As Sabine discusses, the authors point to quite a few undesirable aspects of the ranking of journals according to "impact factor", and the consequent rush to try to publish in the top-ranked journals. The publication bias effect (and in some cases, the subsequent retractions that follow) appear to be influenced to a degree by the impact factor of the journal in which the study is published. Another thing that might be interesting (though probably hard to check) is the link between the likelihood of scientists holding a press conference or issuing a press release to announce a result, and the likelihood of that result being wrong. I'd guess the correlation is quite high!

Of course the only real reason that the impact factor of the journal in which your paper is published matters is that it can be used a proxy indication of the quality of your work for the benefit of people who can't be bothered, or are unable, to read the original work and judge it on merit.

The other yardstick by which researchers are often judged is the number of citations their papers receive, which at least has the (relative) merit of being based on those papers alone, rather than other people's papers. Combining impact factor and citation count is even sillier – unless they are counted in opposition, so that a paper that is highly cited despite being in a low-impact journal gets more credit, and a moderately cited one in a high-impact journal gets less!

Anyway, bear these things in mind if you ever find yourself making a reflexive judgement about the quality of a paper you haven't read based on where it was published.

1The paper includes a quote which pretty well sums up the problem for ENCODE:
"The onion test is a simple reality check for anyone who thinks they can assign a function to every nucleotide in the human genome. Whatever your proposed functions are, ask yourself this question: Why does an onion need a genome that is about five times larger than ours?"
2Cosmologists (the theoretical ones, at any rate) actually hardly ever publish in Nature. Even observational cosmology is rarely included. So you might regard this as a bit a of a case of sour grapes. I don't think that is the case, simply because it isn't really relevant to us. Not having a Nature publication is not a career-defining gap for a cosmologist: it's just normal.

## Tuesday, February 19, 2013

### Things to Read, 19th February

I have just arrived in Helsinki, where I am visiting current collaborators and future colleagues at the Helsinki Institute of Physics for a few days. I will give a talk next Wednesday, about which more later. In the meantime though, a quick selection of interesting things I have read recently:
• Did you know that about 6 million years ago, the Mediterranean sea is believed to have basically evaporated, leaving a dry seabed? This is called the Messinian Salinity Crisis, which I first learned about from this blog. There's also an animated video showing a hypothesised course of events leading to the drying up:

Very soon after, the Atlantic probably came flooding back in over the straits of Gibraltar – an event known as the Zanclean Flood – and, according to some models, could have refilled the whole basin back up in a very short time. Spare a thought for the poor hippopotamuses that got stuck on the seabed ...
• A long feature in next month's issue of National Geographic Magazine is called The Drones Come Home, by John Horgan. Horgan has written a blog piece about this at Scientific American, which he has titled 'Why Drones Should Make You Afraid'. In the blog piece he has a bullet-point summary of the most disturbing facts about unmanned aircraft (military or otherwise) taken from the main piece. Some of these include:

- "The Air Force has produced [a video showing] possible applications of Micro Air Vehicles [...] swarming out of the belly of a plane and descending on a city, where [they] stalk and kill a suspect."
- "The Obama regime has quietly compiled legal arguments for assassinations of American citizens without a trial"
- "The enthusiasm of the U.S. for drones has triggered an international arms race. More than 50 other nations now possess drones, as well as non-governmental militant groups such as Hezbollah."

Scary stuff; worth reading the whole thing.
• I wrote some time ago about Niall Ferguson's argument about economics with Paul Krugman (this was in the context of a lot of nonsense Ferguson was coming up with at the time, both in his Reith lectures for the BBC, and in other publications). I just learned (via a post by Krugman, who also just learned) that Ferguson had already apparently admitted that he got it wrong, about a year ago. Krugman's response to that is here; I'd add that I notice this admission didn't seem to stop Ferguson continuing the same economic reasoning in his Reith lectures a few months later!
• A review of John Lanchester's new novel Capital, by Michael Lewis in the New York Review of Books. Quite often Almost always with the NYRB, I read reviews of books before I have read the actual book. In this case the result was to make me resolve to buy a copy.