Quantum biology

Quantum biology is the study of if and how quantum effects play a role in biological organisms, traditionally considered too warm, big and messy for them to matter. Recent research has, however, started to uncover that several biological systems do exhibit interesting quantum effects.

This TED talk by Jim Al-Khalili probably won't surprise you if you have been following the field as I have, but it is a superb recap and introduction to the topic.

"How does a robin know to fly south? The answer might be weirder than you think: Quantum physics may be involved. Jim Al-Khalili rounds up the extremely new, extremely strange world of quantum biology, where something Einstein once called “spooky action at a distance” helps birds navigate, and quantum effects might explain the origin of life itself." via TED Talks

"Could a quantum computer have subjective experience?" by Scott Aaronson

People who are into physics and follow blogs actively have surely ran into MIT physicist Scott Aaronson, probably most well known for his critiques of the alleged D-Wave quantum computer. More recently, Scott has been writing a lot about consciousness, but his latest post – prepared talk notes from the Quantum Foundations of a Classical Universe meeting – is a real doozy. It's a long read but well worth the trouble.

Read More

Physics Nobel to Englert and Higgs

As widely expected, the 2013 Nobel Prize in Physics was awarded for the Higgs boson. The committee chose to award only the theoretical prediction, omitting the experimental teams at CERN to the annoyance of some. Nobel tradition notwithstanding, apparently there were no strict rules preventing the inclusion of CERN as an organization for the physics prize, which would indeed better reflect how fundamental science is done these days. Admittedly, the Nobel committee gave a very visible nod to the experimentalist in awarding the prize

for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider.

As always, Nature News and Ars Technica have good stories covering the prize, and there's also extensive reporting by the BBC. However, even awarding theoreticians for this discovery was tricky; as the Nobel committee puts it, the award was for the "Brout-Englert-Higgs (BEH)-mechanism", with only François Englert and Peter Higgs (both in their 80s) sharing the prize since Robert Brout deceased in 2011. Some have argued that very important earlier contributions from Anderson should had been recognized, as well as independent but slightly later work by Kibble, Guralnik and Hagen. As Nature News puts it:

In 1964, six physicists independently worked out how a field would resolve the problem. Robert Brout (who died in 2011) and Englert were the first to publish, in August 1964, followed three weeks later by Higgs — the only author, at the time, to allude to the heavy boson that the theory implied. Tom Kibble, Gerald Guralnik and Carl Hagen followed. “Almost nobody paid any attention,” says Ellis — mostly because physicists were unsure how to make calculations using such theories. It was only after 1971, when Gerard ’t Hooft sorted out the mathematics, that citations started shooting up and the quest for the Higgs began in earnest.

So numerous were the theorists involved, that Higgs reputedly referred to the ABEGHHK’tH (Anderson–Brout–Englert–Guralnik–Hagen–Higgs–Kibble–’t Hooft) mechanism.

...but I guess the ABEGHHK'tH-mechanism just doesn't roll off the tongue as "Higgs" or "BEH" :)

For additional context on the theoretical developments, see this post on the LHC's Quantum Diaries blog. However, if you have any knowledge in quantum theory, I would most warmly recommend putting in the effort to read the Scientific Background from the Nobel committee, which describes the preceding and parallel developments, and later significance and discovery, in great detail and as clearly as can reasonably be expected, giving full credit where it is due. The committee even visibly acknowledge the role of the US in the discovery, which some felt needed to be explicitly defended. Of course, the Popular information document is more accessible, and very well written.

In the end, despite the potential for controversy, the decision seems to be reasonably well received, with Hagen showing the most emotion:

"Regarding the committee’s choice, “I think in all honesty, this is what I would have done,” says John Ellis, a theoretical physicist at CERN, Europe’s particle-physics lab near Geneva, Switzerland." (Nature News)

"The whole of CERN was elated today to learn that the Nobel Prize for Physics had been awarded this year to Professors François Englert and Peter Higgs for their theoretical work on what is now known as the Brout-Englert-Higgs mechanism." (Pauline Gagnon via Quantum Diaries)

"The discovery of the Higgs boson at Cern... marks the culmination of decades of intellectual effort by many people around the world." (Rolf Hauer via the BBC)

"My two collaborators, Gerald Guralnik and Carl Richard Hagen, and I contributed to that discovery, but our paper was unquestionably the last of the three to be published in Physical Review Letters in 1964 (though we naturally regard our treatment as the most thorough and complete) and it is therefore no surprise that the Swedish Academy felt unable to include us, constrained as they are by a self-imposed rule that the prize cannot be shared by more than three people. My sincere congratulations go to the two prize winners, Francois Englert and Peter Higgs." (Tom Kibble via BBC News)

“Faced with a choice between their rulebook and an evenhanded judgment, the Swedes chose the rulebook,” Hagen said in a blunt e-mail shortly thereafter. “Not a graceful concession by any means, but that department has never been my strong suit.” (Robert Hagen via the Washington Post)

“It stings a little,” Guralnik said. But he added: “All in all, it’s a great day for science. I’m really proud to have been associated with this work that has turned out to be so important.” (Gerard Guralnik via the Washington Post)

To wrap up, Ken Bloom via LHC's Quantum Diaries blog offers a very practical perspective of science in the trenches:

I suppose that my grandchildren might ask me, “Where were you when the Nobel Prize for the Higgs boson was announced?” I was at CERN, where the boson was discovered, thus giving the observational support required for the prize. And was I in the atrium of Building 40, where CERN Director General Rolf Heuer and hundreds of physicists had gathered to watch the broadcast of the announcement? Well no; I was in a small, stuffy conference room with about twenty other people.

[...]

So in the end, today was just another day at the office — where we did the same things we’ve been doing for years to make this Nobel Prize possible, and are laying the groundwork for the next one.

Nature News: 'Proof mooted for quantum uncertainty'

So understandably written yet exciting and deep reporting from Nature News once again:

He suspects that a strange quantum concept known as negative probability — negative dips in the probability distribution of a particle’s location or momentum — could be at the heart of the issue. These dips may mean that a measuring device disturbs the system less than the uncertainty principle seems to allow. “The fact these two different definitions give you a different answer is telling you something about the weirdness of quantum mechanics,” says Wiseman.

Related to one of my early posts on the blog:

One of the most fascinating and to my mind central questions of contemporary physics is the ontological status of quantum objects – does the quantum wavefunction describe reality as it is, or merely our possible knowledge of it. A related question is: where is the limit between the quantum and the classical? Earlier assumptions about limiting quantum effects to extremely small systems(see also a TED talk on the topic), only non-biological systems, or extremely cold systems have all been vigorously pushed back by improvements in experimental techniques. [...] Update (12.9.2012): a new news article was just posted on Nature News on the limitations of Heisenberg’s original formulation of the uncertainty principle, which is quite relevant for this discussion. The bottom line is that the uncertainty is not a result of a perturbation of the object by the act of measurement, but rather an inherent property of quantum systems. Ars Technica again has an extremely lucid story on the study.

Quantum foundations poll

Nature News and the Quantum Frontiers blog reported on a recent poll Anton Zeilinger and colleagues conducted at a quantum foundations meeting in Vienna, and published on arXiv. The poll was designed to gauge the opinions of the participants (which included philosophers of science as well as active physicists) on the interpretation of quantum mechanics. Unsurprisingly, there was no consensus on the "correct" interpretation (from Nature News):

For example, votes were roughly evenly split between those who believe that, in some cases, “physical objects have their properties well defined prior to and independent of measurement” and those who believe that they never do. And despite the famous idea that observation of quantum systems plays a key role in determining their behaviour, 21% felt that “the observer should play no fundamental role whatsoever”.

If I had to toss my 2 cents, I would choose the "physical objects have their properties well defined prior to and independent of measurement" and the "observer should play no fundamental role whatsoever" hats. Of course, measurement can and will affect the state of the system, but I gather the idea was to argue against the idea that . I'm not completely sure if agree that there should be "no fundamental limit to quantum theory" – I still find Penrose's objective reduction ideas attractive, though only experimental evidence will tell one way or the other.

Sean Carroll posted a graph of the results on his blog and called it the "most embarrassing graph in physics":

He explained the reason for calling it this as follows:

Think about it — quantum mechanics has been around since the 1920′s at least, in a fairly settled form. John von Neumann laid out the mathematical structure in 1932. Subsequently, quantum mechanics has become the most important and best-tested part of modern physics. Without it, nothing makes sense. Every student who gets a degree in physics is supposed to learn QM above all else. There are a variety of experimental probes, all of which confirm the theory to spectacular precision.

And yet — we don’t understand it. Embarrassing. To all of us, as a field (not excepting myself).

I'd have to agree – it is embarrassing that there is so little understanding of the interpretation of quantum mechanics. The results where the majority did agree were (from Quantum Frontiers):

1. Quantum information is a breath of fresh air for quantum foundations (76%). 2. Superpositions of macroscopically distinct states are in principle possible (67%). 3. Randomness is a fundamental concept in nature (64%). 4. Einstein’s view of quantum theory is wrong (64%). 5. The message of the observed violations of Bell’s inequalities is that local realism is untenable (64%). 6. Personal philosophical prejudice plays a large role in the choice of interpretation (58%). 7. The observer plays a fundamental role in the application of the formalism but plays no distinguished physical role (55%). 8. Physical objects have their properties well defined prior to and independent of measurement in some cases (52%). 9. The message of the observed violations of Bell’s inequalities is that unperformed measurements have no results (52%).

I've always felt that the Copenhagen interpretation to be an unsatisfying cop-out to cover the fact that we (still!) don't understand the theory well enough at a fundamental level. It's nice to see the amount of recent activity on the topic, though I find the thought that "there will still be conferences on the foundations of quantum theory in 50 years time" a bit depressing..

Sean agrees on Copenhagen and ends on a bit more optimistic note:

 All we have to do is wrap our brains around the issue, and yet we’ve failed to do so.

I’m optimistic that we will, however. And I suspect it will take a lot fewer than another eighty years. The advance of experimental techniques that push the quantum/classical boundary is forcing people to take these issues more seriously. I’d like to believe that in the 21st century we’ll finally develop a convincing and believable understanding of the greatest triumph of 20th-century physics.

Is there any particle physics beyond the standard model?

The answer seems to be: NO.

Starts with a Bang reports on the upcoming LHCb publication on the muon–anti-muon decay channel of a B meson consisting of a strange quark and a bottom antiquark. The observed decay rate matches exactly the Standard Model prediction, casting very serious doubts on the existence of supersymmetric particles below 1 TeV. This in practice constrains supersymmetry out of the physically relevant energy ranges, and string theory along with it.

See also Peter Woit's pithy take on the results, and Ars Technica's ever-lucid reporting, and the story in Scientific American via Nature News. Matt Strassler offers meta-commentary that is worth a read as well.

Edit: additional reporting from Nature News, and a round-up of SUSY proponents' recent talks from Peter Woit.

Simulation: Quantum leaps

Nature News reporting on the rise of quantum simulators:

Quantum simulators are the lesser sibling of an idea in physics known as quantum computers, which have been touted for more than three decades as a way to do everything from complex modelling to code-breaking. What the simulators and computers share is an ability to operate by the rules of quantum mechanics. Where they differ is in computational power: quantum computers are general-purpose machines able to carry out any possible algorithm, whereas quantum simulators have to be tailored specifically for the problem at hand.

Haven't heard of any applications in nanoscience so far, but who knows how things develop.

On the foundations of quantum mechanics

One of the most fascinating and to my mind central questions of contemporary physics is the ontological status of quantum objects – does the quantum wavefunction describe reality as it is, or merely our possible knowledge of it. A related question is: where is the limit between the quantum and the classical? Earlier assumptions about limiting quantum effects to extremely small systems (see also a TED talk on the topic), only non-biological systems, or extremely cold systems have all been vigorously pushed back by improvements in experimental techniques. As a primer on the subject, the Brooklyn Institute for Social Research published an interesting discussion with philosopher David Albert on the foundations of quantum physics. Albert is – somewhat controversially – extremely ambitious for the ultimate scope of physical theory, which in his view should incorporate our consciousness and psychology as well. However, he interestingly does NOT consider the question "why is there something rather than nothing", which was touched on in the previous post, to be a valid scientific question. In any case, it's a fascinating and accessible talk and the first part about the foundations of quantum mechanics and the measurement problem is particularly good.

Update (12.9.2012): a new news article was just posted on Nature News on the limitations of Heisenberg's original formulation of the uncertainty principle, which is quite relevant for this discussion. The bottom line is that the uncertainty is not a result of a perturbation of the object by the act of measurement, but rather an inherent property of quantum systems. Ars Technica again has an extremely lucid story on the study.

This brings us to the actual reason for this post, a research article which some say might be one of the most important results in quantum foundations in decades. The work itself is rather technical, but to sum it up, it offers solid evidence for the reality of the wavefunction, provided some rather modest assumptions hold. Nature News covered the publication while it was still in arXiv (that is, not yet peer reviewed and published) and later in a follow-up after the publication. Both news articles are well worth a read (pay attention to the evolution of the titles).

What prompted me to write about this now was the extremely interesting back story to the article and the news items, explained on the excellent Cosmic Variance blog in a guest post by one of the authors, Terry Randolph. It gives a rare peek behind the curtain of scientific method, illustrating (perhaps in a slightly negative light) the role of the editorial and peer review processes to what gets published and where. This stuff is extremely interesting to scientists, but perhaps it will be illuminating for the lay reader as well.

I'll likely have more to say on the topic from personal experience in the (hopefully) near future.