Moving to Vienna for a postdoc (VP1)

From: http://www.traveldiscounters.ca/Travel-Vacations/Vienna-Austria-Travel.php

After finishing one's PhD, it is typical that budding scientists spend a period doing research abroad. Cue Wikipedia:

Postdoctoral research is scholarly research conducted by a person who has completed doctoral studies. It is intended to further deepen expertise in a specialist subject, including integrating a team, acquiring novel skills and methods. Postdoctoral research is often considered essential while advancing the scholarly mission of the host institution; it is expected to produce relevant publications.

A single such period spent in one place is colloquially called "a postdoc", and typically is one to two years in duration. It is common to spend several of these, often moving from place to place, accruing experience, publications, and applying for tenure track positions in ones native country (or anywhere). In practice, having postdoctoral experience from abroad is a de facto prerequisite for tenure positions in Finland, and in many other countries as well.

Well, I finished my PhD already in 2011, and have been doing a postdoc in the same university ever since. Mine was a exceptional case: I was involved in managing a couple of research projects, and responsible for a large international conference being held this June. After that is done, I'll take a well-deserved summer holiday, and the move to Vienna, Austria for a two year postdoc. Luckily I have also gotten some research done, but I really look forward to being able to concentrate on it more.

I thought it would be interesting to share my experiences of the process. I've lived in Helsinki my whole life, and despite having traveled quite a bit, moving abroad is not an insignificant step. Especially to a country whose language I don't speak (I had Russian instead of the typical French or German in school)! On top of that come considerations typical for my career stage: where to get funding, how to choose where to go (both professionally and personally), what to study (what is interesting to you vs. what gets funded), and so on. Luckily I was already able to secure a postdoc grant for the entire period, which will be the topic of my next post.

I am planning on writing a series of blog posts on all aspects of this transition, hopefully offering some useful advice for people who are pondering the same issues. I'll mark the posts by a running "(VP#)" numbering in the titles for easy reference. If you have any particular aspect you would like to read about, please let me know!

8 Comments

Science Europe's open access statement

The Science Europe foundation is an association of European research organisations whose objective is to strengthen collaboration between national research organisations throughout Europe. The most important Finnish research funder, Academy of Finland which currently only "recommends" open access, is a member. I recently personally heard from the Academy director Heikki Mannila that the they will closely follow the Science Europe's decisions in future Academy policy-making.

Thus the recently announced "Science Europe Position Statement: Principles on the Transition to Open Access to Research Publications (April 2013)" is especially important for Finnish researchers. To quote from the document:

The benefits of Open Access are clear; furthermore, the technology available allows for a decisive move towards making Open Access a reality. The ultimate goal is to move to a new and sustainable system of scholarly communication of Open Access that guarantees the highest quality of publications and maximises the impact of research results. Science Europe Member Organisations acknowledge that the transition towards such a system presents challenges and that a common understanding of these challenges, and a collective approach to tackle them, is the most efficient way forward to accomplish the transition.

Open access advocate Ross Mounce warmly welcomed the statement on the The Open Knowledge Foundation blog, highlighting it's rejection of hybrid open access - a counterpoint to some arguments that have been put forward. It will be very interesting to see if the Academy will interpret the position as a nudge towards implementing an actual mandate for Academy-funded research.


A few further bits and pieces:

  • I only just now ran into Cites & Insights: Crawford at Large, which is a web-based journal that is covering open access developments in great detail. The latest issue includes recent arguments for and against CC-BY licensing, a very nuanced discussion on various facets of the green vs. gold debate, and several items on article processing charges, repositories, the latest policy developments, etc, giving a very frank evaluation of the positions of various open access advocates.
  • Ars Technica published a rundown of the latest open access news.
  • A recent Scientific American blog post argues that Elite journals are going to hell in a handbasket since their proportion of the most highly cited articles is diminishing. Although the data is certainly interesting (and, coincidentally, showing that the plethora of Nature-branded journals is working well for NPG), I'm not quite convinced this is a huge trend at least yet.
  • Finally, as interviewed for Deutche Welle, the editor-in-chief of Nature Medicine raises the doubt that scientists are less inclined to do tough experiments for open access journals. Although some such effect might be real (scientists being human and swayed also by non-scientific motivations as is well known), it's a bit doubtful whether this distinction has anything to do with open access per se. Furthermore, new mechanisms such as open peer review would likely have a big impact in this.
1 Comment

Update on open access in Finland

I've been recently involved in a few initiatives related to open access in Finland and our university, Aalto.

First, there is an initiative to build a common Aalto Current Research Information System (ACRIS), where all Aalto researchers will input their published research. The system will directly feed into Aaltodoc, our university's publication archive. I was graciously given a chance for offering input to working groups for both projects, and in general the planned systems sound very sensible to me. However, the self-archiving workflow isn't as streamlined as it should be before the ACRIS system is adopted in 2015, and in the meanwhile, it is expected that significant self-archiving will start taking place in 2014 due to funder mandates.  As we recently wrote with my colleague Jani Kotakoski recently in Signum, the University of Helsinki open access mandate has not worked as well as it should had, and the technically overly complicated self-archiving process is surely partly to blame. Hopefully some of the suggestions that I made (such as metadata autofilling by doi) can be implemented before the system comes to widespread use, and Aalto will have better levels of compliance (and awareness).

I've also heard that our university is finalizing a draft of it's upcoming open access policy – I've been trying to get involved with the process, but so far the people making the decisions have decreed that the policy preparation should be done behind closed doors. I'm a bit peeved with this, and it certainly doesn't help with the criticism that Aalto administration is too closed to grass-roots participation. As we also wrote in Signum, Aalto stands in a very good position to learn from the experiences of others. Furthermore, the timing is very opportune since the international political situation has really started to take shape in the past year. I'm personally strongly in favor of Aalto further adopting a strict mandate for open access self-archiving of all Aalto-funded research, and believe I have the arguments to back why this would beneficial for the University. Let's hope they will open the process for comment in the draft stage.

Second, on March 19, there was the second meeting of the Tutkimuksen Tietoaineistot (TTA) -project looking at issues of open and linked data, metadata formats, long term data preservation etc. The second meeting was a bit technical for me, but it was good to hear what is being done on this front. Finally, the founding of an Open Science Finland working group was announced under the auspices of the Open Knowledge Finland association. There will be a joint meeting with the Finnish Open Access working group (FinnOA) on April 17, which I am going to attend.

Third, Bo-Christer Björk published an article on the rise of open access in the Finnish-language Tieteessä Tapahtuu magazine (loose translatable as "Happening in Science"). The article is a recap of the results of the peer-reviewed article Björk and his colleagues published in BMC Medicine last year, which made big impact when it came out (see e.g. the Guardian story I've linked to before). Nice to see more articles in Finnish on the topic that seems to be very much on the table nationally at the moment.

I'll keep following the situation and will be sure to keep the blog updated on new developments.

There are also a couple of interesting links that I'd like to add here.

Michael Eisen, one of the more radical open access advocates, gives an overview of The Past, Present and Future of Scholarly Publishing on his blog. Randal Olson draws attention to a potentially ugly side of the open science movement, and Heather Morrison decries Elsevier's licensing policies.

A couple of interesting podcasts recently came out. First, we have a pair of publishers justifying the costs incurred in publishing scientific research on The Chronicle of Higher Education. Then, we have an interview with Ross Mounce for the Journal of Ecology podcast. I had the pleasure of meeting Ross, who is a Panton Fellow at the Open Knowledge Foundation, at the OKFest in Helsinki. He's very active in advocating open data and open access, and I warmly recommend subscribing to his blog at http://rossmounce.co.uk.

Finally, the Nature Publishing Group announced the upcoming launch of a data journalScientific Data. While on the face of it the initiative is laudable and in line with open science ideals, some of its specifics (e.g. the article fees and licensing terms) have already generated a lot of critical discussion on the Open Science mailing list.

It will be interesting to see how the debate develops when we move closer to the launch.

1 Comment

Nature's special issue on the future of publishing

Nature Magazine's latest issue is a special on the future of publishing. There's a ton of great content, starting with the introduction and the Editorial:

New technologies allow a much greater and faster transition to a digital future, and this week’s special issue reveals that scientists are finding a multitude of ways to publish and access their research results. As this journal has noted before, the future of research literature will ideally be an amalgam of papers, data and software that interlinks with tools for analysis, annotation, visualization and citation. The need for common standards is as great as ever.

But it is demand, not supply, that will shape how scientists and publishers grasp these opportunities. For instance, a key reason that online open-access journals are now accepted as a mainstream (if still minority) method of publishing research is because of the mandates steadily introduced since 2001 by institutions and by research funders.

The issue features a story on the rise of predatory publishing, a truly outrageous case of using journal identity theft to scam authors, on the changing roles of libraries and open data, on licensing issues (paywalled, alas), on the future of scholarly communication (see also this comment), and many more.

However, my favorite article by far is Richard Van Noorden's meticulously researched look at the true cost of science publishing. His article covers almost all the important emerging data on the many aspects of the issue, and addresses most of the recent debates surrounding the cost of article processing fees, true publication costs, the debate on the added value of traditional publishers, and licensing issues. Richard was also a guest on the latest Nature podcast.

I'll highlight a few of the passages that most caught my eye, but I really recommend reading the whole thing. On the current debate:

The variance in prices is leading everyone involved to question the academic publishing establishment as never before. For researchers and funders, the issue is how much of their scant resources need to be spent on publishing, and what form that publishing will take. For publishers, it is whether their current business models are sustainable — and whether highly selective, expensive journals can survive and prosper in an open-access world.

On the current wide variety in the cost of publishing:

Data from the consulting firm Outsell in Burlingame, California, suggest that the science-publishing industry generated $9.4 billion in revenue in 2011 and published around 1.8 million English-language articles — an average revenue per article of roughly $5,000. Analysts estimate profit margins at 20–30% for the industry, so the average cost to the publisher of producing an article is likely to be around $3,500–4,000.... Outsell estimates that the average per-article charge for open-access publishers in 2011 was $660. ... But Philip Campbell, editor-in-chief of Nature, estimates his journal's internal costs at £20,000–30,000 ($30,000–40,000) per paper.

On publisher profits:

Elsevier's reported margins are 37%, but financial analysts estimate them at 40–50% for the STM publishing division before tax. (Nature says that it will not disclose information on margins.) Profits can be made on the open-access side too: Hindawi made 50% profit on the articles it published last year, says Peters.... Commercial publishers are widely acknowledged to make larger profits than organizations run by academic institutions. A 2008 study by London-based Cambridge Economic Policy Associates estimated margins at 20% for society publishers, 25% for university publishers and 35% for commercial publishers.

On added value:

The key question is whether the extra effort adds useful value, says Timothy Gowers, a mathematician at the University of Cambridge, UK, who last year led a revolt against Elsevier. Would scientists' appreciation for subscription journals hold up if costs were paid for by the authors, rather than spread among subscribers? “If you see it from the perspective of the publisher, you may feel quite hurt,” says Gowers. “You may feel that a lot of work you put in is not really appreciated by scientists. The real question is whether that work is needed, and that's much less obvious.” ... A more-expensive, more-selective journal should, in principle, generate greater prestige and impact. Yet in the open-access world, the higher-charging journals don't reliably command the greatest citation-based influence, argues Jevin West, a biologist at the University of Washington in Seattle. Earlier this year, West released a free tool that researchers can use to evaluate the cost-effectiveness of open-access journals (see Nature http://doi.org/kwh; 2013).

On the path forward:

More than 60% of journals already allow authors to self-archive content that has been peer-reviewed and accepted for publication, says Stevan Harnad, a veteran open-access campaigner and cognitive scientist at the University of Quebec in Montreal, Canada. Most of the others ask authors to wait for a time (say, a year), before they archive their papers. However, the vast majority of authors don't self-archive their manuscripts unless prompted by university or funder mandates.

As I said, really excellent reporting, and is one of those things that goes a long way to justify for me the current cost structure of the Nature Publishing Group. Go read it all.

Comment

Planck results on the CMB, and latest on the Higgs

The Planck Collaboration released the most detailed map yet of the Cosmic Microwave Background (CMB), as measured by the European Space Agency's (ESA) Planck satellite, along with an arXiv release of 30 submitted research articles analyzing the data. What boggles my mind is that as far as I've understood, these measurement represent pretty much the best possible data that can be obtained from the CMB, limited not by instrumentation but by fundamental quantum effects.

The Starts with a Bang blog gave a very good primer on what was expected from the data on the eve of the release, which I recommend reading before checking out the results. (See also Sean Carroll's anticipatory post on his blog.) For the results, you can read the official press release by ESA, the story by Nature News, or head back to Starts with a Bang for an excellent recap of the results:

So yes to inflation, no to gravitational waves from it.

Yes to three very light, standard-model neutrinos, no to any extras.

Yes to a slightly slower-expanding, older Universe, no to spatial curvature.

Yes to more dark matter and normal matter, yes also to a little less dark energy.

It thus seems the results were not very unexpected, although the corrections to the energy balance and age of the universe were perhaps a bit more significant than was expected. See also Peter Woit's take on the implications for string theory (spoiler: no support whatsover).


The other breaking recent news came from the Moriond particle physics conference (it seems physicists have a good thing going with these skiing conferences, as I personally also know :), which saw the release of the latest data from teams working on the Higgs boson at the Large Hadron Collider. The Quantum Diaries presents the results thus:

No more Higgs-like, Higgs-ish or even Higgsy boson. The CMS and ATLAS collaborations, the two large experiments operating at the Large Hadron Collider (LHC) at CERN, have now gathered sufficient evidence to say that the new boson discovered last summer is almost certainly “a” Higgs boson. Note that we are going to call it “a” Higgs boson and not “the” Higgs boson since we still need more data to determine what type of Higgs boson we have found. But all the analysis conducted so far strongly indicates that we are indeed dealing with a type of Higgs boson.

In a nutshell: it's a Higgs boson (as opposed to anything else), and in all likelyhood the Higgs boson (fully as predicted by the Standard Model of particle physics), with no anomalous properties or new physics still in sight. However, as stated in the press release, more data needs to be collected and analyzed before this is conclusive. Unfortunately due to the LHC shut-down, we will have to wait two years for new more stringent data.

For the most important plots of the data, see the viXra log, or the slides from the relevant talk at Moriond. For analysis of the agreement with the Standard model, see the Resonaances blog. Although many theoreticians are dismayed by the lack of any evidence for supersymmetry or other popular theories beyond the Standard model, there are some who think this is a good sanity check and will direct research to more fruitful directions.

8 Comments

New models for academic publishing

New models for academic publishing
New models for academic publishing

Academic publishing is changing fast. In this post, I'll describe not only the exciting recent policy developments, but also several new models for the publication of scientific research.

Undoubtedly open access in set to truly break through this year, mainly thanks to strong funder mandates (e.g. RCUK and Horizon 2020). The debate has started to shift to the relative merits of author-pays-gold (or rather, funder-pays; see below) and institutional-repositories-green models. (Please see Peter Suber's widely accepted definitions for the terminology). Richard Poynder continues his important coverage of the developments, describing in detail the controversy over the UK's gold-first policy in the international context:

As the Committee began to explore the complaints it had received, a key issue emerged: Was RCUK’s policy in line with what the rest of the world was doing with OA, or was the country taking a risky gamble in the hope of acquiring a leadership role in the development of OA?

[...]

At a stroke, the risk the RCUK policy could pose for the UK begins to look much greater. If the rest of the world follows the US lead, rather than the lead of RCUK, the UK will likely discover that the extra transition costs it anticipates (paying both APCs and subscription) could continue indefinitely.

The "US lead" refers to the just introduced US bill dubbed FASTR, which would require free public access to all federally funded (published) research after a 6 month embargo period (green self-archiving), and also includes a lauded new open data component. Although such legislation has been introduced before, the current political climate is looking more amenable this time around. As the White House was still obligated to respond to a petition that gathered 65000 signatures demanding such a mandate, there were realistic hopes for a strong response from the Obama administration.

Thus, although not completely unexpected, The White House's response to the petitioners last Friday was well received indeed:

The logic behind enhanced public access is plain. We know that scientific research supported by the Federal Government spurs scientific breakthroughs and economic advances when research results are made available to innovators. Policies that mobilize these intellectual assets for re-use through broader access can accelerate scientific breakthroughs, increase innovation, and promote economic growth. That’s why the Obama Administration is committed to ensuring that the results of federally-funded scientific research are made available to and useful for the public, industry, and the scientific community.

Moreover, this research was funded by taxpayer dollars. Americans should have easy access to the results of research they help support.

The way the US government will ensure this is by an overarching mandate that any agency receiving more than $100M yearly in Federal funding for research or development must ensure that their research is openly available after a guideline 12 month embargo period (green self-archiving). A full memorandum of the decision is available on the White House Office of Science and Technology Policy's website. See also reporting by Nature News (also this) and Ars Technica, and the response from the National Science Foundation. Such a strong green mandate from the US leaves the UK policy looking rather lonely in the international context. Nature's recent editorial – perhaps unsurprisingly, since Nature has a vested interest in gold – expressed some disappointment:

But in 2013, it looks as if a combination of financial constraints and a lack of firm resolve at the top of the US government is blocking movement towards the policy that ultimately benefits science the most: ‘gold’ open access, in which the published article is immediately freely available, paid for by a processing charge rather than by readers’ subscriptions.

[...]

As for Nature, we view the US position as a signal that in the longer term, for highly selective journals, fully funded gold open access is a scientific necessity.

However, Peter Suber, one of the leaders of the open access movement, warmly welcomed the decision in an statement on Google+ soon after the announcement. He highlighted the similarities and differences between the White House memo and the FASTR legislation, making the case that both are necessary. For a more in-depth analysis, see his latest SPARC newsletter, where Suber compellingly argues that the directive should be viewed as a step forward despite the 12 month embargo:

Some friends of OA have criticized the White House policy for allowing excessive embargoes. I join the criticism but praise the policy. We must distinguish a backward step from a forward step that could have been larger. The White House directive is a forward step that could have been larger. Whatever the reasons were for not taking a larger step forwards, there's no sense at all in which it's a step backwards.

However, as continuing resistance from incumbent publishers remains strong, so does the rhetoric of open access advocates not content even with the recent progress, and more wrangling over the exact course of the change looks unavoidable.

Whatever happens with legislation, it is clear we are at a tipping point, with some estimates finding as much as 17% of literature published in 2011 openly available from publishers (however, see the discussion on the Guardian report of the study). With such a major shift in the centuries-old publishing industry in full swing, lean newcomers are seeking to challenge the incumbents, while the old giants are scrambling to reinvent themselves (at least on paper). New ideas and models are cropping up at an increasing pace, and several interesting novel ways of organizing the publication of scientific research have started to emerge.

One of the more disruptive new initiatives is the recently launched PLoS One –like journal PeerJ, which is trying to change the old adage "publish or perish" to "publish until you perish". PeerJ offers very affordable lifetime memberships that allow the members to publish either one, two or unlimited papers per year for a single, one-time flat fee. Additionally, PeerJ will use article-level metrics (as does PloS One) and (opt-in) open peer review. The launch has garnered some very positive reactions, although questions about the sustainability of the funding model seem to remain. An important issue is the quality of the peer review, and PeerJ's approach of having members review one article per year is clever (and already in use elsewhere). It is also worth mentioning that a few companies are trying to decouple peer review from journals altogether.

Another interesting development is the concept of arXiv overlay journals:

What is an arXiv overlay journal? It is just like an electronic journal, except that instead of a website with lots of carefully formatted articles, all you get is a list of links to preprints on the arXiv. The idea is that the parts of the publication process that academics do voluntarily — editing and refereeing — are just as they are for traditional journals, and we do without the parts that cost money, such as copy-editing and typesetting.

The Episciences Project is trying to set up a platform to make it very straightforward to set up such arXiv overlay journals. For the moment this activity seems to be concentrated on mathematicians, including Elsevier boycotter Timothy Gowers. Although open access advocate Steve Harnad has questioned the need to give this approach a new name, that seems to be a rather secondary concern considering the real change this approach would represent.

As mentioned in the Guardian story, the article processing fees (APCs) of many commercial publishers are rather high (and the degree of openness this buys varies greatly). This offers a clear opportunity for newcomers to establish themselves. Three more traditional type new open access journals taking advantage of this are eLife, which sets to compete with Science and Nature and is able to waive it's APCs at least for time being due to society support; the aggressively priced The Forum of Mathematics, which some of the people behind the Elsevier boycott are backing; the PLoS-like Open Library of Humanities; and a series of IZA's economics journals.

Consolidation of publishers is still ongoing, the latest news being that the Nature Publishing Group (NPG) has bought a majority share in the Swiss-based community-driven open access publisher Frontiers (read the announcement, Nature News Blog). This marks a big step up in the volume of open access articles NPG publishes; all NPG journals published 2000 open access articles in 2012, while Frontiers had doubled it's article count to 5000 to become the world's fifth largest open access publisher. Frontiers had also experimented with open peer review and other community features, which NPG is very interested in.

I've also personally had a good publishing experience with the Beilstein Journal of Nanotechnology, which is also able to waive APCs due to society support, at least for the time being. Although commercial publishers understandably have a great interest in defending their high APCs, emerging data doesn't seem to show a strong correlation between high price and article-level citation impact. This is an indicator that will be very interesting to follow as the citations stack up.

A related debate is the claim that gold open access is a burden on the authors themselves due to APCs. Peter Suber of the Harvard open access initiative debunked this claim thoroughly on Google+, while providing interesting data on where the costs actually fall. Commenters chimed in with important further details, such as the distribution of open access journals that do not and those that do charge APCs by scientific discipline. Many publishers (including the Forum of Mathematics) also have fee waivers for non-affiliated or non-supported authors.

Finally, there has been lively recent discussion on the best licensing terms for open access publications. Traditionally, open access activists (and indeed most journals) have favored the most liberal Creative Commons Attribution (CC-BY) license, which basically allows any sort of reuse with the proper citation. Some journals have opted for the CC-BY-NC license, which prohibits commercial reuse, but this can be problematic for several reasons. However, others argue that the choice of license is not always completely unproblematic.

To conclude, it looks clear that the year 2013 will be as interesting for open access as 2012. Since these issues are currently very much on the table in Finland as well as internationally, I'll be sure to keep updated on any interesting new developments.

In the meantime, if you can read Finnish, check out the magazine articles we have written on the topic with my colleague Jani Kotakoski, especially the latest one that just came out in the magazine of the Finnish Research Library Association, Signum. We are also looking for new publication venues to keep following the situation in Finnish, so any tips on that front are most welcome. Please also see the slides (embedded below) of a recent talk I gave on the positive effects open access and social media could have on research impact.

My thanks to Ross Mounce and Jani Kotakoski for commenting on a draft of this post.

[slideshare id=15932694&doc=openaccessandsocialmedia-130110080010-phpapp02]

7 Comments

A Year in Helsinki (timelapse)

[youtube http://www.youtube.com/watch?v=zjMV3cu4tX8?rel=0&w=599&h=337]An awesome timelapse video shot mostly during the winter in Helsinki – most places are very familiar to me. Click here to watch it fullscreen in HD.

Comment

The Overview Effect (video)

[vimeo http://www.vimeo.com/55073825 w=500&h=213]

OVERVIEW from Planetary Collective on Vimeo.

Watched this yesterday - amazingly inspirational. It truly sounds that seeing the Earth from space is a genuine spiritual experience. I'm definitely going one way or another before I die.

Comment

The Episciences Project

An interesting new journal platform concept called the Episciences Project was recently unveiled by Timothy Gowers on his blog and subsequently reported on Nature News:

Many mathematicians — and researchers in other fields — claim that they already do most of the work involved in publishing their research. At no cost, they type up and format their own papers, post them to online servers, join journal editorial boards and review the work of their peers. By creating journals that publish links to peer-reviewed work on servers such as arXiv, Demailly says, the community could run its own publishing system. The extra expense involved would be the cost of maintaining websites and computer equipment, he says.

An interesting idea, although I do think the proofreading, unified formatting and copyediting services – at least for the better journals – are not superfluous, overly expensive as they might be. Just think about the legibility of some articles I'm sure many of us have reviewed. A middle way sounds more appealing to me, but of course it's better the more things are tried.

On a related note, Mike Taylor at the Guardian recently wrote a rather strong blog post on the immorality of publishing behind a paywall that is worth a read.

 

Comment

Quantum foundations poll

Nature News and the Quantum Frontiers blog reported on a recent poll Anton Zeilinger and colleagues conducted at a quantum foundations meeting in Vienna, and published on arXiv. The poll was designed to gauge the opinions of the participants (which included philosophers of science as well as active physicists) on the interpretation of quantum mechanics. Unsurprisingly, there was no consensus on the "correct" interpretation (from Nature News):

For example, votes were roughly evenly split between those who believe that, in some cases, “physical objects have their properties well defined prior to and independent of measurement” and those who believe that they never do. And despite the famous idea that observation of quantum systems plays a key role in determining their behaviour, 21% felt that “the observer should play no fundamental role whatsoever”.

If I had to toss my 2 cents, I would choose the "physical objects have their properties well defined prior to and independent of measurement" and the "observer should play no fundamental role whatsoever" hats. Of course, measurement can and will affect the state of the system, but I gather the idea was to argue against the idea that . I'm not completely sure if agree that there should be "no fundamental limit to quantum theory" – I still find Penrose's objective reduction ideas attractive, though only experimental evidence will tell one way or the other.

Sean Carroll posted a graph of the results on his blog and called it the "most embarrassing graph in physics":

He explained the reason for calling it this as follows:

Think about it — quantum mechanics has been around since the 1920′s at least, in a fairly settled form. John von Neumann laid out the mathematical structure in 1932. Subsequently, quantum mechanics has become the most important and best-tested part of modern physics. Without it, nothing makes sense. Every student who gets a degree in physics is supposed to learn QM above all else. There are a variety of experimental probes, all of which confirm the theory to spectacular precision.

And yet — we don’t understand it. Embarrassing. To all of us, as a field (not excepting myself).

I'd have to agree – it is embarrassing that there is so little understanding of the interpretation of quantum mechanics. The results where the majority did agree were (from Quantum Frontiers):

1. Quantum information is a breath of fresh air for quantum foundations (76%). 2. Superpositions of macroscopically distinct states are in principle possible (67%). 3. Randomness is a fundamental concept in nature (64%). 4. Einstein’s view of quantum theory is wrong (64%). 5. The message of the observed violations of Bell’s inequalities is that local realism is untenable (64%). 6. Personal philosophical prejudice plays a large role in the choice of interpretation (58%). 7. The observer plays a fundamental role in the application of the formalism but plays no distinguished physical role (55%). 8. Physical objects have their properties well defined prior to and independent of measurement in some cases (52%). 9. The message of the observed violations of Bell’s inequalities is that unperformed measurements have no results (52%).

I've always felt that the Copenhagen interpretation to be an unsatisfying cop-out to cover the fact that we (still!) don't understand the theory well enough at a fundamental level. It's nice to see the amount of recent activity on the topic, though I find the thought that "there will still be conferences on the foundations of quantum theory in 50 years time" a bit depressing..

Sean agrees on Copenhagen and ends on a bit more optimistic note:

 All we have to do is wrap our brains around the issue, and yet we’ve failed to do so.

I’m optimistic that we will, however. And I suspect it will take a lot fewer than another eighty years. The advance of experimental techniques that push the quantum/classical boundary is forcing people to take these issues more seriously. I’d like to believe that in the 21st century we’ll finally develop a convincing and believable understanding of the greatest triumph of 20th-century physics.

2 Comments