Virtually every young student of science has had a fantasy about winning a Nobel Prize. It has become deeply embedded in popular culture, representing not only having done something great but actually being great. But as we mature, these fantasies quickly take a backseat to reality. Only a small fraction of scientists even come into personal contact with Nobel laureates; for everyone else, they take on an almost mythical quality and are not relevant to their own lives. So nobody goes into an area of research with the idea that there will be a big award at the end. Rather, we enter a field out of curiosity and interest in the problem, its applications, perhaps its benefit to the world, and, more pragmatically, job prospects.

Nevertheless, scientists are only human. Like everyone else, we can be ambitious and competitive and crave recognition. Instead of inculcating a feeling that the work is its own reward, the scientific establishment feeds this desire to feel special and somehow better than our peers at virtually every stage of the process. The corruption starts early, with small prizes throughout our education, then prestigious fellowships, then early career under-forty prizes. Later on, scientists hanker to be elected to their country’s academies and then win even grander prizes. It is the darker side of a natural human desire to feel respected by our colleagues. However, all these prizes, awarded at different career stages, affect only a tiny fraction of scientists. Most of them go to people who work in elite institutions, have had powerful mentors and networks, and are on the fast track to fame and glory.

At the pinnacle is the Nobel Prize, but it is rare for someone to suddenly be awarded the Nobel without some hints that he or she is a serious candidate.

As soon as someone does something considered significant, he or she is in contention for lots of prizes that are little known to the public. One might think that these prizes are all independent and, with the explosion of science, could be used to recognise many different scientists and discoveries.

In fact, the whole system is beset by a kind of cronyism. The various prizes often go to the same people, who are often well-known and powerful scientists. Often one bold committee makes the first award in a new field, and then other prize committees play it safe by following suit. This can quickly have a snowballing effect, with the result that the same luminaries pick up lots of awards. Moreover, the primary motive for many of these prizes becomes honouring the prize itself along with its nominating committee – rather than honouring the recipient or selecting good role models or highlighting interesting work in unfashionable areas.

So instead of differentiating themselves from the Nobel Prize and complementing it, many of these committees measure their success by how many of their awardees go on to win the Nobel Prize and proudly advertise this fact. You could consider these awards “predictors”, in the way that the BAFTA or the Golden Globe is often considered a predictor for the Oscars. In the worst kind of subservience, these prizes will not consider even a subject, let alone a person, that has already been awarded the Nobel.

How did the Nobel acquire its standing? It was instituted almost accidentally by a Swedish chemist, Alfred Nobel, who invented dynamite and parlayed it into a huge industry. Worried about his legacy, he decided that the bulk of his enormous fortune should go towards a set of prizes. Three prizes in the sciences – physics, chemistry, and physiology or medicine – were to be administered in Sweden, along with a prize in literature. A separate peace prize was to be administered in Norway. Curiously, there was no prize for mathematics.

The timing of the first Nobel Prizes in 1901 was particularly propitious. Its inception coincided with the kind of revolution in science that happens only once every few centuries. Physics had discovered quantum mechanics, subatomic particles, and relativity, changing our view of matter forever. These in turn revolutionised our understanding of the forces that hold molecules together and the mechanisms of chemical reactions, making chemistry a modern discipline. The discovery of genes and the inner structures of the cell revolutionised biology. Many of the earlier recipients of the Nobel Prize, like Planck, Einstein, Curie, Dirac, Rutherford, and Morgan, were giants who will be remembered forever. When combined with the staggering sum of money involved, which at the time was enough to guarantee the recipient financial security for the rest of his or her life, the Nobel soon became synonymous with greatness.

It was certainly not infallible, making some terrible omissions, like Mendeleev, who discovered the periodic table, which is the basis of modern chemistry; Lise Meitner, who provided an explanation for nuclear fission; or Oswald Avery, who discovered that DNA is the genetic material. Occasionally there were also blunders, like the prize for lobotomy or giving one to Johannes Fibiger for his now discredited finding that roundworms caused cancer, while simultaneously rejecting Yamagiwa Katsus-aburo, who showed that chemicals in coal tar can cause cancer, which paved the way for the study of carcinogens.

The prize is also subject to some rather arbitrary rules. Nobel specified that the prize was to be for a discovery or invention made during the previous year, but because it often takes years or even decades for the importance of something to become clear, this rule was very quickly discarded.

A second arbitrary rule, that the prize would be awarded to no more than three recipients, was formalised only in 1968. In a slavish imitation of the Nobel, the Lasker Foundation adopted the same rule in 1997 for its prize, known as America’s Nobels. The Lasker jury has been chaired for many years by Joseph Goldstein, a Nobel Prize winner who discovered the fundamental biology behind statins and has thus helped prevent millions of heart attacks and strokes. As someone on statins myself, I am personally a beneficiary of his research. In a recent article in Cell, he justified the rule of three by giving the number an almost mystical quality and comparing it to art’s three-panel triptych. I think it merely shows that years of chairing a jury can make the criterion seem so natural and reasonable that even someone of Goldstein’s stature and intellect can justify it by this exercise in numerology.

In fact, the rule of three is inappropriate today. When the Nobel Prizes started in 1901, scientists worked in relative isolation and met only once every few years. By the time they announced their findings, there was no question who had discovered what, and it was rare that more than three people would have contributed to the exact same discovery. In today’s world, the germ of an idea shared at a meeting quickly spreads throughout the world and lots of people contribute to its development.

It is not always clear whether the original idea or some later contribution was the truly groundbreaking advance. In sport, there is a clear way to measure performance – the biggest score, the fastest time, the longest jump, or the highest vault. But in science, figuring out three people who made the real difference in a particular field becomes increasingly difficult and subjective, if not impossible. Also, the explosion of science in the last half century has meant that lots of important advances never get the prize and it has increasingly become a lottery.

Increasingly, the rule of three means that year after year, there are complaints about people who have been overlooked. Although many big advances, like the discovery of the Higgs boson or sequencing the human genome, are done by large collaborative teams, the science prizes, unlike the peace prize, are not awarded to institutions. And though the Nobel amount is large, there are now prizes that dwarf it in cash value. For these and other reasons, the Nobel may be in danger of losing its unique, exalted status.


Because many of the early Nobel laureates were giants in their field, the idea has taken hold – especially among non-scientists – that Nobel laureates are geniuses. In fact, the prize is not awarded for being a great scientist but rather for making a groundbreaking discovery or invention. Some of them may be extraordinarily brilliant, but others are just good scientists who were persistent or happened to stumble onto a major finding. Being in the right place at the right time often helped enormously. What Malvolio said in Shakespeare’s Twelfth Night equally applies to Nobels: some are born great, some achieve greatness, and some have greatness thrust upon them.

But the label of genius that goes with the Nobel means that scientists, if they reach the stage where there is even a slight chance of getting one, hanker after it. To get one is – at least in the public’s perception – to join the pantheon of the greats.

Some hanker after it so much that it changes their behaviour, and their writings and public appearances have all the hallmarks of a political campaign. It makes them deeply unhappy and frustrated when, year after year, they fail to get one, and is a disease that I call pre-Nobelitis.

After the prize, there is post-Nobelitis. Suddenly, scientists are thrust into the limelight and bask in the public adulation that goes with it. They are asked for their opinion on everything under the sun, regardless of their own expertise, and it soon goes to their head. Some of them are long past their prime, having made their big discoveries decades earlier, and the renewed attention means that they spend their time wandering around the world, pontificating about all sorts of things. They become what I call professional Nobel laureates. Some laureates escape the disease, either because they are still very active scientists who simply ignore the distractions and continue to focus on doing the kind of good science that got them the prize in the first place, or because they use the prestige to do good for science in various leadership roles. A great example of the latter is Harold Varmus, who won the prize for identifying genes that in some circumstances can transform a normal cell into a cancerous one, but then became the director of the National Institutes of Health in the US and a strong advocate for biomedical research.

Prizes are often touted as a good thing for science by increasing its visibility with the public and providing the public, especially young people, with good role models. Nacho Tinoco, the famous physical chemist who was Brian Wimberly’s mentor, once told me he thought the Nobels were good for science because they fostered competition among top scientists and spurred them to do their best work. They may be good for science, but they are not so great for scientists. They distort their behaviour and exacerbate their competitive streak, creating a lot of unhappiness.

Excerpted with permission from Gene Machine: The Race to Decipher the Secrets of the Ribosome, Venki Ramakrishnan, HarperCollins India.