“A busy life is a wasted life.” – Francis Crick
“The world does not belong to the meek.” – James Watson
James Watson (left) and Francis Crick (right) examining the three-dimensional model of DNA.
Credit: A. Barrington Brown/Science Photo Library
One of the most famous photographs from the era of modern science shows two men studying stick-like extensions on a crude spiral scaffold. The man standing points at a detail on the wiry mesh while his younger colleague looks upwards engaged, as if in thrall by a great magician waving his wand. The two men in their loosely fitting suits exude an aura of casualness, undermining the moment’s scholastic seriousness. In 1962, Francis Crick and James Watson, the stick-wielding sage and his equally brilliant wingman, both received the Nobel prize in Medicine for their “discoveries concerning the molecular structure of nucleic acids,” or DNA, the chemical code that underlies all life in the known universe. Their finding caused a paradigm shift in how biological life and genetic heredity were perceived. Instead of letting nature take millennia to perfect living organisms, scientists could begin to engineer life’s chemical code in an evolutionary blink of an eye. Nowadays, most biotechnological applications ranging from insulin production to the generation of vaccines against Covid-19 could not exist without Crick and Watson’s discovery. Time, however, eventually showed both sides of its Janus face to these heroes of scientific history. While Crick’s professional esteem and eminence grew throughout his professional career, Watson ended up as a complete pariah in the scientific arena.
The Ideological Human
I was not impervious to the mythology of Watson and Crick during my early days as a student majoring in biochemistry. Learning the basics of practical molecular biology consisted of spending painstaking hours in an ill-lit laboratory, where a prickly scent of ammonia lingered around mysterious powders in dark-brown jars. The anonymous lab hours tended to dissolve into a shapeless haze of days and weeks, but I still recall the morning when the teaching assistant appeared uncharacteristically perky, enthusing about the “essence of life” we were going to witness soon. It turned into a long day, starting with a pile of bovine livers that were diced and ground, shaken and stirred, until, from the contents of the discolored mush, a few drops of transparent, watery liquid had been extracted. Then, just a droplet of another chemical and – voila! A thimbleful of white, fluffy cotton suddenly solidified inside the test tube. My first memory of DNA is distant but indelible, a true nerd’s wet dream, and I have misters Crick and Watson to thank for it.
After receiving the Nobel Prize, they both continued building on their, by now, well-known legacies. Francis Crick continued to Salk Institute to study the origin of consciousness, received the Commonwealth Order of Merit, and posthumously, lent his name to one of the largest biomedical research centers in the UK. James Watson for his part headed to Cold Spring Harbor Laboratories, one of the most prominent genetic research facilities in the United States, and eventually headed the Human Genome Project at the National Institutes of Health in the 1990s. Furthermore, Watson’s resistance to the Vietnam war and his vocal opposition to the nuclear arms race in the seventies made him an admirable, principled pacifist in my eyes. It resonated with me strongly as I had refused my mandatory military service, instead opting for a substitute service in the civil sector as a conscientious objector. My decision had brought me head-to-head with my own family, especially my grandfather, who had fought and bled in the Soviet-Finnish Winter War in 1939. By refusing the national(istic) rite of masculinity for personal reasons, I was simultaneously considered rejecting my grandfather, the values he represented, and the sacrifices he had made. Crick and Watson’s characters both reflected my idealistic notion of a socially conscious scientist acting for the benefit of humankind. Indeed, there was no money to be made in my future profession, but at least I could proudly stand on the venerable shoulders of those who had gone before me.
The Franklin Controversy
The first cracks in Crick and Watson’s stature appeared with accusations of sexism and illicit appropriation of a colleague’s groundbreaking work conducive to the discovery of DNA. Rosalind Franklin was an experienced researcher at Cambridge in the early 1950s, where Crick and Watson were employed at the same time. She had become a specialist on X-ray technology developed to decipher the three-dimensional structures of viruses and other minuscule objects invisible to the eye. Unbeknownst to her, Maurice Wilkins, a co-worker with access to Franklin’s data is claimed to have shown a pivotal photograph from her experiments to Watson, who consequently used it to ascertain the validity of his and Crick’s work. While Crick and his wife became close friends with Franklin, Watson later downplayed her role as supplementary and coincidental, even claiming she did not truly understand her own work and its importance despite a Ph.D. and years of experience.
Watson’s dismissive chauvinism became more apparent later on in his writings, such as when he described parties at Cold Spring Harbor where “girls, as opposed to intellectuals and wives, would be present.” Moreover, he has always considered good academic research a product of aggressive competition. Franklin passed away due to aggressive ovarian cancer in 1958, only four years before Crick and Watson received the Nobel Prize – which they shared with the afore-mentioned Wilkins who allegedly gave them access to Franklin’s data. Did she feel at ease, working as a single woman in the heavily male-dominated and competitive academic arena of the 1950s? How did she cope with the snubs labeling her a cold-natured spinster in a culture that prioritized submersion into work at the cost of relationships? Did she resent the exploitation of her work that paved the way to the bright male stars of Cambridge? Franklin’s groundbreaking efforts to further scientific progress became renowned only after second-wave feminists unearthed her unacknowledged contributions from the campus archives. At the time I rationalized my misgivings about Crick and Watson’s behavior as reflecting the male-dominated academic culture of their times and a thing of the past. After all, suggestive speculations, insinuating guesses, and contradicting interpretations had no place in true science where my passion lay.
Feet in Their Mouths
However, things started to get worse from then onwards. Amidst the societal discussion over the ethics of prenatal genetic profiling, James Watson claimed homosexuality was a legitimate cause for terminating a pregnancy. As a gay man who had recently come out to his entire family, I felt Watson’s words hitting close to home. Homophobia was supposed to stem from inadequate education about and exposure to queer people. I felt invalidated, diminished, and reduced to a one-dimensional biological automaton by someone I greatly revered because of his intellect. Rationalization to my rescue, then: Watson was clearly not up to date regarding the diversity of sexuality and the heteronormative naturalization his thinking represented. He might have also made a pro-feminist comment on women’s inalienable right to decide over their own bodies without being subjected to blame and patronizing condescension; or perhaps it was the media, more interested in controversial, paper-selling soundbites than in a nuanced discussion over the ethics of science. Subsequently, I offered him the same leniency as I did my parents, whose careless words disclosed their confusion and uncertainty when coming to terms with my queerness at the time of my own coming out.
I was sorely mistaken, though, as my Watson slowly transformed into a Watson I did not recognize anymore. He said that black people were genetically predisposed to have lower intelligence compared to whites. To compensate for such an inherent deficiency, nature had given brown-tinted men a higher libido, which explained the difference between Latin Lovers and English Patients. He said the Chinese were intelligent but not creative like Jews because evolution had enriched features promoting conformity in the Asian population. One of Watson’s main interests was genetic engineering which he considered a tool to cure the stupid and create “pretty girls.” Furthermore, low IQ was a sufficient reason to curtail stupid people’s reproductive rights. As for Francis Crick, public controversies mostly avoided him despite his peculiar speculations about space-traveling aliens implanting the seeds of life onto a young Earth millennia ago. He entertained the idea of encouraging wealthy people to reproduce and propagate desirable hereditary characteristics to combat society’s stupidity, but he mentioned it only in private letters. Like the complementary but antiparallel strands of DNA (i.e., each paired nucleotide thread chemically oriented and running in opposite direction from the other), Crick and Watson’s ethea diverged from each other, as did mine from both of them.
The Crazy Geniuses
The cult of the scientific hero as an idiosyncratic genius, exemplified by Crick and Watson, persists throughout the annals of history. Such historical mythologies and laudatory narratives, however, are mainly based on a highly biased selection of facts, which hides their unpleasant and often misogynistic streak. For example, Galileo Galilei has been called the father of modern science who stood up for his principles despite religious repression and persecution. Based on his observations of the heavenly objects and the flow of tides, he concluded the Aristotelian, geocentric worldview was a fallacy and became the most prominent astronomer-skeptic of the pre-modern era. His theory displaced the Earth and, more importantly, man as the center of the universe. It also challenged the interpretation of the Holy Scriptures, which incurred the wrath of the Catholic Church. However, Galileo also fathered two daughters out of wedlock, but he never recognized them because of potential financial worries and a threat of shame to his social status. He sent his teenage daughters to a Catholic convent where they lived in poverty and squalor, sexually abused by priests as described by his daughter Victoria in letters sent to her father.
Charles Darwin, in turn, is the paragon of Victorian science whose paradigm of evolution and natural selection shattered humans’ understanding of themselves. Despite relativizing the position of Homo sapiens in the Earth’s biological continuum, he upheld male eminence by declaring that “man has ultimately become superior to woman” based on his understanding of the laws of inheritance . By drawing inspiration from his half-cousin’s revolutionary theory of evolution, Francis Galton for his part entertained the idea of creating an “extraordinarily gifted race” through the selective breeding of humans. Despite his commendable contributions to meteorology and biological statistics, Galton remains infamous for coining the term “eugenics”, which propagated scientific racism on pseudobiological grounds and contributed to the genocidal horrors of the early 20th century. More recently, eccentric masterminds harboring unscientific ideas have garnered attention as well. After his LSD-infused Ph.D. years in California, American biochemist Kary Mullis revolutionized the field of genetic manipulation, including genetics, forensics, and diagnostics, by inventing PCR technology. After receiving his Nobel Prize in Chemistry in 1993, he garnered infamy by asserting his astrological beliefs and objecting to human contribution to climate change. Most alarmingly, he became known as an AIDS denialist by repudiating the role of the human immunodeficiency virus (HIV) as the cause of AIDS until his death in 2019.
The Controversial Good
Meaningful public discourse on science and its ethical-social significance balances a fine line between consensus and confrontation. Francis Crick passed away in 2004 with his professional reputation lauded and intact. On the contrary, James Watson’s stature continues to diminish as institutions revoke honorary titles and acknowledgments because of his continued pseudoscientific ramblings. Personally, Crick and Watson’s public participation and controversial legacy in science have exposed my shortcomings and latent gullibility to be persuaded by fame, personality, and charisma. More importantly, however, it also helped me crystallize what exactly attracted me to science. First and foremost, it is an arena for free speech, unadulterated ideas, and creative aloofness. Its controversial quandaries and boundless imagination may clash with socially and politically correct normativity, but as John Stuart Mill said, better speech is the most efficient weapon against bad speech. Second, science aims at addressing questions concerning the emergence of animate life from inanimate matter without introducing a supernatural higher power (Crick and Watson have described themselves as a skeptic and an atheist, respectively). Crick focused on studying cognition, consciousness, and memory during his later career, propelled by his unshaken belief in their biological origin. Watson also became interested in understanding the neurological basis of mental health issues because of his son’s schizophrenia. Third, science is everyone’s business. Watson has opposed the commercial exploitation of human biology by stating genes belong to people, not nations, while calling the patenting of human genes “lunacy.” In 2003, Crick and Watson signed the Humanist Manifesto, the core tenets of which include life’s value emerging from “individual participation in the service of humane ideals” and maximal individual happiness created by working for society’s benefit. I could pick out the discrepancies between the manifest’s maxims and some of Crick and Watson’s statements over their long careers. Alternatively, however, I could also point out the alignment between the credos and the scientists’ contributions. In the end, I choose to do both and still see us sharing the same non-black-and-white human space.
William Shakespeare has said it is a fool’s prerogative to utter truths that no one else will speak. Over the years, some of Crick and Watson’s (g)utterances have been beyond foolish to the level of inhumane nihilism, but the world has heard their intelligent reasoning loud and clear as well. Such provocative bluntness reflects one of Watson’s rules perfectly: equating success with the willingness to get into deep trouble because nobody can identify the weakness in your thinking better than your enemy. Therefore, I have not canceled Crick and especially Watson despite their outdated attitudes, quirky comments, or reprehensive claims. Whenever friction and dissensus occur, the realm of knowledge can expand despite the concurrent potential to experience anguish and anger. Whenever people become bores or brands, opinions replace facts, and spin substitutes for content, reality becomes brittle. By categorically negating elements of Crick and Watson’s uncomfortable, subjective truths, I would deny myself the at times frustrating, at times exhilarating navigation over the grey terrain of life between comforting complacency and ethical nihilism. More so, the story of Watson and Crick is an apt personal reminder that the truths of science should not be confused with the personal beliefs or politics of its practitioners and interpreters. Instead, I have come to learn that idolatry and authority, not science per se, is seductive and deceitful. Although my heroes’ self-inflicted fall is disheartening, paradigms remain.
By Pasi Halonen
 Nobel Prizes That Changed Medicine, edited by Gilbert R Thompson, World Scientific Publishing Company, 2011.
 Hargittai, Istvan. The Dna Doctor: Candid Conversations With James D Watson, World Scientific Publishing Company, 2007.
 Polymerase chain reaction (PCR); most contemporary gene technology applications such as virus diagnosis and genetic manipulation make use of PCR technology
 Mill, John Stuart. On Liberty. Chapter II. On the Liberty of Thought and Discussion. Cambridge University Press, 2012.