Gregory Sloop
10 min readMay 13, 2019

--

What Do Paradigm Shifts Look Like?

H. pylori. Courtesy of Yutaka Tsutsumi, M.D. Professor Department of Pathology Fujita Health University School of Medicine -

How often is the mainstream proven wrong within the span of a single lifetime? We no longer believe the sun goes around the Earth, but the Copernican revolution took hundreds of years. The widespread acceptance of the groundbreaking work of several other scientists such as Gregor Mendel, who described several basic principles of genetics, Ignaz Semmelweis, who proposed that physicians disinfect their hands before examining patients, and Alfred Wegener, the father of continental drift, was also delayed until after their deaths.

In fact, the interdisciplinary scientist Moti Nissani argues that resistance to new ideas is common and a serious obstacle to scientific progress. He published an eclectic list of 47 innovators, including physicians, scientists, engineers, an explorer, and Henry David Thoreau, whose accomplishments received delayed recognition. Benjamin Franklin was included because of his comment on lead toxicity: “the Opinion of this mischievous Effect from Lead, is at least above Sixty Years Old; and you will observe with concern how long a useful Truth may be known and exist before it is generally receiv’d and practis’d on.”

If Nissani’s list had included painters, Vincent van Gogh, who only sold one painting during his lifetime, would have been on it. In 1990, one hundred years after his death, his painting Laboureur dans un Champ sold for $81.3 million. In a wide variety of areas, important innovations, even genius, is not always recognized immediately. Renown and reward come most easily when accomplishments occur within the mainstream. Contrast van Gogh with Thomas Kinkade, the “Painter of Light.” It is estimated that one in twenty homes in the U.S. has a Kinkade, so it can be argued that his art defines the mainstream, at least in the U.S. He reportedly earned $53 million between 1997 and 2005 and received numerous awards.

The delay in appreciating the extraordinary explains why contemporaneous recognition of a paradigm shift, to use historian of science Thomas Kuhn’s phrase, is so rare. This delay causes revolutionary innovations to arrive not fortississimo, but with a crescendo. Paradigm shifts are most often recognized in real-time when they appear fortississimo and cause widespread, rapid, and overwhelming change. Gradual changes may only be recognized as paradigm shifts retrospectively.

With the rapid communication allowed by the internet, it is possible that innovations may spread faster, increasing the frequency of paradigm shifts, especially if scientists are as dispassionate, objective, and rational in processing information as computers. This possibility may never materialize because, as the twentieth-century physicist Max Planck wrote, “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” The reaction to Harvey’s discovery of the circulation of blood varied by age, with the younger generation likelier to accept it. For a paradigm shift to occur within the span of a single lifetime, an innovation must be so compelling and undeniable it causes minds to change before they disappear by attrition.

There are non-scientific factors which make a paradigm shift less likely. As Edward S. Weiss noted in his review “Reflux, revolution, and the role of forgotten research in medical paradigms,” researchers outside the mainstream may “investigate older approaches, earning the scorn of their colleagues and collecting very little in the way of grant money and prestige.” Obviously, this decreases the chance of a paradigm shift. Weiss’ use of the word “older” is significant. Planck and Kuhn understood that conflicting data are routinely ignored by the mainstream. Thus, a theory which ultimately becomes the new mainstream is often based on forgotten, ignored, or neglected research. As a medical truism suggests, the only new data out there is the paper you haven’t read.

Because scientists aren’t dispassionate computing machines, Kuhn felt that science was an “arational” process that doesn’t have to converge with the truth. Scientific hubris has a powerful effect in retarding progress. Thus, paradigm shifts can be delayed indefinitely. Because the overwhelming bulk of scientific progress is made by incremental advances, not monumental leaps, experiencing a paradigm shift in real-time may be like sighting a rare bird.

What does a paradigm shift look like when one finally occurs? A good illustration is given in the baseball movie “Moneyball.” In the movie, Billy Beane, the General Manager of the Oakland Athletics, uses statistics, not the instincts of scouts, to evaluate talent. Billy’s success threatened the mainstream, so he was scorned. In the movie, the owner of the Boston Red Sox tells Billy, “The first guy through the wall always gets bloody. Always” Beane initiated a paradigm shift, and now analytics are used throughout the big leagues.

Before the paradigm shift, the baseball establishment wasn’t known for forward thinking, potentially making a paradigm shift more difficult than necessary. Is paradigm change easier in the arts or academia? The musicologist K. Robert Schwarz described how academia maintained hegemony over the music world in the mid-twentieth century. At that time, academic composers favored an atonal, highly systematized method of composing called Serialism, and dominated traditional, tonal composers, even though the work of traditional composers was preferred by most audiences.

Schwarz wrote, ‘’’Dominated’ is the key word. Many now maintain that the Serialists took over academia, insuring that their quasi-scientific method, which was ideal for the university, was the only one encouraged. As they gained prestige, the argument continues, they took control of grant-giving bodies, new-music ensembles and competitions. Everyone else was shut out, especially those reactionary tonal composers.”

Scorn for outsiders is seen in this excerpt from a 1979 textbook on composing, quoted by Schwarz: “While the tonal system, in an atrophied or vestigial form, is still used today in popular and commercial music, and even occasionally in the works of backward-looking serious composers, it is no longer employed by serious composers of the mainstream.”

Scorn from the mainstream was effective: “To be a tonal composer in the 60’s and 70’s was a deeply dispiriting experience. One was shunned as the last teen-aged virgin.” The Serialists’ dominance didn’t last, but their story shows that a select group devoted to an idea with dubious merit can temporarily hold sway using scorn and withholding resources.

Are physicists, practitioners of perhaps the hardest of the hard sciences, above unscientific behavior? The story of Niels Bohr and the Copenhagen Interpretation of quantum mechanics, as told by journalist Peter Byrne in The Many Worlds Of Hugh Everett III: Many Worlds, Mutual Assured Destruction, and the Meltdown of a Nuclear Family (Oxford University Press, 2010), is informative.

“His [Bohr’s] favor could make a man’s career, and his disfavor could stifle it. As he grew older, he drew his personal circle of physicists ever more protectively about himself. His long-time assistant and collaborator, Leon Rosenfeld, in particular, became almost fanatical in his devotion to shielding Bohr…from attack by young Turks….”

According to the Copenhagen Interpretation, the act of measurement forces an experimental subject to change from an indeterminate state to a definite one. Erwin Schrodinger, one of the major contributors to quantum mechanics, was so annoyed by this preposterous notion that he created the well-known thought experiment involving the feline that came to be known as “Schrodinger’s cat.” In this scenario, a cat in a closed box is potentially exposed to poison gas. Whether the cat lives or dies is determined by chance. According to the Copenhagen Interpretation, the cat is simultaneously dead and alive until its state is determined either by detection of the chance event or direct observation. Although intended to ridicule the Copenhagen Interpretation, adherents celebrate the thought experiment because it shows how clever they are for not rejecting such an absurd conceit.

According to Byrne, “The founders and followers of the Copenhagen interpretation advocated their philosophy of physics…as the only feasible one. Attempts at basically different approaches, albeit by such prominent scientists as Einstein, [and] Schrodinger…were dismissed and ridiculed.”

The historian of science Mara Beller wrote in Quantum Dialogue: The Making of a Revolution (The University of Chicago Press, 1999) “The orthodox exaggerated the difficulties of the opposition stand while ignoring their own. As the opponents realized, not without some bitterness, their criticism of the Copenhagen interpretation was simply “brushed off” with accusations of not ‘understanding Bohr.’ As in politics, so in science, the orthodox misrepresented, trivialized, and caricatured the oppositions' stand.”

One target of Bohr and his followers was Hugh Everett III, whose interpretation of quantum mechanics, the Many-Worlds Interpretation, holds that Schrodinger’s cat is alive in one universe and dead in a parallel universe, never both simultaneously. In the 1970’s, when the Many-Worlds Interpretation began gaining popularity, Rosenfeld progressed from scientific to ad hominem attacks on Everett, which Byrne described as a “mini-jihad.” Now that Bohr and his personality cult have died, Everett’s interpretation is considered to be a mainstream interpretation. This episode is another example of a select group devoted to an idea of dubious merit holding sway over an entire community.

Has Medicine learned not to resist paradigm shifts? This generation of physicians has seen at least one genuine paradigm shift, the recognition that peptic ulcer disease can be caused by infection with the bacterium Helicobacter pylori and treated with antibiotics. This disease was once so prevalent that the stereotypical patient was sometimes portrayed in the media: an irritable, hard-drinking male smoker who constantly complained of abdominal pain, took antacids, and worried that stress would make his ulcer bleed again.

In the 1980’s, patients with peptic ulcer disease were followed by surgeons. Medical students were taught surgical approaches to the disease and several different bland diets, each with its own proponents. The aim of these interventions was to decrease gastric acid production, which was thought to cause the disease. In 1986, a pathology lecturer at my school mentioned in passing that one faculty member thought that bacteria could cause peptic ulcer disease.

During a surgery rotation, I saw the frustrations of a crusty attending surgeon, his patient, and the patient’s family when the patient was admitted for a transfusion after his ulcer bled again. The patient and family insisted that he followed the prescribed diet and were puzzled why it didn’t work. A diet that restricted which required so much devotion ought to be effective. After rounds, the surgeon told us the patient must have been non-compliant. On another occasion, I was told by a younger staff surgeon that no diet worked for everyone or all the time. Eventually, an operation might be necessary.

Because of antibiotic therapy, the rare patients who require an operation now usually have neglected disease. Patients with peptic ulcer disease are now followed by gastroenterologists, not surgeons. Endoscopy is used to visualize the patient’s stomach lining instead of an x-ray after swallowing barium. The stomach can be biopsied through an endoscope, which gave pathologists a larger role in managing patients with peptic ulcer disease. Practicing pathologists had to learn how to analyze a new type of specimen on the job, which is much more stressful than learning as a resident under the tutelage of faculty. The same goes for gastroenterologists. Fear of a learning curve may account for some of the resistance to a paradigm shift.

One of the most surprising developments that accompanied this paradigm shift was that an unusual type of stomach cancer, lymphoma, was treatable by eradicating helicobacter with antibiotics. The surgical oncologist at my hospital couldn’t hide his skepticism when this was reported about three years into the paradigm shift. Now, it is also known that helicobacter plays a role in the development of the most common type of gastric cancer, adenocarcinoma.

This paradigm shift decreased mortality, the need for surgery, and the number of office visits due to peptic ulcer disease. The shift affected multiple medical specialties and had an unexpected impact on other diseases. It is anticipated that antibiotic therapy will eventually eliminate helicobacter from the United States.

Thanks to the paradigm shift, Medicine became tangibly more powerful and evolved even farther from faith healing. A trip to the doctor’s offered real hope and required less faith. Physicians could give more than a diet which was so unpleasant it had to be good for you and the assurance that they’d seen this a million times and knew what you could expect. But before any of this could happen, the physician had to overcome the fact that he’d practiced as if he’d absolutely known what was wrong, and the best way to fix it. The longer he’d believed and practiced that falsehood, the harder it was to accept he was wrong. The fact that he’d embodied the falsehood in good faith comforted only those with a little humility. To those with only hubris, the paradigm shift presented an emotional and intellectual crisis.

Only 11 years elapsed between the first international presentation of the new hypothesis and the definitive statement from the U.S. National Institutes of Health in 1994 that ulcer patients with helicobacter infections should be treated with antibiotics. This might suggest that the paradigm shift occurred as smoothly as such things can. In reality, the road to the paradigm shift was long and tortuous. It included many missed opportunities to start the paradigm shift, beginning in 1875. In 2002, Lawrence Altman, M.D. medical correspondent for the New York Times said, “I’ve never seen the medical community more defensive or more critical of a story. They argued… with no evidence, this couldn’t be.”

The most notable abortive effort to initiate the shift was by John Lykoudis, M.D., a general practitioner who used antibiotics to cure approximately 30,000 patients with peptic ulcer disease in Greece in the 1950’s and 60’s. Lykoudis’ manuscript detailing his success was rejected by the Journal of the American Medical Association in 1966. He also was fined by the Disciplinary Committee of the Athens Medical Association for advertising and dispensing his unapproved proprietary drug.

In their brief biography of Lykoudis in Helicobacter Pioneers (Blackwell Publishing, 2002, edited by Barry Marshall, M.D., who shared the 2002 Nobel Prize with Robin Warren, M.D.“for their discovery of the bacterium Helicobacter pylori and its role in gastritis and peptic ulcer disease”), Basil Rigas and Efstathios D. Papavassiliou described his heroic effort, the envy and even hatred of Lykoudis from lesser minds, the fierce opposition from the establishment, the almost suffocating sense of frustration he felt, and his plea to his wife to bring the proof to his gravesite when his theory was finally substantiated.

Not surprisingly, those authors concluded that the main reason Lykoudis wasn’t able to initiate a paradigm shift was that his theory ran contrary to the mainstream. Writing about yet another abortive investigation, Dr. Altman wrote that the correct therapy would have cured “millions …of their ulcers and spared them from surgically parting with much of their stomachs.”

The only positive about resistance to a paradigm shift is that it maximizes the chance that a mainstream paradigm is only replaced by a better one. The peptic ulcer disease saga demonstrates the very considerable downside. In that light, the profundity of this aphorism from composer and music theorist John Cage (1912–1992) is clear: “I can’t understand why people are frightened of new ideas. I’m frightened of the old ones.”

--

--

Gregory Sloop

Associate Professor of Pathology, Idaho College of Osteopathic Medicine. Always fighting the power. Thank you for reading my work.