“Are devils extinct?” my son asks one day.
“Devils don’t exist,” I tell him, intrigued by his use of the science-y word “extinct” to explain why he has never seen or heard anything in the news about devils. In his mind devils were real, maybe something like dinosaurs.
“Who made the first wolf?” he asks another time.
“No one made wolves,” I reply. “Wolves evolved along with coyotes and other wild dogs from a common ancestor.” My knowledge doesn’t go much farther back than that, so I try to explain the concepts of evolution and natural selection by going forward instead.
“We have a wolf living in our house: Lady,” I say, referring to our pet dog. “Lady is a cousin of the wolf.” Then I explain how dogs likely evolved from wolves.
“Perhaps some relatively fearless wolves scavenged food scraps around early human encampments. Then those wolves bred with each other to have babies with even less-fearful temperaments, and so on and so on over tens of thousands of years,” I suggest. He nods at the logic and raises an eyebrow at Lady’s new status.
Usually, my son’s questions come at me when we’re driving in the car, him alone in the back. In the rearview mirror I see him with his head cocked to the side, forehead to the plastic door frame, hypnotized by the rhythmic passing of the road below; with the external world moving by too fast, he retreats to the internal. Suddenly, his head will pop up and he’ll rattle off one of these koans. Often his queries have to do with the origins of things. “Who made the first house?” he once inquired. And, “I don’t remember being born,” he once said, in shock.
When I was very young, I had a firm hold on the origins of things. Each Sunday morning, I sat expectantly on the church pew in Patapsco, Maryland with my sisters and cousins, presuming Jesus would one day throw open the doors of the vestibule, choosing our spottily-attended Methodist house of worship in our tiny town—the Algonquian name of which literally translates to “backwater”—for his second coming.
Instead of God’s purported son tossing ajar the church doors, eight years of post-secondary education eased open my mind. The slow erosion of my religious underpinnings came to a head when I realized, as an undergraduate in English literature, my suspension of disbelief was stronger reading John Milton’s Paradise Lost than reading the Bible. In the U.S., more education does correlate with less religion. According to the Pew Research Center, U.S. college graduates are less likely than those with less education to say they believe in God without question (55 % v. 66 %), less likely to rate religion as “very important” in their lives (46 % vs. 58 %), and more likely to self-describe as atheists or agnostics (11 % vs. 4 %). I once believed God had all the answers. When I realized he didn’t (or he wasn’t), I threw my faith into science. But it would take twenty years, motherhood, and a global pandemic before I understood what science really was.
*
One morning in 2019, head bent over my phone to read the news, my eyes opened wide: 850 scientists had signed a letter in the journal Nature arguing that the p-value be eliminated. I was floored. I thought about my favorite biology professor from my master’s program in Environmental Studies, from whom I had inherited a worn copy of Wilson and Daly’s Sex, Evolution, and Behavior, which my own poring over had further worn. As part of a fellowship, my cohort and I taught nature programs to middle schoolers at an environmental education center along with our professors. At lunch, the other graduate students and I would rabble with the faculty over the latest research findings in our field.
“But was it significant?” That biology professor would press us, his eyes and bald forehead shining, as we shared proclamations from interesting studies and passed food around the table. If we said no, he would shrug his shoulders, smirk, and respond, “Doesn’t matter then. The results have to be significant.”
The threshold for determining significance of experimental results was developed in the 1920’s by the British mathematical statistician R.A. Fischer. Fischer called this threshold the p-value and set it, in his own words, at an “arbitrary but convenient” level of .05. The lower the p-value, the less likely an observed difference—between an experimental group and a control group, for instance—could have occurred by chance, and thus the greater the statistical significance of the observed difference.
When a p-value lands above .05, it is regarded as statistically insignificant. But according to that 2019 article on NPR’s Weekend Edition Sunday,” declaring a result to be statistically significant or not essentially forces complicated questions to be answered as true or false.” P-values shouldn’t be used in such an either/or way, to decide whether a researcher’s hypothesis is correct or not, to define what is “real.” When a p-value is above .05, this does not definitively prove that no relationship exists between the two variables. Yet, the NPR article notes, a study reported in the journal Nature revealed that of 791 scholarly, peer-reviewed articles, 51 %—more than half—were interpreted incorrectly by the authors as having no effect when there was no statistical significance.
“The whole idea of statistical significance” one article notes, “makes science seem like it holds answers, rather than inklings. Most questions aren’t so black and white.” This information about the p-value hit me like a ton of bricks. Surely my former biology professor, who through the 1990’s was the world’s premier expert on the sex lives of voles, who had mentored me through my thesis on sex differences among fifth graders in a visually-spatially oriented problem-solving activity, knew the true, original intention of the p-value. Surely all scientists know it. Surely I had been taught it in my research methods class. So why did this article feel sacrilegious?
Like, I suspect, many college-educated Americans, I had mistakenly transferred all my expectations unmet by the fantastical stories told to me in a basement Sunday school room to the structured methods of science. I had incorrectly conceptualized these two, religion and science, as different avenues to the same destination—one wrong and one right; one foolish and one enlightened—but each delivering its believers to a similar place: a land of definitive answers.
*
I once believed I would become a poet; instead, I teach communication at a technical school. In one of my favorite lessons, I give my students three articles to read on the topic of research methods, one published in Money Magazine, one in Library Science, and one in Plos One. For homework, I tell them to compare the content, purpose, authors, audience, word choice, tone, structure, and style. I assign them a few pages in the textbook that inform how to find credible sources and introduce the categories of popular magazines, trade publications, and scholarly, peer-reviewed academic journals. In class the next day, my students meet in small groups to compare their T-charts and Venn diagrams with each other.
“Which article comes from which type of publication?” I ultimately ask. “Which article would be the most credible, in-depth source?”
They always get it right, calling out, “The boring one!” like a Greek drama’s chorus.
I pass out piles of magazines and have the students group them into categories according to the publication types. I open a series of webpages that correlate with the publication types so they can transfer this knowledge of source types from print to the web. I talk in depth about the concept of peer review.
“Scholarly, peer-reviewed research,” I tell them semester after semester, year after year, holding up The Journal of Nursing with its pictureless, matte white cover, “is the closest thing we have to the truth.”
But lately, in particular after reading that 2019 article on the p-value, I can’t end my lesson without a concession paragraph, without giving them the whole story. The scientific method, I warn, does have limitations—beyond those noted in the articles themselves, beyond sources of research funding. There is an over-reliance on the concept of statistical significance, I say, and because of this, a whole host of other problems: journals may lean toward only publishing research that hits that .05 mark, and scientists, under pressure to publish for promotion, tenure, and research funding, may revise research methods or cherry-pick data to show whatever results are desired. Recalculating data in order to reach that .05 marker, I am careful to say, often includes acceptable research decisions about what will be compared or how to define a variable but makes it difficult to perform a foundational part of science: reproducing experimental results in follow-up studies. And the resulting reproducibility crisis, particularly in the biomedical and behavioral sciences, I conclude, waving The Journal of Nursing around for extra emphasis, has, in turn, lowered the public’s confidence in the entire scientific process.
The irony, of course, is that by giving them the whole story I, too, may be lowering the public’s confidence in the entire scientific process, something I decidedly don’t want to do. I don’t want to seem like I lack conviction in my own discipline, for conviction, after all, is sometimes the strongest and simplest teaching tool. My problem as a teacher has always been finding a middle ground between sincerely considering every idea floated before me and being too confident. I lean too much toward the former, and sure enough, as I ramble on, I see in my students’ eyes how the message gets muddled. Like me, they want definitive answers. What is it she wants us to take away today? they seem to be asking before they close their laptops and shuffle off to their next class, or to pick up the kids, or to work.
What am I saying? I ask myself. What have I given them? I meant to say that scholarly, peer-reviewed research is the gold standard for sources. That it’s all we can depend on. I meant to throw my students a rope, give them something to grasp in the sometimes turbulent waters of existence. But more often now I feel like we really have no lifeline, no ability to predict anything or exact any control over our lives here on Earth.
*
One day, my son clears all his Legos off the desk we have reserved for this purpose in our living room to set up my Tasco microscope, which my parents gave me as a child when I requested it for Christmas. We’ve gotten the microscope out a few times before but only for short periods and not for a year or more. Now, at age seven, I think he might actually be able to use it. At first, he can’t remember what it’s called, so he takes me to the closet and points it out. I hand him the Styrofoam package with its taped-up plastic covering, which still contains most of the microscope’s accessories.
While I cook dinner, my son is quiet and busy. I want to interfere, afraid he will break the microscope, to instruct him on how to focus it, but when I peek into the living room, I see him acting with care. The microscope doesn’t quite work as well as it should anyway. The lenses are intact, but the light beneath the stage no longer works due to corrosion from batteries left in the last time I played with it as a child. The light is located on a rotating disk which flips to reveal a circular mirror on the underside, so the microscope still functions if you set it up near a bright lamp or indirect sunlight and angle the mirror just right–and then be careful not to bump it.
My husband comes home and joins my son in the living room. I hear their quiet talk and see them go in and out of the house a few times, content in their work. After dinner, I go to see what my son has done with the microscope.
“I’ve created a science table,” he explains, as I survey the old desk in front of our large picture window. The microscope’s container and instruction booklet have been cast aside on the floor, but the table looks organized, so different than it had with the clutter of Lego that normally conceals its entire surface. The microscope is set up in front of his chair. He has taken the microscope’s accessories out of their Styrofoam beds—a probe, a scalpel, a glass stirring rod, an eyedropper, and a single test tube—and laid them in a parallel row near the scope. The small, tightly capped glass vials of gum media, used to stick coverslips to slides, and methylene blue and eosin, used to stain specimens for easier viewing, surround two delightfully small, plastic, square containers marked “cover slips” and “blank labels.” A cardboard-backed butterfly specimen from Southeast Asia, still in its plastic sleeve, mostly complete but with one of its wings tattered, has been placed nearby.
Then I see that my son and my husband have done something I could never do as a child. They have set up the microscope kit’s tiny brine shrimp hatchery, filled the first in a row of four plastic, one-inch-square compartments with water and sea salt and, if my son did it, probably the entire contents of the container of brine shrimp eggs. I could never begin this process as a child, though I wanted to, because I knew the end result was supposed to be killing the shrimp at various stages of their lives to make slides. And with no knowledge of a brine shrimp’s lifespan, allowing them to die a natural death seemed like it could be a long undertaking.
I smile. As an adult, I like raising things: pets, plants, students, children. I slide the hatchery to the back of the table where the warm sun shines for most of the day through the South-facing window. Maybe they will hatch, I think. Who knows how long a brine shrimp egg can lay dormant and still be viable?
*
Science can hold answers. After all, science has created telephones, electric lights, airplanes, and treatments and cures for many diseases. My husband is a case in point. A type 1 diabetic since age 16, had he been born 150 years ago, he would likely have lived for only months after the onset of his disease. But in the late 1800’s researchers discovered a lack of something they labeled “pancreatic substances” caused diabetes in dogs. In 1921, a scientist extracted and injected one of those pancreatic substances—insulin—from a healthy dog’s pancreas into the pancreas of a dog with diabetes, which kept the dog alive until the insulin ran out. In 1922, doctors successfully reversed glucose levels in a 14-year-old boy dying of diabetes by injecting him with cow insulin. Then, in the late 20th century, researchers developed a more human-friendly form of insulin from E. Coli bacteria, effectively returning the lifespan of type 1 diabetics to normal. That’s a lot of science. By science I mean a lot of observations that led to a lot of testable hypotheses that led to a lot of facts–a lot of truth, if you will. And my husband is alive because of it.
Daniel Sarewitz, professor of Science and Society at Arizona State University, argues in “Saving Science,” that when we apply science to finding a solution to a particular problem, rather than just to advancing knowledge, and that problem ties to simple subjects like technology or industry, the science has a good chance of being true. But when science deals with more complex subjects such as social, political, or economic issues, or living systems such as a cell, a tumor, a species, or a classroom of children, scientific findings become much more tenuous. Particularly when dealing with humans and multifaceted problems like the cause of Alzheimer’s or criminal behavior, it becomes much more difficult to produce scientific results that are indisputable facts because there are so many non-isolatable variables and because much of the experimental design rests on assumptions to begin with. For instance, Sarewitz notes that research on causes and treatments for Alzheimer’s is performed on mice. Mice do not naturally develop dementia (like dogs get diabetes), so the research is founded on the assumption that this artificially produced dementia in a mouse brain will correlate to a naturally progressing dementia in a human brain. And though mouse brains and human brains are alike in many ways, they are also different. Accordingly, as Sarewitz and others note, drugs developed and used to treat various neurological disorders like depression, schizophrenia, and Alzheimer’s often work well in mice but mostly fail miserably in humans.
I think about a photo I saw last Christmas on Facebook, posted by a relative whose husband has dementia. In the photo, three men sit side-by-side, one in a wheelchair, each with that vacant look in their eyes so common in dementia patients that comes from having their minds kidnapped from their bodies. “Look who came to visit!” proclaim the words beneath the photo. Behind the men stands a person dressed as the Grinch. The Grinch has leaned down and spread his green, furry arms across the shoulders of two of them, his head with its permanent, devilish grin bookended by their dead stares.
The photo makes me cringe. The Grinch himself could have kidnapped the men’s souls according to this picture’s story. Maybe I’m reading too much into it, but I wonder who the Grinch was really for–the patients or the workers–and whether such a character might actually be frightening to someone who may not even be able to recognize their own children. In his article, Sarewitz interviews Susan Fitzpatrick, president of a foundation which funds research on cognition and the brain. Both believe that in its quest for gaining knowledge through inherently flawed mouse model research, neuroscience has failed at other things, like reducing suffering. “There’s not a lot of research on how best to socially, emotionally, environmentally support Alzheimer’s patients,” Fitzpatrick laments, “that might ameliorate their own anxiety, their own stress—maybe the disease, as horrible as it is, would be less horrible through a better care structure, but we do very little research on that.” I can’t stop staring at the Facebook photo. “Memory Care Café” reads the wall benevolently behind the three men and the Grinch in the rather sinister looking picture.
*
If experimental design in science often rests on assumptions, experimental results in science often rest on assumptions as well. There’s a lot about science that, by definition, isn’t true. Theories, of course, are an important part of science. In scientific contexts, the word “theory” means something different than it does in common speech, something more than just a guess or hunch. In science, a theory is an explanation about the natural world based on observation, facts, testable data, and deducible laws. A theory allows scientists to make predictions about what will happen. When observations and new data continue to support a theory, the theory becomes stronger. If predictions don’t come true or new data defies the theories’ laws, then the theory must be changed or thrown out. In science, no theory—no matter how strong—is ever considered invincible. In “But Is it Science?” Jim Baggot, a British popular science writer, notes: “The philosopher Karl Popper argued that what distinguishes a scientific theory from pseudoscience and pure metaphysics [a branch of philosophy] is the possibility that it might be falsified on exposure to empirical data. In other words, a theory is scientific if it has the potential to be proved wrong.” This is a major difference between those two opposing influences in my life: religion and science. Religion doesn’t ask to be proved wrong; it doesn’t predict its own evolution. And yet, every time I pass one of those “We Believe Science is Real” signs in a neighbor’s yard I wonder, do they understand what “real” means? Not true, or factual, or invincible, but more like “real until proven not?” Do they understand that science is no undefeatable god?
As an illustration of how theories aren’t true in and of themselves, Baggot gives the example of Newton’s laws of motion. Newton’s laws work well in the observable universe. But these laws don’t accurately describe how objects move at or near the speed of light or how they act in the microscopic world. Sub-atomic particles do not occupy a single space and maintain a single velocity at a specific point in time. Rather, their wave-like nature gives them a range of positions and a range of speeds, so for them we use a different set of laws: quantum mechanics. “And yet,” writes Baggot, “Newton’s laws remain perfectly satisfactory when applied to ‘everyday’ objects and situations, and physicists and engineers will happily make use of them. Curiously, although we know they’re ‘not true’, under certain practical circumstances they’re not false either. They’re ‘good enough.’”
Theories have been a foundational part of science since ancient Greece, when Aristotle developed a more inductive method of reasoning based on observation and empirical evidence, (what we now know as the scientific method), in contrast to the more deductive method of reasoning based on thought and logic used by philosophers. But Baggot, who holds a doctorate in chemical physics from Oxford, labels the current era of science “post empirical.” He calls this a perilous time where “truth no longer matters” because science is based more on theory and logic than observation. Concerned about the way string theory and the theory of multiple universes is portrayed in both the media and in scientific papers, Baggott writes, “I, for one, prefer a science that is rational and based on evidence, a science that is concerned with theories and [my emphasis] empirical facts, a science that promotes the search for truth, no matter how transient or contingent. I prefer a science that does not readily admit theories so vague and slippery that empirical tests are either impossible or they mean absolutely nothing at all.” What makes it worse, argues Baggot, is how both scientists and the media dismiss some theories, like intelligent design, and promote others, like the theory of multiple universes, when neither is based on any data. Baggot does not advocate for the theory of intelligent design (nor do I) by any means. Rather, he warns of a double standard in how scientists share theories with the public that could fuel a misunderstanding of what is science and what is philosophy.
Einstein, Baggot notes, had warned that human’s desire to know might lead us to believe we could understand the world simply by thinking without observation or data. “There’s a tension between the metaphysical content of the theory and the empirical data,” Baggot writes, “—a tension between the ideas and the facts— which prevents the metaphysics from getting completely out of hand. In this way, the metaphysics is tamed or ‘naturalised’, and we have something to work with. This is science.”
*
My son has a bit of trouble with the microscope. We look at several of the prepared slides that came with it. I try to choose slides of objects he’s encountered—paper, an ant, a frog muscle—vs. soybean glutamate or Oscillatoria princeps, but the specimens are so tiny and the magnification so great the whole thing is anticlimactic; he doesn’t really understand what he is looking at or what the microscope is actually doing.
Though I played with my microscope regularly (it was a favorite toy), and despite using microscopes in high school, I didn’t properly learn how to use a microscope myself until after I got my master’s degree. Having gone from an English Bachelors to a Master of Arts in Environmental Studies followed by a job teaching environmental education programs in a state forest, I felt I needed to beef up my science knowledge and fill in some basic gaps in my natural science education, so I enrolled in nine credits at the local community college—Biology I and II and Microbiology. In Microbiology I learned the proper steps to using a microscope: always start with the lowest power of magnification no matter how small the specimen or your end goal, then use the coarse focus, then switch to higher powers of magnification, and you will not need to focus again. I had somehow missed these simple steps, but in that microbiology class the microscopic world finally opened up to me. I learned about gram-negative and gram-positive bacteria and could properly identify them after staining my slides due to the way their differing cell walls did or didn’t hold the color. I learned the “truth” about handwashing: how the best way to eliminate viruses and bacteria was not with some special antibacterial soap that attempts to kill them but with technique, by physically knocking the pathogens off of your hands under a running stream of soapy water (the soap makes the pathogens, chemically, less “sticky”). This was another example of science delivering clear truths, something my nursing students at the technical college are schooled and tested on, a lesson I remember sharing with a colleague on the eve of the coronavirus pandemic, the night things were suddenly going down, after which we would wake up and find all the schools in our state, from daycare centers to colleges, closed indefinitely.
My husband, who directs a residential environmental education center, solves the problem at the science table by bringing home from work a different kind of microscope, a stereo microscope, which my son and I instantly fall in love with. A stereo microscope is for magnifying objects you can already see with your naked eyes, at low powers of 10-50X. This makes all the difference, for all of us. We can finally “see” what we are seeing. And easily—there are two eyepieces instead of one, so my son can use both his eyes without having to train his brain to ignore the visual input of one of them to see what’s in the microscope’s field. Though the stereo microscope can hold slides, the base is also broad enough to hold large, three-dimensional objects. We look at rose quartz from the Black Hills of South Dakota, barbs on a bull thistle from our backyard, a dead ladybug from the windowsill. Our favorite is the underside of a chunk of Ganoderma applanatum, or artist’s fungus, a shelf-like fungus that grows on the sides of decaying trees. I had knocked one off during a hike with my son for him to etch with a small stick on the car ride home, which is how the fungus gets its common name. The decorated fungus had been sitting on our back porch but now, under the stereo microscope, it revealed a new kind of artistry, a seemingly infinite pattern of perfectly round, tubular holes, about five per millimeter. We could barely imagine their ordered existence when we looked at the fungus with our naked eyes, but the pattern which the stereo microscope revealed didn’t seem impossible either.
We go for a hike at Devil’s Lake State Park with a friend and to keep my son occupied through the miles, I give him a Ziplock bag to fill with items to look at under the microscope. He selects lichens; the fruit of white baneberry, also known as dolls-eyes; acorn caps; and ferns backed with spores. He moves through the woods, racking up the miles, zigzagging off the trail to fill his bag with all kinds of things. When my friend asks him what he is doing, he replies, “I’m finding science.”
*
Even when science stays entirely in the realm of empirical data, there can be problems. Michèle Nuijten, of the MetaResearch Center at Tilburg University in the Netherlands, which studies bias and error in science, developed a computer program called Statcheck to review statistical calculations. Statcheck looked at 50,000 psychology articles in 2015 and revealed that half of them contained a statistical error. According to an earlier, unrelated 2009 Stanford University study, which looked at scientific fraud, about 2 % of scientists self-reported that they had at some point falsified data, so 98 % of the errors Statchek found were likely attributed to honest mathematical mistakes. All of this comes from Stephen Buranyi’s “The High-Tech War on Science Fraud,” where Buranyi reports that the true amount of scientific fraud occurring isn’t known; he believes it could be much higher. It’s not a coveted topic of study because, Buranyi writes, “The exposure of fraud directly threatens the special claim science has on truth, which relies on the belief that its methods are purely rational and objective.”
Though I clearly should be, I’m not as interested in scientific fraud (as Shakespeare taught, “the truth will out,” right?) as I am in the kind of fraud inherent in being human, the kind of tricks our own minds play on us, the cognitive biases we ourselves are beholden to even as we name these cognitive biases and warn ourselves of their existence. “Even an honest person,” notes science writer Regina Nuzzo, “is a master of self-deception.”
Although it isn’t one of the state-determined competencies of my English Composition I course, I teach an entire unit on one of these self-deceptions, the correlation vs. causation fallacy, because the fallacy is so common, and I feel it’s important for my students to be aware of as critical thinkers. We begin by discussing the statement, “This weather is making me sick,” one of my personal pet peeves. Viruses and bacteria make us sick, I chide my students, not the weather. We examine the headlines of popular media reporting on research, look at verb choices used to explain relationships—“associated with” or “tied to” vs. “makes,” “results in” or “causes”—and analyze the argument in a documentary film on veganism which implies that eating animal products causes many diseases, from diabetes to cancer to rheumatoid arthritis to multiple sclerosis, and that becoming vegan can cure them.
In addition to confusing correlation with causation, humans also suffer from motivated reasoning–which is that we tend to find what we are primed to find. I am well aware of motivated reasoning as a regular hiker in a state with a strong hunting culture. One winter, I hiked with my son and dog in a state natural area near my house, one of my usual haunts. The 9-day gun deer season had ended. Because there are other, less popular hunting seasons until January—bow, antlerless, turkey—my son and I both wore a blaze orange hat, the dog wore a blaze orange collar, and I kept her on a blaze orange leash. We stopped to ice skate in our boots on a pond next to the trail, tying the dog to a tree, and I noticed a hunter on the opposite shore, standing just off the trail’s return loop. When we got back to the car he was there also. We exchanged hellos and he said, not unkindly, “Maybe wear a little more blaze orange next time.” He was sporting mostly camo, and a blaze orange vest.
Do I look like a deer? I wondered. But I wouldn’t argue with a man with a gun, nor would I argue with a man with the same evolved brain as me. According to a white paper by the United Kingdom’s University of Hudderfield, most deadly hunting mistakes are due to misidentification of the target caused by cognitive biases, primarily motivated reasoning. You are looking for deer, hoping to see a deer, so if you see or hear movement, your mind sees a deer. Hunting accidents such as these often occur at the hands of seasoned hunters, who even after the accident are 100 % certain that what they saw and shot at was the animal they were looking for.
The same is true for researchers, except that rustling in the bushes is a certain group of numbers jumping out from the page, as if they were more important than all the other groups of numbers around them. Chris Hartgerink, also of Tilburg University, as quoted in Philip Ball’s article “The Trouble with Scientists” admits, “I was aware of biases in humans at large, but when I first ‘learned’ that they also apply to scientists, I was somewhat amazed, even though it is so obvious.” In scientific research, motivated reasoning, also known as confirmation bias, manifests in a myriad of ways—in the way we design studies that only look for evidence that confirms a hypothesis and ignore evidence that may refute it (which is, ironically, the opposite of how science is supposed to work); by the way we interpret data; and by the way we may find errors in data analysis that dispute our findings but overlook errors in data analysis that confirm them.
Like the devaluation of the p-value, this revelation of the role of cognitive bias in data analysis plagues me. But I also find it soothing. I now realize when I get an email from my son’s school (the third in a week), this one saying he kicked the chair out from under a classmate who had stood up, causing him to fall when he tried to sit down, maybe I shouldn’t rush to EBSCO, to find the latest journal articles on impulsive behavior–its causes and how best to treat it and life outcomes for impulsive children. Maybe this text from my mother is enough: “I did the exact same thing in 1st grade to Jeffrey Myers,” she types, “He hurt his back and cried, and I cried when I was told not to do that. It has haunted me my whole life. Jeffrey was fine. Don’t despair! I turned out alright.” Her text, which comes through as I am sitting in the parking lot waiting to pick my son up from school and wondering how to address his behavior, surprises me. My loving and moral mother kicked the chair out from under someone? Why, at 46, am I still astonished that my parents have been through almost everything? Her words make me cry—in a good way. Maybe much of the time, to get us through life, all we need are the kind words of a wise elder.
*
Science found us, it seems, in 2020. I ponder this while surveying a shelf of unused bottles of bleach in my basement, thinking about how quickly words formerly unfamiliar, like “fomite,” leapt into daily usage during the coronavirus pandemic. I remember a video I watched early on in the pandemic shared by a friend on social media of a doctor (whose credentials I verified, of course) using sterile technique to put away groceries. I tried this once on the patio outside my house, the lids of my garbage can and recycling bin serving as counters while I pulled bags of cornflakes from their boxes and wiped down bananas. But tomatoes rolled onto the sidewalk, and I tripped over milk jugs and orange juice containers. I used my chin to hold items against my chest as I removed them from their potentially dangerous outer packages. It seemed I was introducing more germs than I was eliminating. I just couldn’t do it: learn from a single video, even in a time of crisis, this thing called sterile technique, something my health care students at the technical college practiced and were tested on over and over again through various units and semesters of a two-year program. After research revealed the virus spread more through aerosols than fomites, the doctor made additional videos which stopped short of retracting his original public service announcement (in fact, he made a few more videos about how to handle packages and mail) and instead focused on reducing fear, urging people to stay home under lockdowns, and encouraging social distancing.
But who am I to judge? For a while, at the very beginning of the pandemic, based on another suggestion I’d seen in the news, I packed a large Tupperware with nonperishable food items and hid it in a closet—jars of spaghetti sauce and boxes of rice and noodles. I’m not sure why—whether for convenience, should the virus make both my husband and me too sick for too long at the same time to even order and pick up groceries, or fear of pandemonium, the virus causing looting in the streets and no groceries to be had for anyone.
“What’s going on with the CDC right now?” one of my students raised her hand and asked during a lecture on information literacy in my English Comp. I class that fall of 2020. Her question elated but also worried me. It meant she understood the CDC as a reliable source for scientific knowledge; it also meant that we were living in a time when science was unfolding right before our eyes (not to mention in a country where politics was intervening with science). We had come to depend on scientists in our daily lives in a more overt way than we ever imagined when SARS-CoV-2 began circulating around the globe. We watched as science made purported discoveries and then left them behind following new data and new contexts: we shouldn’t wear masks, then we should wear masks; the virus spreads on surfaces, no—the virus spreads in air; we should stay six feet apart, actually three feet apart will suffice; fully vaccinated people didn’t need masks, then fully vaccinated people should still wear masks; a cloth mask worked, but suddenly, after Omicron, a cloth mask was useless; full vaccination protected you from illness but then it didn’t unless we changed the definition of full vaccination to include a booster shot. We wanted definitive answers, and we wanted them now. But science takes time.
There was also—and still is—some clear pseudoscience that rose to the forefront during the pandemic. I think back on a lesson I developed in 2014, during the Ebola crisis in Western Africa, based on an article I’d read in The Atlantic called “How to Write a Hit Song about Ebola”—a lesson that brims with irony now. First, we’d examine a Unicef poster with a short list of do’s and don’ts and some images showing how not to get infected. Then, I would play “Ebola in Town” by Liberian artists Shadow and D-12 while the students followed along with the lyrics. The room would bubble with nervous giggles at the song’s upbeat hip-co rhythm and repeated warnings: “Don’t touch your friend! If you like the monkey, don’t eat the meat! If you like the baboon, I said don’t eat the meat! If you like the bat-o, don’t eat the meat!” When I asked my students why they were laughing they’d say they thought the song was a joke: it had a happy tune, it didn’t fit with the serious message, it made you feel like dancing, and the repetition annoyed them. Then we’d talk about the inter-relatedness of structure, audience and purpose, how this song was the best way to convince a mostly illiterate population, who after ten years of civil war largely believed Ebola to be a government conspiracy, that it was, in fact, real—and deadly. Who better than a popular musician, someone with street-cred to deliver this message (Shadow and D-12 fled Liberia as youths and grew up in a refugee camp in Ghana, returning to their home country later to make music), and what better format than a hip-co song to reach a population whose largest age group is under 15, and for whom the main media source is radio?
At the time, my students and I struggled to understand why the Unicef poster might result in disbelief and fear rather than a call to action. But just a couple years later, in 2022, people in the U.S.A. unvaccinated against Covid-19 were dying from the Delta variant at 11 times the rate of those vaccinated, according to an article in theBMJ. And as reported in a U.K. study of 528,000 cases, your chances of being hospitalized with the Omicron variant were 8 times higher if you were unvaccinated. Yet in my county, Portage County, Wisconsin, as of May 2023, only 57 % of the eligible population has completed the Covid-19 vaccination series. I think about this as I help a nursing student with her research paper on why vaccines should not be mandated, as I muse over a Facebook post from an old highschool field hockey teammate linking to ridiculous sources (links I won’t click on) about how hospitals cause death.
Eighty-one percent of U.S. citizens are literate, 87 % live above the poverty rate, and 77 % have access to the internet. The Civil War ended 160 years ago, but are we really all that different from our friends in Liberia in our tendencies to fear institutions, to fall prey to conspiracy theories, to inaccurately weigh statistical risks and benefits? We have the same need for empowerment, for “big” explanations in frightening times—reasons for belief in conspiracy theories according to an April 2021 PBS Newshour report—as any human on the planet.
Even I hemmed and hawed about getting my first booster shot back in September 2021, pre-Omicron, though after some research I consented within my first week of eligibility. What’s more, up until 2020 I rarely vaccinated my son against the flu mainly because I didn’t think it was necessary. Then one night, after recovering from influenza B, he woke me up to take him to the bathroom, saying, “Mommy, my legs are broken.” He couldn’t walk. Turns out he had benign acute childhood myositis, a rare complication of the flu that is thankfully usually non-life-threatening and cleared up on its own after three days in the hospital. Despite his newfound fear of needles due to the numerous times the nurses had to poke him to get an IV into his flu-dehydrated veins, now he and I both get the flu vaccine every year. So, there are some reasons for not getting vaccinated that I can entertain: a lack of concern for the severity of the virus, fear of the vaccine’s newness, or worry that pharmaceutical companies are taking advantage of us for money. Pseudoscience, of course, is not one of them.
But how can we expect average citizens to be immune from pseudoscience when even scientists and philosophers can’t agree on what differentiates it from real science? According to one article in the Journal for General Philosophy of Science, “the problem of demarcating science from nonscience remains unsolved.” From Aristotle to Karl Popper to Thomas Kuhn, philosophers of science have been trying to definitively state what science is and what is not science for over 2,000 years. The article’s author, Damian Fernandez-Beanato of the University of Bristol, at one point notes: “Because science is a historically and culturally situated activity, the degree of pseudoscientificity of a given discipline, theory, practice, etc. can vary socially and historically because the epistemic circumstances—from which we take our reasons for believing—alter as we (appear to) learn more, or forget what we have (appeared to) learn, or as we justifiably believe differently.” In other words, science doesn’t exist in a vacuum. Science exists in our minds, which exist in a cultural point in time as influential as Earth’s own atmosphere is for the planet.
I didn’t know how to answer my student’s question in the fall of 2020 about the CDC. But let what I did in another class in a later semester serve as an answer now. In Creative Writing, a new course added to our technical college curriculum as an elective for transfer students, I was introducing my students to poetry, to how it should be heard aloud, to how it should be translated orally from the page, to how some define poetry and differentiate it from prose by its very line breaks. I explained how enjambed lines, unlike end-stopped lines, suggest ambiguities in meaning, open up or close off various interpretations and experiences of a poem.
How could I read a poem with a mask on? I wondered. By the fall of 2021, my state had long since dropped any indoor mask mandates and my institution had followed suit, only recommending them. I continued to wear my mask religiously, though. How could I not, an English teacher who taught about source credibility, when the CDC still recommended masking indoors if you lived in a community with high transmission (we did) even for those vaccinated?
I thought about all science had informed us about masking. The CDC never got any of the facts wrong but perhaps, as they have gradually come to recognize, they need to work on their messaging. I would also argue we non-scientists need to work on our “listening,” on what we expect to hear when science is conveyed to us: not definitive answers, but suggestions and probabilities, and data applied in a context. I thought about weighing risks and benefits. I thought about appearances and politics. I thought about my education in both the humanities and the sciences, about how at the end of one of my most successful and informative science classes—that microbiology class I took at the community college—the professor delighted me on the last day by giving all thirty of his students a copy of Rudyard Kipling’s The Jungle, a work of fiction I devoured through the perspective of both my love of literature’s exploration of the human condition and my newfound scientific understanding of pathogens.
So I took my mask off. I took my mask off for the first time in two years of teaching to properly read Laura Kasishke’s “Bike Ride with Older Boys” and Mark Strand’s “Eating Poetry” and Carl Philip’s “Somewhere Holy.”
“I’m vaccinated and boosted,” I told my seven students as I removed one ear loop and let the mask dangle from my other ear. They sat spaced throughout the room, half masked, half not. “There’s still risk,” I continued, “but I’m taking my mask off to read these poems aloud to you, because poetry is worth it.”
And they smiled.
*
Another day back in 2021: I am picking my son up from school. We’ve received another email that one of his classmates has tested positive for the virus. My son wasn’t considered a close contact, so he doesn’t have to quarantine. I ask him who had to go home, and he mumbles an answer.
“Take off your mask, so I can hear you,” I say. He’s so used to wearing it, he’ll sometimes wear it for the whole drive. He repeats the names, and I note that he won’t have to quarantine according to the CDC’s updated guidelines because he’s vaccinated.
“How does the vaccine work?” my son asks.
I think for a moment. “Well, they inject something into your body that looks like the coronavirus, so when your immune system sees it, it creates special soldiers just to fight that particular virus. Then, if you become infected with the actual virus, you already have soldiers ready to fight it.”
“Cool!” he says, in a tone I don’t often get, one that indicates he understands and is genuinely impressed. The war metaphor, the mistaken identity—he gets these literary elements. Despite his fear of needles, he was very excited to receive his two doses. “I’m fully vaccinated now,” he tells friends and family members, holding out his hands palms forward, as if pushing back the virus, for emphasis.
And then came Omicron.
So now, just as with my students, I feel compelled to give him the whole story. “Science is cool. It can tell us a lot, and with science we can accomplish some pretty cool things,” I agree, and then I force myself to say it: “But in science, there is also a lot of uncertainty. There is a new variant of the virus right now,” I continue. “One that has a lot of mutations that make it look different, so even the soldiers in a vaccinated person don’t always recognize it . . .”
“But I just got my vaccine,” he argues. “So, my soldiers probably will recognize it. . .”
That’s enough science for today, I decide. “Yes,” I reply, looking in the rearview mirror, where he sits in the back, stimulated from a day of school and play with friends, ready to go home and relax. “Your soldiers probably will.”
*
We can fix some of the problems cognitive bias causes in empirical research. Philip Ball suggests we develop a “database where scientists can, elementary school style, type in their hypothesis before they begin collecting data.” Hartgerink seconds this process, according to Stephen Buranya, noting that scientists could also “have their results checked by algorithms during and after the publishing process.” We can “blind” data analysis, which means changing the data in systematic ways so the researchers can perform the analysis without knowing, for example, which set is the experimental group and which is the control. We can commit to publishing all results, not just those that are significant. And we can still use the p-value, but as one small part of a broader way of analyzing data, says Ron Wasserstein, the director of the American Statistical Association. Even peer-review could be made more transparent and accountable by opening up the process. This means eliminating anonymity between authors and reviewers, allowing the wider public to peer review articles before publication, and including summaries of the reviews and the reviewer’s names along with the published articles. Together, all of these practices can help keep science “honest.”
Some scientific disciplines and scholarly journals are already attacking the problem of cognitive bias and its effect on how we interpret data. In the field of pharmacology, all clinical trials must be pre-registered. The Open Science Framework (OSF), founded in 2013, provides a tool for scientists to publicly and openly manage the entire research process, as well as for scientists to collaboratively attempt to replicate the empirical results of previous studies. And though closed peer review is still the norm, some journals and presses are experimenting with or have switched to open, such as BMC, BMJ, Copernicus, eLife, EMBO Press, F1000Research, Nature Communications, Royal Society Open Science, PeerJ and PlosOne.
Christie Aschwandan captures the current situation of science best in “Science Isn’t Broken,” when she quotes Brian Nosek, founder, along with Jeffrey Spies, of the OSF. “Science is hard,” Nosek says. “Science is fucking hard,” Aschwandan amends this.
Perhaps what we really need to fix is not science but our own view of it. Yes, science is uncertain, but when we swallow that, ironically, it might allow us to bring in a little of the old philosophy that seems so much a part of science and so acceptable in older scientists I read such as Edwin Way Teale or Lewis Thomas. This was not so much a metaphysics presented as science (such as intelligent design), or a psuedoscience (I refuse to give any examples of these) but a scientist’s personal commentary, a sort of personal essay interwoven with the empirical research, how scientific discoveries made the researcher feel. I first became truly passionate about science through through the 1971 Audubon Nature Encyclopedia. In the encyclopedia’s entry on ponds, zoologist Ann Haven Morgan writes, rather poetically: “Some ponds treat the eyes to rare beauty, others seem to be only reaches of mucky water. But any one of them can excite our curiosity, for all ponds have the charm of secrecy.” Right there, in an encyclopedia, in a work that is informative by definition, is an admittance of and a great appreciation for the unknown, the “charm of secrecy,” as if nature itself were pushing back against our knowing everything.
Perhaps also, we need to reunite the sciences and the humanities, or at least have them turn and face one other again. Massion Pigilucci, of the Department of Ecology and Evolution and Philosophy at Stony Brook University, argues that science doesn’t, but should, hold philosophy in higher regard, giving examples of how each has informed the other in areas such as quantum mechanics, evolutionary biology, cognitive science, and moral philosophy. Sebastian De Haro, in “Science and Philosophy, A Love-Hate Relationship” agrees, maintaining that when scientists are on the cusp of a new paradigm, or theory-creation, they struggle with the same “why-questions” as do philosophers. Both authors write about the need for interdisciplinary studies, requiring philosophy students to take courses in science and science students to take courses in philosophy. Lewis Thomas, who was both doctor and poet, in, “A Trip Abroad” advised, “We must rely on scientists to help us find the way through the near distance, but for the longer stretch of the future we are dependent on the poets. A poet is, after all, a sort of scientist, but engaged in a qualitative science in which nothing is measurable. He lives with data that cannot be numbered, and his experiments can be done only once.”
*
At one point in the fall of 2021, I complained to my mother that for the last two years I could only read news—in the morning, at lunch, after dinner, before I went to bed. I felt it was a real problem. “I just need to get away from my phone and my iPad,” I lamented.
“It’s okay,” she said. “Don’t you remember how I would listen to the news on the radio in the morning and watch it at lunch at noon, and how your father and I would watch it together after dinner and then read the newspaper and then he would stay up and watch it again after we all went to bed? It’s nothing new. Enjoy your news,” she said.
Maybe it wasn’t my phone or the warp-speed news cycle of modern times or the doom-and-gloom headlines of the pandemic; maybe it was in my genes (there’s some pseudoscience for you!). I relaxed a little at my mother’s suggestion, but the very next Friday after my classes ended, I went to the public library while my son was still in school, alone there for the first time probably since his birth. I laughed when I saw my favorite section—the 920s, biography and memoir, which gets a room of its own—closed off for cleaning. That made it easier to do what I had come to do: browse the fiction like I used to. I selected a few books, judging them only by their covers, Francine Prose’s Blue Angel and Frederick Reiken’s The Odd Sea, and over the course of the next few weeks I became totally immersed in and satisfied by the made-up lives of the made-up characters with their made-up conflicts in their made-up settings.
*
A decade ago, in my mid-thirties, I wrote about putting the idea of a creator to rest, though I had some difficulty giving up the idea of an afterlife. Now, as a mother, I see the need for a creator, because a child isn’t thinking about the end of life, but where he or she came from.
My husband and I said we would raise our son with both religion and an open mind, but in the end, I think, we’ve failed at both. As an adult, I feel grateful for having my Methodist background, that bottom to push against, to push off to get elsewhere—even away from that background itself. What have I given my son but an empty well, a bottomless pit? And how will that affect his life’s trajectory?
One day, somewhere, he hears the phrase “the Holy Spirit” and asks me what it means. While he understands the concept and stories behind the Father and the Son, this third incarnation of God is a bit difficult for him to grasp. I tell him the Holy Spirit is something you feel within you, when you are at peace with the world.
“I feel it most when I’m in nature,” I explain.
“I feel it most,” my son replies, “when I’m watching TV.”
I laugh at this one, but sometimes, my not bringing him up with religion works decidedly against me.
“Cleanliness is next to Godliness,” I say one day as we work together to tidy his room.
“I don’t believe in God,” he replies.
Again, I chuckle at his shrewdness. But the way he says it—I don’t believe in God—the same way he might say, “We don’t eat at McDonalds,” bothers me. So matter of fact. God might not be real, but he isn’t McDonalds either. Advertising your conviction against something seems as proselytizing as advertising your conviction for it. This is never what I meant to do.
One summer day, we are eating lunch on the porch, and I look out over the soybean field. Wind blows the plants in waves and the hot air above them ripples, mirage-like. We’ve just returned from a trip back home to Maryland to visit my parents and sisters, a trip that always includes a few days jaunt to the coast where we swim in the Atlantic Ocean.
“Wouldn’t it be neat if instead of a soybean field this were an ocean?” I muse, feeling a bit homesick. “Wouldn’t that be beautiful?”
“Dinosaurs used to roam here,” my son responds.
“That’s right!” I say. “Isn’t that hard to believe?”
“Where did all this stuff come from?” he asks, looking around, palms up. The essential question, at such a young age.
“We don’t really know,” I tell him, thinking, how can we not? How can we be here and not know why or how? Dammit, what the hell is all this stuff?
None of it makes sense, perhaps because we’re this wonderful conglomeration of matter that must see the world as a story, something with a clear beginning, middle, and end. So much so that we even like our science that way—a science that is neat rather than messy, that goes full circle, confirming its own beliefs, drawing conclusions that bring us back to the beginning the way a good narrative should. I think about which story to tell him now. Normally I would start with the big bang, that grand explosion, the coalescing of matter into stars and planets, galaxies and solar systems, the first single-celled animal here on Earth forming in a deep-sea vent, then some fabulous amphibious thing creeping out of the ocean onto a landscape covered with mosses and liverworts. I’d tout the amazing diversity of life on Earth, the various intelligences of animals. The way a brine shrimp egg, in the right conditions, can still hatch (though ours never did) after lying dormant for up to 25 years. I’d remind him that the WEBB telescope is up there right now, its sunshield and mirrors unfurled, looking 13.5 billion years into our history, almost to our very beginning, at the first stars to ever shine in the cosmos. The Big Bang Theory, The Theory of Stellar Evolution, The Theory of Biological Evolution, all are supported with—and lacking some—key empirical evidence. I realize it’s not what I say, what I tell–whether scientific theory or religious story—but how I deliver the message to my son that’s important. There’s a great mystery in both that I must not skip over, whether philosopher or scientist. There’s an unknowableness I must honor, an uncertainty I must embrace rather than avoid. I must stand dumb in the awe of the universe. I must love that dumbness.
The ocean is fresh in my mind from our trip, but so is Patapsco Methodist Church, where my mother is in her fourth decade as the organist, where my husband and son and I always attend the service on Sundays when we visit, the only church my son has regularly been in, which means only once or twice a year.
His question still hangs in the air.
“We don’t really know where all this stuff came from,” I finally say, taking a deep breath, “But there are lots of different stories. Here’s one.” And I tell it as I remember it: the seven days, the beautiful garden, the tree of knowledge of good and evil, the one rule, the snake, the lie, the nakedness, the banishment. And I can see now by the way he listens how it’s a good story, and how it doesn’t need, how it never needed to be true, and how the only truth may be this: that I never needed to choose, how the best thing I can give my son in the world today, from both religion and science isn’t facts, but uncertainty, because uncertainty allows for something that facts sometimes don’t—like hope.