The questions that kids ask about science aren’t always easy to answer. Sometimes, their little brains can lead to big places that adults forget to explore. That is what inspired our series Science Question From A Toddler, which uses kids’ curiosity as a jumping-off point to investigate the scientific wonders that adults don’t even think to ask about. The answers are for adults, but they wouldn’t be possible without the wonder that only a child can bring. I want the toddlers in your life to be a part of it! Send me their science questions, and they may serve as the inspiration for a column. And now, our toddler …
“Do scientists die? Does that mean other scientists will dig them up someday?” — Althea, age 4.1
Scientists: They’re just like us. And not only does that mean they will, eventually, shuffle off this mortal coil, it also means that they (like dinosaurs, ancient civilizations and modern living humans) have all kinds of deeply interesting behaviors we don’t totally understand yet. Scientists have opinions that influence the experiments they design and the questions they ask. Scientists don’t all think about science the same way. Scientists have cultures and subcultures that interact and conflict.
Which is to say, scientists make excellent research subjects for other scientists — even before they’ve had a chance to die, get buried and fossilize into something a preschooler might dig up in a dinosaur-themed sandbox. And, unlike the thunder lizards of old, studying living scientists has big implications for modern politics. How do we know we can trust expertise? When should we question data, and when should we use it as a path to truth? When we live in a knowledge economy, it’s pretty important to know how the ol’ knowledge sausage gets made.
Studying living scientists is more than just a meta exercise in navel gazing. That’s because scientific evidence is a pretty important part of how we solve problems and make decisions, said Harry Collins, professor of social sciences at Cardiff University in the United Kingdom. Every time we choose to believe science — to decide that vaccines are safer than communicable diseases, for instance, or that it matters that 97 percent of climate scientists agree that climate change is real — we’re deciding to trust a particular group of people and a particular culture. Shouldn’t we be as interested in the culture that we trust to define truth as we are in, say, apatosaurus femurs?
But we weren’t always, Collins told me. For most of the 20th century, he said, the idea that science was the best way to arrive at truth was taken as a given. Historians studied how science had developed, but nobody was really trying to understand how science worked in the modern world or why we should trust it more than other institutions.
It was particularly ironic considering science’s growing global importance during that time, said Karin Knorr Cetina, professor of sociology at the University of Chicago. The idea of sending sociologists to study the scientists who were studying the big questions of society grew out of that increased reliance on science as the purveyor of truth. For instance, some of the first research on scientists and the way they work came out of the process of setting up national health-care systems in Europe, Knorr Cetina said. With so much at stake, everyone wanted to know how the scientists were getting answers — not just what their answers were. “If we base political decisions on scientific evidence, it’s important to understand: What are these scientists actually doing? Are they doing what they’re supposed to do? … What sort of autonomy should scientists have?” Knorr Cetina said.
Both Collins and Knorr Cetina have been studying scientists since the 1970s. Collins focused on physicists searching for evidence of gravitational waves, while Knorr Cetina’s early work compared the cultures of molecular biology and particle physics. She has since moved on to studying the use of science in economics. Over the years, they’ve both interviewed scientists, attended conferences and spent hundreds of hours hanging out in laboratories — observing scientists in their natural settings like Jane Goodall taking notes on a pack of particularly well-educated chimps.
As a result, they’ve watched and documented behaviors that we’d otherwise have no way to see. For example, in 2014, Collins published a paper about how scientists decide what to do with edgy ideas. What makes the difference between groundbreaking research and a crank?
To study this, Collins focused on a scientific research paper that had been sent to him by its author. The paper claimed there were big flaws in the tools that physicists were using to search for evidence that gravitational waves were real. If it was right, that would mean billions of dollars had been wasted on a program that could never work. Collins sent the paper — along with a survey — to 12 experts in gravitational wave physics to find out whether they would dismiss the paper or investigate its claims further.
Of the 10 who responded to the survey, all dismissed it. But the survey also showed that the decision to ignore the paper was based on a complex series of choices and metrics — not just what the paper said, but how it said it, where it had been published and what else the author was working on. For instance, the respondents noted that the paper had come out in a journal that specialized in unorthodox research that other journals had rejected. Other respondents told Collins that they were concerned about the frequency with which the author cited himself in his own paper and that, from what they could tell, references to other people’s work were mostly papers that had been published by the same, unorthodox, journal. And one respondent said the author’s other work showed a particular interest in disproving Einstein’s theory of relativity.
None of that necessarily meant the author was wrong, but it was all evidence that, taken together, made wrongness more likely. (He was wrong, though. The tools critiqued in the paper would find evidence of gravitational waves about a year later.)
This research that Collins did isn’t trivial — especially at a time when a proposed presidential committee on climate science is fueled by the idea that scientific gatekeepers are preventing unorthodox knowledge from getting a fair hearing. Knowing how experts decide what is and isn’t likely to be true matters.
It’s also important to know that the way science works is seldom as straightforward and tidy as we assume it is. Just like experts use signposts to decide that an unorthodox idea is more or less likely to be right, they’re also looking at their own ideas through the lens of probability. “It’s a mistake if the public thinks you can just go ask a scientist” and solve problems easily, Knorr Cetina said. In her research on biologists, physicists and economists, she has seen that it takes a very long time — decades — to get solid answers to simple questions. And everything comes in what she calls a “bracket of uncertainty.”
But, at the same time, both Knorr Cetina and Collins also said their work studying scientists has made it more clear to them that it’s worth trusting science. To Collins, it comes down to the way the profession places such a big value on finding truth. The bracket of uncertainty still exists, but after years of watching scientists argue with one another, he personally came to the conclusion that the professional value of finding truth usually wins out over other drivers like money or fame. “With scientists, you can be sure that whether they reach the truth or not, they’re trying harder than anybody else, and that’s where you should place your bets,” he said.