The idea of the ivory tower—conjuring an image of the eccentric scholar, cloistered amongst stacks of books and papers, disinterested in the real world and seeking only to add to a parochial branch of human knowledge—is a well-worn cliché. It might not even be all that true; after all, one of the earliest modern references of that staid term comes from the Henry James novel about excessive wealth. And for all of academia’s flaws, overt wealth is no longer one of them.
Still, academics can’t escape this stereotype. It’s made us easy targets as our political discourse becomes poisonous and polarizing, giving the loudest voices—brandishing their blunt-force talking points—a convenient way to dismiss us. And as academics and experts have become marginalized, debate itself has changed: it’s less a broad and careful balancing of ideas and more about theatrically pitting one extreme against the other, or giving thoroughly discredited ideas the opportunity to have a public hearing, just as Toronto’s Munk Debates—featuring Steve Bannon and David Frum—did in November. These public, hyped-up debates are a terrible way to sort through difficult issues, both on social media and off, or on campus or off; they are merely about scoring points for being charming, quick on one’s feet, and delivering the best zingers.
That’s not how knowledge works. Academics must take a lead role in informing people how it’s actually created and disseminated—and in so doing, take back the torch that had been wrested from them.
Academics don’t participate in panels at conferences in order to make our fellow panelists look like fools; we present our newest ideas for consideration and discussion, as we hope to learn from others and help their work take shape. Likewise, our publications don’t waste time refuting baseless and noxious ideas such as Holocaust-denial or flat-earth theories; we stand on a foundation of the most credible and careful research of our predecessors and contemporaries in order to make our own contributions to knowledge.
Rather than trying to get a leg up on opponents in a screaming match, the best research aims to participate in a conversation that’s ongoing. Instead of delivering the final word on a subject, or demonstrating that all previous work is junk, academics encourage the field to think about things in new ways by considering new evidence and approaches. Paradigm shifts rarely happen.
But such abrupt and radical changes to our understanding of a particular field are often the only things that enter the public’s view, through the media or otherwise. Because of that, our culture is now primed to privilege sensational news over the inherently slow and incremental work of most scholarship, which involves a peer-review system that relies on volunteer reviewers and good-faith submissions.
Exacerbating this problem is that the aura of expertise is being granted to academics who are outside their lane. For instance, for all his success as a linguist and cognitive scientist at Harvard, Steven Pinker knows very little about the Enlightenment, and it shows. Likewise, despite his background as a clinical psychologist, Jordan Peterson has become popular on the strength of often erroneous sermons about postmodernism. Of course, everyone, including academics, is free to opine on any topic they wish—but my academic expertise is much narrower than the range of fields on which I write in public, and it’s on me and my editors to be upfront about that. More scholars need to take them to task for besmirching the already shrinking belief in expertise, and more media outlets should take their academic challengers seriously so that the public knows not to take alluring ravings as fact.
It’s easy to be sweeping in generalizations and sensational in one’s findings, but the care that defines academia is hugely important. The recent scandal over an elaborately forged manuscript that claimed that Jesus had a wife is a reminder that due diligence is essential: no matter how important a potential discovery is, or how much we want to believe it, we’d better make sure we have our facts straight and have covered our bases before sensationalizing it. Still, with facts at a premium, and loud and easily digestible speculation increasingly passing for a truth that people can believe, academic research’s laggard pace and failure to fascinate has affected how the public views us.
But that’s something we can control. We can stand up and say something.
It might not be academics’ instinct to get embroiled in such battles, but it’s necessary. Outlets like The Conversation and Eidolon have worked to corral social media’s wild west into a place that values actual studied expertise. My own field, for instance, is now used to bolster nativist and chauvinistic views of Western civilization, in which the “West is best,” with the Greeks and Romans championed as heroes to be emulated rather than simply peoples to be studied. When white supremacist groups like Identity Evropa post selfies at the White House while rallying behind ancient symbols like marble statues, classicists should not stay silent—they must correct the record and bring facts to the battlefield, so that disciplines aren’t misappropriated to fuel modern-day hatred.
In 2019, Canadians will head to the polls, and the world will watch as the U.S. inches closer to a climactic presidential election. This year, then, must also be devoted to fostering public engagement and a broader understanding of who experts are and what they do. The stakes are simply too high to bury our heads in the sand, or retreat to that ivory tower that the popular imagination has built for us; academics can no longer claim impartiality or seek the luxury of being aloof. As the philosopher Marshall Berman said: “you may not be interested in politics, but politics is interested in you.”