“Only certain types of people respond to certain types of propaganda in certain situations. The best reporting on QAnon, for example, has taken into account the conspiracy movement’s popularity among white evangelicals. The best reporting about vaccine and mask skepticism has taken into account the mosaic of experiences that form the American attitude toward the expertise of public-health authorities. There is nothing magically persuasive about social-media platforms; they are a new and important part of the picture, but far from the whole thing. Facebook, however much Mark Zuckerberg and Sheryl Sandberg might wish us to think so, is not the unmoved mover.
For anyone who has used Facebook recently, that should be obvious. Facebook is full of ugly memes and boring groups, ignorant arguments, sensational clickbait, products no one wants, and vestigial features no one cares about. And yet the people most alarmed about Facebook’s negative influence are those who complain the most about how bad a product Facebook is. The question is: Why do disinformation workers think they are the only ones who have noticed that Facebook stinks? Why should we suppose the rest of the world has been hypnotized by it? Why have we been so eager to accept Silicon Valley’s story about how easy we are to manipulate?
Within the knowledge-making professions there are some sympathetic structural explanations. Social scientists get funding for research projects that might show up in the news. Think tanks want to study quantifiable policy problems. Journalists strive to expose powerful hypocrites and create “impact.” Indeed, the tech platforms are so inept and so easily caught violating their own rules about verboten information that a generation of ambitious reporters has found an inexhaustible vein of hypocrisy through stories about disinformation leading to moderation. As a matter of policy, it’s much easier to focus on an adjustable algorithm than entrenched social conditions.
Yet professional incentives only go so far in explaining why the disinformation frame has become so dominant. Ellul dismissed a “common view of propaganda . . . that it is the work of a few evil men, seducers of the people.” He compared this simplistic story to midcentury studies of advertising “which regard the buyer as victim and prey.” Instead, he wrote, the propagandist and the propagandee make propaganda together.
One reason to grant Silicon Valley’s assumptions about our mechanistic persuadability is that it prevents us from thinking too hard about the role we play in taking up and believing the things we want to believe. It turns a huge question about the nature of democracy in the digital age—what if the people believe crazy things, and now everyone knows it?—into a technocratic negotiation between tech companies, media companies, think tanks, and universities.
But there is a deeper and related reason many critics of Big Tech are so quick to accept the technologist’s story about human persuadability. As the political scientist Yaron Ezrahi has noted, the public relies on scientific and technological demonstrations of political cause and effect because they sustain our belief in the rationality of democratic government.
Indeed, it’s possible that the Establishment needs the theater of social-media persuasion to build a political world that still makes sense, to explain Brexit and Trump and the loss of faith in the decaying institutions of the West. The ruptures that emerged across much of the democratic world five years ago called into question the basic assumptions of so many of the participants in this debate—the social-media executives, the scholars, the journalists, the think tankers, the pollsters. A common account of social media’s persuasive effects provides a convenient explanation for how so many people thought so wrongly at more or less the same time. More than that, it creates a world of persuasion that is legible and useful to capital—to advertisers, political consultants, media companies, and of course, to the tech platforms themselves. It is a model of cause and effect in which the information circulated by a few corporations has the total power to justify the beliefs and behaviors of the demos. In a way, this world is a kind of comfort. Easy to explain, easy to tweak, and easy to sell, it is a worthy successor to the unified vision of American life produced by twentieth-century television. It is not, as Mark Zuckerberg said, “a crazy idea.” Especially if we all believe it.”