Democracy Gone Astray

Democracy, being a human construct, needs to be thought of as directionality rather than an object. As such, to understand it requires not so much a description of existing structures and/or other related phenomena but a declaration of intentionality.
This blog aims at creating labeled lists of published infringements of such intentionality, of points in time where democracy strays from its intended directionality. In addition to outright infringements, this blog also collects important contemporary information and/or discussions that impact our socio-political landscape.

All the posts here were published in the electronic media – main-stream as well as fringe, and maintain links to the original texts.

[NOTE: Due to changes I haven't caught on time in the blogging software, all of the 'Original Article' links were nullified between September 11, 2012 and December 11, 2012. My apologies.]

Friday, September 15, 2023

The End of Neutrality


We didn’t always talk about the Supreme Court in crassly partisan terms. The court had liberals and conservatives, and general voting patterns, but public analysis of the court’s activity normally centered on the legal reasoning behind decisions and dissents. People might disagree with one ruling or another, but the court’s overall authority, rooted in its role as a trusted arbiter of competing claims, enjoyed a basic respect.

Today, in contrast, it’s common to regard the court as little more than another political body. More and more, Senate votes for judicial nominees, such as Brett Kavanaugh, break down along party lines, and people tend to assume that the outcome of a given case will hinge on which bloc of justices, liberal or conservative, has the majority. Justices—like everybody else in our tribal world—are now seen as vehicles for expressing a political preference.

It’s not just the Supreme Court, either. Something similar is happening in all our institutions—the news media, universities, think tanks, the intelligence services and other technocratic offices of the government. Once respected as objective, neutral bodies that could referee claims emerging from our heterogeneous society, they are increasingly viewed as instruments of a liberal or conservative or other ideological agenda, if sometimes hiding their partisanship behind a veneer of disinterestedness. The very idea of value neutrality that rose to prominence after World War II—the idea that individuals or institutions can fairly arbitrate among competing values in a pluralistic society—has fallen on hard times, leaving us unsure of where to turn for a reliable account of the world.

It’s easy to blame Donald Trump for this shift, with his casual mendacity and contempt for conventions. He has cavalierly trashed institutions he dislikes, from CNN to the 9th Circuit Court of Appeals to the intelligence agencies, as tools of the opposition, encouraging his followers to dismiss their pretenses of fairness. But Trump isn’t the root of this problem so much as a poisonous flower. The cynicism he exploits and deepens has been metastasizing for decades. Now it has reached stage 4.

We can already see the implications: dysfunctional Washington politics, a shrill public discourse rife with accusations of bad faith, conspiracy theories sprouting like toadstools, sharp hostility to universities. The inability to marshal a national consensus even on basic facts—like the Russian efforts to disrupt our elections—has prevented us from taking steps to secure our democracy and left many fearful about the soundness of our system. Without trust in the government and other neutral bodies to provide reliable information and to adjudicate fairly among viewpoints, we risk losing one of our democracy’s greatest virtues: the ability to wage our debates freely and contentiously while knowing that ultimately most of us will accept the resolutions as legitimate. Without such acceptance, self-government becomes like a trial without a judge, a boxing match without a referee. What happened?

***

The importance we place on neutrality in our institutions is actually somewhat new. It grew out of what the intellectual historian Edward Purcell, in the title of an influential 1973 book, called the “crisis of democratic theory,” which gripped American intellectual culture in the 1920s and ’30s. In that era, worldwide depression and a backlash against the idealism of World War I undermined the grounds for believing that democracy was necessarily the best form of government. Because philosophers and social scientists had come to embrace empiricism over rationalism—arguing that our knowledge came from experience, not reason—many intellectuals had lost their confidence in the older philosophical principles that had once seemed absolute. A relativistic outlook seeped into American culture, even affecting how people thought about democracy.

But in the crucible of World War II and the fight against totalitarianism, Purcell showed, there emerged a revised defense of democracy. In place of the fashionable cynicism of the 1920s and ’30s came the idea that democracy was superior as a system of government precisely because it wasn’t absolute. It allowed multiple viewpoints to coexist and compete, and it was capable of revision. Although this argument took place at a rarefied level, among scholars and intellectuals, their ideas crept into popular thought.

The new understanding of democracy as experimental, like science, meant government was best seen as a neutral manager of competing interests, not an instrument for imposing an ideology. A set of ideas might prevail in a given election, but victory was provisional. Democracy, one might say, was a verb; its value consisted in continuing to enact it. “The totalitarians regard the toleration of conflict as our central weakness,” Arthur Schlesinger Jr. wrote in 1949. “But we know it to be basically our central strength.”

The institutions that promote and disseminate knowledge came to rest on similar assumptions. In the late 19th century, America’s great research universities arose, and the social sciences flourished by claiming to put knowledge on a more scientific, empirical basis. Similarly, as early as the 1890s, newspaper journalism had come to value factual reporting over editorializing; by the 1920s, objectivity and commitment to facts were understood as helpful ways to avoid the pitfalls of the subjective. In the years after World War II, these tendencies settled into guiding principles. These institutions promoted inquiry, fairness, openness and the competition of ideas.

The post-war decades were hardly free of turmoil, from McCarthyism to the struggles for racial equality. But with the far right and far left in retreat, President John F. Kennedy could proclaim, as he did in 1962, that “the central domestic issues of our time … relate not to basic clashes of philosophy or ideology but to ways and means of reaching common goals—to research for sophisticated solutions to complex and obstinate issues.” Widely shared prosperity and a strong social consensus backing a liberal welfare state and an internationalist foreign policy helped Americans to trust in their political system.

By the late 1960s, however, the consensus that had prevailed was crumbling. Left and right alike made war on established authority. From both sides, one heard the same charge: that the ostensible neutrality of the government and other public institutions masked an ideology—one that was either aggressively liberal (according to the right) or cravenly conservative (according to the left).

Professional expertise came under fire. As Michael Schudson wrote in the book Discovering the News: “Critics claimed that urban planning created slums, that schools made people stupid, that medicine caused disease, that psychiatry invented mental illness, and that the courts promoted injustice.” Journalistic objectivity was now disdained not as an unattainable ideal, as it had been in the past, but as “a mystification,” in Schudson’s apt term. In academia, too, arguments against politicizing scholarship faced counterclaims that all scholarship was inherently politicized. Trust in government plummeted from its peak in the mid-1960s.

Ever since, intellectuals, journalists, civil servants and others have wrestled with questions of neutrality and bias. The awareness that scholars or judges might harbor a latent ideology didn’t, for the most part, keep them from aiming for objectivity. But in the academy—and in the wider culture—it fed the growth of what became known as postmodern thought. In journalism, dissatisfaction with the strictures of “straight news” encouraged a variety of innovations, including interpretive and analytical pieces, investigative reporting and the often-subjective New Journalism. Still, these early doubts about the professed neutrality of our knowledge-forming institutions did not fatally undermine them. Postmodern critiques of scholarly values were more often mocked than embraced. Objectivity remained a respected ideal. 

It’s hard to say when the rejection of neutrality went from being a persistent intellectual critique to a dominant belief—or even whether it’s reached that point yet. But the battle in late 2000 over the outcome of the presidential election was a symbolic watershed. That the Supreme Court voted 5-4 along ideological lines to make George W. Bush president deepened the suspicion that not only ostensibly neutral election processes but even the law itself would succumb to the political preferences of those administering it. In his dissent, Justice John Paul Stevens warned that “the loser” in that drama was “the Nation’s confidence in the judge as an impartial guardian of the rule of law.”

Under Bush, the polarization that had already begun to seize Washington politics intensified, dealing body blows to neutrality. Since Richard Nixon’s presidency, Republicans had gradually built a conservative counterestablishment: think tanks, foundations, societies, networks and news outlets to promote their ideas. By Bush’s presidency, it was possible to find “experts” who could lend a patina of authority to conservative policy positions otherwise unsupported by solid research—whether on evolution, birth control, global warming or even the origins of the universe. Even Bush’s most controversial undertaking, the invasion of Iraq, relied on an alternative set of intelligence analysts, after those at the CIA and elsewhere failed to return the findings that Bush had hoped for.

Fox News was crucial to this development. Founded in 1996 by the longtime Republican consultant Roger Ailes, it gained influence in the Bush years—especially after the 9/11 terrorist attacks—and wooed viewers away from what were now being called the mainstream media. But Fox’s claim of being an ideological alternative to the networks was misleading. Those networks aspired to neutrality; they weren’t trying to advance liberal causes (even if some perceived bias in their reports). Fox, despite claiming to be “fair and balanced,” reflected the pronounced ideological agenda of the man who ran it—a thoroughgoing conservative cultural populist. So liberals began to emulate the right’s methods. MSNBC began to turn itself—if half-heartedly—into a liberal Fox. And if Air America tried (and failed) to be the mirror of Rush Limbaugh, many podcasts are now succeeding at the task.

So, too, with think tanks. Conservative shops like the Heritage Foundation might have fancied themselves right-wing counterweights to “liberal” ones like the Brookings Institution, but Brookings, like the network news shows, adhered to a neutral scholarly ideal. Lacking a Heritage of their own, Democrats in 2003 founded the Center for American Progress, with a pronounced partisan orientation. Now, key sources of information bear a partisan stamp, undermining their claims to independent authority.

***

Under Trump, neutrality has become a difficult position for any individual or institution to maintain. Everyone is expected to take a side. Even attempts to articulate safe, bipartisan points of consensus run afoul of tribal suspicions. Journalists who serve up anodyne platitudes about a free press suddenly seem like militant anti-Trumpers, while commonsense pleas from no less than Barack Obama not to ignore someone’s ideas solely on the basis of his race or sex are mocked as wrongheaded or naive. Where the early internet, with its blogs and comments, had put pressure on the mainstream media, social media have amplified that pressure many times over—with Twitter enticing officially neutral reporters into staking out positions, voiced with sarcasm, snark or notes of partisanship that would be verboten in the news pages. No sooner does someone try to set up a new kind of neutral arbiter—like the invaluable fact-checking sites PolitiFact and FactCheck.org—than that, too, comes under fire for bias, as these sites have from the right. PolitiFact founder Bill Adair is trying to create an automated tool that conservatives will accept as neutral—though allegations that Facebook’s algorithms are politically skewed suggest that not even a computer program can attain that holy status anymore.

On campuses, departments now offer courses in “social justice,” which usually means the advocacy of left-wing politics, and university presidents feel pressure to take liberal political stands. Republicans no longer consider universities forces for good. In law, opposition is growing to widely respected concepts like viewpoint-neutrality—the idea that the government can’t punish speech based on its content. Even legal scholars on the left, as the New York Times’ Adam Liptak wrote, “have traded an absolutist commitment to free speech for one sensitive to the harms it can inflict.” The First Amendment, in some eyes, isn’t really neutral anymore.

The demise of neutrality lies behind the dominant political problems of our age. It is responsible for all the chatter about a “post-truth society” that we have heard lately. Truth still exists, of course, but agreement on the truth feels more elusive than it has in a long time. That spells danger for democracy, which depends on constructive argument and deliberation. Without trusted sources of information or respected vehicles of settling differences, there is only partisan argument and the triumph of the powerful.

The collapse of neutral institutions also feeds a vicious cycle of polarization and extremism. When institutions no longer enjoy credibility across the political spectrum, people look to more ideological sources for confirmation of what they want to believe. Conservatives drift toward the Tea Party and Trump, and progressives toward left-wing radicalism. Some long-standing neutral institutions, particularly in journalism, seem to be feeling pressure to abandon their historic role in order to please their audiences. If a reporter or scholar or judge isn’t anticipating good-faith critiques from a range of viewpoints, he is less likely to shore up his thinking to make his conclusions broadly palatable.

When we anticipate the future, we usually project trends to continue—and that promises a long dark night of discord. But it’s also conceivable that the Trump presidency, with its ceaseless chaos and disruption, will prompt a recognition of what is being lost and fuel demand for a rebirth of statesmanship and objectivity. In the 1930s and ’40s, it wasn’t just the threat of totalitarianism but its lure—to those frustrated with the imperfections of democracy—that forced Americans to justify their democracy in more durable ways. The current crisis is forcing us to examine the intellectual and practical bases of our system. With luck, we will, in seeing its weaknesses, have the wisdom to figure out how to shore it up.

Let’s just hope that this time, it won’t take a world war to change our minds.

Original Article
Source: politico
Author: David Greenberg

No comments:

Post a Comment