In 1990, at the age of 11, I stood in a line of sixth graders outside an imposing converted armory on Manhattan’s Upper East Side, nervously anticipating a test that would change my life. I was hoping to gain entrance to Hunter College High School, a public magnet school that runs from grades seven through twelve and admits students from all five boroughs. Each year, between 3,000 and 4,000 students citywide score high enough on their fifth-grade standardized tests to qualify to take Hunter’s entrance exam in the sixth grade; ultimately, only 185 will be offered admission. (About forty-five students, all from Manhattan, test into Hunter Elementary School in the first grade and automatically gain entrance to the high school.)
I was one of the lucky ones who made it through, and my experience there transformed me. It was at Hunter that I absorbed the open-minded, self-assured cosmopolitanism that is the guiding ethos of the current American ruling class. What animates the school is a collective delight in the talent and energy of its students and a general feeling of earned superiority. In 1982 a Hunter alumnus profiled the school in a New York magazine article called “The Joyful Elite” and identified its “most singular trait” as the “exuberantly smug loyalty of its students.”
That loyalty emanates from the deeply held conviction that Hunter embodies the meritocratic ideal as much as any institution in the country. Unlike elite colleges, which use all kinds of subjective measures—recommendations, résumés, writing samples, parental legacies and interviews—in deciding who gains admittance, entrance to Hunter rests on a single “objective” measure: one three-hour test. If you clear the bar, you’re in; if not, you’re out. There are no legacy admissions, and there are no strings to pull for the well-connected. If Michael Bloomberg’s daughter took the test and didn’t pass, she wouldn’t get in. There are only a handful of institutions left in the country about which this can be said.
Because it is public and free, the school pulls kids from all over the city, many of whom are first-generation Americans, the children of immigrant strivers from Korea, Russia and Pakistan. Half the students have at least one parent born outside the United States. For all these reasons Hunter is, in its own imagination, a place where anyone with drive and brains can be catapulted from the anonymity of working-class outer-borough neighborhoods to the inner sanctum of the American elite. “I came from a family where nobody went to college. We lived up in Washington Heights. We had no money,” says Jennifer Raab, who as president of CUNY’s Hunter College oversees the high school as well. “It was incredibly empowering.” When she surveys the student body, “it gets me very sappy about the American dream. It really can come true. These kids are getting an education that is unparalleled, and it’s not about where they come from or who they are.”
But the problem with my alma mater is that over time, the mechanisms of meritocracy have broken down. In 1995, when I was a student at Hunter, the student body was 12 percent black and 6 percent Hispanic. Not coincidentally, there was no test-prep industry for the Hunter entrance exam. That’s no longer the case. Now, so-called cram schools like Elite Academy in Queens can charge thousands of dollars for after-school and weekend courses where sixth graders memorize vocabulary words and learn advanced math. Meanwhile, in the wealthier precincts of Manhattan, parents can hire $90-an-hour private tutors for one-on-one sessions with their children.
By 2009, Hunter’s demographics were radically different—just 3 percent black and 1 percent Hispanic, according to the New York Times. With the rise of a sophisticated and expensive test-preparation industry, the means of selecting entrants to Hunter has grown less independent of the social and economic hierarchies in New York at large. The pyramid of merit has come to mirror the pyramid of wealth and cultural capital.
How and why does this happen? I think the best answer comes from the work of a social theorist named Robert Michels, who was occupied with a somewhat parallel problem in the early years of the last century. Born to a wealthy German family, Michels came to adopt the radical socialist politics then sweeping through much of Europe. At first, he joined the Social Democratic Party, but he ultimately came to view it as too bureaucratic to achieve its stated aims. “Our workers’ organization has become an end in itself,” Michels declared, “a machine which is perfected for its own sake and not for the tasks which it could have performed.”
Michels then drifted toward the syndicalists, who eschewed parliamentary elections in favor of mass labor solidarity, general strikes and resistance to the dictatorship of the kaiser. But even among the more militant factions of the German left, Michels encountered the same bureaucratic pathologies that had soured him on the SDP. In his classic book Political Parties, he wondered why the parties of the left, so ideologically committed to democracy and participation, were as oligarchic in their functioning as the self-consciously elitist and aristocratic parties of the right.
Michels’s grim conclusion was that it was impossible for any party, no matter its belief system, to bring about democracy in practice. Oligarchy was inevitable. For any kind of institution with a democratic base to consolidate the legitimacy it needs to exist, it must have an organization that delegates tasks. The rank and file will not have the time, energy, wherewithal or inclination to participate in the many, often minute decisions necessary to keep the institution functioning. In fact, effectiveness, Michels argues convincingly, requires that these tasks be delegated to a small group of people with enough power to make decisions of consequence for the entire membership. Over time, this bureaucracy becomes a kind of permanent, full-time cadre of leadership. “Without wishing it,” Michels says, there grows up a great “gulf which divides the leaders from the masses.” The leaders now control the tools with which to manipulate the opinion of the masses and subvert the organization’s democratic process. “Thus the leaders, who were at first no more than the executive organs of the collective, will soon emancipate themselves from the mass and become independent of its control.”
All this flows inexorably from the nature of organization itself, Michels concludes, and he calls it “The Iron Law of Oligarchy”: “It is organization which gives birth to the dominion of the elected over the electors, of the mandataries over the mandators, of the delegates over the delegators. Who says organization says oligarchy.”
* * *
The dynamic Michels identifies applies, in an analogous way, to our own cherished system of meritocracy. In order for it to live up to its ideals, a meritocracy must comply with two principles. The first is the Principle of Difference, which holds that there is vast differentiation among people in their ability and that we should embrace this natural hierarchy and set ourselves the challenge of matching the hardest-working and most talented to the most difficult, important and remunerative tasks.
The second is the Principle of Mobility. Over time, there must be some continuous, competitive selection process that ensures performance is rewarded and failure punished. That is, the delegation of duties cannot simply be made once and then fixed in place over a career or between generations. People must be able to rise and fall along with their accomplishments and failures. When a slugger loses his swing, he should be benched; when a trader loses money, his bonus should be cut. At the broader social level, we hope that the talented children of the poor will ascend to positions of power and prestige while the mediocre sons of the wealthy will not be charged with life-and-death decisions. Over time, in other words, society will have mechanisms that act as a sort of pump, constantly ensuring that the talented and hard-working are propelled upward, while the mediocre trickle downward.
But this ideal, appealing as it may be, runs up against the reality of what I’ll call the Iron Law of Meritocracy. The Iron Law of Meritocracy states that eventually the inequality produced by a meritocratic system will grow large enough to subvert the mechanisms of mobility. Unequal outcomes make equal opportunity impossible. The Principle of Difference will come to overwhelm the Principle of Mobility. Those who are able to climb up the ladder will find ways to pull it up after them, or to selectively lower it down to allow their friends, allies and kin to scramble up. In other words: “Who says meritocracy says oligarchy.”
Consider, for example, the next “meritocracy” that graduates of Hunter encounter. American universities are the central institution of the modern meritocracy, and yet, as Daniel Golden documents in his devastating book The Price of Admission, atop the ostensibly meritocratic architecture of SATs and high school grades is built an entire tower of preference and subsidy for the privileged:
At least one third of the students at elite universities, and at least half at liberal arts colleges, are flagged for preferential treatment in the admissions process. While minorities make up 10 to 15 percent of a typical student body, affluent whites dominate other preferred groups: recruited athletes (10 to 25 percent of students); alumni children, also known as “legacies” (10 to 25 percent); development cases (2 to 5 percent); children of celebrities and politicians (1 to 2 percent); and children of faculty members (1 to 3 percent).
This doesn’t even count the advantages that wealthy children have in terms of private tutors, test prep, and access to expensive private high schools and college counselors. All together, this layered system of preferences for the children of the privileged amounts to, in Golden’s words, “affirmative action for rich white people.” It is not so much the meritocracy as idealized and celebrated but rather the ancient practice of “elites mastering the art of perpetuating themselves.”
A pure functioning meritocracy would produce a society with growing inequality, but that inequality would come along with a correlated increase in social mobility. As the educational system and business world got better and better at finding inherent merit wherever it lay, you would see the bright kids of the poor boosted to the upper echelons of society, with the untalented progeny of the best and brightest relegated to the bottom of the social pyramid where they belong.
But the Iron Law of Meritocracy makes a different prediction: that societies ordered around the meritocratic ideal will produce inequality without the attendant mobility. Indeed, over time, a society will become more unequal and less mobile as those who ascend its heights create means of preserving and defending their privilege and find ways to pass it on across generations. And this, as it turns out, is a pretty spot-on description of the trajectory of the American economy since the mid-1970s.
* * *
The sharp, continuous rise in inequality is one of the most studied and acknowledged features of the American political economy in the post-Carter age. Paul Krugman calls it “The Great Divergence,” and the economist Emmanuel Saez, who has done the most pioneering work on measuring the phenomenon, has written: “The top 1% income share has increased dramatically in recent decades and reached levels which had not been seen…since before the Great Depression.”
One of the most distinctive aspects of the rise in American inequality over the past three decades is just how concentrated the gains are at the very top. The farther up the income scale you go, the better people are doing: the top 10 percent have done well, but they’ve been outpaced by the top 1 percent, who in turn have seen slower gains than the top 0.1 percent, all of whom have been beaten by the top 0.01 percent. Adjusted for inflation, the top 0.1 percent saw their average annual income rise from just over $1 million in 1974 to $7.1 million in 2007. And things were even better for the top 0.01 percent, who saw their average annual income explode from less than $4 million to $35 million, nearly a ninefold increase.
It is not simply that the rich are getting richer, though that’s certainly true. It is that a smaller and smaller group of über-rich are able to capture a larger and larger share of the fruits of the economy. America now features more inequality than any other industrialized democracy. In its peer group are countries like Argentina and other Latin American nations that once stood as iconic examples of the ways in which the absence of a large middle class presented a roadblock to development and good governance.
So: income inequality has been growing. What about mobility? While it’s much harder to measure, there’s a growing body of evidence that, at the same time income inequality has been growing at an unprecedented rate, social mobility has been declining. In a 2012 speech, Alan Krueger, chair of President Obama’s Council of Economic Advisers, coined the term “The Gatsby Curve” to refer to a chart showing that over the past three decades, “as inequality has increased…year-to-year or generation-to-generation economic mobility has decreased.”
The most comprehensive attempt at divining the long-term trends in social mobility over several generations is presented in “Intergenerational Economic Mobility in the US, 1940 to 2000,” a complex paper by economists Daniel Aaronson and Bhashkar Mazumder of the Federal Reserve Bank of Chicago. After a series of maneuvers that qualify as statistical pyrotechnics, they conclude that “mobility increased from 1950 to 1980 but has declined sharply since 1980. The recent decline in mobility is only partially explained by education.”
Another pair of economists, from the Boston Federal Reserve, analyzed household income data to measure mobility over a period of three decades rather than intergenerational mobility. They found that in the 1970s, 36 percent of families stayed in the same income decile; in the 1980s, that figure was 37 percent; and in the 1990s, it was 40 percent. In other words, over time, a larger share of families were staying within their class through the duration of their lives.
This is evidence that the Iron Law of Meritocracy is, in fact, exerting itself on our social order. And we might ask what a society that has been corrupted entirely by the Iron Law of Meritocracy would look like. It would be a society with extremely high and rising inequality yet little circulation of elites. A society in which the pillar institutions were populated and presided over by a group of hyper-educated, ambitious overachievers who enjoyed tremendous monetary rewards as well as unparalleled political power and prestige, and yet who managed to insulate themselves from sanction, competition and accountability; a group of people who could more or less rest assured that now that they have achieved their status, now that they have scaled to the top of the pyramid, they, their peers and their progeny will stay there.
Such a ruling class would have all the competitive ferocity inculcated by the ceaseless jockeying within the institutions that produce meritocratic elites, but face no actual sanctions for failing at their duties or succumbing to the temptations of corruption. It would reflexively protect its worst members; it would operate with a wide gulf between performance and reward; and it would be shot through with corruption, rule-breaking and self-dealing, as those on top pursued the outsized rewards promised for superstars. In the same way the bailouts combined the worst aspects of capitalism and socialism, such a social order would fuse the worst aspects of meritocracy and bureaucracy.
It would, in other words, look a lot like the American elite in the first years of the twenty-first century.
* * *
Of all the status obsessions that preoccupy our elites, none is quite so prominent as the obsession with smartness. Intelligence is the core value of the meritocracy, one that stretches back to the early years of standardized testing, when the modern-day SAT descended from early IQ tests. To call a member of the elite “brilliant” is to pay that person the highest compliment.
Intelligence is a vitally necessary characteristic for those with powerful positions. But it isn’t just a celebration of smartness that characterizes the culture of meritocracy. It’s something more pernicious: a Cult of Smartness in which intelligence is the chief virtue, along with a conviction that smartness is rankable and that the hierarchy of intelligence, like the hierarchy of wealth, never plateaus. In a society as stratified as our own, this is a seductive conclusion to reach. Since there are people who make $500,000, $5 million and $5 billion all within the same elite, perhaps there are leaps equal to such orders of magnitude in cognitive ability as well.
In Liquidated: An Ethnography of Wall Street, anthropologist Karen Ho shows how the obsession with smartness produces “a meritocratic feedback loop,” in which bankers’ growing influence itself becomes further evidence that they are, in fact, “the smartest.” According to one Morgan Stanley analyst Ho interviewed, those being recruited by the firm “are typically told they will be working with ‘the brightest people in the world. These are the greatest minds of the century.’” Robert Hopkins, a vice president of mergers and acquisitions at Lehman Brothers, tells her of those who inhabit Wall Street: “We are talking about the smartest people in the world. We are! They are the smartest people in the world.”
And just as one would suspect, given the fractal nature of inequality at the top, hovering above those who work at big Wall Street firms is an entire world of hedge-fund hotshots, who see themselves as far smarter than the grunts on Wall Street. “There’s 100 percent no question that most people on Wall Street, even if they have nice credentials, are generally developmentally disabled,” a hedge-fund analyst I’ll call Eli told me, only somewhat jokingly, one night over dinner. Hedge funds, according to Eli and his colleagues, are the real deal; the innermost of inner rings. “I was surrounded my whole life by people who took intelligence very seriously,” Eli told me. “I went to good schools, I worked at places surrounded by smart people. And until now I’ve never been at a place that prides itself on having the smartest people and where it’s actually true.”
That confidence, of course, projects outward, and from it emanates the authority that the financial sector as a whole enjoyed (and in certain circles still enjoys). “At the end of the day,” Eli says with a laugh, “America does what Wall Street tells it to do. And whether that’s because Wall Street knows best, whether Wall Street is intelligently self-dealing, or whether it has no idea and talks out of its ass, that is the culture in America.”
This is the Cult of Smartness at its most pernicious: listen to Wall Street—they’ve got the smartest minds on the planet.
While smartness is necessary for competent elites, it is far from sufficient: wisdom, judgment, empathy and ethical rigor are all as important, even if those traits are far less valued. Indeed, extreme intelligence without these qualities can be extremely destructive. But empathy does not impress the same way smartness does. Smartness dazzles and mesmerizes. More important, it intimidates. When a group of powerful people get together to make a group decision, conflict and argumentation ensue, and more often than not the decision that emerges is that which is articulated most forcefully by those parties perceived to be the “smartest.”
It is under these conditions that destructive intelligence flourishes. Behind many of the Bush administration’s most disastrous and destructive decisions was one man: David Addington, counsel and then chief of staff to Dick Cheney. Addington was called “Cheney’s Cheney” and “the most powerful man you’ve never heard of.” A former Bush White House lawyer told The New Yorker’s Jane Mayer that the administration’s legal framework for the “war on terror”—from indefinite detention, to torture, to rejection of the 1949 Geneva Accords, to denial of habeas corpus—was “all Addington.”
Addington’s defining trait, as portrayed in numerous profiles, is his hard-edged, ideologically focused intelligence. “The boy seemed terribly, terribly bright,” Addington’s high school history teacher told Mayer. “He was scornful of anyone who said anything that was naïve, or less than bright. His sneers were almost palpable.” A US News and World Report profile of Addington observed that “his capacity to absorb complex information is legendary.” Co-workers referred to him as “extremely smart” and “sublimely brilliant.”
What emerges in these accounts is a figure who used his dazzling recall, razor-sharp logical ability and copious knowledge to implacably push administration policy in a rogue direction. Because he knew the law so well, he was able to make legal arguments that, executed by anyone else, would have been regarded as insane. He would edit briefs so that they always reflected a maximalist interpretation of presidential power, and his sheer ferocity and analytic horsepower enabled him to steamroll anyone who raised objections. Pentagon lawyer Richard Schiffrin described Addington’s posture in a meeting just after 9/11 to Mayer this way: “He’d sit, listen, and then say, ‘No, that’s not right.’… He didn’t recognize the wisdom of the other lawyers. He was always right. He didn’t listen. He knew the answers.”
This is a potent articulation of the dark emotional roots of the Cult of Smartness: the desire to differentiate and dominate that the meritocracy encourages. Ironically, in seeking to stand apart, the Cult of Smartness can kill independent thought by subtly training people to defer to others whom one should “take seriously.”
* * *
But fractal inequality doesn’t just produce errors of judgment like those we saw during the run-up to Iraq; it also creates a system of incentives that produces an insidious form of corruption. This corruption isn’t the obvious quid pro quo of the Gilded Age—there are precious few cases of politicians taking satchels of cash in exchange for votes. What’s far more common is what Harvard Law professor Lawrence Lessig calls “institutional corruption,” in which an institution develops an “improper dependency,” one that “conflicts with the dependence intended.”
This kind of corruption is everywhere you look. Consider a doctor who receives gifts and honorariums from a prescription drug company. The doctor insists plausibly that this has no effect on his medical decisions, which remain independent and guided by his training, instincts and the best available data. And he is not lying or being disingenuous when he says this: he absolutely believes it to be the case. But we know from a series of studies that there is a strong correlation between gifts from pharmaceutical companies and doctors’ willingness to prescribe their drugs.
This basic dynamic infects some of our most important institutions. Key to facilitating both the monumental housing bubble and its collapse was the ratings agencies’ habit of giving even extremely leveraged, toxic securities a triple-A rating. The institutional purpose of the rating agencies (and their market purpose as well) is to add value for investors by using their expertise to make judgments about the creditworthiness of securities. Originally, the agencies made their money from the investors themselves, who paid subscription fees in exchange for access to their ratings. But over time the largest agencies shifted to a model in which the banks and financial entities issuing the securities would pay the agencies for a rating. Obviously, these new clients wanted the highest rating possible and often would bring pressure to bear on the agencies to make sure they secured the needed triple A. And so the ratings agencies developed an improper dependence on their clients, one that pulled them away from fulfilling their original institutional purpose of serving investors. They became corrupt, and the result was trillions of dollars in supposedly triple-A securities that became worthless once the housing bubble burst.
We see a similar destructive example of this dynamic at work in two groups we entrusted to guard the public interest when it comes to the economy: federal regulators and elite economists. In a paper about the financial crisis, Rob Johnson and Thomas Ferguson tracked the salary trends for those working in finance and those in the federal agencies tasked with regulating them and found a striking divergence between the two. The authors note:
At some point after incomes in the financial sector took off, lifetime earnings of the regulated far outstripped what any regulator could ever hope to earn. Rising economic inequality was translating into a crippling institutional weakness in regulatory structure. Not surprisingly, as one former member of a U.S. regulatory agency expressed it to us, regulatory agencies turned into barely disguised employment agencies, as staff increasingly focused on making themselves attractive hires to the firms they were supposed to be regulating.
In his film Inside Job, Charles Ferguson documents the insidious ways in which consulting fees and moonlighting gigs with financial companies created systematic conflicts of interest for some of the nation’s most prominent economists. Ferguson’s film parades through a number of the most admired names in the field, from Larry Summers to Martin Feldstein to Frederic Mishkin, who all had lucrative sidelines working for business interests with stakes in their academic work. Mishkin even took $124,000 from the Iceland Chamber of Commerce to write a paper endorsing the country’s economic model, just a few years before it collapsed.
What we are left with is the confusion that arises from an ambiguity of roles: are our regulators attempting to rein in the excesses of those they regulate, or are they auditioning for a lucrative future job? Are economists who publish papers praising financial deregulation giving us an honest assessment of the facts and trends, or courting extremely lucrative consulting fees from banks?
In her book Shadow Elite, about the new global ruling class, Janine Wedel recalls visiting Eastern Europe after the fall of the Berlin Wall and finding the elites she met there—those at the center of building the new capitalist societies—toting an array of business cards that represented their various roles: one for their job as a member of parliament, another for the start-up business they were running (which was making its money off government contracts), and yet another for the NGO on the board of which they sat. Wedel writes that those “who adapted to the new environment with the most agility and creativity, who tried out novel ways of operating and got away with them, and sometimes were the most ethically challenged, were most rewarded with influence.”
This has an eerie resonance with our predicament. We can never be sure just which other business cards are in the pocket of the pundit, politician or professor. We can’t be sure, in short, just who our elites are working for.
But we suspect it is not us.
Original Article
Source: the nation
Author: Christopher Hayes
I was one of the lucky ones who made it through, and my experience there transformed me. It was at Hunter that I absorbed the open-minded, self-assured cosmopolitanism that is the guiding ethos of the current American ruling class. What animates the school is a collective delight in the talent and energy of its students and a general feeling of earned superiority. In 1982 a Hunter alumnus profiled the school in a New York magazine article called “The Joyful Elite” and identified its “most singular trait” as the “exuberantly smug loyalty of its students.”
That loyalty emanates from the deeply held conviction that Hunter embodies the meritocratic ideal as much as any institution in the country. Unlike elite colleges, which use all kinds of subjective measures—recommendations, résumés, writing samples, parental legacies and interviews—in deciding who gains admittance, entrance to Hunter rests on a single “objective” measure: one three-hour test. If you clear the bar, you’re in; if not, you’re out. There are no legacy admissions, and there are no strings to pull for the well-connected. If Michael Bloomberg’s daughter took the test and didn’t pass, she wouldn’t get in. There are only a handful of institutions left in the country about which this can be said.
Because it is public and free, the school pulls kids from all over the city, many of whom are first-generation Americans, the children of immigrant strivers from Korea, Russia and Pakistan. Half the students have at least one parent born outside the United States. For all these reasons Hunter is, in its own imagination, a place where anyone with drive and brains can be catapulted from the anonymity of working-class outer-borough neighborhoods to the inner sanctum of the American elite. “I came from a family where nobody went to college. We lived up in Washington Heights. We had no money,” says Jennifer Raab, who as president of CUNY’s Hunter College oversees the high school as well. “It was incredibly empowering.” When she surveys the student body, “it gets me very sappy about the American dream. It really can come true. These kids are getting an education that is unparalleled, and it’s not about where they come from or who they are.”
But the problem with my alma mater is that over time, the mechanisms of meritocracy have broken down. In 1995, when I was a student at Hunter, the student body was 12 percent black and 6 percent Hispanic. Not coincidentally, there was no test-prep industry for the Hunter entrance exam. That’s no longer the case. Now, so-called cram schools like Elite Academy in Queens can charge thousands of dollars for after-school and weekend courses where sixth graders memorize vocabulary words and learn advanced math. Meanwhile, in the wealthier precincts of Manhattan, parents can hire $90-an-hour private tutors for one-on-one sessions with their children.
By 2009, Hunter’s demographics were radically different—just 3 percent black and 1 percent Hispanic, according to the New York Times. With the rise of a sophisticated and expensive test-preparation industry, the means of selecting entrants to Hunter has grown less independent of the social and economic hierarchies in New York at large. The pyramid of merit has come to mirror the pyramid of wealth and cultural capital.
How and why does this happen? I think the best answer comes from the work of a social theorist named Robert Michels, who was occupied with a somewhat parallel problem in the early years of the last century. Born to a wealthy German family, Michels came to adopt the radical socialist politics then sweeping through much of Europe. At first, he joined the Social Democratic Party, but he ultimately came to view it as too bureaucratic to achieve its stated aims. “Our workers’ organization has become an end in itself,” Michels declared, “a machine which is perfected for its own sake and not for the tasks which it could have performed.”
Michels then drifted toward the syndicalists, who eschewed parliamentary elections in favor of mass labor solidarity, general strikes and resistance to the dictatorship of the kaiser. But even among the more militant factions of the German left, Michels encountered the same bureaucratic pathologies that had soured him on the SDP. In his classic book Political Parties, he wondered why the parties of the left, so ideologically committed to democracy and participation, were as oligarchic in their functioning as the self-consciously elitist and aristocratic parties of the right.
Michels’s grim conclusion was that it was impossible for any party, no matter its belief system, to bring about democracy in practice. Oligarchy was inevitable. For any kind of institution with a democratic base to consolidate the legitimacy it needs to exist, it must have an organization that delegates tasks. The rank and file will not have the time, energy, wherewithal or inclination to participate in the many, often minute decisions necessary to keep the institution functioning. In fact, effectiveness, Michels argues convincingly, requires that these tasks be delegated to a small group of people with enough power to make decisions of consequence for the entire membership. Over time, this bureaucracy becomes a kind of permanent, full-time cadre of leadership. “Without wishing it,” Michels says, there grows up a great “gulf which divides the leaders from the masses.” The leaders now control the tools with which to manipulate the opinion of the masses and subvert the organization’s democratic process. “Thus the leaders, who were at first no more than the executive organs of the collective, will soon emancipate themselves from the mass and become independent of its control.”
All this flows inexorably from the nature of organization itself, Michels concludes, and he calls it “The Iron Law of Oligarchy”: “It is organization which gives birth to the dominion of the elected over the electors, of the mandataries over the mandators, of the delegates over the delegators. Who says organization says oligarchy.”
* * *
The dynamic Michels identifies applies, in an analogous way, to our own cherished system of meritocracy. In order for it to live up to its ideals, a meritocracy must comply with two principles. The first is the Principle of Difference, which holds that there is vast differentiation among people in their ability and that we should embrace this natural hierarchy and set ourselves the challenge of matching the hardest-working and most talented to the most difficult, important and remunerative tasks.
The second is the Principle of Mobility. Over time, there must be some continuous, competitive selection process that ensures performance is rewarded and failure punished. That is, the delegation of duties cannot simply be made once and then fixed in place over a career or between generations. People must be able to rise and fall along with their accomplishments and failures. When a slugger loses his swing, he should be benched; when a trader loses money, his bonus should be cut. At the broader social level, we hope that the talented children of the poor will ascend to positions of power and prestige while the mediocre sons of the wealthy will not be charged with life-and-death decisions. Over time, in other words, society will have mechanisms that act as a sort of pump, constantly ensuring that the talented and hard-working are propelled upward, while the mediocre trickle downward.
But this ideal, appealing as it may be, runs up against the reality of what I’ll call the Iron Law of Meritocracy. The Iron Law of Meritocracy states that eventually the inequality produced by a meritocratic system will grow large enough to subvert the mechanisms of mobility. Unequal outcomes make equal opportunity impossible. The Principle of Difference will come to overwhelm the Principle of Mobility. Those who are able to climb up the ladder will find ways to pull it up after them, or to selectively lower it down to allow their friends, allies and kin to scramble up. In other words: “Who says meritocracy says oligarchy.”
Consider, for example, the next “meritocracy” that graduates of Hunter encounter. American universities are the central institution of the modern meritocracy, and yet, as Daniel Golden documents in his devastating book The Price of Admission, atop the ostensibly meritocratic architecture of SATs and high school grades is built an entire tower of preference and subsidy for the privileged:
At least one third of the students at elite universities, and at least half at liberal arts colleges, are flagged for preferential treatment in the admissions process. While minorities make up 10 to 15 percent of a typical student body, affluent whites dominate other preferred groups: recruited athletes (10 to 25 percent of students); alumni children, also known as “legacies” (10 to 25 percent); development cases (2 to 5 percent); children of celebrities and politicians (1 to 2 percent); and children of faculty members (1 to 3 percent).
This doesn’t even count the advantages that wealthy children have in terms of private tutors, test prep, and access to expensive private high schools and college counselors. All together, this layered system of preferences for the children of the privileged amounts to, in Golden’s words, “affirmative action for rich white people.” It is not so much the meritocracy as idealized and celebrated but rather the ancient practice of “elites mastering the art of perpetuating themselves.”
A pure functioning meritocracy would produce a society with growing inequality, but that inequality would come along with a correlated increase in social mobility. As the educational system and business world got better and better at finding inherent merit wherever it lay, you would see the bright kids of the poor boosted to the upper echelons of society, with the untalented progeny of the best and brightest relegated to the bottom of the social pyramid where they belong.
But the Iron Law of Meritocracy makes a different prediction: that societies ordered around the meritocratic ideal will produce inequality without the attendant mobility. Indeed, over time, a society will become more unequal and less mobile as those who ascend its heights create means of preserving and defending their privilege and find ways to pass it on across generations. And this, as it turns out, is a pretty spot-on description of the trajectory of the American economy since the mid-1970s.
* * *
The sharp, continuous rise in inequality is one of the most studied and acknowledged features of the American political economy in the post-Carter age. Paul Krugman calls it “The Great Divergence,” and the economist Emmanuel Saez, who has done the most pioneering work on measuring the phenomenon, has written: “The top 1% income share has increased dramatically in recent decades and reached levels which had not been seen…since before the Great Depression.”
One of the most distinctive aspects of the rise in American inequality over the past three decades is just how concentrated the gains are at the very top. The farther up the income scale you go, the better people are doing: the top 10 percent have done well, but they’ve been outpaced by the top 1 percent, who in turn have seen slower gains than the top 0.1 percent, all of whom have been beaten by the top 0.01 percent. Adjusted for inflation, the top 0.1 percent saw their average annual income rise from just over $1 million in 1974 to $7.1 million in 2007. And things were even better for the top 0.01 percent, who saw their average annual income explode from less than $4 million to $35 million, nearly a ninefold increase.
It is not simply that the rich are getting richer, though that’s certainly true. It is that a smaller and smaller group of über-rich are able to capture a larger and larger share of the fruits of the economy. America now features more inequality than any other industrialized democracy. In its peer group are countries like Argentina and other Latin American nations that once stood as iconic examples of the ways in which the absence of a large middle class presented a roadblock to development and good governance.
So: income inequality has been growing. What about mobility? While it’s much harder to measure, there’s a growing body of evidence that, at the same time income inequality has been growing at an unprecedented rate, social mobility has been declining. In a 2012 speech, Alan Krueger, chair of President Obama’s Council of Economic Advisers, coined the term “The Gatsby Curve” to refer to a chart showing that over the past three decades, “as inequality has increased…year-to-year or generation-to-generation economic mobility has decreased.”
The most comprehensive attempt at divining the long-term trends in social mobility over several generations is presented in “Intergenerational Economic Mobility in the US, 1940 to 2000,” a complex paper by economists Daniel Aaronson and Bhashkar Mazumder of the Federal Reserve Bank of Chicago. After a series of maneuvers that qualify as statistical pyrotechnics, they conclude that “mobility increased from 1950 to 1980 but has declined sharply since 1980. The recent decline in mobility is only partially explained by education.”
Another pair of economists, from the Boston Federal Reserve, analyzed household income data to measure mobility over a period of three decades rather than intergenerational mobility. They found that in the 1970s, 36 percent of families stayed in the same income decile; in the 1980s, that figure was 37 percent; and in the 1990s, it was 40 percent. In other words, over time, a larger share of families were staying within their class through the duration of their lives.
This is evidence that the Iron Law of Meritocracy is, in fact, exerting itself on our social order. And we might ask what a society that has been corrupted entirely by the Iron Law of Meritocracy would look like. It would be a society with extremely high and rising inequality yet little circulation of elites. A society in which the pillar institutions were populated and presided over by a group of hyper-educated, ambitious overachievers who enjoyed tremendous monetary rewards as well as unparalleled political power and prestige, and yet who managed to insulate themselves from sanction, competition and accountability; a group of people who could more or less rest assured that now that they have achieved their status, now that they have scaled to the top of the pyramid, they, their peers and their progeny will stay there.
Such a ruling class would have all the competitive ferocity inculcated by the ceaseless jockeying within the institutions that produce meritocratic elites, but face no actual sanctions for failing at their duties or succumbing to the temptations of corruption. It would reflexively protect its worst members; it would operate with a wide gulf between performance and reward; and it would be shot through with corruption, rule-breaking and self-dealing, as those on top pursued the outsized rewards promised for superstars. In the same way the bailouts combined the worst aspects of capitalism and socialism, such a social order would fuse the worst aspects of meritocracy and bureaucracy.
It would, in other words, look a lot like the American elite in the first years of the twenty-first century.
* * *
Of all the status obsessions that preoccupy our elites, none is quite so prominent as the obsession with smartness. Intelligence is the core value of the meritocracy, one that stretches back to the early years of standardized testing, when the modern-day SAT descended from early IQ tests. To call a member of the elite “brilliant” is to pay that person the highest compliment.
Intelligence is a vitally necessary characteristic for those with powerful positions. But it isn’t just a celebration of smartness that characterizes the culture of meritocracy. It’s something more pernicious: a Cult of Smartness in which intelligence is the chief virtue, along with a conviction that smartness is rankable and that the hierarchy of intelligence, like the hierarchy of wealth, never plateaus. In a society as stratified as our own, this is a seductive conclusion to reach. Since there are people who make $500,000, $5 million and $5 billion all within the same elite, perhaps there are leaps equal to such orders of magnitude in cognitive ability as well.
In Liquidated: An Ethnography of Wall Street, anthropologist Karen Ho shows how the obsession with smartness produces “a meritocratic feedback loop,” in which bankers’ growing influence itself becomes further evidence that they are, in fact, “the smartest.” According to one Morgan Stanley analyst Ho interviewed, those being recruited by the firm “are typically told they will be working with ‘the brightest people in the world. These are the greatest minds of the century.’” Robert Hopkins, a vice president of mergers and acquisitions at Lehman Brothers, tells her of those who inhabit Wall Street: “We are talking about the smartest people in the world. We are! They are the smartest people in the world.”
And just as one would suspect, given the fractal nature of inequality at the top, hovering above those who work at big Wall Street firms is an entire world of hedge-fund hotshots, who see themselves as far smarter than the grunts on Wall Street. “There’s 100 percent no question that most people on Wall Street, even if they have nice credentials, are generally developmentally disabled,” a hedge-fund analyst I’ll call Eli told me, only somewhat jokingly, one night over dinner. Hedge funds, according to Eli and his colleagues, are the real deal; the innermost of inner rings. “I was surrounded my whole life by people who took intelligence very seriously,” Eli told me. “I went to good schools, I worked at places surrounded by smart people. And until now I’ve never been at a place that prides itself on having the smartest people and where it’s actually true.”
That confidence, of course, projects outward, and from it emanates the authority that the financial sector as a whole enjoyed (and in certain circles still enjoys). “At the end of the day,” Eli says with a laugh, “America does what Wall Street tells it to do. And whether that’s because Wall Street knows best, whether Wall Street is intelligently self-dealing, or whether it has no idea and talks out of its ass, that is the culture in America.”
This is the Cult of Smartness at its most pernicious: listen to Wall Street—they’ve got the smartest minds on the planet.
While smartness is necessary for competent elites, it is far from sufficient: wisdom, judgment, empathy and ethical rigor are all as important, even if those traits are far less valued. Indeed, extreme intelligence without these qualities can be extremely destructive. But empathy does not impress the same way smartness does. Smartness dazzles and mesmerizes. More important, it intimidates. When a group of powerful people get together to make a group decision, conflict and argumentation ensue, and more often than not the decision that emerges is that which is articulated most forcefully by those parties perceived to be the “smartest.”
It is under these conditions that destructive intelligence flourishes. Behind many of the Bush administration’s most disastrous and destructive decisions was one man: David Addington, counsel and then chief of staff to Dick Cheney. Addington was called “Cheney’s Cheney” and “the most powerful man you’ve never heard of.” A former Bush White House lawyer told The New Yorker’s Jane Mayer that the administration’s legal framework for the “war on terror”—from indefinite detention, to torture, to rejection of the 1949 Geneva Accords, to denial of habeas corpus—was “all Addington.”
Addington’s defining trait, as portrayed in numerous profiles, is his hard-edged, ideologically focused intelligence. “The boy seemed terribly, terribly bright,” Addington’s high school history teacher told Mayer. “He was scornful of anyone who said anything that was naïve, or less than bright. His sneers were almost palpable.” A US News and World Report profile of Addington observed that “his capacity to absorb complex information is legendary.” Co-workers referred to him as “extremely smart” and “sublimely brilliant.”
What emerges in these accounts is a figure who used his dazzling recall, razor-sharp logical ability and copious knowledge to implacably push administration policy in a rogue direction. Because he knew the law so well, he was able to make legal arguments that, executed by anyone else, would have been regarded as insane. He would edit briefs so that they always reflected a maximalist interpretation of presidential power, and his sheer ferocity and analytic horsepower enabled him to steamroll anyone who raised objections. Pentagon lawyer Richard Schiffrin described Addington’s posture in a meeting just after 9/11 to Mayer this way: “He’d sit, listen, and then say, ‘No, that’s not right.’… He didn’t recognize the wisdom of the other lawyers. He was always right. He didn’t listen. He knew the answers.”
This is a potent articulation of the dark emotional roots of the Cult of Smartness: the desire to differentiate and dominate that the meritocracy encourages. Ironically, in seeking to stand apart, the Cult of Smartness can kill independent thought by subtly training people to defer to others whom one should “take seriously.”
* * *
But fractal inequality doesn’t just produce errors of judgment like those we saw during the run-up to Iraq; it also creates a system of incentives that produces an insidious form of corruption. This corruption isn’t the obvious quid pro quo of the Gilded Age—there are precious few cases of politicians taking satchels of cash in exchange for votes. What’s far more common is what Harvard Law professor Lawrence Lessig calls “institutional corruption,” in which an institution develops an “improper dependency,” one that “conflicts with the dependence intended.”
This kind of corruption is everywhere you look. Consider a doctor who receives gifts and honorariums from a prescription drug company. The doctor insists plausibly that this has no effect on his medical decisions, which remain independent and guided by his training, instincts and the best available data. And he is not lying or being disingenuous when he says this: he absolutely believes it to be the case. But we know from a series of studies that there is a strong correlation between gifts from pharmaceutical companies and doctors’ willingness to prescribe their drugs.
This basic dynamic infects some of our most important institutions. Key to facilitating both the monumental housing bubble and its collapse was the ratings agencies’ habit of giving even extremely leveraged, toxic securities a triple-A rating. The institutional purpose of the rating agencies (and their market purpose as well) is to add value for investors by using their expertise to make judgments about the creditworthiness of securities. Originally, the agencies made their money from the investors themselves, who paid subscription fees in exchange for access to their ratings. But over time the largest agencies shifted to a model in which the banks and financial entities issuing the securities would pay the agencies for a rating. Obviously, these new clients wanted the highest rating possible and often would bring pressure to bear on the agencies to make sure they secured the needed triple A. And so the ratings agencies developed an improper dependence on their clients, one that pulled them away from fulfilling their original institutional purpose of serving investors. They became corrupt, and the result was trillions of dollars in supposedly triple-A securities that became worthless once the housing bubble burst.
We see a similar destructive example of this dynamic at work in two groups we entrusted to guard the public interest when it comes to the economy: federal regulators and elite economists. In a paper about the financial crisis, Rob Johnson and Thomas Ferguson tracked the salary trends for those working in finance and those in the federal agencies tasked with regulating them and found a striking divergence between the two. The authors note:
At some point after incomes in the financial sector took off, lifetime earnings of the regulated far outstripped what any regulator could ever hope to earn. Rising economic inequality was translating into a crippling institutional weakness in regulatory structure. Not surprisingly, as one former member of a U.S. regulatory agency expressed it to us, regulatory agencies turned into barely disguised employment agencies, as staff increasingly focused on making themselves attractive hires to the firms they were supposed to be regulating.
In his film Inside Job, Charles Ferguson documents the insidious ways in which consulting fees and moonlighting gigs with financial companies created systematic conflicts of interest for some of the nation’s most prominent economists. Ferguson’s film parades through a number of the most admired names in the field, from Larry Summers to Martin Feldstein to Frederic Mishkin, who all had lucrative sidelines working for business interests with stakes in their academic work. Mishkin even took $124,000 from the Iceland Chamber of Commerce to write a paper endorsing the country’s economic model, just a few years before it collapsed.
What we are left with is the confusion that arises from an ambiguity of roles: are our regulators attempting to rein in the excesses of those they regulate, or are they auditioning for a lucrative future job? Are economists who publish papers praising financial deregulation giving us an honest assessment of the facts and trends, or courting extremely lucrative consulting fees from banks?
In her book Shadow Elite, about the new global ruling class, Janine Wedel recalls visiting Eastern Europe after the fall of the Berlin Wall and finding the elites she met there—those at the center of building the new capitalist societies—toting an array of business cards that represented their various roles: one for their job as a member of parliament, another for the start-up business they were running (which was making its money off government contracts), and yet another for the NGO on the board of which they sat. Wedel writes that those “who adapted to the new environment with the most agility and creativity, who tried out novel ways of operating and got away with them, and sometimes were the most ethically challenged, were most rewarded with influence.”
This has an eerie resonance with our predicament. We can never be sure just which other business cards are in the pocket of the pundit, politician or professor. We can’t be sure, in short, just who our elites are working for.
But we suspect it is not us.
Original Article
Source: the nation
Author: Christopher Hayes
No comments:
Post a Comment