Democracy Gone Astray

Democracy, being a human construct, needs to be thought of as directionality rather than an object. As such, to understand it requires not so much a description of existing structures and/or other related phenomena but a declaration of intentionality.
This blog aims at creating labeled lists of published infringements of such intentionality, of points in time where democracy strays from its intended directionality. In addition to outright infringements, this blog also collects important contemporary information and/or discussions that impact our socio-political landscape.

All the posts here were published in the electronic media – main-stream as well as fringe, and maintain links to the original texts.

[NOTE: Due to changes I haven't caught on time in the blogging software, all of the 'Original Article' links were nullified between September 11, 2012 and December 11, 2012. My apologies.]

Friday, August 19, 2011

Can the Middle Class Be Saved?

In October 2005, three Citigroup analysts released a report describing the pattern of growth in the U.S. economy. To really understand the future of the economy and the stock market, they wrote, you first needed to recognize that there was “no such animal as the U.S. consumer,” and that concepts such as “average” consumer debt and “average” consumer spending were highly misleading.

In fact, they said, America was composed of two distinct groups: the rich and the rest. And for the purposes of investment decisions, the second group didn’t matter; tracking its spending habits or worrying over its savings rate was a waste of time. All the action in the American economy was at the top: the richest 1 percent of households earned as much each year as the bottom 60 percent put together; they possessed as much wealth as the bottom 90 percent; and with each passing year, a greater share of the nation’s treasure was flowing through their hands and into their pockets. It was this segment of the population, almost exclusively, that held the key to future growth and future returns. The analysts, Ajay Kapur, Niall Macleod, and Narendra Singh, had coined a term for this state of affairs: plutonomy.

In a plutonomy, Kapur and his co-authors wrote, “economic growth is powered by and largely consumed by the wealthy few.” America had been in this state twice before, they noted—during the Gilded Age and the Roaring Twenties. In each case, the concentration of wealth was the result of rapid technological change, global integration, laissez-faire government policy, and “creative financial innovation.” In 2005, the rich were nearing the heights they’d reached in those previous eras, and Citigroup saw no good reason to think that, this time around, they wouldn’t keep on climbing. “The earth is being held up by the muscular arms of its entrepreneur-plutocrats,” the report said. The “great complexity” of a global economy in rapid transformation would be “exploited best by the rich and educated” of our time.

Kapur and his co-authors were wrong in some of their specific predictions about the plutonomy’s ramifications—they argued, for instance, that since spending was dominated by the rich, and since the rich had very healthy balance sheets, the odds of a stock-market downturn were slight, despite the rising indebtedness of the “average” U.S. consumer. And their division of America into only two classes is ultimately too simple. Nonetheless, their overall characterization of the economy remains resonant. According to Gallup, from May 2009 to May 2011, daily consumer spending rose by 16 percent among Americans earning more than $90,000 a year; among all other Americans, spending was completely flat. The consumer recovery, such as it is, appears to be driven by the affluent, not by the masses. Three years after the crash of 2008, the rich and well educated are putting the recession behind them. The rest of America is stuck in neutral or reverse.

Income inequality usually shrinks during a recession, but in the Great Recession, it didn’t. From 2007 to 2009, the most-recent years for which data are available, it widened a little. The top 1 percent of earners did see their incomes drop more than those of other Americans in 2008. But that fall was due almost entirely to the stock-market crash, and with it a 50 percent reduction in realized capital gains. Excluding capital gains, top earners saw their share of national income rise even in 2008. And in any case, the stock market has since rallied. Corporate profits have marched smartly upward, quarter after quarter, since the beginning of 2009.

Even in the financial sector, high earners have come back strong. In 2009, the country’s top 25 hedge-fund managers earned $25 billion among them—more than they had made in 2007, before the crash. And while the crisis may have begun with mass layoffs on Wall Street, the financial industry has remained well shielded compared with other sectors; from the first quarter of 2007 to the first quarter of 2010, finance shed 8 percent of its jobs, compared with 27 percent in construction and 17 percent in manufacturing. Throughout the recession, the unemployment rate in finance and insurance has been substantially below that of the nation overall.

It’s hard to miss just how unevenly the Great Recession has affected different classes of people in different places. From 2009 to 2010, wages were essentially flat nationwide—but they grew by 11.9 percent in Manhattan and 8.7 percent in Silicon Valley. In the Washington, D.C., and San Jose (Silicon Valley) metro areas—both primary habitats for America’s meritocratic winners—job postings in February of this year were almost as numerous as job candidates. In Miami and Detroit, by contrast, for every job posting, six people were unemployed. In March, the national unemployment rate was 12 percent for people with only a high-school diploma, 4.5 percent for college grads, and 2 percent for those with a professional degree.

Housing crashed hardest in the exurbs and in more-affordable, once fast-growing areas like Phoenix, Las Vegas, and much of Florida—all meccas for aspiring middle-class families with limited savings and education. The professional class, clustered most densely in the closer suburbs of expensive but resilient cities like San Francisco, Seattle, Boston, and Chicago, has lost little in comparison. And indeed, because the stock market has rebounded while housing values have not, the middle class as a whole has seen more of its wealth erased than the rich, who hold more-diverse portfolios. A 2010 Pew study showed that the typical middle-class family had lost 23 percent of its wealth since the recession began, versus just 12 percent in the upper class.

The ease with which the rich and well educated have shrugged off the recession shouldn’t be surprising; strong winds have been at their backs for many years. The recession, meanwhile, has restrained wage growth and enabled faster restructuring and offshoring, leaving many corporations with lower production costs and higher profits—and their executives with higher pay.

Anthony Atkinson, an economist at Oxford University, has studied how several recent financial crises affected income distribution—and found that in their wake, the rich have usually strengthened their economic position. Atkinson examined the financial crises that swept Asia in the 1990s as well as those that afflicted several Nordic countries in the same decade. In most cases, he says, the middle class suffered depressed income for a long time after the crisis, while the top 1 percent were able to protect themselves—using their cash reserves to buy up assets very cheaply once the market crashed, and emerging from crisis with a significantly higher share of assets and income than they’d had before. “I think we’ve seen the same thing, to some extent, in the United States” since the 2008 crash, he told me. “Mr. Buffet has been investing.”

“The rich seem to be on the road to recovery,” says Emmanuel Saez, an economist at Berkeley, while those in the middle, especially those who’ve lost their jobs, “might be permanently hit.” Coming out of the deep recession of the early 1980s, Saez notes, “you saw an increase in inequality … as the rich bounced back, and unionized labor never again found jobs that paid as well as the ones they’d had. And now I fear we’re going to see the same phenomenon, but more dramatic.” Middle-paying jobs in the U.S., in which some workers have been overpaid relative to the cost of labor overseas or technological substitution, “are being wiped out. And what will be left is a hard and a pure market,” with the many paid less than before, and the few paid even better—a plutonomy strengthened in the crucible of the post-crash years.


The Culling of the Middle Class


One of the most salient features of severe downturns is that they tend to accelerate deep economic shifts that are already under way. Declining industries and companies fail, spurring workers and capital toward rising sectors; declining cities shrink faster, leaving blight; workers whose roles have been partly usurped by technology are pushed out en masse and never asked to return. Some economists have argued that in one sense, periods like these do nations a service by clearing the way for new innovation, more-efficient production, and faster growth. Whether or not that’s true, they typically allow us to see, with rare and brutal clarity, where society is heading—and what sorts of people and places it is leaving behind.

Arguably, the most important economic trend in the United States over the past couple of generations has been the ever more distinct sorting of Americans into winners and losers, and the slow hollowing-out of the middle class. Median incomes declined outright from 1999 to 2009. For most of the aughts, that trend was masked by the housing bubble, which allowed working-class and middle-class families to raise their standard of living despite income stagnation or downward job mobility. But that fig leaf has since blown away. And the recession has pressed hard on the broad center of American society.

“The Great Recession has quantitatively but not qualitatively changed the trend toward employment polarization” in the United States, wrote the MIT economist David Autor in a 2010 white paper. Job losses have been “far more severe in middle-skilled white- and blue-collar jobs than in either high-skill, white-collar jobs or in low-skill service occupations.” Indeed, from 2007 through 2009, total employment in professional, managerial, and highly skilled technical positions was essentially unchanged. Jobs in low-skill service occupations such as food preparation, personal care, and house cleaning were also fairly stable. Overwhelmingly, the recession has destroyed the jobs in between. Almost one of every 12 white-collar jobs in sales, administrative support, and nonmanagerial office work vanished in the first two years of the recession; one of every six blue-collar jobs in production, craft, repair, and machine operation did the same.

Autor isolates the winnowing of middle-skill, middle-class jobs as one of several labor-market developments that are profoundly reshaping U.S. society. The others are rising pay at the top, falling wages for the less educated, and “lagging labor market gains for males.” “All,” he writes, “predate the Great Recession. But the available data suggest that the Great Recession has reinforced these trends.”

For more than 30 years, the American economy has been in the midst of a sea change, shifting from industry to services and information, and integrating itself far more tightly into a single, global market for goods, labor, and capital. To some degree, this transformation has felt disruptive all along. But the pace of the change has quickened since the turn of the millennium, and even more so since the crash. Companies have figured out how to harness exponential increases in computing power better and faster. Global supply chains, meanwhile, have grown both tighter and more supple since the late 1990s—the result of improving information technology and of freer trade—making routine work easier to relocate. And of course China, India, and other developing countries have fully emerged as economic powerhouses, capable of producing large volumes of high-value goods and services.

Some parts of America’s transformation may now be nearing completion. For decades, manufacturing has become continually less important to the economy, as other business sectors have grown. But the popular narrative—rapid decline in the 1970s and ’80s, followed by slow erosion thereafter—isn’t quite right, at least as far as employment goes. In fact, the total number of people employed in industry remained quite stable from the late 1960s through about 2000, at roughly 17 million to 19 million. To be sure, manufacturing wasn’t providing many new jobs for a growing population, but for decades, rising output essentially offset the impact of labor-saving technology and offshoring.

But since 2000, U.S. manufacturing has shed about a third of its jobs. Some of that decline reflects losses to China. Still, industry isn’t about to vanish from America, any more than agriculture did as the number of farm workers plummeted during the 20th century. As of 2010, the United States was the second-largest manufacturer in the world, and the No. 3 agricultural nation. But agriculture is now so mechanized that only about 2 percent of American workers make a living as farmers. American manufacturing looks to be heading down the same path.

Meanwhile, another phase of the economy’s transformation—one more squarely involving the white-collar workforce—is really just beginning. “The thing about information technology,” Autor told me, “is that it’s extremely broadly applicable, it’s getting cheaper all the time, and we’re getting better and better at it.” Computer software can now do boilerplate legal work, for instance, and make a first pass at reading X-rays and other medical scans. Likewise, thanks to technology, we can now easily have those scans read and interpreted by professionals half a world away.

In 2007, the economist Alan Blinder, a former vice chairman of the Federal Reserve, estimated that between 22 and 29 percent of all jobs in the United States had the potential to be moved overseas within the next couple of decades. With the recession, the offshoring of jobs only seems to have gained steam. The financial crisis of 2008 was global, but job losses hit America especially hard. According to the International Monetary Fund, one of every four jobs lost worldwide was lost in the United States. And while unemployment remains high in America, it has come back down to (or below) pre-recession levels in countries like China and Brazil.



Anxiety Creeps Upward


Over time, both trade and technology have increased the number of low-cost substitutes for American workers with only moderate cognitive or manual skills—people who perform routine tasks such as product assembly, process monitoring, record keeping, basic information brokering, simple software coding, and so on. As machines and low-paid foreign workers have taken on these functions, the skills associated with them have become less valuable, and workers lacking higher education have suffered.

For the most part, these same forces have been a boon, so far, to Americans who have a good education and exceptional creative talents or analytic skills. Information technology has complemented the work of people who do complex research, sophisticated analysis, high-end deal-making, and many forms of design and artistic creation, rather than replacing that work. And global integration has meant wider markets for new American products and high-value services—and higher incomes for the people who create or provide them.

The return on education has risen in recent decades, producing more-severe income stratification. But even among the meritocratic elite, the economy’s evolution has produced a startling divergence. Since 1993, more than half of the nation’s income growth has been captured by the top 1 percent of earners, and the gains have grown larger over time: from 2002 to 2007, out of every three dollars of national income growth, the top 1 percent of earners captured two. Nearly 2 million people started college in 2002—1,630 of them at Harvard—but among them only Mark Zuckerberg is worth more than $10 billion today; the rise of the super-elite is not a product of educational differences. In part, it is a natural outcome of widening markets and technological revolution, which are creating much bigger winners much faster than ever before—a result that’s not even close to being fully played out, and one reinforced strongly by the political influence that great wealth brings.

Recently, as technology has improved and emerging-market countries have sent more people to college, economic pressures have been moving up the educational ladder in the United States. “It’s useful to make a distinction between college and post-college,” Autor told me. “Among people with professional and even doctoral [degrees], in general the job market has been very good for a very long time, including recently. The group of highly educated individuals who have not done so well recently would be people who have a four-year college degree but nothing beyond that. Opportunities have been less good, wage growth has been less good, the recession has been more damaging. They’ve been displaced from mid-managerial or organizational positions where they don’t have extremely specialized, hard-to-find skills.”

College graduates may be losing some of their luster for reasons beyond technology and trade. As more Americans have gone to college, Autor notes, the quality of college education has become arguably more inconsistent, and the signaling value of a degree from a nonselective school has perhaps diminished. Whatever the causes, “a college degree is not the kind of protection against job loss or wage loss that it used to be.”

Without doubt, it is vastly better to have a college degree than to lack one. Indeed, on a relative basis, the return on a four-year degree is near its historic high. But that’s largely because the prospects facing people without a college degree have been flat or falling. Throughout the aughts, incomes for college graduates barely budged. In a decade defined by setbacks, perhaps that should occasion a sort of wan celebration. “College graduates aren’t doing badly,” says Timothy Smeeding, an economist at the University of Wisconsin and an expert on inequality. But “all the action in earnings is above the B.A. level.”

America’s classes are separating and changing. A tiny elite continues to float up and away from everyone else. Below it, suspended, sits what might be thought of as the professional middle class—unexceptional college graduates for whom the arrow of fortune points mostly sideways, and an upper tier of college graduates and postgraduates for whom it points progressively upward, but not spectacularly so. The professional middle class has grown anxious since the crash, and not without reason. Yet these anxieties should not distract us from a second, more important, cleavage in American society—the one between college graduates and everyone else.

If you live and work in the professional communities of Boston or Seattle or Washington, D.C., it is easy to forget that nationwide, even among people ages 25 to 34, college graduates make up only about 30 percent of the population. And it is easy to forget that a family income of $113,000 in 2009 would have put you in the 80th income percentile nationally. The true center of American society has always been its nonprofessionals—high-school graduates who didn’t go on to get a bachelor’s degree make up 58 percent of the adult population. And as manufacturing jobs and semiskilled office positions disappear, much of this vast, nonprofessional middle class is drifting downward.


The Bottom 70 Percent


The troubles of the nonprofessional middle class are inseparable from the economic troubles of men. Consistently, men without higher education have been the biggest losers in the economy’s long transformation (according to Michael Greenstone, an economist at MIT, real median wages of men have fallen by 32 percent since their peak in 1973, once you account for the men who have washed out of the workforce altogether). And the struggles of men have amplified the many problems—not just economic, but social and cultural—facing the country today.

Just as the housing bubble papered over the troubles of the middle class, it also hid, for a time, the declining prospects of many men. According to the Harvard economist Lawrence Katz, since the mid-1980s, the labor market has been placing a higher premium on creative, analytic, and interpersonal skills, and the wages of men without a college degree have been under particular pressure. “And I think this downturn exacerbates” the problem, Katz told me. During the aughts, construction provided an outlet for the young men who would have gone into manufacturing a generation ago. Men without higher education “didn’t do as badly as you might have expected, on long-run trends, because of the housing bubble.” But it’s hard to imagine another such construction boom coming to their rescue.

One of the great puzzles of the past 30 years has been the way that men, as a group, have responded to the declining market for blue-collar jobs. Opportunities have expanded for college graduates over that span, and for nongraduates, jobs have proliferated within the service sector (at wages ranging from rock-bottom to middling). Yet in the main, men have pursued neither higher education nor service jobs. The proportion of young men with a bachelor’s degree today is about the same as it was in 1980. And as the sociologists Maria Charles and David Grusky noted in their 2004 book, Occupational Ghettos, while men and women now mix more easily on different rungs of the career ladder, many industries and occupations have remained astonishingly segregated, with men continuing to seek work in a dwindling number of manual jobs, and women “crowding into nonmanual occupations that, on average, confer more pay and prestige.”

As recently as 2001, U.S. manufacturing still employed about as many people as did health and educational services combined (roughly 16 million). But since then, those latter, female-dominated sectors have added about 4 million jobs, while manufacturing has lost about the same number. Men made no inroads into health care or education during the aughts; in 2009, they held only about one in four jobs in those rising sectors, just as they had at the beginning of the decade. They did, however, consolidate their hold on manufacturing—those dwindling jobs, along with jobs in construction, transportation, and utilities, were more heavily dominated by men in 2009 than they’d been nine years earlier.

“I’m deeply concerned” about the prospects of less-skilled men, says Bruce Weinberg, an economist at Ohio State. In 1967, 97 percent of 30-to-50-year-old American men with only a high-school diploma were working; in 2010, just 76 percent were. Declining male employment is not unique to the United States. It’s been happening in almost all rich nations, as they’ve put the industrial age behind them. Weinberg’s research has shown that in occupations in which “people skills” are becoming more important, jobs are skewing toward women. And that category is large indeed. In his working paper “People People,” Weinberg and two co-authors found that interpersonal skills typically become more highly valued in occupations in which computer use is prevalent and growing, and in which teamwork is important. Both computer use and teamwork are becoming ever more central to the American workplace, of course; the restructuring that accompanied the Great Recession has only hastened that trend.

Needless to say, a great many men have excellent people skills, just as a great many men do well in school. As a group, men still make more money than women, in part due to lingering discrimination. And many of the differences we observe between the genders may be the result of culture rather than genetics. All of that notwithstanding, a meaningful number of men have struggled badly as the economy has evolved, and have shown few signs of successful adaptation. Men’s difficulties are hardly evident in Silicon Valley or on Wall Street. But they’re hard to miss in foundering blue-collar and low-end service communities across the country. It is in these less affluent places that gender roles, family dynamics, and community character are changing in the wake of the crash.



A Cultural Separation


In the March 2010 issue of this magazine, I discussed the wide-ranging social consequences of male economic problems, once they become chronic. Women tend not to marry (or stay married to) jobless or economically insecure men—though they do have children with them. And those children usually struggle when, as typically happens, their parents separate and their lives are unsettled. The Harvard sociologist William Julius Wilson has connected the loss of manufacturing jobs from inner cities in the 1970s—and the resulting economic struggles of inner-city men—to many of the social ills that cropped up afterward. Those social ills eventually became self-reinforcing, passing from one generation to the next. In less privileged parts of the country, a larger, predominantly male underclass may now be forming, and with it, more-widespread cultural problems.

What I didn’t emphasize in that story is the extent to which these sorts of social problems—the kind that can trap families and communities in a cycle of disarray and disappointment—have been seeping into the nonprofessional middle class. In a national study of the American family released late last year, the sociologist W. Bradford Wilcox wrote that among “Middle Americans”—people with a high-school diploma but not a college degree—an array of signals of family dysfunction have begun to blink red. “The family lives of today’s moderately educated Americans,” which in the 1970s closely resembled those of college graduates, now “increasingly resemble those of high-school dropouts, too often burdened by financial stress, partner conflict, single parenting, and troubled children.”

“The speed of change,” wrote Wilcox, “is astonishing.” By the late 1990s, 37 percent of moderately educated couples were divorcing or separating less than 10 years into their first marriage, roughly the same rate as among couples who didn’t finish high school and more than three times that of college graduates. By the 2000s, the percentage in “very happy” marriages—identical to that of college graduates in the 1970s—was also nearing that of high-school dropouts. Between 2006 and 2008, among moderately educated women, 44 percent of all births occurred outside marriage, not far off the rate (54 percent) among high-school dropouts; among college-educated women, that proportion was just 6 percent.

The same pattern—families of middle-class nonprofessionals now resembling those of high-school dropouts more than those of college graduates—emerges with norm after norm: the percentage of 14-year-old girls living with both their mother and father; the percentage of adolescents wanting to attend college “very much”; the percentage of adolescents who say they’d be embarrassed if they got (or got someone) pregnant; the percentage of never-married young adults using birth control all the time.

One stubborn stereotype in the United States is that religious roots are deepest in blue-collar communities and small towns, and, more generally, among Americans who do not have college degrees. That was true in the 1970s. Yet since then, attendance at religious services has plummeted among moderately educated Americans, and is now much more common among college grads. So, too, is participation in civic groups. High-school seniors from affluent households are more likely to volunteer, join groups, go to church, and have strong academic ambitions than seniors used to be, and are as trusting of other people as seniors a generation ago; their peers from less affluent households have become less engaged on each of those fronts. A cultural chasm—which did not exist 40 years ago and which was still relatively small 20 years ago—has developed between the traditional middle class and the top 30 percent of society.

The interplay of economic and cultural forces is complex, and changes in cultural norms cannot be ascribed exclusively to the economy. Wilcox has tried to statistically parse the causes of the changes he has documented, concluding that about a third of the class-based changes in marriage patterns, for instance, are directly attributable to wage stagnation, increased job insecurity, or bouts of unemployment; the rest he attributes to changes in civic and religious participation and broader changes in attitudes among the middle class.

In fact, all of these variables seem to reinforce each other. Nonetheless, some of the most significant cultural changes within the middle class have accelerated in the past decade, as the prospects of the nonprofessional middle class have dimmed. The number of couples who live together but are not married, for instance, has been rising briskly since the 1970s, but it really took off in the aughts—nearly doubling, from 3.8 million to 6.7 million, from 2000 to 2009. From 2009 to 2010, that number jumped by nearly a million more. In six out of 10 of the newly cohabitating couples, at least one person was not working, a much higher proportion than in the past.

Ultimately, the evolution of the meritocracy itself appears to be at least partly responsible for the growing cultural gulf between highly educated Americans and the rest of society. As the journalist Bill Bishop showed in his 2008 book, The Big Sort, American communities have become ever more finely sorted by affluence and educational attainment over the past 30 years, and this sorting has in turn reinforced the divergence in the personal habits and lifestyle of Americans who lack a college degree from those of Americans who have one. In highly educated communities, families are largely intact, educational ideals strong, and good role models abundant. None of those things is a given anymore in communities where college-degree attainment is low. The natural leaders of such communities—the meritocratic winners who do well in school, go off to selective colleges, and get their degrees—generally leave them for good in their early 20s.

In their 2009 book, Creating an Opportunity Society, Ron Haskins and Isabel Sawhill write that while most Americans believe that opportunity is widespread in the United States, and that success is primarily a matter of individual intelligence and skill, the reality is more complicated. In recent decades, people born into the middle class have indeed moved up and down the class ladder readily. Near the turn of the millennium, for instance, middle-aged people who’d been born to middle-class parents had widely varied incomes. But class was stickier among those born to parents who were either rich or poor. Thirty-nine percent of children born to parents in the top fifth of earners stayed in that same bracket as adults. Likewise, 42 percent of those whose parents were in the bottom fifth remained there themselves. Only 6 percent reached the top fifth: rags-to-riches stories were extremely rare.

A thinner middle class, in itself, means fewer stepping stones available to people born into low-income families. If the economic and cultural trends under way continue unabated, class mobility will likely decrease in the future, and class divides may eventually grow beyond our ability to bridge them.

What is most worrying is that all of the most powerful forces pushing on the nonprofessional middle class—economic and cultural—seem to be pushing in the same direction. We cannot know the future, and over time, some of these forces may dissipate of their own accord. Further advances in technology may be less punishing to middle-skill workers than recent advances have been; men may adapt better to a post-industrial economy, as the alternative to doing so becomes more stark; nonprofessional families may find a new stability as they accommodate themselves to changing norms of work, income, and parental roles. Yet such changes are unlikely to occur overnight, if they happen at all. Momentum alone suggests years of trouble for the middle class.


Changing the Path of the American Economy


True recovery from the Great Recession is not simply a matter of jolting the economy back onto its former path; it’s about changing the path. No single action or policy prescription can fix the varied problems facing the middle class today, but through a combination of approaches—some aimed at increasing the growth rate of the economy itself, and some at ensuring that more people are able to benefit from that growth—we can ameliorate them. Many of the deepest economic trends that the recession has highlighted and temporarily sped up will take decades to fully play out. We can adapt, but we have to start now.

The rest of this article suggests how we might do so. The measures that I propose are not comprehensive, nor are they without drawbacks. But they are emblematic of the types of proposals we will need to weigh in the coming years, and of the nature of the national conversation we need to have. That conversation must begin with a reassessment of how globalization is affecting American society, and of what it will take for the U.S. to thrive in a rapidly changing world.

In 2010, the McKinsey Global Institute released a report detailing just how mighty America’s multinational companies are—and how essential they have become to the U.S. economy. Multinationals headquartered in the U.S. employed 19 percent of all private-sector workers in 2007, earned 25 percent of gross private-sector profits, and paid out 25 percent of all private-sector wages. They also accounted for nearly three-quarters of the nation’s private-sector R&D spending. Since 1990, they’ve been responsible for 31 percent of the growth in real GDP.

Yet for all their outsize presence, multinationals have been puny as engines of job creation. Over the past 20 years, they have accounted for 41 percent of all gains in U.S. labor productivity—but just 11 percent of private-sector job gains. And in the latter half of that period, the picture grew uglier: according to the economist Martin Sullivan, from 1999 through 2008, U.S. multinationals actually shrank their domestic workforce by about 1.9 million people, while increasing foreign employment by about 2.4 million.

The heavy footprint of multinational companies is merely one sign of how inseparable the U.S. economy has become from the larger global economy—and these figures neatly illustrate two larger points. First, we can’t wish away globalization or turn our backs on trade; to try to do so would be crippling and impoverishing. And second, although American prosperity is tied to globalization, something has nonetheless gone wrong with the way America’s economy has evolved in response to increasingly dense global connections.

Particularly since the 1970s, the United States has placed its bets on continuous innovation, accepting the rapid transfer of production to other countries as soon as goods mature and their manufacture becomes routine, all with the idea that the creation of even newer products and services at home will more than make up for that outflow. At times, this strategy has paid off big. Rapid innovation in the 1990s allowed the economy to grow quickly and create good, new jobs up and down the ladder to replace those that were becoming obsolete or moving overseas, and enabled strong income growth for most Americans. Yet in recent years, that process has broken down.

One reason, writes the economist Michael Mandel, is that America no longer enjoys the economic fruits of its innovations for as long as it used to. Knowledge, R&D, and business know-how depreciate more quickly now than they did even 15 years ago, because global communication is faster, connections are more seamless, and human capital is more broadly diffused than in the past.

As a result, domestic production booms have ended sooner than they used to. IT-hardware production, for instance, which in 1999 the Bureau of Labor Statistics projected would create about 155,000 new jobs in the U.S. over the following decade, actually shrank by nearly 500,000 jobs in that time. Jobs in data processing also fell, presumably as a result of both offshoring and technological advance. Because innovations now depreciate faster, we need more of them than we used to in order to sustain the same rate of economic growth.

Yet in the aughts, as an array of prominent economists and entrepreneurs have recently pointed out, the rate of big innovations actually slowed considerably; with the housing bubble fueling easy growth for much of that time, we just didn’t notice. This slowdown may have been merely the result of bad luck—big breakthroughs of the sort that create whole categories of products or services are difficult to predict, and long droughts are not unknown. Overregulation in certain areas may also have played a role. The economist Tyler Cowen, in his recent book, The Great Stagnation, argues that the scientific frontier itself—or at least that portion of it leading to commercial innovation—has been moving outward more slowly, and requiring ever more resources to do so, for many decades.

Process innovation has been quite rapid in recent years. U.S. multinationals and other companies are very good at continually improving their operational efficiency by investing in information technology, restructuring operations, and shifting work around the globe. Some of these activities benefit some U.S. workers, by making the jobs that stay in the country more productive. But absent big breakthroughs that lead to new products or services—and given the vast reserves of low-wage but increasingly educated labor in China, India, and elsewhere—rising operational efficiency hasn’t been a recipe for strong growth in either jobs or wages in the United States.

America has huge advantages as an innovator. Places like Silicon Valley, North Carolina’s Research Triangle, and the Massachusetts high-tech corridor are difficult to replicate, and the United States has many of them. Foreign students still flock here, and foreign engineers and scientists who get their doctorates here have been staying on for longer and longer over the past 15 years. When you compare apples to apples, the United States still leads the world, handily, in the number of skilled engineers, scientists, and business professionals in residence.

But we need to better harness those advantages to speed the pace of innovation, in part by putting a much higher national priority on investment—rather than consumption—in the coming years. That means, among other things, substantially raising and broadening both national and private investment in basic scientific progress and in later-stage R&D—through a combination of more federal investment in scientific research, perhaps bigger tax breaks for private R&D spending, and a much lower corporate tax rate (and a simpler corporate tax code) overall.

Edmund Phelps and Leo Tilman, professors at Columbia University, have proposed the creation of a National Innovation Bank that would invest in, or lend to, innovative start-ups—bringing more money to bear than venture-capital funds could, and at a lower cost of capital, which would promote more investment and enable the funding of somewhat riskier ventures. The broader idea behind such a bank is that because innovation carries so many ambient benefits—from job creation to the experience gained by even failed entrepreneurs and the people around them—we should be willing to fund it more liberally as a society than private actors would individually.

Removing bureaucratic obstacles to innovation is as important as pushing more public funds toward it. As Wall Street has amply demonstrated, not every industry was overregulated in the aughts. Nonetheless, the decade did see the accretion of a number of regulatory measures that may have chilled the investment climate (the Sarbanes-Oxley accounting reforms and a proliferation of costly security regulations following the creation of the Department of Homeland Security are two prominent examples).

Regulatory balance is always difficult in practice, but Michael Mandel has suggested a useful rule of thumb: where new and emerging industries are concerned—industries that are at the forefront of the economy and could provide big bursts of growth—our bias should be toward light regulation, allowing creative experimentation and encouraging fast growth. The rapid expansion of the Internet in the 1990s is a good example of the benefit that can come from a light regulatory hand early in an industry’s development; green technology, wireless platforms, and social-networking technologies are perhaps worthy of similar treatment today.

Any serious effort to accelerate innovation would mean taking many other actions as well—from redoubling our commitment to improving U.S. schools, to letting in a much larger number of creative, highly skilled immigrants each year. Few such measures will be without costs or drawbacks. Among other problems, a mandate of light regulation on high-potential industries requires the government to “pick winners.” Tilting government spending toward investment and innovation probably means tilting it away from defense and programs aimed at senior citizens. And because the benefits of innovation diffuse more quickly now, the return on national investment in scientific research and commercial innovation may be lower than it was in previous decades. Despite these drawbacks and trade-offs, the alternative to heavier investment and a higher priority on national innovation is dismal to contemplate.

As we strive toward faster innovation, we also need to keep the production of new, high-value goods within American borders for a longer period of time. Protectionist measures are generally self-defeating, and while vigilance against the theft of intellectual property and strong sanctions when such theft is discovered are sensible, they are unlikely to alter the basic trends of technological and knowledge diffusion. (Much of that diffusion is entirely legal, and the long history of industrialization and globalization suggests that attempts to halt it will fail.) What can really matter is a fair exchange rate. Throughout much of the aughts and continuing to the present day, China, in particular, has taken extraordinary measures to keep its currency undervalued relative to the dollar, and this has harmed U.S. industry. We must press China on currency realignment, putting sanctions on the table if necessary.

Given some of the workforce trends of the past decade, doubling down on technology, innovation, and globalization may seem wrongheaded. And indeed, this strategy is no cure-all. But without a vibrant, innovative economy, all other prospects dim. For the professional middle class in particular, an uptick in innovation and a return to faster economic growth would solve many problems, and likely reignite income growth. While technology is eating into the work that some college graduates do, their general skills show little sign of losing value. Recent analysis by the McKinsey Global Institute, for instance, indicates that demand for college grads by American businesses is likely to grow quickly over the next decade even if the economy grows very slowly; rapid economic growth would cause demand for college grads to far exceed supply.

Still, even in boom times, many more people than we would care to acknowledge won’t have the education, skills, or abilities to prosper in a pure and globalized market, shaped by enormous labor reserves in China, India, and other developing countries. Over the next decade or more, even if national economic growth is strong, what we do to help and support moderately educated Americans may well determine whether the United States remains a middle-class country.



Filling the Hole in the Middle Class


In The Race Between Education and Technology, the economists Claudia Goldin and Lawrence Katz write that throughout roughly the first three-quarters of the 20th century, most Americans prospered and inequality fell because, although technological advance was rapid—and mostly biased toward people with relatively high skills—educational advance was faster still; the pool of people who could take advantage of new technologies kept growing larger, while the pool of those who could not stayed relatively small.

There would be no better tonic for the country’s recent ills than a resumption of the rapid advance of skills and abilities throughout the population. Clearly there is room for improvement. About 30 percent of young adults finish college today, yet that figure is 50 percent among those with affluent parents. It follows that with improvements in the K–12 school system, more-stable home environments, and widespread financial access to college, we eventually could move to a 50 percent college graduation rate overall. And because IQ worldwide has been slowly increasing from generation to generation—a somewhat mysterious development known as the “Flynn effect”—higher rates still may eventually come within reach.

Yet the past three decades of experience suggest that this upward migration, even to, say, 40 percent, will be slow and difficult. (From 1979 to 2009, the percentage of people ages 25 to 29 with a four-year college degree rose from 23.1 percent to 30.6 percent—or roughly 1 percentage point every four years.) And ultimately, of course, the college graduation rate is likely to hit a substantially lower ceiling than that for high school or elementary school. For a time, elementary school was the answer to the question of how to build a broad middle class in America. And for a time after that, the answer was high school. College may never provide as comprehensive an answer. At the very least, over the next decade or two, college education simply cannot be the whole answer to the woes of the middle class, since even under the rosiest of assumptions, most of the middle of society will not have a four-year college degree.

Among the more pernicious aspects of the meritocracy as we now understand it in the United States is the equation of merit with test-taking success, and the corresponding belief that those who struggle in the classroom should expect to achieve little outside it. Progress along the meritocratic path has become measurable from a very early age. This is a narrow way of looking at human potential, and it badly underserves a large portion of the population. We have beaten the drum so loudly and for so long about the centrality of a college education that we should not be surprised when people who don’t attend college—or those who start but do not finish—go adrift at age 18 or 20. Grants, loans, and tax credits to undergraduate and graduate students total roughly $160 billion each year; by contrast, in 2004, federal, state, and local spending on employment and training programs (which commonly assist people without a college education) totaled $7 billion—an inflation-adjusted decline of about 75 percent since 1978.

As we continue to push for better K–12 schooling and wider college access, we also need to build more paths into the middle class that do not depend on a four-year college degree. One promising approach, as noted by Haskins and Sawhill, is the development of “career academies”—schools of 100 to 150 students, within larger high schools, offering a curriculum that mixes academic coursework with hands-on technical courses designed to build work skills. Some 2,500 career academies are already in operation nationwide. Students attend classes together and have the same guidance counselors; local employers partner with the academies and provide work experience while the students are still in school.

“Vocational training” programs have a bad name in the United States, in part because many people assume they close off the possibility of higher education. But in fact, career-academy students go on to earn a postsecondary credential at the same rate as other high-school students. What’s more, they develop firmer roots in the job market, whether or not they go on to college or community college. One recent major study showed that on average, men who attended career academies were earning significantly more than those who attended regular high schools, both four and eight years after graduation. They were also 33 percent more likely to be married and 36 percent less likely to be absentee fathers.

Career-academy programs should be expanded, as should apprenticeship programs (often affiliated with community colleges) and other, similar programs that are designed to build an ethic of hard work; to allow young people to develop skills and achieve goals outside the traditional classroom as well as inside it; and ultimately to provide more, clearer pathways into real careers. By giving young people more information about career possibilities and a tangible sense of where they can go in life and what it takes to get there, these types of programs are likely to lead to more-motivated learning, better career starts, and a more highly skilled workforce. Their effect on boys in particular is highly encouraging. And to the extent that they can expose boys to opportunities within growing fields like health care (and also expose them to male role models within those fields), these programs might even help weaken the grip of the various stereotypes that seem to be keeping some boys locked into declining parts of the economy.

Even in the worst of scenarios, “middle skill” jobs are not about to vanish altogether. Many construction jobs and some manufacturing jobs will return. And there are many, many middle-income occupations—from EMTs, lower-level nurses, and X-ray technicians, to plumbers and home remodelers—that trade and technology cannot readily replace, and these fields are likely to grow. A more highly skilled workforce will allow faster, more efficient growth; produce better-quality goods and services; and earn higher pay.

All of that said, the overall pattern of change in the U.S. labor market suggests that in the next decade or more, a larger proportion of Americans may need to take work in occupations that have historically required little skill and paid low wages. Analysis by David Autor indicates that from 1999 to 2007, low-skill jobs grew substantially as a share of all jobs in the United States. And while the lion’s share of jobs lost during the recession were middle-skill jobs, job growth since then has been tilted steeply toward the bottom of the economy; according to a survey by the National Employment Law Project, three-quarters of American job growth in 2010 came within industries paying, on average, less than $15 an hour. One of the largest challenges that Americans will face in the coming years will be doing what we can to make the jobs that have traditionally been near the bottom of the economy better, more secure, and more fulfilling—in other words, more like middle-class jobs.

As the urban theorist Richard Florida writes in The Great Reset, part of that process may be under way already. A growing number of companies have been rethinking retail-workforce development, to improve productivity and enhance the customer experience, leading to more-enjoyable jobs and, in some cases, higher pay. Whole Foods Markets, for instance, one of Fortune magazine’s “Best Companies to Work For,” organizes its workers into teams and gives them substantial freedom as to how they go about their work; after a new worker has been on the job for 30 days, the team members vote on whether the new employee has embraced the job and the culture, and hence whether he or she should be kept on. Best Buy actively encourages all its employees to suggest improvements to the company’s work processes, much as Toyota does, and favors promotion from within. Trader Joe’s sets wages so that full-time employees earn at least a median income within their community; store captains, most of them promoted from within, can earn six figures.

The natural evolution of the economy will surely make some service jobs more productive, independent, and enjoyable over time. Yet productivity improvements at the bottom of the economy seem unlikely to be a sufficient answer to the problems of the lower and middle classes, at least for the foreseeable future. Indeed, the relative decline of middle-skill jobs, combined with slow increases in college completion, suggests a larger pool of workers chasing jobs in retail, food preparation, personal care, and the like—and hence downward pressure on wages.

Whatever the unemployment rate over the next several years, the long-term problem facing American society is not that employers will literally run out of work for people to do—it’s that the market value of much low-skill and some middle-skill work, and hence the wages employers can offer, may be so low that few American workers will strongly commit to that work. Bad jobs at rock-bottom wages are a primary reason why so many people at the lower end of the economy drift in and out of work, and this job instability in turn creates highly toxic social and family problems.

American economists on both the right and the left have long advocated subsidizing low-wage work as a means of social inclusion—offering an economic compact with everyone who embraces work, no matter their level of skill. The Earned Income Tax Credit, begun in 1975 and expanded several times since then, does just that, and has been the country’s best anti-poverty program. Yet by and large, the EITC helps only families with children. In 2008, it provided a maximum credit of nearly $5,000 to families with two children, with the credit slowly phasing out for incomes above $15,740 and disappearing altogether at $38,646. The maximum credit for workers without children (or without custody of children) was only $438. We should at least moderately increase both the level of support offered to families by the EITC and the maximum income to which it applies. Perhaps more important, we should offer much fuller support for workers without custody of children. That’s a matter of basic fairness. But it’s also a measure that would directly target some of the biggest budding social problems in the United States today. A stronger reward for work would encourage young, less-skilled workers—men in particular—to develop solid, early connections to the workforce, improving their prospects. And better financial footing for young, less-skilled workers would increase their marriageability.

A continued push for better schooling, the creation of clearer paths into careers for people who don’t immediately go to college, and stronger support for low-wage workers—together, these measures can help mitigate the economic cleavage of U.S. society, strengthening the middle. They would hardly solve all of society’s problems, but they would create the conditions for more-predictable and more-comfortable lives—all harnessed to continuing rewards for work and education. These, ultimately, are the most-critical preconditions for middle-class life and a healthy society.


The Limits of Meritocracy


As a society, we should be far more concerned about whether most Americans are getting ahead than about the size of the gains at the top. Yet extreme income inequality causes a cultural separation that is unhealthy on its face and corrosive over time. And the most-powerful economic forces of our times will likely continue to concentrate wealth at the top of society and to put more pressure on the middle. It is hard to imagine an adequate answer to the problems we face that doesn’t involve greater redistribution of wealth.

Soaking the rich would hardly solve all of America’s problems. Holding all else equal, we would need to raise the top two tax rates to roughly 90 percent, then unrealistically assume no change in the work habits of the people in those brackets, merely to bring the deficit in a typical year down to 2 percent of GDP. But even with strong budget discipline and a reduction in the growth of Medicare costs, somewhat higher taxes for most Americans—in one form or another—seem inevitable. If we aim to increase our national investment in innovation, and to provide more assistance to people who are falling out of the middle class (or who can’t step up into it), that’s even more true. The professional middle class in particular should not expect exemption from tax increases.

Over time, the United States has expected less and less of its elite, even as society has oriented itself in a way that is most likely to maximize their income. The top income-tax rate was 91 percent in 1960, 70 percent in 1980, 50 percent in 1986, and 39.6 percent in 2000, and is now 35 percent. Income from investments is taxed at a rate of 15 percent. The estate tax has been gutted.

High earners should pay considerably more in taxes than they do now. Top tax rates of even 50 percent for incomes in the seven-figure range would still be considerably lower than their level throughout the boom years of the post-war era, and should not be out of the question—nor should an estate-tax rate of similar size, for large estates.

The rich have not become that way while living in a vacuum. Technological advance, freer trade, and wider markets—along with the policies that promote them—always benefit some people and harm others. Economic theory is quite clear that the winners gain more than the losers lose, and therefore the people who suffer as a result of these forces can be fully compensated for their losses—society as a whole still gains. This precept has guided U.S. government policy for 30 years. Yet in practice, the losers are seldom compensated, not fully and not for long. And while many of the gains from trade and technological progress are widely spread among consumers, the pressures on wages that result from these same forces have been felt very differently by different classes of Americans.

What’s more, some of the policies that have most benefited the rich have little to do with greater competition or economic efficiency. Fortunes on Wall Street have grown so large in part because of implicit government protection against catastrophic losses, combined with the steady elimination of government measures to limit excessive risk-taking, from the 1980s right on through the crash of 2008.

As America’s winners have been separated more starkly from its losers, the idea of compensating the latter out of the pockets of the former has met stiff resistance: that would run afoul of another economic theory, dulling the winners’ incentives and squashing their entrepreneurial spirit; some, we are reminded, might even leave the country. And so, in a neat and perhaps unconscious two-step, many elites have pushed policies that benefit them, by touting theoretical gains to society—then ruled out measures that would distribute those gains widely.

Even as we continue to strive to perfect the meritocracy, signs that things may be moving in the other direction are proliferating. The increasing segregation of Americans by education and income, and the widening cultural divide between families with college-educated parents and those without them, suggests that built-in advantages and disadvantages may be growing. And the concentration of wealth in relatively few hands opens the possibility that much of the next generation’s elite might achieve their status through inheritance, not hard or innovative work.

America remains a magnet for talent, for reasons that go beyond the tax code; and by international standards, none of the tax changes recommended here would create an excessive tax burden on high earners. If a few financiers choose to decamp for some small island-state in search of the smallest possible tax bill, we should wish them good luck.

In political speeches and in the media, the future of the middle class is often used as a stand-in for the future of America. Yet of course the two are not identical. The size of the middle class has waxed and waned throughout U.S. history, as has income inequality. The post-war decades of the 20th century were unusually hospitable to the American middle class—the result of strong growth, rapid gains in education, progressive tax policy, limited free agency at work, a limited pool of competing workers overseas, and other supportive factors. Such serendipity is anomalous in American history, and unlikely to be repeated.

Yet if that period was unusually kind to the middle class, the one we are now in the midst of appears unusually cruel. The strongest forces of our time are naturally divisive; absent a wide-ranging effort to constrain them, economic and cultural polarization will almost surely continue. Perhaps the nonprofessional middle class is rich enough today to absorb its blows with equanimity. Perhaps plutonomy, in the 21st century, will prove stable over the long run.

But few Americans, no matter their class, will be eager for that outcome.

Origin
Source: the Atlantic 

No comments:

Post a Comment