Democracy Gone Astray

Democracy, being a human construct, needs to be thought of as directionality rather than an object. As such, to understand it requires not so much a description of existing structures and/or other related phenomena but a declaration of intentionality.
This blog aims at creating labeled lists of published infringements of such intentionality, of points in time where democracy strays from its intended directionality. In addition to outright infringements, this blog also collects important contemporary information and/or discussions that impact our socio-political landscape.

All the posts here were published in the electronic media – main-stream as well as fringe, and maintain links to the original texts.

[NOTE: Due to changes I haven't caught on time in the blogging software, all of the 'Original Article' links were nullified between September 11, 2012 and December 11, 2012. My apologies.]

Saturday, February 07, 2015

The Moral Hazard of Big Data

If you are a person in America, then there are equations trying to learn more about you. Some of these equations work for private companies and some of them work for the government, but they all generate correlations based on your behavior. Google “Ways to keep New Years resolution” and buy a sweatband on Amazon, and your Facebook ads all turn to gym memberships. Search Wikihow for “Join al-Nusra Front” and buy a hunting knife at Target, alarms go off at the NSA. Interacting with the world now involves an implicit agreement to be watched, and not just by surveillance cameras, global positioning satellites, and browser cookies, but increasingly by algorithms designed to predict and manage our future conduct.

University of Maryland law professor Frank Pasquale’s new book The Black Box Society is a tour of how computational intelligence has come to dominate three important parts of American life: reputation, search, and finance. Pasquale is invoking a couple different concepts with the title. Like a black box on an airplane, these algorithms take information from the noise around them; like a black box in computer science, they are hidden systems, only observable from the outside in terms of their inputs and outputs. But more like black holes, the algorithms are visible in their effects on their surroundings. Our economy—and the many vital life processes it manages—twists and turns based on the say-so of inscrutable mathematical processes.

Reputation in this context means more than what people say about us when we’re not there. Our every watchable choice, from shopping to clicking, paints part of a portrait. The computers know us as the pupil at the center of a giant Venn diagram, the intersection of an uncountable mass of circles, each one labeled “Middle child” or “Honey Bunches of Oats eater” or “Bisexual,” etc. (Rob Horning calls this identity we leak as we move through the digitally connected world the “data self.”) Retailers, advertisers, and data brokers nip around the edges, identifying meaningful correlations where they can. Often this is relatively harmless, or even helpful—as when Netflix suggests a new movie or Pandora a new song. Other times it reveals the clumsiness of its own methodology, as when Twitter promotes an ad for a black professionals dating service into my timeline. The algorithms don’t need to get it right every time, just enough to make the correlation worthwhile in the aggregate.

But the digital reputation-system’s subterranean operations can be downright vicious. Civil rights protections in hiring, housing, and credit are no match for algorithms that can guess your race or propensity for future pregnancy. Pasquale writes that “a surprising proportion of digital marketing is about finding marks for dubious loans, pharmaceutical products, and fly-by-night for-profit educators.” Gambling scammers target lists of recovering addicts, and, of course, there are the penis enlargement ads. Whether or not it's in our interests, we are constantly telling computers the best ways to wring money and time (which is also money) out of us. We inform on ourselves, and we collectively provide the mass of data necessary for them to guess about people like us. It’s the “know your enemy” school of marketing, and we’re the enemy.

When it comes to search, Google isn’t a result, it’s who you ask in the first place. But the concept is much bigger than any would-be hegemonic firm. The Internet without a search engine is a whole different creation, like a telephone network without a phone book. Search enables us to connect to what we don’t already know, which is almost everything in the world. Pasquale likens Google’s search dominance to the spread of the English language: There’s nothing that makes it necessarily the best, but it was on top at an important moment and has been able to retain its spot. But Google’s owners (like English speakers) have interests are not identical to those of the general population; Pasquale cites a handful of situations where their central algorithm seemed to reflect the company’s financials more than an “objective” measure of the best results. Google has endeavored to appear to consumers like a public utility, a benign if somewhat paternal presence, but their books aren’t open to us. Ranking algorithms are as arbitrary and temperamental as the corporate interests that refine them.

Compared to reputation and search, finance is the area where Americans are most aware of the dangers of algorithms run amok. It was algorithms that made it possible for banks to combine sub-prime mortgages into respectable looking investments, and another set of rating algorithms that projected they would be safe places to put money. At its root, finance is supposed to be the part of the economy devoted to the efficient allocation of resources. The opportunity for profit keeps capital flowing from investors who have it to people who can make good use of it. But as the finance sector has developed, we see how these incentives alone don’t ensure good outcomes.

Meanwhile financiers can hide the nature of their moves in impenetrable math. As Pasquale puts it, “Extreme complexity doesn’t merely anesthetize the public. It also extends an open invitation for quants or traders or managers to bully their way past gatekeepers, like rating agencies, accountants, and regulators.” Finance professionals have, like corrupt Party apparatchiks, used their central position in the nation’s system of resource allocation to direct those same resources straight into their own pockets. Hyper-complex algorithms have accelerated this process, allowing financiers to generate massive growth for themselves that is disconnected from growth for everyone else.

Pasquale devotes the final chapters of Black Box Society to regulation and reform. Despite the demonstrated ability of corporations to effectively co-opt, evade, and neutralize government oversight in all three domains he analyzes, the author has a sort of resigned faith in socialist solutions. As a scholar of digital law, Pasquale offers a number of intriguing reforms, like immutable audit logs and a taxpayer-funded voucher system to support independent artists. They all feel a bit half-hearted. The underlying code for every black box is to generate profit. By design the best the government can do is clean up after and try to ameliorate the worst consequences. Even these basic caretaking functions have corroded; Washington has its own black boxes, and they rely heavily on the private sector’s cooperation. This relationship is coziest in new “fusion centers,” where corporate and state data miners gather to share notes, protected by complementary excuses: trade secrets and national security.

Pasquale does an excellent job making the case against these shady practices, but short of a coup by fair-minded digital law scholars, it’s not clear what if any safeguard mechanisms exist to put the country on a better course. In the near-term, even our cooperation with the black boxes is irrelevant, for as long as enough people click “accept,” they can estimate close enough about the rest of us. The problem-problem-solution structure of books like Black Box Society doesn’t give Pasquale much room to maneuver or imagine; he ends up suggesting the answer to corporate-state cooperation is corporate-state cooperation.

The black box society means computers are playing a central role in our individual and social life, but it also stands for particular consequences. “The interpenetration of state and business in finance and law enforcement,” Pasquale writes, “serves an ever narrower set of interests.” Sublime amounts of wealth, power, and information flow to a tiny and unaccountable minority. Don’t blame the computers; the broken feedback loop is hardwired. As long as we engineer profit machines, they will classify other non-market concerns as obstacles, and they will continue to circumvent them. The claim that profit-seeking is the most efficient way to manage and develop society’s collective resources—human and material—is no longer plausible. But with the surfeit of information we have already elicited from each other, we have more than enough to begin work on a new program.

Original Article
Source: newrepublic.com/
Author: Malcolm Harris

No comments:

Post a Comment