we keep messing it up, that maybe our communication media are such spam-filled, dick-pic-laden, Nazi-promoting cesspools because we’re somehow doing them wrong. This chapter isn’t about the incredible promise of consciousness expansion that our new communication technologies, often quite justifiably, fuel. It is about our disappointment with them and the uses to which that disappointment has been put.

Marshall McLuhan appears to have been critical of what came to be popularized as the Shannon-Weaver model of communication, as well as of Norbert Wiener’s cybernetics, because they paid insufficient attention to “how people are changed by the instruments they employ.” In a strange way, McLuhan and Claude Shannon predicted the two central features that define our twenty-first-century media landscape. Shannon pointed out that by the management of redundancy, almost any content could be beamed across the planet. And McLuhan sensed that, because people would be producing, receiving, and enjoying that content, they would load up every available channel with redundancy right away. In other words: we can communicate better, and therefore we will actually communicate worse.

This isn’t exactly a new problem. In his book Speaking into the Air, John Durham Peters argues that “communication” has always been a concept that brims with potential, a potential that actual acts of communication nearly always fall short of. There is an almost mystical fantasy of perfect transparency, community, and directness behind this concept that draws on imagery of religious visions and divine inspiration. Though the word is quite old, the concept of communication became compelling to philosophers and theorists only once it was both imperative that messages travel with little distortion and clear that they very rarely did. The concept designates, as Peters puts it, both a bridge and a barrier. Or, put another way, communication was often taken to be solving the problems communication had created in the first place.

This has allied both the problem and the promise of communication with technological progress. The more networked we become, for instance, the more abuse of our systems of communication becomes a dangerous issue. Fake news on social media matters a great deal more than, say, a monk writing a fake chronicle in twelfth-century England, or someone drawing a slanderous cartoon in eighteenth-century France. But the fact that discourse about communication has traditionally pulled from mystical or religious language has allowed the media of the internet age to hide behind a convenient sense of disappointment—a dodge that has shadowed acts of communication since well before Huxley took his first gulp of mescaline. As a result, we aren’t able to communicate very well about our systems of communication.

If you require documentation for that claim, simply ask @jack—the Twitter CEO, Jack Dorsey—about banning neo-Nazis from his platform. You’ll get back an ever-changing cloud of verbiage, abuzz with ideals and high hopes. He’ll elide the fact that in countries where showing certain content would expose Twitter to legal liability, the company is perfectly happy to let those ideals and hopes be damned and get busy censoring. He’ll elide the incredibly tricky and deeply political choices his company makes to decide what content to take down. He’ll even elide exactly how this is done. At most, you get a sense of profound disappointment: We built you kids this amazing toy, and all you can think to do with it is be Nazis or call each other Nazis. This, as people so often remark on @jack’s platform, is why we can’t have nice things.

This space of disappointment is one that the right and the left, capitalists and their Marxist critics, largely occupy together, at times quite amicably. The company Palantir Technologies is universally regarded as one of the more dangerous in Silicon Valley when it comes to the possible violation of civil rights and threats to free speech. It creates technologies that aggregate and cross-reference massive data banks and try to predict threats to national security or whether, you know, an individual is an undocumented immigrant. And independent of whether you like the idea of the NSA or the FBI having access to such technologies, other companies have already created tech for far more authoritarian governments.

Two of the founders of Palantir are Peter Thiel and Alex Karp. Karp is a rarity among Silicon Valley CEOs, as he has a Ph.D. in social theory, having studied with Jürgen Habermas. (His dissertation adviser was another Frankfurt professor, Karola Brede, who isn’t nearly as well-known and whom Karp usually doesn’t mention.) This is often noted as something of a contradiction: Karp likes to invoke Habermas, one of the great theorists of the liberal order and of rights and transparency, yet he now builds technologies widely seen as being deeply dangerous to all of those things. Palantir Technologies, after all, is named after the great seeing stones of Tolkien’s The Lord of the Rings—like them, its technology is meant to allow the powerful to see what others cannot. That kind of imbalance contradicts the central idea of a public sphere, which, according to Habermas, we are all supposed to enter as equals. And yet, it isn’t as though Habermas were altogether bullish on the public sphere. In his great book The Structural Transformation of the Public Sphere (1962), he instead traces a gradual decline of the public sphere under pressure from mass media and consumer culture. While Habermas is not a cultural pessimist, he uses a narrative of decline characteristic of cultural pessimism. The public sphere is an ideal, and we’ve spent the last two hundred years falling increasingly short of it.

Karp’s dissertation uses the work of the sociologist Talcott Parsons to analyze what he calls “jargon”—speech that is used more for the feelings it engenders and transports in certain quarters than for its informational content; it’s language that, in a sense, makes its home in the space between the promises of the public sphere and its actuality. His example is a speech by the German author Martin Walser complaining about a supposed social compulsion in Germany to constantly refer back to the Holocaust

Вы читаете What Tech Calls Thinking
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату