whom? The public affairs manager on the other end of the phone sounded confused. “You mean privacy?” No, I said, I wanted to know how Google thought about its editorial power. “Oh,” he replied, “we’re just trying to give people the most relevant information.” Indeed, he seemed to imply, no ethics were involved or required.

I persisted: If a 9/11 conspiracy theorist searches for “9/11,” was it Google’s job to show him the Popular Mechanics article that debunks his theory or the movie that supports it? Which was more relevant? “I see what you’re getting at,” he said. “It’s an interesting question.” But I never got a clear answer.

Much of the time, as the Jargon File entry claims, engineers resist the idea that their work has moral or political consequences at all. Many engineers see themselves as interested in efficiency and design, in building cool stuff rather than messy ideological disputes and inchoate values. And it’s true that if political consequences of, say, a somewhat faster video-rendering engine exist, they’re pretty obscure.

But at times, this attitude can verge on a “Guns don’t kill people, people do” mentality—a willful blindness to how their design decisions affect the daily lives of millions. That Facebook’s button is named Like prioritizes some kinds of information over others. That Google has moved from PageRank—which is designed to show the societal consensus result—to a mix of PageRank and personalization represents a shift in how Google understands relevance and meaning.

This amorality would be par for the corporate course if it didn’t coincide with sweeping, world-changing rhetoric from the same people and entities. Google’s mission to organize the world’s information and make it accessible to everyone carries a clear moral and even political connotation—a democratic redistribution of knowledge from closed-door elites to the people. Apple’s devices are marketed with the rhetoric of social change and the promise that they’ll revolutionize not only your life but our society as well. (The famous Super Bowl ad announcing the release of the Macintosh computer ends by declaring that “1984 won’t be like 1984.”)

Facebook describes itself as a “social utility,” as if it’s a twenty-first-century phone company. But when users protest Facebook’s constantly shifting and eroding privacy policy, Zuckerberg often shrugs it off with the caveat emptor posture that if you don’t want to use Facebook, you don’t have to. It’s hard to imagine a major phone company getting away with saying, “We’re going to publish your phone conversations for anyone to hear— and if you don’t like it, just don’t use the phone.”

Google tends to be more explicitly moral in its public aspirations; its motto is “Don’t be evil,” while Facebook’s unofficial motto is “Don’t be lame.” Nevertheless, Google’s founders also sometimes play a get-out-of- jail-free card. “Some say Google is God. Others say Google is Satan,” says Sergey Brin. “But if they think Google is too powerful, remember that with search engines, unlike other companies, all it takes is a single click to go to another search engine. People come to Google because they choose to. We don’t trick them.”

Of course, Brin has a point: No one is forced to use Google, just as no one is forced to eat at McDonald’s. But there’s also something troubling about this argument, which minimizes the responsibility he might have to the billions of users who rely on the service Google provides and in turn drive the company’s billions in advertising revenue.

To further muddle the picture, when the social repercussions of their work are troubling, the chief architects of the online world often fall back on the manifest-destiny rhetoric of technodeterminism. Technologists, Siva Vaidyanathan points out, rarely say something “could” or “should” happen—they say it “will” happen. “The search engines of the future will be personalized,” says Google Vice President Marissa Mayer, using the passive tense.

Just as some Marxists believed that the economic conditions of a society would inevitably propel it through capitalism and toward a world socialist regime, it’s easy to find engineers and technodeterminist pundits who believe that technology is on a set course. Sean Parker, the cofounder of Napster and rogue early president of Facebook, tells Vanity Fair that he’s drawn to hacking because it’s about “re-architecting society. It’s technology, not business or government, that’s the real driving force behind large-scale societal shifts.”

Kevin Kelly, the founding editor of Wired, wrote perhaps the boldest book articulating the technodeterminist view, What Technology Wants, in which he posits that technology is a “seventh kingdom of life,” a kind of meta-organism with desires and tendencies of its own. Kelly believes that the technium, as he calls it, is more powerful than any of us mere humans. Ultimately, technology—a force that “wants” to eat power and expand choice—will get what it wants whether we want it to or not.

Technodeterminism is alluring and convenient for newly powerful entrepreneurs because it absolves them of responsibility for what they do. Like priests at the altar, they’re mere vessels of a much larger force that it would be futile to resist. They need not concern themselves with the effects of the systems they’ve created. But technology doesn’t solve every problem of its own accord. If it did, we wouldn’t have millions of people starving to death in a world with an oversupply of food.

It shouldn’t be surprising that software entrepreneurs are incoherent about their social and political responsibilities. A great deal of this tension undoubtedly comes from the fact that the nature of online business is to scale up as quickly as possible. Once you’re on the road to mass success and riches—often as a very young coder—there simply isn’t much time to fully think all of this through. And the pressure of the venture capitalists breathing down your neck to “monetize” doesn’t always offer much space for rumination on social responsibility.

The $50 Billion Sand Castle

Once a year, the Y Combinator start-up incubator hosts a daylong conference called Startup School, where successful tech entrepreneurs pass wisdom on to the aspiring audience of bright-eyed Y Combinator investees. The agenda typically includes many of the top CEOs in Silicon Valley, and in 2010, Mark Zuckerberg was at the top of the list.

Zuckerberg was in an affable mood, dressed in a black T-shirt and jeans and enjoying what was clearly a friendly crowd. Even so, when Jessica Livingston, his interviewer, asked him about The Social Network, the movie that had made him a household name, a range of emotions crossed his face. “It’s interesting what kind of stuff they focused on getting right,” Zuckerberg began. “Like, every single shirt and fleece they had in that movie is actually a shirt or fleece that I own.”

Where there was an egregious discrepancy between fiction and reality, Zuckerberg told her, was how his own motivations were painted. “They frame it as if the whole reason for making Facebook and building something was that I wanted to get girls, or wanted to get into some kind of social institution. And the reality, for people who know me, is that I’ve been dating the same girl since before I started Facebook. It’s such a big disconnect…. They just can’t wrap their head around the idea that someone might build something because they like building things.”

It’s entirely possible that the line was just a clever bit of Facebook PR. And there’s no question that the twenty-six-year-old billionaire is motivated by empire building. But the comment struck me as candid: For programmers as for artists and craftsmen, making things is often its own best reward.

Facebook’s flaws and its founder’s ill-conceived views about identity aren’t the result of an antisocial, vindictive mind-set. More likely, they’re a natural consequence of the odd situation successful start-ups like Facebook create, in which a twenty-something guy finds himself, in a matter of five years, in a position of great authority over the doings of 500 million human beings. One day you’re making sand castles; the next, your sand castle is worth $50 billion and everyone in the world wants a piece of it.

Of course, there are far worse business-world personality types with whom to entrust the fabric of our social lives. With a reverence for rules, geeks tend to be principled—to carefully consider and then follow the rules they set for themselves and to stick to them under social pressure. “They have a somewhat skeptical view of authority,” Stanford professor Terry Winograd said of his former students Page and Brin. “If they see the world going one way and they believe it should be going the other way, they are more like to say ‘the rest of the world is wrong’ rather than ‘maybe we should reconsider.’”

But the traits that fuel the best start-ups—aggression, a touch of arrogance, an interest in empire building,

Вы читаете The Filter Bubble
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату