[NA-Discuss] Don’t Be Evil, but Don’t Miss the Train
mcknight.glenn at gmail.com
Sun Apr 22 02:21:22 UTC 2012
BACK in 2004, as
to go public, Larry Page and Sergey Brin celebrated the maxim that was
supposed to define their company: “Don’t be evil.”
But these days, a lot of people — at least the mere mortals outside the
Googleplex — seem to be wondering about that uncorporate motto.
How is it that Google, a company chockablock with brainiac engineers, savvy
marketing types and flinty legal minds, keeps getting itself in hot water?
Google, which stood up to the Death Star of Microsoft? Which changed the
world as we know it?
The latest brouhaha, of course, involves the strange tale of Street View,
Google’s project to photograph the entire world, one street at a time, for
its maps feature. It turns out Google was collecting more than just images:
federal authorities have dinged the company for lifting personal data off
Wi-Fi systems, too, including e-mails and passwords.
Evil? Hard to know. But certainly weird — and enough to prompt a small fine
of $25,000 from the Federal Communications
far more damaging, howls from Congress and privacy advocates. A Google
spokeswoman called the hack “a mistake” and disagreed with the F.C.C.’s
contention that Google “deliberately impeded and delayed” the commission’s
Many people might let this one go, were it not for all those other
worrisome things at Google. The company has been accused of flouting
copyrights, leveraging other people’s work for its benefit and violating
European protections of personal privacy, among other things. “Don’t be
evil” no longer has its old ring. And Google, an underdog turned overlord,
is no humble giant. It tends to approach any controversy with an air that
ranges somewhere between “trust us” and “what’s good for Google is good for
But ascribing what’s going on here solely to the power or arrogance of a
single company misses an important dimension of today’s high-technology
business, where there are frequent assaults, real or perceived, on various
business standards and practices.
Mark Zuckerberg has apologized multiple times for
changing policies on privacy and data ownership. Last year, he agreed to a
20-year audit of Facebook’s
Jeffrey P. Bezos has been criticized for how
data with other companies, and what information it stores in its browser.
even before it drew fire for the labor practices at Foxconn in China, had
trouble over the way it handled personal information in making music
When such problems arise, executives often stare blankly at their accusers.
When a company called Path was recently found to be collecting the digital
address books of its customers, for instance, its founder characterized the
process as an “industry best
reversed the policy after a storm of criticism.
WHAT’S going on, when business as usual in such a dynamic industry makes
the regulators — and the public — nervous?
Part of Google’s problem may be no more than an ordinary corporate
quandary. “With ‘Don’t be evil,’ Google set itself up for accusations of
hypocrisy anytime they got near the line,” says Roger McNamee, a longtime
Silicon Valley investor. “Now they are on the defensive, with their
business undermined especially by Apple. When people are defensive they can
do things that are emotional, not reasonable, and bad behavior starts.”
But “Don’t be evil” also represents the impossibility of a more nuanced
social code, a problem faced by many Internet companies. Nearly every tech
company of significance, it seems, is building technologies that are
producing an entirely new kind of culture. EBay, in theory, can turn anyone
on the planet into a merchant. Amazon Web Services gives everyone a cheap
supercomputer. Twitter and Facebook let you publish to millions. And tools
like Google Translate allow us to transcend old language barriers.
“You want a company culture that says, ‘We are on a mission to change the
world; the world is a better place because of us,’” says Reid Hoffman,
founder of LinkedIn and a venture capitalist with Greylock
“It’s not just ‘we create jobs.’ A tobacco company can do that.”
“These companies give away a ton of value, a public good, with free
products like Google search, that transforms cultures,” Mr. Hoffman says.
“The easy thing to say is, ‘If you try to regulate us, you’ll do more harm
than good, you’re not good social architects.’ I’m not endorsing that, but
I understand it.”
The executives themselves don’t know what their powerful changes mean yet,
and they, like the rest of us, are dizzied by the pace of change. Sure,
automobiles changed the world, but the roads, gas stations and suburbs grew
over decades. Facebook was barely on the radar five years ago and now has a
community of more than 800 million, doing things that no one predicted.
When the builders of the technology barely understand the effect they are
having, the regulators of the status quo can seem clueless.
Moreover, arrogance can come easily to phenomenally well-educated people
who have always been at the top of the class. Success, though sometimes
fickle, comes fast, and is registered in millions and billions of dollars.
The world applauds, so it’s easy to see yourself as a person who can choose
well for the world.
In the “people like us” haze of the rarefied realms of tech, it’s easy to
forget that, well, not everyone* is* like us. Not everyone is comfortable
with the idea of sharing personal information, of living in full view on
the Web. And, of course, ordinary people have more downside risk than a
26-year-old Harvard dropout billionaire.
Another hazard is also one of the great strengths of the Silicon Valley: a
tolerance of failure. Failing at an interesting project is seen as an
important kind of learning. In the most famous case, Steve Jobs was driven
from Apple, then failed in his NeXT Computer venture and for a while
floundered at Pixar. But he picked up vital skills in management and
technology along the way. There are a thousand lesser such stories.
If tech is building a new culture, with new senses of the private and the
shared, the failure of overstepping boundaries is also the only way to
learn where those boundaries have shifted.
It is a self-serving point, but that doesn’t mean it’s entirely wrong. To
the outsiders, it can look a lot as if the companies are playing “catch us
if you can” by continually testing, and sometimes exceeding, boundaries.
IS there a better way? Mr. Hoffman says he thinks the tech industry has to
acknowledge how much its products are shaping society. “We need something
more than, ‘We’re good guys, trust us,’ ” he says. “There should be an
industry group that discusses overall issues around data and privacy with
political actors. Something that convinces them that you are good guys, but
gives them a place to swoop in.”
mcknight.glenn at gmail.com
More information about the NA-Discuss