A deranged clown spreading purple smoke

Don’t be evil. Really, don’t.

culture founder experience leadership Feb 01, 2022

One of the many marvels of Japan is Nikkō Tōshō-gū shrine, nestled in verdant mountains 150km north of Tokyo. It was built in the 17th century to venerate the spirit of Tokugawa Ieyasu, the last of the dynastic Shoguns. Overlooked by a cypress forest and five storey pagoda, the shrine is part of a temple complex that manages to evoke perfect tranquillity despite the constant train of passing tourists. It is also the site of a unexpected joke. The stable building bears an intricate wood carving of the three wise monkeys, sometimes called mystic apes, who hear, speak and see no evil. Popular in Japanese culture, this proverb suggests that evil can be contained or lessened if we close our senses to it. What would the great Ieyasu have thought?

Evil became a currency in tech circles in 2004, when an IPO letter from Google’s founders said the following: "Don't be evil. We believe strongly that in the long term, we will be better served—as shareholders and in all other ways—by a company that does good things for the world even if we forgo some short term gains." In so doing they perfectly captured the optimistic view of humanity that most tech founders aspire to.

This phrase became a part of the company’s code of conduct until it was removed in 2018. By and large Google has lived up to this promise. It has been guilty of monopolistic and anti-competitive behaviour, tax evasion and stretching the limits of privacy and acceptable data usage. These have, unfortunately, become part of the Big Tech playbook. No one would accuse Google of being particularly evil. Nor Apple, whose zealous control-freakery has distinctly puritanical overtones. Many booksellers consider Amazon to be evil for the way it has distorted and dominated publishing. Some of its underpaid warehouse employees may agree and we all have our secret doubts about Jeff Bezos and his phallic Blue Origin rocket. But is Amazon really evil? Or are they just very good at giving us what we want: ever more cheaper stuff?

Which brings us to Facebook. All is not well in the world of Zuckerberg. This week’s catastrophic outage wiped $40bn off Facebook’s value and cost it $100m in Iost ad revenue. No doubt other tech execs were amused to hear of the cascading system meltdown and stories of Facebook engineers being locked out of their own offices and computers. Any schadenfreude would be misplaced, because Facebook will recover. Such mega-outages have a negative impact on the entire sector, as everyone suddenly realises and starts to question their increasing dependence on Big Tech. Already there are calls for Facebook to be broken up.

No, the real issue that Facebook has is whistleblowers lifting the bonnet on how the company operates. Some of it sounds ugly. And some of it sounds evil. Not evil in the sense of trying to deliberately do harm, but evil in the sense of not doing good, allowing, enabling and perhaps even inadvertently encouraging evil to spread. At best Facebook cynically operates on the three monkey principle: see, hear and speak no evil. If there isn’t a big enough problem, they don’t need to fix it. At worst, they have created a virtual domain in which they have lost their moral compass. This was seen in the ramblings of their latest CTO, who argued that to connect people is inherently good, irrespective of who those people were and what they used those connections for. So terrorists using the platform to plan an attack was, apriori, doing good. Most of us would call this something closer to evil.

Facebook is unusual in that it is a giant social experiment. What happens when you connect 3bn people so they can all chat with each other? No one knew the answer when Facebook was created, but we have a pretty good idea now. Horrible things happen, because there are some horrible people in the world and there are many more who behave horribly. One man’s racism is another’s freedom of speech. Facebook is a mirror to our society, reflecting all that is good -and evil- within it. On a conceptual level, it is quite unfair to imagine that Facebook or any company is capable of policing what has proved beyond traditional law enforcement. When Twitter tried this and sensibly suspended Donald Trump, there was outrage that a tech company should censor an elected politician. It triggered unwelcome regulatory scrutiny. If there is a receptive audience for fake news, misinformation and bigoted opinion, why should they be denied access to their corrupter-in-chief? Left unchecked this can aid a breakdown in law and order, such as during the Facebook-orchestrated Capitol riots. But was this the fault of the rioters or Facebook?

We can debate this on a philosophical level until the cows come home. But what is becoming clear from the ‘Facebook files’ is that the company is failing in its basic duty of care. Facebook is more than a societal mirror: it is a bridge. It has created infrastructure that did not exist before, and in so doing has created problems that didn’t exist before. We didn’t have 3bn connected to each other, and so any new problems that arise from this are Facebook’s responsibility. No one should ever have to encounter any form of abuse from strangers. That’s a new thing. The company has to divert more of its technological wizardry to protect its users rather than exploit their data for advertisers.  

Moreover, Facebook is reinforcing and escalating behaviours that can be considered evil. Not terror plots, more different forms of extremist content. Extremists have always existed within and been influenced by a wider, more moderate society. Not on Facebook, where news feed algorithms are deliberately altered so people are constantly immersed in their warped worlds. Why? To generate maximum engagement and advertising revenue.  The result is the very people that need to be exposed to different points of view are living in online echo chambers. Their views are validated and encouraged, not challenged. This is not good for them as individuals or for society.

Finally, Facebook is also not addressing the issues it knows it is creating. Frances Haugen, the whistle blower, has said Facebook’s platforms ‘harm children, stoke division and weaken our democracy’. The company’s own studies have shown the toxic effect its platform has on teenage mental health, politics and child safety. Millions live under the social scrutiny of peers or feel the pressure to live model lives. This adds unnecessary stress to an already stressful existence. Facebook concluded increases in anxiety and depression ‘was unprompted and consistent across all groups.’ 13% suicidal British teenagers traced their thoughts to Instagram, which ‘exacerbates downward spirals.’ There have been numerous cases of online bullying and attempts to encourage vulnerable teens to self-harm or commit suicide. 775 million antivax comments were being discussed each day despite a pledge from Mark Zuckerberg to crack down on them. People, drugs and human organs are all trafficked on Facebook. Evil runs rampant.

The fact that Facebook knows about this, has been hiding and with-holding the evidence is as shocking as its inaction. It appears to have lied about the effectiveness of its actions to curb harmful experiences. It has set up an independent Oversight Board that is powerless to do anything, while six million celebrities with special VIP have been made all powerful on the platform, free of any editorial restraint. All this is reminiscent of Big Tobacco knowing about the harmful effects of smoking well before consumers did. To make matters worse, we now know that Facebook is actively researching and developing products for children aged under 13 (including under 4s!), seeing them as a ‘valuable but untapped audience’. Ye Gods.

There is no doubt that Facebook are doing some good. The outrage at the outage proves this. Millions of people get access to the net via Facebook. Whatsapp is critical infrastructure in several countries, with 90% of the populations of Kenya, South Africa, Nigeria, Argentina, Malaysia, Colombia, and Brazil reliant on it. But Facebook are failing the Google evil test. It is not trying to be evil, but is allowing evil to fester on its platform. The urge to connect is powerful. If Facebook wants to be the connective tissue for our digital society, it must take bold steps to restore its reputation. It needs to adopt a three foolish monkey approach: listen out for evil (as defined by the law making bodies); see and recognise evil in all its forms (which can be codified in their own code of conduct for what they consider to be acceptable behaviour); and speak out against the evil they encounter. Don’t bury reports. Publish them and act to counter, mitigate or prevent harm.

And while Facebook are doing this, we need to be asking ourselves the same questions. Is what we are doing increasing goodness or evil? If there are unintentional, harmful side effects to our businesses (economic, social, environmental or psychological), what can we do to reduce and eliminate them? How willing are we to change our business models and revenue forecasts to do less evil?

Don’t be evil.    


Further reading:



Startup know-how to give you the edge

Subscribe to THE ROLLERCOASTER, our fortnightly newsletter with actionable advice to manage the ups and downs of startup life.

We will never sell your data to anyone.