An Ethics Guide for Tech Gets Rewritten With Workers in Mind

The Ethical Explorer Pack is designed to help Silicon Valley's rank and file—not just CEOs—steer products away from harmful directions.
A hand lifts up a large group of smaller figures
Illustration: WIRED Staff; Getty Images

In 2018, Silicon Valley, like Hamlet’s engineer, was hoist with its own petard. Citizens were panicking about data privacy, researchers were sounding alarms about artificial intelligence, and even industry stakeholders rebelled against app addiction. Policymakers, meanwhile, seemed to take a renewed interest in breaking up big tech, as a string of congressional hearings put CEOs in the hot seat over the products they made. Everywhere, techies were grasping for answers to the unintended consequences of their own creations. So the Omidyar Network—a “philanthropic investment firm” created by eBay founder Pierre Omidyar—set out to provide them. Through the firm’s newly minted Tech and Society Solutions Lab, it issued a tool kit called the EthicalOS, to teach tech leaders how to think through the impact of their products ahead of time.

Two years later, some things have changed. But it’s not CEOs who are leading the charge. It’s the workers—engineers, designers, product managers—who have become the loudest voices for reform in the industry. So when it came time for the Omidyar Network to refresh its tool kit, it became clear that a new target audience was needed. “We realized how much the scene had changed,” says Sarah Drinkwater, Omidyar Network’s director of beneficial tech. “We believe really firmly that the people who are going to force the change through are the workers, not the leaders.”

Now, the Omidyar Network has a new tool kit, designed to get tech workers talking about the way their products shape society, democracy, and more. The Ethical Explorer Pack, as it’s called, covers many of the same topics and ideas as EthicalOS, but with added guidance on how workers can bring these issues up on their teams—whether to identify red flags early on, to brainstorm solutions to potential problems, or to set boundaries around things like data control, surveillance, or disinformation. The kit, which comes as a free digital download or a physical deck of cards, provides exercises, activities, and prompts that can be used alone or with a group to guide conversations.

The Ethical Explorer Pack fits into a broader push for companies to think about social and cultural impacts the way they think about user engagement or profits. Some companies in Silicon Valley have even created internal corporate positions to focus on those issues, like Salesforce’s Office of Ethical and Humane Use. (Salesforce’s chief ethical and humane use officer, Paula Goldman, was poached from the Omidyar Network; she helped to create the original EthicalOS.) There are also other tool kits designed to help people go much deeper on specific problems, like the Open Data Institute’s Data Ethics Canvas. But Drinkwater says there weren’t enough resources to simply help rank-and-file workers raise ethical concerns within their own teams.

The past several years have seen tech workers grow more outspoken about their employers’ products and policies. In 2018, thousands of Googlers signed a petition objecting to the company’s involvement in Project Maven, a controversial military program to use AI for drone footage; the backlash forced Google not to renew its Pentagon contract and create a code of ethics for AI. Last fall, Amazonites staged a walkout to demand the company take more steps to combat climate change, leading to a series of sustainability initiatives. More recently, hundreds have protested working conditions at Amazon facilities during the pandemic. Even unsuccessful protests have brought awareness—and public shame—to tech companies. Facebook CEO Mark Zuckerberg has stood firm in his decision not to moderate political speech (specifically, Donald Trump’s) on the platform, even after hundreds of employees staged a virtual walkout last month; now hundreds of advertisers say they’re boycotting Facebook over hate speech and misinformation.

While those conversations have played out most loudly in the behemoths of the tech industry, Drinkwater says they are just as important in smaller companies and nascent startups that are still figuring out their ethical blueprint. Those startups may one day become huge companies themselves, or find themselves at an impasse sooner than they think. (Take Clubhouse—a Silicon Valley darling that remains in private beta—which has recently caught flak from users and many tech reporters for not having a protocol to deal with harassment.) The Ethical Explorer Pack, then, is meant to be a resource for anyone within a tech company to jump-start their thinking about potential issues, long before they become problems.

The kit includes a “field guide” for navigating eight risk zones: surveillance, disinformation, exclusion, algorithmic bias, addiction, data control, bad actors, and outsize power. There’s also a deck of cards with prompts, each related to at least one of the risk zones, that an engineer or designer, say, might bring up during meetings or keep in mind while they work. How did we decide who our target audience should be, and how can we benefit from more diversity in our audience? Can our tech unintentionally reinforce or amplify existing biases? What other tools should we consider to help all of our users feel safe? Some of the prompts seem elementary (“What data are we collecting from users?”) and others overly broad (“How might our tech be co-opted to undermine trust in societal institutions?”). But Drinkwater says that those types of questions still aren’t being asked early or often enough in the product-creation process.

Drinkwater has some experience with dissent: She worked at Google for nearly seven years, including a stint on Google Maps. She remembers once voicing concerns about an upcoming feature release—she won’t say which one—but failing to persuade her team that it was a real problem. The feature came out, “the press had a field day, and it got pulled,” she says. Looking back at that experience, Drinkwater thinks she would have benefitted from a template for thinking through the ethical problems with the feature. “We know these are topics that many workers in tech worry about. We know these are topics they want to talk about,” she says. “I’m confident that if they had a structure, they would find they’re not alone.”

The Ethical Explorer Pack might introduce some employees to questions they’d never considered. For others, the prompts might reinforce ideas they were too nervous to bring up. “People can feel like the joy-kill, the person getting in the way of the cool innovation,” says Drinkwater. The kit includes some sample language for making the case to managers and colleagues. That might not be enough to get buy-in when an ethical concern stands at odds with a business’s bottom line, or when a marginalized worker is the only one speaking up. Still, the tool kit encourages workers to make their point in the parlance of Silicon Valley. Slowing down to think through future consequences isn't any different than slowing down to debug code. It might even save startups from a headache, or something bigger, down the road.


More Great WIRED Stories