|
Getting your Trinity Audio player ready...
|
In the late 1980s, law professors Ian Ayres and Peter Siegelman set out to learn whether blacks and women got the same deals as white men when buying a new car. They trained 38 people to negotiate a purchase using a script, and uncovered disturbing differences: Across 153 dealerships, black and female buyers paid more for the same cars than white men did.
Fast-forward a dozen years to the early days of internet commerce. Entrepreneurs were experimenting with web-based sales of everything, including automobiles. Several economists analyzed this new mode of selling cars and found that it did away with the racial and gender discrimination.
Indeed, the first generation of online marketplaces made it hard for sellers to discriminate. Transactions were conducted with relative anonymity. A user could negotiate a purchase without providing any identifying information. As a New Yorker cartoon famously put it, “On the internet, nobody knows you’re a dog.”
Except that platforms — and now their users — do know whether you’re black or white, male or female. And the internet has recently been revealed as a source of discrimination, not an end to it: With their identities uncovered, disadvantaged groups face many of the same challenges they have long confronted in the offline world.
What happened, and what can we do about it?
THE EMERGENCE OF DIGITAL DISCRIMINATION
Many online sellers now have discretion over whom they do business with on the basis of looks or a name. The availability of such information is platform-specific, with some sites preserving a fair amount of anonymity while others hark back to practices long banned in offline markets. Similarly, on many sites potential buyers see not only products but also the names and photos of sellers. Although having details about prospective transaction partners may make people more comfortable, a growing body of evidence shows that it facilitates discrimination.
The short-term-rental marketplace Airbnb is a case in point. When a would-be renter searches listings, he sees descriptions and pictures of both the property and the host. And hosts can see the names — and in many instances the pictures — of potential tenants before accepting or rejecting them.
One of us (Luca) has investigated racial discrimination on Airbnb. In a study focused on the U.S. market, he and two colleagues constructed 20 user profiles and sent rental requests to roughly 6,400 hosts. The profiles and requests were identical except for one detail — the user’s name. Half the profiles had names that (according to birth records) are common among whites, while half had names common among blacks. Requests with black-sounding names were 16% less likely than those with white-sounding names to be accepted.
Online racial discrimination is enabled by two features: markers of race, most obviously photographs but also subtler indicators, such as names; and discretion on the part of market participants over whom they transact with. Both are choices made by platform designers.
Another feature of online commerce has at times nurtured rather than suppressed discrimination: the use of algorithms and big data. In an eye-opening study, computer science professor Latanya Sweeney sought to understand the role of race in Google ads. She searched for common African-American names and recorded the ads that appeared with the results. She then searched for names that are more common among whites. The searches for black-sounding names were more likely to generate ads offering to investigate possible arrest records.
TOWARD SMARTER MARKET DESIGN
Our aim is to offer a framework for companies that want to design and manage a thriving marketplace while minimizing the risk of discrimination. We offer two guiding principles for platforms struggling with this challenge. We then evaluate four design choices that are likely to affect discrimination.
PRINCIPLE 1: Don’t ignore the potential for discrimination.
Most platforms don’t know the racial and gender composition of their transaction participants. A regular report on the race and gender
of users, along with measures of each group’s success on the platform, is a step toward confronting problems.
PRINCIPLE 2: Maintain an experimental mindset.
Companies including Facebook and eBay have baked experimental thinking into their development of new products and features. To test design choices that may influence the extent of discrimination, companies should conduct randomized controlled trials.
DESIGN DECISION 1: Are you providing too much information?
In many cases, the simplest, most effective change a platform can make is to withhold potentially sensitive user information until after a transaction has been agreed to. In addition to choosing what information to reveal, platforms choose how salient to make it. By reducing the salience of race, platforms could reduce discrimination.
DESIGN DECISION 2: Could you further automate the transaction process?
When using Uber, you tap the screen to order a ride; only after confirming do you learn who will pick you up. In theory, you can then cancel if you don’t like the driver’s rating or looks. But that takes effort, and this small “transaction cost” is probably just enough to deter most looks-based cancellations.
Having transactions occur before race and gender are revealed makes it more difficult for people to discriminate. Consider Airbnb’s “instant book” feature. A host using it allows renters to book her property without her having first approved them. Instant book is an opt-in feature: Landlords must sign up for it. If Airbnb switched its default to instant book, discrimination would likely be lessened.
DESIGN DECISION 3: Could you make discrimination policies more top-of-mind?
Most platforms have policies prohibiting discrimination, but they’re buried in fine print. For example, Airbnb hosts must agree not to discriminate — but they do so when signing up. By the time a host is deciding whether to accept a renter, she has probably forgotten that agreement. Marketplaces could present anti-discrimination policies at a more relevant moment — and have the host’s agreement not to discriminate occur during the transaction process.
DESIGN DECISION 4: Should your algorithms be discrimination-aware?
Thus far many algorithm designers have ignored factors such as race and gender and hoped for the best. But if an algorithm designer cares about fairness, she needs to track how race or gender impacts the user experience and set objectives.
A LESSON FROM SYMPHONY ORCHESTRAS
In the mid-1960s, less than 10% of the musicians in the “big five” U.S. orchestras were women. In the 1970s and 1980s, the groups changed their audition procedures to eliminate potential bias. Instead of conducting auditions face to face, they seated musicians behind a screen. A landmark 2000 study found that the screen increased the success rate of female musicians by 160%.
We’ve evolved far enough that platform designers can choose where and when to place virtual screens. We hope they will use that power to create a more inclusive society.
(Ray Fisman is the Slater Family Professor in Behavioral Economics at Boston University and a co-author of “The Inner Lives of Markets: How People Shape Them — And They Shape Us. Michael Luca is an assistant professor at Harvard Business School and a visiting assistant professor at Stanford University.)
+++
SIDEBAR
Idea in Brief
THE PROBLEM
Online marketplaces such as eBay, Uber and Airbnb have the potential to reduce racial, gender and other kinds of discrimination that affect transactions in the offline world. But recent research shows that the opposite has occurred.
THE REASON
Early platforms kept the identities of buyers and sellers relatively anonymous. But the addition of photos, names and other means of identification to listings has inadvertently encouraged discriminatory behavior.
THE ANSWER
To create markets that are both efficient and inclusive, platform designers need to be mindful of the potential for discrimination and open to experimentation as they make choices about automation, algorithms and the use of identifying data.


