For this post, I interviewed Professor Michael Luca,
Assistant Professor at Harvard Business School. Professor Luca’s research and teaching focus
on the economics of digitization and on using data to improve policy and managerial
decisions. In his work he has collaborated with organizations ranging from Yelp
to the City of Boston.
Professor Luca wrote “Racial Discrimination in the Sharing Economy: Evidence from a Field Experiment” with Benjamin Edelman and Dan Svirsky, both colleagues at Harvard Business School. I first heard about this
study when it was a working paper, and Professor Luca was interviewed on the
podcast Hidden Brain. In this study,
Professor Luca and his coauthors examined the online marketplace, or what academics
call the sharing economy. In particular, they wanted to see how the differing
policies of websites play out for the consumers who use them.
Think about the transactions you carry out on eBay or Amazon.
They are pretty much anonymous: you usually know nothing about the people you
are doing business with. But, what about Airbnb? If you’ve used it, you know
that this site requires users to share personal information in order to
participate. The requirements have changed over time, and they are different
for hosts vs. guests. You must share at least your name, and hosts must post
their pictures. Looking at a random sample of guests, Professor Luca and his coauthors
found that 44% had posted profile pictures. The stated reason is that sharing this
personal information builds trust between host and guest. But, as heard on the
Hidden Brain podcast, this sharing of personal information has led to the
unintended consequence of racial discrimination, against African American
guests. In addition to the anecdotes you hear on the podcast and elsewhere (search
#airbnbwhileblack on Twitter to see more), this paper provides evidence that
discrimination on the platform is widespread.
How did the authors find evidence of racial discrimination
on Airbnb? They exploited the fact that Airbnb users must share personal
information, in particular their names. They created 20 Airbnb accounts, 10
whose names sounded distinctively African American and 10 whose names sounded
distinctively white. Half of each were female, and half were male. Examples of
female African American sounding names included Tanisha Jackson and Latoya
Williams and examples of male white sounding names included Brent Baker and Brad
Walsh. In total, they sent about 6,400 messages to hosts in 5 cities requesting
bookings. It turned out that guests with white sounding names were accepted
about 50 percent of the time while those with African American sounding names
were accepted only 42 percent of the time.
But, you might wonder, who is carrying out this
discrimination? The authors found that even hosts who they thought might not
discriminate were: hosts with multiple listings (many of whom are subject to
and violating existing discrimination laws), hosts in diverse neighborhoods,
hosts with more experience, and hosts with lower priced listings all
discriminated. The only hosts who didn’t discriminate were the ones who had
hosted an African American guest in the past. (The authors were able to gather
this last bit of information from pictures of previous guests of the hosts.)
Airbnb is aware of these findings, and has begun to take
action. A 32 page report responding to the mounting evidence is posted on their
website. A team of data scientists now has the responsibility of
evaluating this issue internally. Professor Luca has met with members of the
staff, and shared possible solutions. One would be to eliminate completely the sharing of personal
information. This, however, is not something that Airbnb is willing to do –
Airbnb clearly wants to maintain a culture of sharing personal details, in
contrast with Priceline and other short-term rental platforms. They do have an
option called “Instant Book”, which to be clear, existed before, but now the
goal is to get 1/3 of properties booked through it. From the Airbnb website: “Instant
Book listings don't require approval from the host before they can be booked.
Instead, guests can just choose their travel dates, book, and discuss check-in
plans with the host.” So, it makes Airbnb more like your standard hotel site. Professor
Luca argues that the effectiveness of instant booking in curbing discrimination
depends largely on which hosts are using it: if those who were discriminating
start to adopt it, it will make more of a difference, but if it is mainly
adopted by people who were already less likely to discriminate, then the impact
will be more limited. Airbnb also has added to its terms of service, which now includes
an anti-discrimination policy, which every host must read and sign.
But, what if the hosts don’t even realize that they are
discriminating? Professor Luca explained to me that this could be part of what
is happening here: it is what
psychologists call unconscious bias. Hosts
make decisions about who to accept, and don’t notice that they are systematically
turning down African American guests. Maybe
these new policies will make them more aware. Hopefully they will now at least stop
and think about the possibility that they are discriminating, and that they have
the ability to take action to prevent it. (As a teacher, this reminded me of
the existing evidence that all teachers, even female teachers, call on boys in
the classroom more than they do on girls. I think this must be unconscious bias
as well.)
Beyond his research on discrimination, Professor Luca has
other work about information in online platforms. For example, in his working
paper, “Survival of the Fittest: The Impact of the Minimum Wage on Firm Exit,”
with Dara Lee Luca, he
uses data from Yelp to study how increases in the minimum wage affect
restaurants. The authors find that the least expensive restaurants are forced
to close when the minimum wage increases.
Let’s talk! I would love to know what you think about this
example of unintended consequences. Please submit comments and questions.
No comments:
Post a Comment