Study: How new Airbnb nondiscrimination policy may be worse

Study: How new Airbnb nondiscrimination policy may be worse
New research co-authored at Washington University in St. Louis shows more information about users and guests is key when it comes to fighting discrimination in the sharing economy. Credit: Washington University in St. Louis

The sharing economy is a booming industry, with companies such as Uber and Airbnb generating billions in value each year. Technology, combined with informal peer business practice, has made it easier than ever to call for a ride or rent a living space.

However, the sharing economy hasn't been without its share of controversy, including accounts of discrimination by Airbnb hosts against guests. As a result, Airbnb instituted a new nondiscrimation policy, which included reducing the size of guest photographs in an effort to prevent host bias. However, new research co-authored by a faculty member at Washington University in St. Louis's Olin Business School shows that more information about guests, as opposed to less, is important in eliminating potential sharing economy bias.

"We know discrimination exists in the sharing economy," said Dennis Zhang, assistant professor of Operations and Manufacturing Management at Olin. "We wanted to find out how do we prevent it, and how do we mitigate it?"

In a working paper, Zhang and his co-authors, Ruomen Cui, assistant professor at the Kelly School of Business at Indiana University, and Jun Li, assistant professor at University of Michigan's Stephen M. Ross School of Business, conducted two randomized field experiments among more than 1,200 Airbnb hosts in Boston, Chicago and Seattle. The researchers used fictitious guest accounts and sent accommodation requests to the hosts using those accounts.

They found requests from guests with African American names—based on name frequency data published by the U.S. Census Bureau— were 19 percent less likely to be accepted than those with Caucasian names.

However, when the researchers posted a single host review for each fictitious user, the tables turned: Acceptance rates for both sets of guests evened out. Zhang says this fact shows strong evidence of concept called statistical discrimination with Airbnb.

"When hosts don't have complete information about a possible guest, they might infer race and make rental decisions based on that," said Zhang. "When enough information is shared, hosts don't need to make those inferences, and we found discrimination was statistically eliminated."

It wasn't just positive reviews that swayed the hosts. The second portion of the experiment involved a negative review of the fictitious guests. Here, too, the acceptance rates for both sets of names were statistically even: 58.2 percent for Caucasians and 57.4 percent for African Americans.

"We thought a negative review might create distortion for the hosts," said Zhang. "However, based on our experiments, any and all information about a guest is important to fight discrimination."

Zhang suggests Airbnb incentivize hosts to write reviews on new users and also provide a more structured way for guests to communicate travel plans in efforts to make the rental transaction much more transparent. He added that at least two of Airbnb's changes to its nondiscrimination policy last September were proven counterproductive by this multifaceted research.

"Hiding user information and making profile pictures smaller doesn't solve the problem, and may make it even more severe. Airbnb really has to think about how to provide more information instead of cutting it from guest profiles," said Zhang.

Explore further: Airbnb takes new steps to fight discrimination