Discrimination on sharing economy platforms such as Airbnb is fairly well known. A study from a few years ago found that black property owners suffered in terms of success on the site.
The study explored how the photos of property owners influenced both the rental prices and the success rate of profiles on AirBnB and found that black property owners charged approximately 12% less for their property than owners of similar properties from different ethnic backgrounds.
“Moreover, black hosts receive a larger price penalty for having a poor location score relative to nonblack hosts,” the researchers write in the paper. “These differences highlight the risk of discrimination in online marketplaces, suggesting an important unintended consequence of a seemingly routine mechanism for building trust.”
Righting the wrong
A new Stanford based study suggests that this situation can be rectified with a relatively straightforward website redesign. The authors suggest that by focusing on the reputation of each user, it can more than offset any social bias in users.
Platforms like Airbnb have thrived in large part because of the more personal and intimate feel they provide. When you feel like you’re staying in someone’s home rather than an impersonal hotel, it creates a strong appeal. Of course, that personal touch can open the process up to bias.
The Stanford research was particularly looking to test what factors might influence homophily, or the tendency to trust those who are similar to ourselves. Are there any things that might counter such a tendency?
Participants in the study were real Airbnb users and they were placed on a digital platform and shown mocked up profiles of other Airbnb users, with profile displaying various demographic and reputation information.
Participants were split into two groups, with the first group shown profiles that were similar to them in terms of age, gender and so on, whilst the second group were shown profiles that were different, but with higher reputation scores than their peers in group one.
Testing for bias
The researchers tested for bias by asking participants to play a behavioral game that would see them invest credits into the people whose profiles they were shown. The more credits each profile accumulated, the more ‘trust’ the participants had in them.
In group one, homiphily was evident, with a clear correlation between the similarity between participant and the amount invested into the profiles they viewed.
This was not the case in the second group however, with participants happy to invest in profiles that were markedly different to themselves, providing they had a strong reputation.
The researchers then tested this out in real life by analyzing some 1 million real interactions between hosts and guests on Airbnb. Just as in the lab environment, users with high reputations didn’t suffer from any of the biases that their less reputable peers did.
“The fundamental question we wanted to answer is whether technology can be used to influence people’s perception of trust,” the authors say. “These platforms can engineer tools that have great influence in how people perceive each other and can make markets fairer, especially to users from underrepresented minorities.”