Interesting in part because it reinforces my priors. From the Abstract (emphasis added):
Recent research has found widespread discrimination by hosts against guests of certain races in online marketplaces. In this paper, we explore ways to reduce such discrimination using online reputation systems. We conduct four randomized field experiments among 1,801 hosts on Airbnb by creating fictitious guest accounts and sending accommodation requests to them. We find that requests from guests with African American-sounding names are 19.2 percentage points less likely to be accepted than those with white-sounding names. However, a positive review posted on a guest's page significantly reduces discrimination: When guest accounts receive a positive review, the acceptance rates of guest accounts with white-sounding and African American-sounding names are statistically indistinguishable. We further show that a non-positive review and a blank review without any content can also help attenuate discrimination, but self-claimed information on tidiness and friendliness cannot reduce discrimination, which indicates the importance of encouraging credible peer-generated reviews. Our results offer direct and clear guidance for sharing-economy platforms to reduce discrimination.I have long argued three key elements - 1) much of academic research on name type reflects underlying negative stereotypes by academics of their fellow Americans; 2) most does not take into account incentive structures, costs and risks; and 3) most the academic research is structurally flawed because they fail to distinguish race from class.
Most academics in these studies seem to come across as highly prejudiced in that they seem to believe that all white Americans are biased against all black Americans, that pursuit of business profitability is not worthy of attention, ignore that everyone responds to incentives, ignore that race is a proxy of group average information, and fail to take into account that race and class are intertwined categories of information.
The scenario we are looking at is a low information high consequence transaction. The hospitality industry in many times and places is a fiercely competitive and low margin business. One bad guest can put the landlord into the financial red for the next several guests. Landlords are well rewarded financially for accurately assessing whether a guest will likely steal, destroy, or skip out without paying.
In a world where landlords are precluded from viewing accurate actual information related to the specific guest, they use less than desirable proxies. They are indeed attempting to discriminate but not necessarily for invidious reasons.
Attributes which might usually be seen as carrying positive or negative information might include race, gender, age, family status, profession, religion, education attainment, criminal history, credit score, etc. Virtually all landlords would rent in an unstinting fashion to 55 year-old, female, black, physicians, affiliated with a mainline protestant church, and with good credit scores and would not rent to 18 year-old, unemployed, white males without a high school diploma, with bad credit scores, and a criminal record. If they could do so knowingly. Race is not the critical factor for the landlord except to the extent that it provides information about the probability of being payed and not having any issues or damages.
If all we know is the race of an individual, there is inherently an associated and inferred income, crime, wealth, class, profile that goes with that race. But so there is also with age, gender, religion, etc.
For an individual, as long as there is no specifically individual information which can be shared, landlords will use what limited proxies they have in order to make the best income maximizing decision they can with statistical average proxies.
One solution is to provide a richer palate of individual specific information about the renter in order to allow landlords to make more targeted and informed decisions. I believe that to be the better approach. It benefits the well-intentioned landlord and the well-behaving renter. It disadvantages the ill-behaved renter.
For the past three decades or so we have gone a different route. Instead of making accurate and specific information available to the renter/decision-maker, we have instead attempted to preclude them from knowing much at all about the renter. What has been documented time and again is that landlords will use whatever proxies they can that usefully advances their goal of being paid and not having to deal with issues and damages.
It is hard on landlords and is even harder on well-behaved and reliable individuals within groups that have negative average group attributes.
We see this most clearly in labor force studies. There is a widespread, and well intentioned, movement to "ban the box" when it comes to labor force hiring. Many states preclude employers from asking about criminal history on the theory that it carries no useful information and that such information disproportionately harms some racial groups over others.
What most studies find is that black hiring rates go up when employers are able to ask about criminal history and go down when employers cannot. What appears to be happening is that employers want reliable trustworthy employees and that is poorly correlated with past criminal behavior.
For the same candidate, if the employer is able to ask about criminal history they can discover things like the fact that even though you are black and with a criminal record, the criminal conviction was for stealing a car at 17 but that you then turned your life around, served in the marines, attend church, etc. With that information, the employer can make a better informed decision about the probability of your being a reliable trustworthy employee and extends an offer of employment.
In contrast, the employer precluded from asking about criminal history can rely on only two pieces of information - you are black and group average conviction rates are much higher for blacks. If they can get away with it, they will extend job offers at a higher rate to groups with better group averages rather than to individuals.
I like Alex Tabarrok's summary of this study:
In other words, taste based discrimination is weak but statistical discrimination is common. Statistical discrimination happens when legitimate demands for trust are frustrated by too little information. Statistical discrimination is a second-best solution to a problem of trust that both owners/sellers/employers and renters/buyers/workers want to solve. Unfortunately, many people try to solve statistical discrimination problems as if they were problems of invidious prejudice.The comments on Tabarrok's post are worth reading for a robust discussion.
If you think the problem is invidious prejudice, it’s natural to try to punish and prevent with penalties and bans. Information bans and penalties, however, often have negative and unintended consequences. Airbnb, for example, chose to hide guest photos until after the booking. But this doesn’t address the real demands of owners for trust. As a result, owners may start to discriminate based on other cues such as names. Instead market designers and regulators should approach issues of discrimination by looking for ways to increase mutually profitable exchanges. From this perspective, providing more information is often the better approach.
No comments:
Post a Comment