Friday, March 1, 2019

So it’s a system at maximum scale with very imperfect information.

This is looking very interesting. From “Men Are Scum”: Inside Facebook's War On Hate Speech by Simon Van Zuylen-Wood.

I am only a third of the way through but just hit a quote which I want to capture and not risk losing it.
The differences between policing the real world and policing the Internet became manifest. “The level of context you have when you’re looking at criminal laws—there’s fact-finding, you provide evidence on both sides, you actually look at the problem in a 360-degree way,” Bickert says. “We don’t. We have a snippet of something online. We don’t know who’s behind this. Or what they’re like. So it’s a system at maximum scale with very imperfect information.
Bickert is Facebook's Content Czar deciding who is allowed to say what. I.e. deciding what the expectation of free speech means in the boundaryless and normless world of the digital internet.

In some ways Zuylen-Wood's reporting is alarming. He indicates, referring to the team:
Like Bickert, a number are veterans of the public sector, Obama-administration refugees eager to maintain some semblance of the pragmatism that has lost favor in Washington.
Yikes. Veterans from an administration that established new levels of animosity towards free speech, animosity towards journalists (animosity as in jailing them and spying on them, not animosity in terms of tweeting ugly things about them), were infamous for their impenetrability and lack of transparency, noted for punishing whistleblowers and notorious for repurposing independent agencies for partisan purposes.

On the other hand, Bickert does seem to have some real world exposure, though not necessarily to real world America.
Bickert spent the first phase of her career at the Justice Department in Chicago, prosecuting gang violence and public corruption. She spent the second phase at the U.S. Embassy in Bangkok, where she extradited child sex traffickers. While her primary focus was protecting kids, she also began to think more about freedom of speech, thanks to strict laws against criticizing the Thai monarchy. She was, in other words, already weighing versions of the fundamental tension—“safety” vs. “voice”—that undergirds all of Facebook’s policy decisions.
But all that is background.

This is the passage which leapt out at me. My emphasis added.
The differences between policing the real world and policing the Internet became manifest. “The level of context you have when you’re looking at criminal laws—there’s fact-finding, you provide evidence on both sides, you actually look at the problem in a 360-degree way,” Bickert says. “We don’t. We have a snippet of something online. We don’t know who’s behind this. Or what they’re like. So it’s a system at maximum scale with very imperfect information.
Yes - precisely. I argue around this point frequently in these blog posts but from an epistemic perspective.

I strongly oppose the concept of hate speech because it is so close to the idea of thought crime. Punish the person for the actual crime, not what we infer without knowing might have been their motivation. As soon as you base punishment on value-laden inferences rather than on objective evidence, you are on a slippery slope towards illegal and extrajudicial rather than rule of law.

And related, we need to get rid of the concept of disparate impact where if there is statistical disparate impact it is presumed to be from conscious and deliberate bias rather than from the myriad other, beneficial and non-prejudicial sources of variance.

Both of these topics are related to the fact that each of us live in a world with our own cognitive limits and our own constrained experiential path dependence while facing "a system at maximum scale with very imperfect information." We don't know the full details and we don't know the full context and so we make quick decisions with limited and imperfect information based on past experiences. Absent particular knowledge, we default to system averages knowledge, even when it involves individual people. We know that system averages and stereotypes and tropes are not good for optimal decision-making. However, when information is expensive or hard to obtain, when we have to make snap decisions, we default to the second-best approach which is system averages and the amalgamation of specific experiences. We know it is less than optimal decision making if we were unconstrained by time or cost, but it is the second-best approach when those constraints are real.

The striking aspect is that organizations, such as Facebook based on its pattern of behavior, which tend to be inimical to diversity of thought or freedom of speech, also tend to be harshest on those whose quality of decision making is most suboptimal due to their limited cognitive capacity or experience or who are limited in terms of time or in terms of access to quality information. Individuals who make less than ideal decisions because they are constrained when facing a system at maximum scale with very imperfect information.

And yet here is that very same organization, Facebook, with effectively unlimited access to money, unlimited access to bright minds, unlimited access to experienced people, unlimited access to quality information, etc. stymied by the same "system at maximum scale with very imperfect information" and making a similar hash of the challenge.

Facebook is pleading complexity of the maximum scale system handicapped with imperfect information which drives the same behaviors among those whom they most condemn.

It is a striking admission.

No comments:

Post a Comment