Analysis: What we learned from the Facebook whistleblower
The list goes on.
Abandoned some efforts to cut down on misinformation after the presidential election.
Knows the effect Instagram has on some young minds.
Key takeaway from Haugen: When misinformation is the main weapon in an ongoing war of ideas, the massive company has chosen the profits it reaps from capturing eyeballs and engaging users over its moral responsibility to cut down on the toxic stuff spread on its platform.
Haugen gave the example that if a new user signed up and followed Donald Trump, it wouldn’t be long before the platform’s algorithm was pushing them QAnon conspiracy theories.
She told the Wall Street Journal that she does not want people to dislike or stop using the platform, but that it needs to be fixed.
“If people just hate Facebook more because of what I’ve done, then I’ve failed,” she told the Journal. “I believe in truth and reconciliation — we need to admit reality. The first step of that is documentation.”
Here are some quotes from this excellent interview, which includes Clegg agreeing that there should be some regulation.
Is Instagram toxic to teenage girls? Not to all teenage girls, Clegg argued.
STELTER: For teenage girls, is the world better with Instagram in it or is it worse?
CLEGG: Well, the vast majority of teen girls and, indeed, boys who have been covered by some of the surveys that you referred to say that for the overwhelming majority of them, it either makes them feel better or it doesn’t make a difference one way or the other.
Why doesn’t Facebook release the kind of research that Haugen leaked? Clegg argued that Facebook has more than 1,000 Ph.D.s on staff and does a lot of research and not all of it is meant to be public. This research, he argued, was meant to help Facebook improve its platforms.
CLEGG: So we do a huge amount of research. We share it with external researchers as much as we can. But do remember, there is a — and I’m not a researcher, but researchers will tell you that there’s a world of difference between doing a peer-reviewed exercise, in cooperation with other academics, and preparing papers internally to provoke an informed internal discussion.
Is Facebook like the tobacco companies? Drawing the much-repeated comparison between Facebook trying to hook users and tobacco companies trying to hook smokers, Stelter said he enjoys Instagram but he does feel the pull of an addiction to it.
CLEGG: Let me give you one very simple reason why this is such a misleading analogy. The people who pay our lunch are advertisers. Advertisers don’t want their content next to hateful, extreme or unpleasant content.
Is Facebook a monster that’s too big to control? Possibly.
CLEGG: Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, you’re right, Brian. We’re never going to be absolutely on top of this 100% of the time, because this is an instantaneous and spontaneous form of communication, where billions of human beings can express themselves as they want, when they want, to each other.
Is Facebook responsible for the divisions that led to the January 6 insurrection? No, Clegg argued.
CLEGG: I think it gives people false comfort to assume that there must be a technological or a technical explanation for the issues of political polarization in the United States.
STELTER: You think it’s too easy — it’s too easy to say it’s Facebook’s fault?
CLEGG: Well — well, I think it would be too easy, surely, to suggest that with a tweak to an algorithm, somehow all the disfiguring polarization in US politics would suddenly evaporate. I think it absolves people of asking themselves the harder questions about the historical, cultural, social and economical reasons that have led to the politics that we have in the US today.