Facebook and radicalisation: how can we regulate the internet to prevent harm? – Stuff

Posted: April 6, 2022 at 1:55 am

Oh, hi! Long time no see! My name is Robbie and Im another white man behind a desk, and things are pretty bad, huh?

In case you never go outside and have these videos sent to your door on a USB delivered by a carrier pigeon, you might have missed a wee bit of a scuffle outside New Zealand Parliament. It turns out there were some folks there with some really interesting opinions! Would you like to meet them? I bet you wouldnt!

Its time to play Meet That Protestor!

[Spanish Flea starts playing. Robbie pulls out a thin old-school microphone.]

Bachelorette number one is Chantelle Baker! With nearly 100,000 Facebook followers, Chantelle well outnumbers her dad, former leader of the New Conservatives, who only has 52,000 Facebook followers. Embarrassing! Chantelle enjoys sharing live streams of peace and love that include other protestors saying, Were not leaving till we hang them. Baker disavows those protestors in the strongest possible terms by joining them and supporting their protest.

READ MORE:* The outing of an Internet troll showed women can win* The internet is no safer three years on from the Christchurch terror attack * We have failed to rein in social media's misinformation

Bachelor number two! Its the Freedom and Rights Coalition with 44,000 followers on Facebook. No wonder people love them! After all, they stand for Freedom and Rights. This might be Brian Tamakis group, but unfortunately, he cant make it because of all that breaking the law business. But dont worry! He could be your pen pal! The man writes beautiful letters warning New Zealanders that were heading down the path of UN ideology of socialism. After a walk down the path of socialism, why not go for a romantic walk down the beach with the Freedom and Rights Coalition!

Supplied/WhiteManBehindADesk

Satirist Robbie Nicol AKA White Man Behind A Desk details Facebook and radicalisation.

Bachelor number three is Counterspin Media NZ! Why not form a thruple with Kelvyn Alp and Hannah Spierer? Together you can fight the Deep State and the transhumanist agenda, and youre bound to win, because those things dont exist! Alp has lots to teach you about the moon landings, and hed love to buy you dinner with the money he makes selling weight loss pills and fraudulent vaccine passports. Maybe hell even tell you about the violent coup hes got planned. Shhh! Its a secret, but hell probably yell it to you in a livestreamed video being watched by police.

Congratulations! Youve made it to the protest! Now that youre here, why not get to know someone new? Introducing Bachelor number four, far-right white supremacist group Action Zealandia! Parliament changed security arrangements after video appeared to show these folks gaining access to a construction site in Bowen House. When theyre not helping protestors become Nazis, Action Zealandia enjoys organising terror cells, fighting hate speech laws, and grooming teenagers to join their white supremacist organisation! Action Zealandia isnt allowed to have a Facebook page any more, and Ill give you 10 guesses why.

What a friendly bunch.

Obviously, thats just a sample - theres also Billy Te Kahika, NZ Doctors Speaking Out with Science, or, more accurately, speaking without science, and a bunch of people exhausted and angry with Covid and the world in general who didnt know what else to do.

But! Were not going to be talking about police decisions in response to the protest or Speaker of the House Trevor Mallard playing games with the sound-system. Instead, were going to talk about how the protestors got there, because while there are a number of different factors that led people there, theres one thing that stands out above all else: Facebook.

Founder, CEO, and controlling shareholder of Facebook Mark Zuckerberg was determined to get people vaccinated. He and his wife, Priscilla Chan, have invested a lot of time and money into vaccination programmes, which makes it even sadder that his wee side-hustle, Facebook, turned so many people against them. Its like if Ronald McDonald became obsessed with encouraging healthy eating, but refused to give up his day job.

Facebook makes money in a similar way to a newspaper: users write content, Facebook publishes that content, they editorialise that content with an algorithm, and then they sell ad space around that content. For context, these ad sales accounted for nearly all of Facebooks $86 billion USD revenue in 2020.

Unlike a newspaper, Facebook doesnt pay its writers, fact-check its content, or spend much money on editorial oversight. The good part of this system is that you get to hear from people that newspapers wouldnt normally publish. The bad part is that you get to hear from people that newspapers wouldnt normally publish.

For example, last month was the three-year anniversary of the Christchurch shooting, which was streamed live on Facebook and seen by thousands of people.

In response to the shooting, Facebook re-established the GIFCT as an independent organisation with their frenemies Microsoft, YouTube, and Twitter. The GIFCT is focused on removing TVEC. Acronyms are fun! Unless they stand for Global Internet Forum to Counter Terrorism or Terrorist and Violent Extremist Content. Then theyre no fun at all.

The operating board of the GIFCT is made up of Facebook, Microsoft, Twitter, and YouTube (owned by Google). Its sort of like a cigarette company setting up a council to help us deal with all the lung cancer. Thanks, corporations! Where would we be without you helping us deal with the problem you cause for profit?

When one of these companies identifies a piece of content as TVEC through vague and unknowable means, they give it a hash, pop it on the database of hashes, and it goes out to the Hash Sharing Consortium. Now, the Hash Sharing Consortium might sound like a nerdy group of stoners, but it isnt. Its different.

Basically, tech companies give TVEC a digital fingerprint, so all the other major online service providers can be like, Oop, someones trying to upload a terrorist video. And they stop them from uploading the content.

People can then slightly edit that content and reupload it and then someone has to report the new content, get that new content uploaded to the hash database, and the whole process starts again. Its kind of like whack-a-mole, only instead of a mole, its the worst thing youve ever seen in your life.

The way these OSPs or Online Service Providers decide what counts as terrorist content is haphazard at best. When New Zealands Chief Censor, David Shanks, decides what content should be illegal, its an extremely delicate process. He has to decide whether sharing extremist imagery is an important part of exposing New Zealanders to the horrors of the world, or whether it may cause further harm. According to the Guardian, Youve got to protect freedom of expression, he says.

Youve got to protect this vital ability to have opinions, to spread them, to access information of any kind.

Dawid Sokoowski/Unsplash

In response to the shooting, Facebook re-established the GIFCT as an independent organisation with their frenemies Microsoft, YouTube, and Twitter, writes Robbie Nicol.

The only reason to diverge from that principle, ever, he says, is to prevent harm something he consults groups ranging from medical experts to high school students about.

Facebook has a different approach.

As of 2019, Facebook was paying people US$15 an hour to look at up to 400 posts a day of the worst stuff imaginable: beheadings, animal abuse, hate speech, and pictures of you with the flash on.

It was then up to these underpaid, traumatised workers to decide the cultural and political context of each post from countries all over the world, sometimes in languages they didnt understand. This led to genuine Nazi content being left up and people getting blocked for sharing photos of Taika Waititi from Jojo Rabbit. Thats not a joke, by the way, that actually happened.

This work is supplemented by machine learning that works perfectly. According to the Wall Street Journal, in leaked documents scientists pointed out the companys ability to detect the content in comments was bad in English, and basically non-existent elsewhere. So, as I said, perfect.

This work is also supplemented by a list of dangerous individuals and organisations, the terrorist section of which seems to be heavily copy-and-pasted from the US governments list of terrorists. This is good because the US government is totally unbiased and completely trusted by every country in the world.

And while these companies often leave up content they claim to have banned, theyre also taking down content that they shouldnt. In June 2017, YouTube announced a plan to combat terrorist content online, and it worked well. In fact, you could argue it worked too well.

To quote Wired: The quick flagging and removal of content appears successful. Unfortunately, we know this because of devastating false-positives: the removal of content used or uploaded by journalists, investigators, and organisations curating materials from conflict zones and human rights crises for use in reporting, potential future legal proceedings, and the historical record.

The thing is, censorship is hard. Its complicated and political and has enormous ramifications for democracy. Facebook isnt doing it properly, and theyre not being nearly transparent enough about how they do it.

But even worse than Facebooks inability to effectively remove harmful content is how they editorialise all the stuff thats left.

To understand why Facebook is such a great platform for radicalisation it helps to look at the changes they made to their algorithm in 2018. When I say the Facebook algorithm, I mean the code they use to decide what pieces of content they promote.

In the same way that a newspaper uses an editor to decide what goes on the front page, Facebook uses an algorithm to decide what goes at the top of your newsfeed.

Whats fun about the changes they made in 2018 is that Facebook pretended they were making these changes for the greater good. Unfortunately, internal documents suggest that instead of making these changes for the good of humankind, they were actually making the changes to increase profit, which was shocking to everyone involved.

In 2018, Facebook was worried about declining engagement". To quote the Wall Street Journal, the fear was that eventually, users might stop using Facebook altogether. A terrifying thought.

[Insert Lionel Hutz shuddering at the idea of a world without lawyers.]

So, Facebook switched things up. Now, an angry reaction was worth five times a like; a long passionate comment was worth twice a short comment saying, Good job!; an angry rant sharing the original content was worth 30 times a like and so on.

Glen Carrie/unsplash

Facebook uses an algorithm to decide what content appears at the top of your newsfeed.

In a public letter to Facebook, BuzzFeed CEO Jonah Peretti, the elder brother of Chelsea Peretti (FUN FACT!) complained that the change to the algorithm was forcing them to post increasingly controversial content to generate arguments in the comments.

In an internal report investigating the effects of the new algorithm on the politics of Poland, Facebook researchers wrote, One partys social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80 per cent negative, explicitly as a function of the change to the algorithm.

Staff at Facebook tried to come up with solutions, but again, quoting the Wall Street Journal, Mr Zuckerberg said he didnt want to pursue it if it reduced user engagement, according to the documents. What a piece of s....

The worlds most popular newspaper, Facebook, is a lawless hellscape that chooses its top stories based on how many complaints they get, and it turns out that has some negative consequences for the planet earth and the people who live there.

A landmark study in Germany looked at thousands of anti-refugee hate crimes and compared them to variables that might be relevant. These included wealth, demographics, support for far-right politics, newspaper sales, the number of refugees, history of hate crime and the number of protests. One variable stood out: Towns where Facebook use was higher than average reliably experienced more attacks on refugees.

Sometimes Facebooks ability to fuel hate crimes is even more extreme. Facebook was forced to admit that it played a role in inciting violence during the militarys genocidal campaign against the Rohingya in Myanmar. Just for some context here, inciting violence in a genocide is bad.

Similarly, in Ethiopia, an investigation by Vice said violence had been supercharged by the almost-instant and widespread sharing of hate speech and incitement to violence on Facebook, which whipped up peoples anger.

And then theres Covid-19. Again, Facebook wanted to stop the spread of misinformationthey gave free ads to the World Health Organisation and added links to accurate information on posts about the pandemic, but it wasnt enough to counteract the fundamental business model of Facebook. Publish content without editorial oversight and promote anything that drives engagement.

So, cool. Great. Facebook, the worlds most popular newspaper, might eventually stop publishing neo-Nazis like the folks at Action Zealandia, but they will actively promote anti-vaxxers because of all the people arguing in the comments, and once youre at the rally, Im sure there are some friendly dudes in brown shirts ready to say hi.

The problem is big, but, surprisingly, governments seem willing to tackle it anyway.

Law-makers in the EU proposed a law that would require online service providers to remove illegal content within one hour. It turned out to be pretty controversial, but as with most controversial actsthe French did it anyway. And this law doesnt just cover TVEC!

The BBC writes: Failure to remove content could attract a fine of up to 1.25m (1.1m). France's regulator, the Superior Council of the Audiovisual (CSA), will have the power to impose heftier fines of up to 4 per cent of global turnover for continuous and repeated violations.

For context, 4 per cent of Facebooks revenue in 2021 was nearly $5 billion USD. Thats quite a big fine.

The UK has put forward a white paper on statutory duty of care, arguing the platform that should be regulated not the content, including the design of the platform and the operation of the business. Secondly, the duty of care implies a risk assessment so that reasonably foreseeable harms are avoided where possible or mitigated.

Its written like that because the British are, unfortunately, British.

Here in New Zealand, were undergoing a review of content regulation and working on hate speech reform.

The Department of Internal Affairs has been put in charge of removing TVEC - forcing them to draw the line between radical politics and terrorism and the line between important journalism on the topic of terrorism and media used to promote terrorist acts.

Maybe thats something our Chief Censor and the Classification Office should decide, but there you go. Apparently, weve decided to give it to the Department of Internal Affairs, the department of government work that nobody else wanted to do.

Ultimately, we know that the GIFCT is insufficient, because social media companies are not going to voluntarily invest enough money to monitor what they share. We know that the Facebook algorithm is a worse editor than Rupert Murdoch, willing to throw anything on the front page that riles people up. And we know that this problem is not limited to Facebook, this s...show runs across multiple platforms that all basically run the same way.

But it took us a long time to figure out how to regulate television and radio and newspapers. And its going to take us a long time to figure out how to regulate the internet. These are enormous questions of democracy, and free speech, and protecting people from harm.

Everyone needs to be a part of this discussion.

So, if you wouldnt mind, maybe start a long pointless argument in the comments. Do all the different kinds of reactions you can think of. Reshare this video with a long speech youve copy-and-pasted from the internet.

Maybe that way we can get an important story on the front page.

White Man Behind A Desk is the work of satirist Robbie Nicol and playwright Finnius Teppett. See more at Patreon.com/WhiteManBehindADesk.

Read the rest here:
Facebook and radicalisation: how can we regulate the internet to prevent harm? - Stuff

Related Post