Business Daily Media

Men's Weekly

.

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means

  • Written by Nathalie Collins, Academic Director (National Programs), Edith Cowan University

New Zealand’s prime minister Jacinda Ardern has called for “ethical algorithms” to help stop online radicalisation.

She made her call on the weekend at the second summit of the “Christchurch Call[1]” for action to eliminate terrorist and violent extremist content online.

The first Christchurch Call summit was convened by Ardern and French president Emmanuel Macron in May 2019. It took place two months after New Zealand’s first and worst mass shooting in decades, the Christchurch mosque shootings, in which a 28-year-old Australian gunman killed 51 men, women and children.

The Christchurch Call is a voluntary compact between governments and technology companies. So far 55 nations have signed on – with the most notable new signatory being the United States[2], which refused to join under Donald Trump.

Google (which owns YouTube), Facebook, Twitter, Microsoft and Amazon have also signed on, as well as Japanese messaging app LINE, French search engine Qwant and video-sharing sites Daily Motion and JeuxVideo.

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means Trump supporters, believing false claims a election was stolen, try to break through a police barrier at the US Capitol in on January 6 2021. John Minchillo/AP

In light of clear examples of extremist behaviour still being fomented online – the storming of the US Capitol in January being a case in point – one might question how much has been achieved.

Read more: Two years on from the Christchurch terror attack, how much has really changed?[3]

On the weekend Arden, while noting the progress made in areas such as the platforms’ protocols for moderating and removing extremist content, singled out the need for ethical algorithms[4]. Here’s why.

How social media platforms serve content

Imagine a large, vast restaurant. Service here works in an interesting way.

The waiters dash around the restaurant to bring diners as much food as they can eat. They don’t take orders but effectively direct you to what you will eat by putting that food in front of you.

The restaurant owner has designed it this way, to keep you eating as much as possible.

How do the waiters know what you like? They have a record of what you ate last time. They listen in on your table conversation. You mention you feel like French fries? They will bring you buckets of fries over and over.

At first you think: “Isn’t this wonderful, these waiters know just what I like.”

But the waiters don’t care about what you like. They just want you to keep eating. Even if the food is unhealthy and increases your risk of disease or death. No matter. They’ll keep bringing it as long as you keep eating.

If these waiters were ethical, if they cared about your well-being, they might bring you healthy alternatives. They might put a salad before you. If the restaurant owner was ethical, the service would not be designed to encourage overeating. It would seek to interest you in something else.

But then you might stop eating. You might leave the restaurant. That would hurt profits.

Algorithms are designed to decide what we see

Social media algorithms work the same as the service in our metaphorical restaurant. Algorithms are tech companies’ secret recipes[5] to keep users on their platforms.

The easiest way to do that is serve you content you like – perhaps with even more salt, sugar and fat.

On YouTube it’s more of the same type of content you’ve been watching. Like videos of stray dogs being rescued? You’ll get more of those recommended to you. If it’s videos about governments hiding alien technology, you’ll get more of those.

Facebook works a little bit differently. It will recommend groups for you to join based on your interests. If you’ve joined a group about native birds, or ascending to the fifth dimension, more such groups will be recommended to you. Those groups enable you to interact with and make “friends’ with others who share your interests and beliefs.

Read more: Why Facebook created its own ‘supreme court’ for judging content – 6 questions answered[6]

Repetition and normalisation

These strategies reinforce and normalise our interests and views. They are crucial reasons for the viral-like spread[7] of extremism.

An idea, no matter how absurd or extreme, becomes more acceptable if repeated over and over again[8]. Advertisers know this. So do propagandists. The more we view videos and posts pushing the same ideas, and connect with people who share the same views, the more we feel we’re normal and it’s those who disagree with us who are deluded.

This radicalisation is a social phenomenon. It is also a business.

Those pushing or holding radical ideas often think they are opposing Big Tech and other corporate interests. They couldn’t be more wrong. Extremist content is a lucrative market segment. Keeping your eyes on a page, enthralling you and reinforcing your views is a way for content creators, social influencers and the platforms themselves to make bank, boost their ego and spread their message. Which, in turn, legitimises their message.

Remember the fundamental business model: for Big Tech it is about about selling your attention to advertisers, no matter the message.

Read more: Reddit removes millions of pro-Trump posts. But advertisers, not values, rule the day[9]

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means New Zealand Prime Minister Jacinda Ardern, third right, at the Christchurch Call summit on May 15 2021, discussing how to combat violent extremism being spread online. Christchurch Call/AP

Can math be made ethical?

Arden’s call is for algorithms designed with intent – the intent to reduce the promotion of content which can harm you, kill you or – given the right conditions – someone else.

An ethical algorithm [10] would encourage a more balanced diet, even if it meant you would stop consuming.

Limiting what the waiters can serve you doesn’t completely avoid the need for important discussions. For example, then who should decide what healthy means? But this would be a less contentious, more productive debate than a stale argument about free expression versus censorship. Especially when the real discussion is the promotion and convenience of "junk” thinking.

Limiting consumption by making things[11] harder to find, not delivered on a platter, is preferable to any outright ban.

Authors: Nathalie Collins, Academic Director (National Programs), Edith Cowan University

Read more https://theconversation.com/jacinda-ardern-calls-for-ethical-algorithms-to-combat-online-extremism-what-this-means-160986

Leftover Budget? The Last-Minute EOFY Tip to Drive Business Success in FY25/26

The countdown is on. With just days left until EOFY, now’s the time to make your remaining 2024–2025 budget work harder and smarter. After workin...

pay.com.au appoints new CEO and Managing Director

The former COO will lead the company’s next growth phase, with ex-CEO Edward Alder transitioning into the role of Managing Director AUSTRALIA, 25...

Tacking the skills shortage — why L&D is failing and what to do about it

The Australian economy is in a tough spot right now, and a huge part of the problem is a massive skills shortage. Late last year, businesses were st...

How reducing revenue leakage could help your business stay in the black in FY2026

It’s time to stop legacy revenue management platforms and processes draining your profitability. Is boosting the bottom line an overarching goal ...

Technical Debt Stifling Path to AI Adoption for Global Enterprises

Outdated legacy technologies costing organisations the ability to innovate, money, time and potentially, even customers Technical debt and an ov...

Attract. Impress. Keep. The new small business growth playbook

Running a small business is a marathon that often feels like a sprint. You are chasing leads, juggling admin, building a brand and trying to carve...

Sell by LayBy