Business Daily Media

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means

New Zealand’s prime minister Jacinda Ardern has called for “ethical algorithms” to help stop online radicalisation.

She made her call on the weekend at the second summit of the “Christchurch Call[1]” for action to eliminate terrorist and violent extremist content online.

The first Christchurch Call summit was convened by Ardern and French president Emmanuel Macron in May 2019. It took place two months after New Zealand’s first and worst mass shooting in decades, the Christchurch mosque shootings, in which a 28-year-old Australian gunman killed 51 men, women and children.

The Christchurch Call is a voluntary compact between governments and technology companies. So far 55 nations have signed on – with the most notable new signatory being the United States[2], which refused to join under Donald Trump.

Google (which owns YouTube), Facebook, Twitter, Microsoft and Amazon have also signed on, as well as Japanese messaging app LINE, French search engine Qwant and video-sharing sites Daily Motion and JeuxVideo.

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means Trump supporters, believing false claims a election was stolen, try to break through a police barrier at the US Capitol in on January 6 2021. John Minchillo/AP

In light of clear examples of extremist behaviour still being fomented online – the storming of the US Capitol in January being a case in point – one might question how much has been achieved.

Read more: Two years on from the Christchurch terror attack, how much has really changed?[3]

On the weekend Arden, while noting the progress made in areas such as the platforms’ protocols for moderating and removing extremist content, singled out the need for ethical algorithms[4]. Here’s why.

How social media platforms serve content

Imagine a large, vast restaurant. Service here works in an interesting way.

The waiters dash around the restaurant to bring diners as much food as they can eat. They don’t take orders but effectively direct you to what you will eat by putting that food in front of you.

The restaurant owner has designed it this way, to keep you eating as much as possible.

How do the waiters know what you like? They have a record of what you ate last time. They listen in on your table conversation. You mention you feel like French fries? They will bring you buckets of fries over and over.

At first you think: “Isn’t this wonderful, these waiters know just what I like.”

But the waiters don’t care about what you like. They just want you to keep eating. Even if the food is unhealthy and increases your risk of disease or death. No matter. They’ll keep bringing it as long as you keep eating.

If these waiters were ethical, if they cared about your well-being, they might bring you healthy alternatives. They might put a salad before you. If the restaurant owner was ethical, the service would not be designed to encourage overeating. It would seek to interest you in something else.

But then you might stop eating. You might leave the restaurant. That would hurt profits.

Algorithms are designed to decide what we see

Social media algorithms work the same as the service in our metaphorical restaurant. Algorithms are tech companies’ secret recipes[5] to keep users on their platforms.

The easiest way to do that is serve you content you like – perhaps with even more salt, sugar and fat.

On YouTube it’s more of the same type of content you’ve been watching. Like videos of stray dogs being rescued? You’ll get more of those recommended to you. If it’s videos about governments hiding alien technology, you’ll get more of those.

Facebook works a little bit differently. It will recommend groups for you to join based on your interests. If you’ve joined a group about native birds, or ascending to the fifth dimension, more such groups will be recommended to you. Those groups enable you to interact with and make “friends’ with others who share your interests and beliefs.

Read more: Why Facebook created its own ‘supreme court’ for judging content – 6 questions answered[6]

Repetition and normalisation

These strategies reinforce and normalise our interests and views. They are crucial reasons for the viral-like spread[7] of extremism.

An idea, no matter how absurd or extreme, becomes more acceptable if repeated over and over again[8]. Advertisers know this. So do propagandists. The more we view videos and posts pushing the same ideas, and connect with people who share the same views, the more we feel we’re normal and it’s those who disagree with us who are deluded.

This radicalisation is a social phenomenon. It is also a business.

Those pushing or holding radical ideas often think they are opposing Big Tech and other corporate interests. They couldn’t be more wrong. Extremist content is a lucrative market segment. Keeping your eyes on a page, enthralling you and reinforcing your views is a way for content creators, social influencers and the platforms themselves to make bank, boost their ego and spread their message. Which, in turn, legitimises their message.

Remember the fundamental business model: for Big Tech it is about about selling your attention to advertisers, no matter the message.

Read more: Reddit removes millions of pro-Trump posts. But advertisers, not values, rule the day[9]

Jacinda Ardern calls for 'ethical algorithms' to combat online extremism. What this means New Zealand Prime Minister Jacinda Ardern, third right, at the Christchurch Call summit on May 15 2021, discussing how to combat violent extremism being spread online. Christchurch Call/AP

Can math be made ethical?

Arden’s call is for algorithms designed with intent – the intent to reduce the promotion of content which can harm you, kill you or – given the right conditions – someone else.

An ethical algorithm [10] would encourage a more balanced diet, even if it meant you would stop consuming.

Limiting what the waiters can serve you doesn’t completely avoid the need for important discussions. For example, then who should decide what healthy means? But this would be a less contentious, more productive debate than a stale argument about free expression versus censorship. Especially when the real discussion is the promotion and convenience of "junk” thinking.

Limiting consumption by making things[11] harder to find, not delivered on a platter, is preferable to any outright ban.

Authors: Nathalie Collins, Academic Director (National Programs), Edith Cowan University

Read more https://theconversation.com/jacinda-ardern-calls-for-ethical-algorithms-to-combat-online-extremism-what-this-means-160986

Business Daily Media Business Development

Mortgages Vs. Equity: Quick Guide To Understanding The Difference

Investing our money is a priority to generate a passive income. Also, avoiding the money not used starts losing its value. Investing in real estate has become one of the most popular met...

Ariana Mortenson - avatar Ariana Mortenson

Everything You Need to Know About Portable WiFi

If you're like most people, you can't live without the internet. In fact, many of us rely on it so much that we take it for granted. But what happens when you're out and about and there...

NewsServices.com - avatar NewsServices.com

Raising UK state pension age to 66 has seen big increase in working 65-year-olds, but particularly deprived women

Retirement is not what it used to be. Gary CraigThe UK state pension age has been rising in recent years, most recently with a staggered increase for both men and women from 65 to 66 between...

Laurence O'Brien, Research Economist, Institute for Fiscal Studies - avatar Laurence O'Brien, Research Economist, Institute for Fiscal Studies

Five rules for effective leadership in difficult times

Vlad Chorniy/ShutterstockAfter another punishing year dominated by COVID, the omicron threat appears to be receding and many people may now be looking at the beginning of the end of the pand...

Christian Harrison, Reader in Leadership, School of Business and Creative Industries, University of the West of Scotland - avatar Christian Harrison, Reader in Leadership, School of Business and Creative Industries, University of the West of Scotland

Turning resolutions into short-term survival and long-term growth tactics

Few Australian industries have been harder hit by the pandemic than hospitality. After two years of lockdowns, social distancing restrictions, staff shortages and supply chain woes, 2022...

Paul Hadida, General Manager, APAC at SevenRooms - avatar Paul Hadida, General Manager, APAC at SevenRooms

The ‘baby bust’ is set to kick-off an AI-boom

The Australian workforce is set to see almost an entire generation retire within the next 15 years. Firstlinks predicts that there will be more baby boomers exiting the workforce than 15-y...

Andy Mellor Regional Vice President of Australia at Kofax. - avatar Andy Mellor Regional Vice President of Australia at Kofax.



NewsServices.com

Content & Technology Connecting Global Audiences

More Information - Less Opinion