Business Daily Media

How tech companies are failing women workers and social media users – and what to do about it

  • Written by Lilia Giugni, Research Associate, University of Bristol
How tech companies are failing women workers and social media users – and what to do about it

From Elon Musk’s erratic start[1] as Twitter’s new owner to Meta’s recent decision[2] to layoff more than 11,000 employees, and an ongoing downturn for tech stocks[3], the social media sector is once again in turmoil.

But while these latest shockwaves have attracted a great deal of public attention, we talk considerably less of their repercussions on women. Big tech companies are failing women on both sides of the screen: their employees and the users of their services. This is why recent moves to regulate social media firms[4] should include specific protections for women.

Online abuse, as has been repeatedly confirmed by academic research[5] and civil rights groups[6], often targets women users. One of Musk’s first acts after buying Twitter was to introduce verification to reduce the number of fake accounts. Such accounts are often cited[7] among the main causes of social media violence. But the authentication process[8] (since withdrawn after protests from the Twitter community) simply relied on “certified” profiles paying a monthly fee.

As such, the move seemed more like a way to raise revenues than an effective online safety strategy. To make things worse, and more or less simultaneously, Musk also controversially restored the accounts[9] of several high-profile figures previously banned for misogynistic discourse. This included self-defined “sexist” influencer Andrew Tate[10].

Read more: Elon Musk's 'Twitter Blue' gives verification for a fee – this could make Twitter even less safe for women[11]

Beyond the tycoon’s chaotic approach to leadership, these decisions indicate wider trends within the social media industry with far-reaching ramifications for women.

Over the last few years, in fact, platforms such as Twitter, Facebook, YouTube and TikTok have all responded to mounting public pressure by adopting more stringent guidelines against gender-based hate speech[12]. These changes, however, have been mostly achieved through self-regulation[13] and voluntary partnerships with the public sector. This approach leaves companies free to reverse previous decisions in the way Musk has.

Besides, censoring individual internet personalities and promoting account verification doesn’t actually address the core causes of social media violence. The actual design of these platforms and the business models these companies employ play a more central role.

Social media platforms want to keep us all online to produce profitable data and maintain audiences for advertisements. They do this with algorithms that create an echo chamber. This means we keep seeing content similar to whatever attracted our clicks in the first place. But research shows this also facilitates the circulation of “divisive” messages[14]. It also supports the spread of online sexism[15], and pushes users that view problematic materials into a “black hole[16]” of related updates.

While the platforms themselves have become problematic for women that use them, many of the companies behind them are also failing the women workers that build and manage online social media networks.

Tech company redundancies

Social media companies’ treatment of employees should also be examined through a gender lens, particularly more recently as they react to a market downturn[17] with mass layoffs and other cost-cutting strategies.

A particularly at-risk category (which I have examined, among others, in my recently published book[18]) is that of social media moderators[19]. These employees are charged with the task of cleaning up platforms of content that violates community standards. They are constantly exposed to misogynistic hate speech, images of sexual violence and non-consensual pornography. Female staff[20] tend to feel especially triggered and many develop mental health issues[21], including depression, anxiety and post-traumatic stress syndrome as a result.

Social media firms and their international subcontractors (to which large part of moderation operations are outsourced) make other choices that also infringe on employees’ rights, particularly female moderators. One of the latest has been placing AI-powered cameras[22] in the homes of moderators who work remotely. This is a particularly brutal intrusion for women since they already often face harassment or safety issues in more public spaces.

Online abuse and workers’ treatment concern people of all genders. Women, however, pay a unique price for social media violence. Recent research from The Economist[23] shows fear of new aggressions pushed nine out of ten female victims surveyed to alter their digital habits – 7% even quit their jobs.

Woman looks at laptop; home in background; remote working.
Moderators delete posts that violate community standards on social media and so are regularly exposed to disturbing content. fizkes/Shutterstock

Specific solutions to online hate

Just as women workers and users encounter specific issues as a result of social media policies – or lack thereof – the interventions designed to improve their safety and wellbeing should also be specific.

My book looks at how digital capitalists – including but not limited to social media corporations – fail female users and workers, and how to remedy this[24]. Among the reforms I suggest are interventions to make platforms more accountable.

The UK Online Safety Bill[25] is set to give regulators the power to fine or prosecute companies that neglect to remove harmful materials, for example. It is important, though, that policy change in this area specifically identifies women as a protected category, which this bill currently fails to do[26]. Transparency commitments for platforms’ algorithms and regulations around data-mining business models could also help but are so far not yet – or not fully – integrated into most national and international legislation.

And since workers must be protected as much as technology users, it is vital that they can organise via trade unions[27], and that there is a push to ensure employers respect their duty of care towards the workforce. This might involve prohibiting invasive workplace surveillance, for example.

There is one solution to both issues: it is time for social media giants to implement specific strategies to safeguard women on both sides of the screen.

References

  1. ^ erratic start (theconversation.com)
  2. ^ Meta’s recent decision (www.nytimes.com)
  3. ^ downturn for tech stocks (www.ft.com)
  4. ^ regulate social media firms (www.theguardian.com)
  5. ^ repeatedly confirmed by academic research (arxiv.org)
  6. ^ civil rights groups (www.amnesty.org)
  7. ^ often cited (www.compassioninpolitics.com)
  8. ^ authentication process (www.cleanuptheinternet.org.uk)
  9. ^ restored the accounts (www.euronews.com)
  10. ^ Andrew Tate (www.theguardian.com)
  11. ^ Elon Musk's 'Twitter Blue' gives verification for a fee – this could make Twitter even less safe for women (theconversation.com)
  12. ^ gender-based hate speech (www.epe.admin.cam.ac.uk)
  13. ^ self-regulation (mckinneylaw.iu.edu)
  14. ^ circulation of “divisive” messages (intpolicydigest.org)
  15. ^ spread of online sexism (research-information.bris.ac.uk)
  16. ^ black hole (www.theguardian.com)
  17. ^ react to a market downturn (www.businessinsider.com)
  18. ^ recently published book (septemberpublishing.org)
  19. ^ social media moderators (www.theverge.com)
  20. ^ Female staff (www.washingtonpost.com)
  21. ^ develop mental health issues (www.theverge.com)
  22. ^ placing AI-powered cameras (www.theguardian.com)
  23. ^ research from The Economist (onlineviolencewomen.eiu.com)
  24. ^ how to remedy this (gen-pol.org)
  25. ^ UK Online Safety Bill (bills.parliament.uk)
  26. ^ currently fails to do (demos.co.uk)
  27. ^ they can organise via trade unions (www.wired.co.uk)

Read more https://theconversation.com/how-tech-companies-are-failing-women-workers-and-social-media-users-and-what-to-do-about-it-199324

Why you need an Australian digital marketing agency

When you're looking to grow your business, hiring a digital marketing agency can be a great way to get started before hiring in-house. You can also ...

Business Training

Why do you need to take care of your WordPress website?

Maintaining a WordPress website is essential for ensuring that it runs smoothly and looks professional. Not keeping your website in good health can ...

Business Training

9 Reasons To Hire An SEO Agency in Sydney

Businesses across Sydney are now quickly coming out of lockdown. With 2022 just around the corner, it’s important that businesses adapt to the new...

Business Training

Grasp the Big Picture of Housing Development

The housing development is the process of constructing and managing residential spaces for human habitation. It involves planning, designing, buildi...

Property

Property peak urges re-elected Government to renew focus on real estate sector

The Real Estate Institute of Victoria (REIV) welcomed the Andrews Government’s Cabinet announcement today, congratulating The Hon. Danny Pearson M...

Property

5 Tips for Ensuring You Don’t buy a Dud Property

People purchase properties because of rental payments and the gradual appreciation of the property. However, there are cases where people purchase p...

Property