San Francisco– Meta deliberately provides less help, reporting of online abuse, and safety on its Facebook platform to save on costs for people living outside of the US, says Facebook whistleblower Frances Haugen.
Speaking to Australia’s Select Committee on Social Media and Online Safety on Thursday morning, Haugen testified that Facebook takes down the “bare minimum” when it comes to harmful content, reports ZDNet.
It is especially when content comes in languages that are not spoken prominently in developed countries as there is minimal criticism from these underrepresented users, the report said.
“It can consistently underinvest in safety, and particularly, it really under invests in safety outside the US because, disproportionately, their safety budget is spent on paving the US,” Haugen was quoted as saying.
“I am sure on a per capita basis there is less help, less support and less safety for Australians because Facebook knows it operates in the dark. Where they do not have to, they do not apologise about anything,” Haugen added.
Providing an example of Facebook doing the bare minimum, Haugen claimed an intervention screen for eating disorder or self-harm content previously touted by Meta global safety head Antigone Davis was only showed hundreds of times per day as of last year.
On Monday, the Department of Home Affairs shared similar findings with the committee, singling out Meta as being “frequently the most reluctant to work with government” when it comes to promoting a safe online environment, adopting a safety-by-design approach, and taking adequate proactive measures to prevent online harms.
Haugen provided this testimony to the committee as part of its social media inquiry into the practices of major technology companies to curb toxic online behaviour.
The inquiry was approved by the federal government at the end of last year with the intention of building on the proposed social media legislation to “unmask trolls”.
Much like Haugen’s previous appearances before governments from other jurisdictions, she continued to flag the core issues with the Facebook platform as being its algorithms that push extreme content and its decision to allow a higher rate of inappropriate content to remain online to avoid mistakenly removing appropriate content. (IANS)