Meta faces lawsuits over its algorithm

Meta faces lawsuits over its algorithm

 

Meta has received eight different complaints, filed in several states, alleging that its algorithms contributed to mental health problems such as eating disorders, insomnia, and suicidal thoughts or tendencies in younger users.


The complaints allege that excessive time on Instagram and Facebook poses serious mental health risks. One plaintiff claimed that Meta misrepresented the safety, utility, and non-addictive properties of its platforms.


Rather than following the content itself, these complaints target the algorithm. That might make Section 230 of the Communications Decency Act, which states that META cannot be held legally liable for external content posted on its platform, not to provide a strong defense.


A federal appeals court ruled last year that Snap could be held liable for the speeding candidate, who allegedly encouraged reckless driving and caused a fatal car crash in 2017.


"This ruling opened the door to lawsuits like the one against Meta," said Eric Goldman, co-director of the High-Tech Law Institute at Santa Clara University.


Prosecutors in Snap argued that the speed filter was not considered third-party content, but was a design choice made by Snap itself.


Because the court has ruled Snape is not protected by Section 230 in this case, others are trying to get around the law in a similar way.


But Goldman argued that Snape's judgment and the cases against Meta are qualitatively different, because the algorithm and the content it serves are the same.


"This idea that we can distinguish between dangerous algorithms and dangerous externalities via the algorithm is in my opinion an illusion," he said. The algorithm directs people to only view the content. Content is the problem.


Goldman also noted that the lawsuits against Meta have emerged in the hope that one of the eight judges overseeing the cases will stand against the company. But what is likely to happen if one of the judges rules in Meta's favor is that the others will follow suit.


Can Meta be sued for her algorithm

Regardless of Section 230, social media companies are not far from the problem of platform addiction. A bill passed by the California State Assembly in late May gives parents the right to hold social media platforms accountable if their children become addicted.


Several platforms are also being investigated by prosecutors across the country. A coalition led by attorney generals from several states in November began investigating Instagram to see how it keeps young people engaged. The group expanded its investigation to include Tik Tok in March.


Frances Hogan testified before Congress last year. She said Meta has not been outspoken about Instagram's effects on young people.


Internal research showed that the app exacerbated mental health problems for teenage girls in particular.


“The platforms incorporate addictive design, features, and algorithmic amplification of disturbing content,” said Jim Steyer, CEO of Common Sense Media. These are some of the tactics that social media platforms like Meta use.


Amid public pressure, social media companies appear to be showing signs of wanting to curb users' addiction.


And Tik Tok recently announced the introduction of more screen time controls. It aims to help users limit the amount of time they spend scrolling.


Instagram has implemented similar daily time limits. But it fell short of mobile users' ability to set a reminder to the daily time limit of less than 30 minutes.


These moves seem beneficial. But it's easy to cross time limits like these. And it may just be a way for these companies to save face since they want to keep users actively using the platform for as long as possible.


Congress or the Supreme Court may step in to amend or clarify Section 230 if states continue to pass legislation to hold social media companies responsible for content posted on their platforms. And that could potentially completely reshape the way social media companies operate.

Post a Comment

Previous Post Next Post