Thursday, February 3, 2022

Social Media Accountability - An Interview

Once again, the spotlight is being shone on the impact social media has on adolescents as two new lawsuits have been filed against Instagram and Snapchat claiming the platforms contributed to severe anxiety in young people, and in the case of one 11-year old girl, even death.

Filed by Matthew Bergman, founder of the Social Media Victims Law Center, the lawsuits claim the platforms played an integral part in one young girls suicide, as well as causing another to form a serious eating disorder. While whistleblowers from Meta (owner of Instagram) have testified regarding these issues before Congress, the companies have, to date, not faced a courtroom over the accusations.

According to Aron Solomon, chief legal analyst for Esquire Digital, who recently spoke with NBC News regarding the lawsuits, “I think cases like this give courts an opportunity to remind both social media companies and parents of their individual and joint responsibilities. However, by saying that these applications, whether it’s Snap or Meta or Twitter, or anything else are so inherently dangerous that even with parental control and supervision, their child ended up dying from suicide is going to be very difficult to prove in the courtroom.”

I had a chance to learn more in this email.

Can you share a little bit about the background on the social media lawsuits?
A Seattle lawyer, Mathew Bergman, started this website called the Social Media Victims Law Center. He's representing two high-profile case - one in Connecticut, where a mother Tammy alleges that her 11-year old daughter Selena died by suicide after developing an addiction to Instagram and Snapchat, and one in Oregon, where a mother said her 14-year-old developed an eating disorder and ran away from home several times after being harassed on Instagram and Snapchat.

Can social media companies be held responsible and accountable?
To an extent, yes. When we use any application, there is a Terms of Service to which we have to agree. It's obviously very different when a minor is involved.

If we look at the Snap terms of service, for example, it bars the use for children of a certain age and (as most ToS do) makes the parent responsible for anyone under 18:

No one under 13 (or, if greater than 13, the minimum age at which a person may use the Services in your country) is allowed to create an account or use the Services. If you are under 18 (or the legal age of majority in your country), you may only use the Services with the prior consent of your parent or legal guardian

BUT, the argument will be that even with the Terms of Service, these providers can't get out from under legal responsibility. Snap and Meta know that their services are used by children and facilitate rather than prevent that.

What implications could this case have, both for parents and for social media companies?
Ultimately, the law will see that there is a shared responsibility between parents or minor children who use these services and the providers themselves. These apps are not so inherently dangerous that they shouldn't be used. However, it's a parent's responsibility to ensure that any child under 18 is using technology in a responsible way. When they don't, the parent can't simply turn and say it's all the fault of the technology provider.

No comments:

Post a Comment