
A U.S. court allows negligence claims against X to proceed under limited Safe Harbor protection.
Musk’s X must face part of lawsuit over child pornography video
Roots of the lawsuit Allegations of negligence and legal framework The lawsuit began in a California federal court and was filed by the parents of two minor girls who were threatened by a blackmailer to post their objectionable photos online. When the blackmailer carried out his threat and posted the photos on Twitter, now ‘X’, the victim’s family sued the company, accusing it of negligence. The main allegation in the lawsuit was that X violated its Terms of Service, which explicitly prohibits child sexual abuse material (CSAM) on the platform. The victims’ lawyers argued that X knowingly or negligently allowed this content to be present on its platform, causing further harm to their children.
Legal hurdles and the doctrine of safe harbor
Which is a major legal exemption for social media companies in such lawsuits. Along with this, what is known as Safe Harbor. And this principle comes under Section 230 of America’s Communications Decency Act (CDA)No online platform will be responsible for the content that a third party posts. If a user posts objectionable content, then the platform cannot be legally blamed for that content. Also, this Safe Harbor principle acts as a shield for social media platforms. This is the reason why the court dismissed parts of this case. The court believed that under Safe Harbor, X could not be held directly responsible for the content that someone else had posted.
Historic decision of the appeals court to hear the allegation of negligence
But recently a federal appeals court brought a new twist to this case. Also, the court ruled that even though X cannot directly hold responsibility for the content, the plaintiffs can sue it on the allegation of negligence. And the appeals court said that when a company is negligent in implementing its terms of service. Then they can sue it. Also, the court divided this case into two parts, assigning direct responsibility for the content, which they dismissed due to the safe harbor principle, as well as assigning responsibility for negligence, which the court stated that they can sue this part. And this allegation is that X knew about this objectionable content but did not take adequate steps to remove it and help the victims.
Also, this decision shows that the safe harbor principle is not a complete shield. And if a social media platform knowingly or negligently violates its own policies.
Limits of platform responsibility
Should social media companies act only when they receive complaints or should they actively seek out child sexual abuse material on their platforms? Also, the terms of service are important. When a company prohibits child sexual abuse material in its terms of service, does it become legally bound to enforce these terms? Also, the role of AI and moderation? Is it possible to identify and remove such content in time using Artificial Intelligence (AI)? Did X deploy enough technical and human resources to do so? Also, the impact on humans, such content leaves lifelong mental and emotional trauma on victims online? Doesn’t social media companies have a moral responsibility to do everything possible to prevent such crimes?
Legal battle in India
In India too, X has faced a legal battle against government orders. And X had filed a suit in the Karnataka High Court in which it alleged that the government was arbitrarily blocking content. But the government had argued that X is presenting Safe Harbor as a right. Whereas it is a responsibility. And the company has to follow the laws of the country. Also, Elon Musk’s stance and X’s policies, which Elon Musk has made many changes after buying Twitter.
He has expressed his opinion many times regarding content moderation policies. But on the issue of child sexual abuse material, the company’s stance has always been clear that they will not tolerate such content at all. But critics believe that under Musk’s leadership, there has been a large-scale layoff in the content moderation team in the company. And due to which the ability to stop such content has been affected.
What will happen next?
Now that the appeals court has ordered to resume some parts of the case, the case will go back to the lower court. Also there, the victim’s family and X’s lawyers will argue on whether X was negligent in implementing its terms of service. And the final result of this case can become an example not only for X but for all social media companies. Also, this decision will determine what are the limits of the Safe Harbor principle and how much responsibility social media platforms will have to take for online child safety. And this issue can prove to be a turning point for the safety of children in the online world. Also, the depth of the matter which is related to child safety and company accountability and this case is not just a legal dispute. Rather it gives rise to many important questions related to online child safety.