Article 43


Sunday, February 26, 2023

The Mother of All Privacy Battles Part 24 - Shutting Us Up

 image: see no evil, hear no evil, speak no evil

[T]he 85-page briefing, titled THE GOOD CENSOR admits that Google and other tech platforms now control the majority of “online conversations” and have undertaken a shift towards “censorship” in response to unwelcome political events around the world.
- Democracy Hollwed Out Part 35 - Censorship

We look at two cases before the Supreme Court that could reshape the future of the internet. Both cases focus on Section 230 of the Communications Decency Act of 1996, which backers say has helped foster free speech online by allowing companies to host content without direct legal liability for what users post. Critics say it has allowed tech platforms to avoid accountability for spreading harmful material. Critics say it has allowed tech platforms to avoid accountability for spreading harmful material. On Tuesday, the justices heard arguments in Gonzalez v. Google, brought by the family of Nohemi Gonzalez, who was killed in the 2015 Paris terror attack. Her family sued Google claiming the company had illegally promoted videos by the Islamic State, which carried out the Paris attack. On Wednesday, justices heard arguments in the case of Twitter v. Taamneh, brought by the family of Nawras Alassaf, who was killed along with 38 others in a 2017 terrorist attack on a nightclub in Turkey. We speak with Aaron Mackey, senior staff attorney with the Electronic Frontier Foundation, who says Section 230 “powers the underlying architecture” of the internet.
- Video - Free Speech on Trial: Supreme Court Hears Cases That Could Reshape Future of the Internet, Democracy Now February 27, 2023


The Supreme Court case that could fundamentally change the internet

By Jessisa Melugin
Washington Examiner
February 24, 2023

The Supreme Court recently heard oral arguments in a case that could fundamentally alter social media.

GONZALES V. GOOGLE, heard by the justices on Feb. 21, asks the highest court in America to determine if the longtime internet liability shield known as a SECTION 230 includes content that the platforms algorithms recommend to users. And about how they present those recommendations.

The case stems from the killing of then-23-year-old Nohemi Gonzalez, who was studying in Paris when she became the only American victim of a terrorist attack that claimed 129 other lives in the city. The Islamic State later took responsibility for the acts.

Back in the United States, Gonzalezs family sued multiple tech platforms, accusing them of radicalizing users into terrorists by promoting pro-ISIS third-party content. Google’s YouTube video-sharing platform is the only defendant that remains, and that’s the case the Supreme Court heard oral arguments on. The case was paired with a similar suit, TWITTER V. TAAMNEH, which was heard the next day.

Both cases address the possible limits of the protection social media platforms have from liability under SECTION 230 OF THE TELECOMMUNICATIONS DECENCY ACT. The law, now commonly shortened to “Section 230,” clarified liability for online sites hosting third-party content at a time when there was uncertainty about their legal responsibility. Legal precedent dealt with traditional publishers, including newspapers, and distributors such as bookstores.

But the online hosts were different in that they were not filtering user content before it was posted, like a traditional publisher did, but only after it was posted, if at all. At the time, CompuServe’s chat board did not moderate posts at all and, because of the precedent in liability law, was therefore not legally responsible for the content it hosted.

A rival service, Prodigy, wanted to take down potentially offensive posts from its users in order to make a family-friendly online environment, but it worried that doing so would trigger legal liability. That’s because, in the past, bookstores were not held liable if they didn’t know of illegal content in the materials they were selling but were on the hook if they did know about it and carried it for sale anyway. Moderating content seemed to be an admission that they knew what they were hosting.

In practice, Section 230 means that host sites cannot be sued for content posted by their users and that taking down any of that content will not trigger liability for the platform. Legal responsibility stays with the creator of the content, not the online host.

Those same issues are at play, but they now apply to small sites and social media platforms with billions of users. More than 500 hours of third-party content is posted to YouTube every minute that’s 720,000 hours per day.

Google and other major social media platforms argue that the volume of hosting can only be maintained because of the legal protections Section 230 affords to them. Without it, the danger of legal expenses for a flood of litigation would cause sites to take down much more content (just to be safe) or to allow everything (so they could claim the old bookstores hands-off protections). That would make for an internet void of anything the least bit controversial or one polluted with violence, spam, and pornography, largely unusable for most people.

The plaintiff’s argument in the case has changed from its initial petition to the court. Legal counsel for the Gonzalez family at first presented the question of Section 230s safe harbor including third-party content when it was algorithmically recommended by the host site, arguing that it should not. But at this week’s oral argument, the Gonzalez family’s lawyer Eric Schnapper concentrated more on whether the thumbnail links in YouTube’s “up next” suggestion for the next video constitute content created by the host, instead of a third party, making them ineligible for Section 230’s protections.

During oral arguments, multiple justices volunteered that they were “confused” about what argument Schnapper was trying to make during their exchange. Chief Justice John Roberts clarified that the YouTube algorithm was the same across the platform and that nothing special was employed in the recommending of the ISIS content. Several justices expressed concerns about the resulting economic upset if the court were to rule in favor of curtailing Section 230s liability shield and exposing online platforms to increased litigation.

The oral arguments went on for almost three hours on Feb. 21, an unusually long amount of time for the justices to spend on a case. More than 70 briefs were filed in the Supreme Court, where interested parties, including other social media platforms, think tanks, and advocacy groups, weighed in on the case.

The court’s ruling could have profound and widespread implications for social media platforms and their users. But depending on how the court decides the aspects of the related Twitter case and its intersection with the Anti-Terrorism Act’s provisions for liability, the Supreme Court may be able to sidestep weighing in on the parameters of Section 230 altogether. America’s highest court is expected to rule on both cases in June.


Posted by Elvis on 02/26/23 •
Section Privacy And Rights • Section Broadband Privacy
View (0) comment(s) or add a new one
Printable viewLink to this article
Page 1 of 1 pages


Total page hits 12478682
Page rendered in 0.5558 seconds
41 queries executed
Debug mode is off
Total Entries: 3549
Total Comments: 340
Most Recent Entry: 06/01/2023 02:21 pm
Most Recent Comment on: 04/06/2023 03:01 pm
Total Logged in members: 0
Total guests: 7
Total anonymous users: 0
The most visitors ever was 588 on 01/11/2023 03:46 pm

Email Us


Login | Register
Resumes | Members

In memory of the layed off workers of AT&T

Today's Diversion

At least a doctor can bury his mistakes, an architect can only advise his clients to plant vines. - Frank Lloyd Wright


Advanced Search



June 2023
       1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30  

Must Read

Most recent entries

RSS Feeds

CNN Top Stories

ARS Technica

External Links

Elvis Favorites

BLS and FRED Pages


Other Links

All Posts



Creative Commons License

Support Bloggers' Rights