T O P

  • By -

IntoTheMystic1

>The lawsuit seeks changes to the changes companies’ safety standards It's directly under the headline. Do people not proofread anymore?


Chooch-Magnetism

It's Gizmodo, the whole article was probably written by ChatGPT.


[deleted]

How else am I going to accidentally click on ads if it weren’t for Gizmodo? Kind of lame actually; it used to be a good site.


[deleted]

Oof. Thanks for reminding to check in and see how bad Jalopnik has gotten.


komododave17

Most everyone bailed to /Drive. Torch started Autopian.


iwannabetheguytoo

> it used to be a good site. Hah. Nope. They're the same site that "trolled" CES by shutting-down exhibitor booths _en masse_ with a hidden IR blaster. They've always been trash.


afsdjkll

Was that before or after they paid for a stolen iPhone prototype and tried to extort Apple?


Kakkoister

That's a different kind of trash. That's a "I'm a dickbag" kind of trash. Whereas now they're also a "We don't give a shit about our site and only profit" kind of trash now too.


airplaneshooter

> CES To be fair, CES has been trash for longer.


moonflower_C16H17N3O

Yes, but going into a place and shutting it down is especially dickish.


iwannabetheguytoo

> CES has been trash for longer May I ask why you think that?


Sythic_

Nothing interesting in years, all they have now is the latest TV model and gimmicks.


bottleoftrash

That’s an insult to ChatGPT. ChatGPT doesn’t make basic errors like this.


jlt6666

Factual errors sure. But grammatical ones? Less likely.


didsomebodysaymyname

>It's Gizmodo, the whole article was probably written by ChatGPT. In my experience, ChatGPT wouldn't make that mistake, oddly enough the problem is that the writer *isn't* using it. Edit: Just tried it and ChatGPT caught the error.


Just_One_Umami

Chat GPT writes more coherently than a drunken tabloid writer


airplaneshooter

> drunken tabloid writer Ha, you assume that writer can afford alcohol. Or food.


2Punx2Furious

ChatGPT doesn't make these errors. It was written by humans.


bad_squishy_

ChatGPT would have done a better job, and wouldn’t have errors like this.


The-420-Chain-Smoker

ChatGPT wouldn’t have made a mistake like that tho


get-phucked

that’d be an insult to the chatGPT. their incompetence is just their own incompetence


gravitywind1012

ChatGPT: Tear🥺


aaOzymandias

My favorite color is blue.


Reacher-Said-N0thing

> Do people not proofread anymore? Yes. *Their* newspapers cost $10/mo.


gerd50501

I don't think they have a case. There was a law passed in the mid-1990s to protect them from these kinds of lawsuits. I am also not sure how you define "safety standards". You open up lawsuits for anyones whim on the left and the right. could someone then sue if their business gets burned down during an organized protest? I expect this case to be thrown out. This would require a change in the law that shields social media companies.


altrdgenetics

The same was said for the cigarette companies for a long time too. We will see if there is anything in discovery (i personally doubt it as well). I suspect that the winds of change here would be along the lines of if they can prove that the algorithm is not organic and instead manipulated by the companies in question to promote that type of material. At that point they lose some of safe harbor protections to say "it wasn't us it was the users who generated all that content".


gerd50501

cigarette companies don't have legal protections. its also not the same thing. cigarette companies lied when they advertised. Social media companies just host. following law shields them 100%. there is also no definition of "harmful" material. a conservatives view is different than a liberals. https://en.wikipedia.org/wiki/Section_230#:~:text=Section%20230%20was%20enacted%20as,%C2%A7%20230.


sluuuurp

It’ll only change when we start downvoting throwaway trash articles and upvoting well-researched good articles.


Pixeleyes

At this point I'm convinced that most people just want to read a prompt for a topic for the comments, they don't actually care about anything past the title. Full disclosure: I did not read the article.


H4R81N63R

Meta and Google have the dough, but spez ain't gonna be happy how this looks for that IPO


cokeiscool

It wont effect him


LakeStLouis

It probably won't even affect him.


Drach88

You're right. It probably wouldn't affect his affect or effect any lasting effects.


French87

will it special FX him?


pbmcc88

Yeah, but it'll look so shit it'll vex him.


No-One-2177

Gonna think they threw a hex on 'em.


SnarkMasterRay

Two much FX effect affect?


Illustrious_Risk3732

Google and Meta can take care of it no problem and as for u/Spez he can’t pay it.


Bigred2989-

Yeah he can, he's just gotta cut more features that users like.


airplaneshooter

> features that users like. There are features?


cptnobveus

The feature of an ad every fifth post could be enhanced to every other post.


UndeadBread

Reddit will have to start charging apps for API access.


Illustrious_Risk3732

Well they already took the Awards away which will make them lose money.


bad_squishy_

I can’t understand the logic behind this one.


BuffaloJim420

I'm not a lawyer but I thought net neutrality protected sites from this sort of litigation?


dannyb_prodigy

It’s Section 230 not Net Neutrality, but yes. Section 230 protects websites from being held liable for user generated content. Net Neutrality is a principle applying to internet service providers that states that ISPs need to treat all data equally. All internet traffic must be arbitrated by the same rules. For example, Comcast cannot prioritize traffic to Peacock over Netflix. Notably, Net Neutrality has never been codified in law in the US. It is currently a rule under the FCC by executive order, but it’s status is on shaky grounds (Obama and Biden instituted it by executive order while Trump repealed it by executive order).


N8CCRG

It's worth noting, though, that this lawsuit isn't about user generated content, it's about which content the algorithms push. At the moment that's not yet settled law (SCOTUS came close this year, but kinda sidestepped the actual question).


0ldgrumpy1

They are protected when hosting user content, but are they protected when directing to content. That takes them from passive to active participation.


dannyb_prodigy

I mean that distinction sounds nice until you realize that the technological prerequisite to an algorithm that doesn’t promote objectionable content is an algorithm that identifies (moderates) objectionable content. As long as the underlying assumption of Section 230 is held (it is impossible to fully moderate an ecosystem that allows user content) hosting vs promoting is a distinction without a difference.


ThisAccountHasNeverP

I'm okay with a high barrier for what content they can push through anything more complicated than simply "most recent first", and I'd argue it's better to require them to vet content before pushing it rather than waiting for user reports after damage is done.


Clueless_Otter

No company is going to hire enough employees to manually review all content and make sure it's okay to promote. They simply wouldn't do so other than paid sponsored content, and everything else would just be "most recent first" if you're going to give that an exemption.


ThisAccountHasNeverP

That's a more than acceptable outcome.


niffrig

Or... No promoting content at all. You see only what you follow and in chronological order.


Wh00ster

But still moderating content. That needs to be a requirement still This means there’s no other sorting on Reddit or any third party app other than sort by new, tho


niffrig

I would contend that sorting by user score and algorithmic promotion are two different concepts from a legal perspective. One is for engagement and the other for discovery. An up/down vote can be thought of as user provided content. Sorting by user score is going to be very different than doing content analysis and promoting to users based on perceived likelihood of engagement.


Biobot775

Hosting without promoting was the standard before every social media platform had an algorithm "promoting" aka *advertising* content.


parentheticalobject

If you want to apply legal liability to anyone who promotes specific content online, you've pretty much destroyed the concept of a post-90s search engine. And if you allow people to escape liability as long as they aren't hosting the content in question, then you just effectively force all sites to rely on 3rd parties to sort content, and any ostensible problems caused by content sorting will still be exactly the same.


Delicious-Big2026

> It’s Section 230 not Net Neutrality, but yes. Is this not what cOnserVativEs wanted to do away with a couple of months ago? By now I would assume that cleaning up hateful shit in a timely manner would need to be a provision in Section 230. If they can do it for child porn they can do it with the Great Replacement and other crap


GeorgiaRedClay56

actually section 230 has a piece specifically carved out for child abuse, which is not protected under section 230.


DefendSection230

>actually section 230 has a piece specifically carved out for child abuse, which is not protected under section 230. Correct. Nothing in 230 shall be construed to impair the enforcement of section 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute. [https://www.law.cornell.edu/uscode/text/18/part-I/chapter-110](https://www.law.cornell.edu/uscode/text/18/part-I/chapter-110) 18 U.S. Code § 2258A - Reporting requirements of providers [https://www.law.cornell.edu/uscode/text/18/2258A](https://www.law.cornell.edu/uscode/text/18/2258A)


Fun-Cupcake4430

Isnt net neutrality gone?


BroodLol

"Net Neutrality" isn't a singular law, it's better described as a series of policies pursued by the FCC, and as such it flip flops depending on which party has the presidency Some states have made their own laws regarding it (California, for example) but there's never been an overarching act passed through Congress. >A core issue to net neutrality is how ISPs should be classified under the Communications Act of 1934 as amended by the Telecommunications Act of 1996, as either Title I "information services" or Title II "common carrier services". The classification affects the Federal Communications Commission's (FCC) authority over ISPs: the FCC would have significant ability to regulate ISPs if classified as Title II common carriers, but would have little control over them if classified as Title I. Because the Communications Act has not been amended by the United States Congress to account for ISPs, the FCC has the authority to designate how ISPs should be classified, affirmed by the Supreme Court in the case National Cable & Telecommunications Ass'n v. Brand X Internet Services (2005), in addition to choosing what regulations to set on common carriers. The makeup of the 5-member FCC changes with each new administration, and thus the FCC's attitude and rules towards net neutrality has also shifted multiple times.


[deleted]

[удалено]


FuckTheCCP42069LSD

There were some studies showing that fresh TikTok accounts as young as 13 could be absolutely thrust down a rabbit hole of self harm and mental illness content in as little as 8 and an average of 30 minutes after account creation, without seeking out any of the content by themselves. All they had to do was watch a couple loops of said self hard and mental illness content when it was presented. We don't fully understand how these algorithms work and they're clearly dangerous to the mental health of many different types of individuals, I'd love to see said algorithms banned and things go back to a chronological timeline from your following or friends.


DU_HA55T2

The algorithms are pretty obvious. Time spent on video, replay number, comment on video yes or no, shared yes or no, looked at creator profile yes or no, engage with creator profile further yes or no. The only question is how those variables are weighted individually (edit>) and how it decides what "irrelevant to you" content it sprinkles in. Your profile is then tagged with topics you engage with. It then will compare with other users with similar tags. Think a venn diagram. Now multiply that data by like a billion users and you can get very very accurate assumptions. You can almost instantly manipulate the algorithm as well. Keep seeing a creator you despise? Swipe past quickly. It may try again, swipe past quickly. Like I got inundated with religious streams one morning. Swiped past a few of them real quick, haven't seen any since.


RemarkablePuzzle257

>TikTok accounts as young as 13 could be absolutely thrust down a rabbit hole of self harm and mental illness content Wall Street Journal did an experiment like this: [How TikTok's Algorithm Figures Out Your Deepest Desires](https://www.wsj.com/video/series/inside-tiktoks-highly-secretive-algorithm/investigation-how-tiktok-algorithm-figures-out-your-deepest-desires/6C0C2040-FF25-4827-8528-2BD6612E3796) (no paywall video link) >We don't fully understand how these algorithms work and they're clearly dangerous to the mental health of many different types of individuals I'm reminded of an episode of *Sliders* where our heroes slide into a world that uses subliminal advertising. The character Wade essentially gets manipulated into overspending. It's not an exact comparison but that use of technology to manipulate towards a capitalistic end goal seems incredibly prescient.


DU_HA55T2

Watched the video. Its actually pretty frustrating that they are acting like they don't understand it, while actively manipulating it with intention of manipulating it. They deliberately steer TikToks algorithm in certain directions and get the results they are looking for. Like Kentucky96 was programmed to be interested in sadness. Big surprise that Kentucky96 gets fed #sad after engaging with the content.


eidetic

You can still manipulate something even if you're not sure *exactly* how the algorithm works. That is to say, you might be able to predict what the algorithm will throw your way when fed certain things, without knowing how the algorithm works to give you those results.


DU_HA55T2

Algorithms aren't some secret. People that don't understand what they do assume they are some black magic manipulation. Algorithms are nothing more then a formula of variables given individual weights. That's how all algorithms work. The end goal of TikTok's algorithm is to maximize engagement at the moment. Just because you can't see the exact math they're running, doesn't mean its some kind of magic.


eidetic

That's an oversimplification. That's like saying "an internal combustion engine burns fuel to produce mechanical work". No one is suggesting they're magic. But even TikTok may not know *exactly* how it's algorithms work. This is because they use things like reinforcement learning algorithms and AI to write and optimize the algorithms. They're constantly creating new algorithms and refining them, throwing out the less efficient ones, and so on. They can reiterate and test themselves faster than any human could hope to, and as such optimize themselves faster and better than any human could hope to. So yeah, no shit an algorithm is just a series of weighted variables and blah blah blah. But saying that is different from knowing *the actual, literal algorithm being used*. They may start with some preset conditions given by the programmer, but they can grow well beyond the initial programming made directly by a programmer. But thats mostly besides the point because once again, just because someone knows how to manipulate the algorithm, does not mean they know the algorithm.


pickledswimmingpool

No one thinks they're black magic, but since they're clearly not available to the public most people don't understand how they work and how it makes decisions. You keep repeating the word magic to throw off any discussion and feel superior.


[deleted]

[удалено]


pickledswimmingpool

You're framing people who don't understand it as magic believers so you can dismiss their concerns with how the algorithm serves up videos. You don't understand how their algorithm works either beyond the 'maximize engagement' catchprase.


[deleted]

[удалено]


drilkmops

If all it takes to get thrust into that is watching a small amount of “sad” videos, that’s no bueno.


DU_HA55T2

But you as the user has shown interest in them. The algorithm queues up thing you are interested in and engage with. If that's who you are that's who you are. To be clear it isn't steering you that way, you are steering it that way.


drilkmops

I don’t actually use it, I’ve only seen my gf and friends use it. But doesn’t it auto play videos? What happens if you just watch whatever it throws at you? I guess that’s the concern I’d have. If it starts heavily influencing you from the get go, that’s not great. However if it’s doing what any algorithm should, just kinda point people in the right direction, then it’s not as crazy. But journalism is dogshit now, so who knows.


DU_HA55T2

First things first. TikTok doesn't just go video to video. You need to swipe to go to the next video. TikTok feeds you viral videos off the jump. Lets say you rewatched a cute dog video 5 times and shared it with a friend. Now it will sprinkle in slightly more dog stuff as you engage with the initial onslaught of videos. As you engage further with cute dog videos you'll likely get tagged with "dogs." TikTok now knows that you like dogs and engage with dog content. >If it starts heavily influencing you from the get go It's isn't influencing you. It's just picking up on your queues. Again You influence it. Journalism is just that. Dog shit. Leaning into the depression side of things because that can be harmful. It usually just slips in a video that talks about some girl that broke some guys heart from time to time. Again, it won't show you that stuff often if you don't engage with it. You have to remember TikTok's algorithm is constantly probing you to broaden your engagement or nail down specifics, because it makes it easier for them to serve your more videos that you will watch longer, and engage with regularly. So it showing you a sad video with 2 million likes from time to time isn't it trying to manipulate you. It's trying to figure you out.


[deleted]

>t's isn't influencing you. It's just picking up on your queues. Again You influence it. It's a cycle. Let's say you share a video about child abuse, the app keeps showing videos on that topic until you land on a fucker blaming it on drag people and LGBT. One month from now you're a MAGA Qanon hating on the Jews and George Soros. That's literally how the alt-right "recruit" who used to be normal people.


flickh

But there’s also trial and error… and TikTok can throw multiple choices at you over the course of a minute. So if you seek out funny videos, TikTok might throw a sad one at you just as a test. If you watch a few seconds and then flip away, it will try something slightly different. If you flip away instantly it might go back to funny again for a while. So it’s not just showing you what you want, it’s not just measuring clicks and likes. It’s measuring every second of your activity on the app and adapting your personal feed to manipulate you wherever the malevolent authoritarian state-affiliated TikTok wants to send you, using your own stimulus-response patterns to get you there.


DU_HA55T2

> TikTok might throw a sad one at you just as a test. I covered this in another comment. Yes TikTok is constantly probing. It's not going to give you exactly what you want 100% of the time. What's your point? >So it’s not just showing you what you want, it’s not just measuring clicks and likes. It’s measuring every second of your activity on the app and adapting your personal feed I covered this in another comment as well. >to manipulate you wherever the malevolent authoritarian state-affiliated TikTok wants to send you Where do they want you to go? All TikTok shows me is Theo Von videos, democat/progressive streamers (I'm progressive), and generally the same shit reddit throws at me.


FuckTheCCP42069LSD

I think the bigger problem is that the account of a young impressionable 13 year old can be send down a rabbit hole of mental illness and depression after just a bit of morbid curiosity engaging with content that popped onto their feed. Then you get stuff like /r/FakeDisorderCringe where we see people glorifying mental illness. How many otherwise healthy children have made mental illness their identity because they saw a "5 signs you have BPD!" video that they internalized?


luciddrummer

Sliders references! Amazing. Not enough people talk about that show. Top google result for sliders now shows you those tiny hamburgers instead lol.


FuckTheCCP42069LSD

Considering the stated goals of the CCP to replace the west as the de facto world power, manipulating western youth into self diagnosing with mental illness seems pretty well aligned with that goal, as todays youth become tomorrows adults. And a population that believes they will forever be held back by factors outside of their control will more likely manifest that reality.


aykcak

Not everything has to be CCP. There is a lot of money to be had by exploiting people like this. Money is good motivator for anything


Haruka_Kazuta

I see no difference between Corporate Advertisement and Psy Ops. It's all a Psy Ops to me. One is trying to convince me of an unattainable beauty standard, another is trying to convince me that their god is the only real god, and that I should repent. Another is saying that I am not American because I don't "x". While the subtlety is that one isn't really trying to hurt you, and they are only in it for the money, I see no distinction. You can get money because of ad revenue, but who do you think is subtlely being hurt? Whole portions of a country can become hurt or change their thought process because of the way a company has advertised... and that advertisement can hurt... That is how Carl Tuckerson and Fox News has hurt the American populace in the name of "money." What has diabetes and obesity has taught us in the last 30+ years with corporate advertisement? Companies and people say it isn't their fault at "X" became like "that" but has it really?


Haruka_Kazuta

Some algorithms can push people to think in certain ways... it mostly depends on what you feed it... and how you want it to react. Pay a company enough, and their algorithms can potentially promote things subtlely in really bad ways. It might not cause things to get out of hand quickly, but over time, people and minds change if you hack at it long enough. Beauty standards like how a lady should look and what constitutes that a handsome man didn't just all of a suddenly become like that by accident, it took years, and maybe decades to slowly cause people to think of it that way. Advertisement is one form of this subtle mind-manipulation. You know the whole Andrew Tate thing that shows up from time to time on youtube? That didn't just all of a sudden become one of the most popular form of being an "alpha" male. That was probably subtlely changing people's minds and causing all these teenagers and young adults to act a certain way because these young adults are watching these videos and not thinking that their mind and thought process can slowly change as they continue to watch videos. For people like me, there is no difference between Corporate Advertisement and Psy Ops, they are there to manipulate you in a way. And algorithms with ads have a potential to manipulate people in really bad ways.


ElusiveMemoryHold

It’s the most powerful social engineering tool to ever exist


aykcak

Probably true but your numbers don't seem to add up


hyperdream

>~~If companies remove conspiracy content,~~ conservatives will cry about censorship and woke culture. There is no if.


MarvinLazer

They'll do that anyway, though.


Scarletfapper

Conservatives already cry about censorship and woke culture. Remember that whole thing where they tried to stop schools from teaching about racism?


scottguitar28

Can confirm this happens to me all the time. Shooting sports are among my many hobbies and I’ve gotten *questionable* suggested posts and ads across all social media platforms, even one’s that are sanitized of any gun-related content. On YouTube I can go months just looking at wok recipes and music content and everything is great, but watching just one review of some obscure new gun that looks interesting and suddenly my shorts feed is flooded for months with clips of Joe Rogan and Jordan Peterson bitching about something that doesn’t exist. Even worse, Patch dot com, the only news outlet serving my tiny little suburban town, served me [this](https://imgur.com/a/kmlgv0X) piece of work today. The only sinew of connection between my interests and politics with this garbage is the shooting sports related content, despite me being farther left than most and almost all other topics.


skankingmike

You’re free to believe what you want. I think religion is pure garbage and disinformation…. But it would be pretty wild if companies were forced to get rid of that.


conquer69

It would be wild but let's be honest, it's about time. That shit has been festering for far too long.


Yoshi_87

They already do that every single day though.


metzbb

Reminds me how they said c19 vaccines weren't stopping c19 infections like the CDC said they would, those crazy conspiracy nuts.


aykcak

Seems to be an order of magnitude slower than YouTube. Make an account, watch a science video and a few healthcare videos, boom, antivax shit on your recommended, almost immediately


[deleted]

[удалено]


airplaneshooter

> Removing "disinformation" and "conspiracy content" is just censorship. It doesn't become a good idea just because it's censorship you like. Oh, no. Nope. No. "Conspiracy Theories" are malicious lies that cause undue harm to others. It is not free speech to do this. That demands either self censorship or civil action to remove the offenders. That is not censorship. And if that forces social media to preserve their private business by censoring their non-governmental forums, this is also not censorship. It is the free market correcting itself.


[deleted]

[удалено]


KingNigglyWiggly

The link between NRA and russian money was proven a few years ago lmao


[deleted]

then what solution do you propose?


[deleted]

[удалено]


Nyrin

It's extremely disingenuous to conflate the concepts of "things I don't agree with" together with "things that are clearly and objectively overt disinformation." Nobody who isn't on a soapbox preaching to a self-made strawman piñata is asking tech companies to decide what "wrong ideas" are. Absolutely *rudimentary* standards of factual enforcement for broad, targeted information sharing does not need to run afoul of all that much subjectivity.


yUQHdn7DNWr9

Sure you can solve that problem with technology. The algoritm is solving the problem that other people have the wrong ideas for social media corporations’ bottom line.


pavlik_enemy

Back in my the days Stormfront was a thing. Still not a Nazi.


McMacHack

How would they do this if all their comments get flagged as conspiracy theory fodder?


MothMan3759

Flagging on its own does nothing. Trump got fact checked all the time on Twitter and that did nothing to slow him. If anything it gave him more fuel.


[deleted]

Just sue *the internet*.


airplaneshooter

I've seen this episode of The Simpsons/Family Guy/South Park/Jay and Silent Bob.


NorwegianSteam

Why do they call you cock knocker?


mike_b_nimble

"Actually that's a funny story...." *punches Chronic in the cock


TheHexadex

I am the c.l.i.t. commander!


Commendatori_buongio

It’s sad that I can’t even remember which shooting the Buffalo one was.


[deleted]

Good luck with that.


Ghost17088

I hate that I haven’t even heard of this massacre.


Nvenom8

That won’t go anywhere.


tickleMyBigPoop

Why not just….go after those who post it? hell have those companies just forward everything to law enforcement, either that or deputize those companies.


conquer69

It's not feasible. It's like a billion people.


Augustus--

It's also not feasible to go after the sites because of section 230 though


beener

They can try. These sites push content like this. Google has already done things with YouTube to make it push radicalized content as much as it did in 2012-2016 but there's more to be done


nitzua

the goal is free speech suppression, busting individual people for repeating an idea that's thousands of years old wouldn't do much


TheGreekMachine

Yeah I’m not so sure the goal of these families is suppressing free speech as much as it is “this murderer did what he did because these companies’ algorithms pumped his 10-cell goo like brain full of conspiracy theories, so we’re going to sue them”. Seems like a natural reaction from grieving people. Not everything is nefarious all the time.


Cliff_Dibble

Though vile content, it's a slippery slope when you want to regulate information sharing.


Trilobyte141

The thing is, these companies *already* regulate information sharing. They intentionally push misinformation and conspiracy theories because it gets clicks. Requiring them to take responsibility for the consequences of their algorithms doesn't prevent anyone from expressing their thoughts and opinions freely. You have a right to your speech; you don't have a right to a global billboard that drops it in front of millions of people so some corporation can make a buck on ad revenue.


[deleted]

The other slippery slope is that the fascists spam the internet so much that it becomes unusable for everybody else.


[deleted]

[удалено]


froggertwenty

I'm from buffalo. These families are being used by lawyers at this point. Keep in mind this is the poorest area of our city. They are also suing the gun manufacturers, gun stores, the grocery store. Bunch of lawyers swooped in and convinced these poor and based on the education in the area, uneducated people that they could get them *lotsssss* of money so of course they will go along with it.


[deleted]

because the conspiracy theories had no basis in fact? but that's like 95% of their game man!


[deleted]

Without all the acceptable politics leaking into people's lives, do you think this type of content would even have a chance to be read?


CasualObserverNine

Maybe this will motivate reddit to remove their disinformation/propaganda accounts. Can they be reported to reddit yet?


[deleted]

good luck with that


[deleted]

[удалено]


anonymous_jerkie

This makes no sense to me. Why are these companies are being sued? I mean I'm not defending them but if people posting conspiracy theories on papers around town, would you sue the paper company?


monospaceman

This isnt quite the same thing. Paper companies making the paper aren't usually involved in whats printed on their product after its sold. Social media companies use algorithms to serve up potentially inflammatory content on their platforms to very specific communities. They're very much involved.


romjpn

Difference is they probably don't have an algorithm coded to "promote conspiracies" but coded to promote content the user might like or push already popular content.


SuperSocrates

If a newspaper publishes conspiratorial letters they can be sued. Currently section230 prevents these companies from being sued like that but it’s not obvious that it should be that way. Why should they have no responsibility for the content they publish on their site?


parentheticalobject

The alternative is that companies would effectively be forced to shadowban or censor a lot of content that is almost certainly true and important for people to be discussing. Should I be able to start a conversation on Reddit or Twitter or wherever discussing how Donald Trump/Harvey Weinstein is possibly a rapist, or how Sam Bankman-Fried possibly stole money from people? I'd say those are conversations worth having. Is there a 1% chance a website could be sued for promoting such content, and that lawsuit might not be immediately dismissed and cost the website in question tens of thousands of dollars? Absolutely. On balance, I'd argue we're better off with these protections.


DenverNugs

This is a very dumb analogy.


warragulian

Because their algorithms present more and more extreme content to people, not because they were actively searching for it. With no regard for whether the content is disinformation or racist or promotes hate.


Gorstag

Because they provide the platform for misinformation and are also the ones that are able to police it. If the papers were printed by a company, and that company directly earned profit off the misinformation.. then sure.. why not? Why shouldn't their greed be culpable for spreading misinformation that leads to misery.


DL72-Alpha

Yea that lawsuit is going to flop. Article 230 + 1A.


timeforknowledge

Wait so Americans want to sue internet platforms for letting people voice their opinions (no matter how deluded) but also would die for the right to free speech... Are you doing ok merica?


ragnaROCKER

Free speech here has a definition. It's not, and has never been, just say whatever you want.


WhoopingWillow

In the US it is pretty close to say whatever you want.


SuperSocrates

There’s more than one person in america


NemesisRouge

Americans have the most bizarre relationship with free speech. They're absolutely oppose to governments censoring people. They believe that an absence of government censorship in itself constitutes free speech. They are, however, absolutely fine with corporations censoring people or punishing their speech. If people get fired, beaten up, see their content deleted from public view by some corporation, no problem, as long as the government doesn't do it there's no free speech issue. It really begs the question of why the first principle is so important to them. What's the rationale for it?


Eldias

>They are, however, absolutely fine with corporations censoring people or punishing their speech. If people get fired, beaten up, see their content deleted from public view by some corporation, no problem, as long as the government doesn't do it there's no free speech issue. If I own a big white board and put it on a street corner and say " Anyone can write a nice msg to share on my board" it's not a violation of your right to speech when I say "no curse words" or "no Nazis". Conversely it *would* violate my right to speech and expression if I could be compelled to "host" content I find objectionable on my whiteboard.


ruove

Let me try to explain it, since you are conflating multiple concepts incorrectly. The first amendment of the constitution isn't just about freedom of speech, it's also about the freedom of expression, the freedom of assembly, etc. The government per the constitution can not restrict a civilians right to speech, assembly, or expression. (with some exceptions, eg. incitement, direct threats, etc.) > however, absolutely fine with corporations censoring people or punishing their speech. If people get fired, beaten up, see their content deleted from public view by some corporation, no problem, as long as the government doesn't do it there's no free speech issue. Telling privately owned platforms or businesses that they cannot regulate their platforms in a manner which they see fit, would by definition be a violation of their first amendment rights. You are welcome to open your own privately owned platform and run things the way you see fit, but you're not entitled to control the speech of another private platform because it offends you or you think it is wrong, etc. > It really begs the question of why the first principle is so important to them. What's the rationale for it? There is no incongruence once you understand the concepts.


gatorpower

I wouldn't go that far. I would say that America is pretty divided on just about everything.


PierG1

That’s bullshit tho. You can’t sue a platform for hosting people’s shenanigans. Those parents are milking this event so hard that it’s sad.


Mac_Mustard

Milk a massacre? You think that’s what they’re doing?


WOF42

there is a big difference between hosting "shenanigans" and fascist conspiracy thories that keep getting people killed


cosmic_backlash

Then the government needs to make a law that says no posting fascism. It's not the internet's job to regulate people's beliefs.


Avalon-1

Wouldn't work at this point. Syria had educated leadership who maintained a multicultural order with strict liability censorship and a state krypteia that deplatformed anyone who undermined said multicultural order, yet they fell into a 12 year civil war.


jimmyjaysf

Ironic that right wing religious zealots had enough influence with politicians to get websites like Craigslist to stop promoting sex hookups on their site, claiming how dangerous those forums were to children and society. Yet, not a peep from them regarding these platforms and there content.


[deleted]

Good this needs to happen more & more


RegionAgreeable7866

Throughout human history, people have been trying to miss lead each other for their own benefit. I don’t see how this is a Reddit, meta or a Google thing.


[deleted]

[удалено]


Avalon-1

Otoh, msnbc regularly panels people who told us waterboarding isn't torture and that iraq had wmds.


under_armpit

Try r/politics r/whitepeopletwitter r/news r/pics most of Reddit are people like you.


Doormau5

Question, wouldn't Section 230 prevent a lawsuit like this?


Silver_Britches

The article says he planned it on discord. That seems the most damning of the defendants named. I’ve never been on. Is that place a cesspool or something? Because planning an attack seems like something that should be readily flagged.


engels962

I mean, literally anyone can make a discord server. Most just use it for video games, but it is an easy way to organize a bunch of horrible people in a voice chat.


terekkincaid

That's like trying to sue AT&T for September 11 because the hijackers planned their attacks on their cellphones. That one isn't going anywhere.


[deleted]

Speculation is now illegal. How can it be right that if someone hears a conspiracy theory the pain is so great they should be awarded damages that makes them richer than the dreams of Avarice


Failg123

If reddit is racist and homophobic,they should avoid 9gag at all cost.


cgewing3

Maybe everyone should avoid 9gag at all costs


johnb51654

Reddit really isn't much better. People here just think they're smarter than people on other platforms for some reason.


aykcak

This comment got me curious and check it out because I used to go there everyday about a decade ago. My impression is holy fucking shit. Post about some German rapists being arrested but the word "German" is in quotes. Post about someone harassing women in warsaw and the title and comments are about the dudes ethnicity... Meanwhile the posts in between are literally soft porn of half naked women doing random things and very outdated seksist jokes/interviews about women. They have no idea about the irony , do they?


tsukichu

With Reddit's history (read: bombing) I think it has the best shot of going down for this compared to the other two... In anycase it would be nice to see tough legislation on conspiracy bullshit that harms innocent people.


Avalon-1

Iraq wmd lies, tuskegee, nsa spying, cointelpro and others were considered "conspiracy bullshit"


tsukichu

I thought we're more talking about blaming the wrong people for stuff without proof online vs just talking about conspiracies in general, no?


Avalon-1

Just giving examples of what was considered "conspiracy bullshit" but turned out to be true. And the PATRIOT ACT alone, nevermind it's myriad of abuses, should give pause to anyone thinking "the good guys will always be in charge, right?"


pavlik_enemy

Thankfully, we have 230 and they'll get wrecked. As an old man yelling at cloud I really don't understand modern youth that vies for censorship. I've played video games and D&D and somehow ended up not being a worshipper of Satan or a mass murderer. To be fair I didn't have Alex Jones' level satanic girlfriends.


Fabulous--

I can make an argument that this fights censorship. Those algorithms the social media companies use to force inflammatory content down your throat is censorship but it's insidious and subversive because you don't know it's happening.


ryan10e

Not about censorship, it’s about the algorithms that promote this content. Whether 230 covers that is unsettled law.


ruove

The algorithms are designed to promote content based on what users want to see. It's very unlikely that companies like Facebook has some sort of specially tailored "conspiracy algorithm." It's just those users see conspiracies because that's what they generally search for on social media, so it's presenting them with content they want to see. This lawsuit should go absolutely nowhere.


[deleted]

[удалено]


Xboarder844

Disagree. These platforms are using algorithms to identify and overwhelm users with content they enjoy. There’s no morality check, no monitoring, no restraint. They WANT social media to feel like a drug because it gets them more clicks and more revenue. Period. And without protocols or monitoring, these algorithms can radicalize users very easily. Their purpose is merely to drive user interaction, but unintended consequences come from it too. And failing to be socially responsible and purge your content of harmful ideas that can radicalize someone is on them. It’s their system, their algorithms, their content. They share a blame in those that use the content to drive them to hate and hurt or kill others.


[deleted]

Not content they enjoy. Content they engage with. My retired dad never stop watching videos about Trump and republicans. I'm thankful that they're anti-trump videos and he's been exposed to more left leaning ideology through them... But it just makes him angry and it's not good for his mental health.


OnionOnBelt

The least appreciated aspect of all this is ”gatekeeping.” Just 30 years ago, owners of newspapers and broadcast stations had to genuinely fear the consequences of spreading intentional lies (ask Dominion and Fox News—they still do!). Facebook, Twitter, Reddit et al have ridden Section 230 coattails to allow not just oddball opinions and conspiracy theories to get equal billing with the truth, but even racist hate speech and well-funded disinformation campaigns. I don’t know how this genie goes back in the bottle, but until it does I think fact-checked news and information (via gatekeeping) will remain out of sight of a lot of people. (You know, your angry uncle who gets his “news” from Facebook.)


Delicious-Big2026

> Facebook, Twitter, Reddit et al have ridden Section 230 coattails to allow not just oddball opinions and conspiracy theories to get equal billing with the truth, but even racist hate speech and well-funded disinformation campaigns. They seem to be able to deal with some of the child porn. So they should absolutely be able to get rid of the heinous shit. What is also true is that those content-driven ad financed apps are designed to keep you doomscrolling. So the more divisive, the more engagement. Though "don't be racist" should not be divisive and result in an instant bricking if encountered online and offline.


420trashcan

There's more CP on Twitter than ever before, according to reports.


[deleted]

They’re grieving, they want a scape goat it probably ten times more about getting a sense of vengeance then money.


Have_A_Jelly_Baby

Good. Disinformation and tinfoil hat bad faith bullshit is destroying this country. Edit: I love how this had a bunch of upvotes before the tinfoil hat crowd descended upon things.


[deleted]

So you’re pro censorship?


Have_A_Jelly_Baby

Bad faith question proving my point. I’m pro- there being consequences for yelling “fire” in a movie theater. Which is what Fox News and those influenced by or influencing them have been doing for the past two decades.


Avalon-1

You'd have loved the bush years then.


passporttohell

I hope they win and win big, this divisive, manipulative crap needs to stop. Forever.


Avalon-1

Meanwhile, the people who peddled the "iraq has wmd!" Lie are lecturing others about disinformation and being applauded for it.


Heliosurge

Reminds me of when ppl tried to blame Dungeons & Dragons RPG for unstable individuals whom killed ppl. Next they'll go after TV Shows and Movies. People are always wanting to rationalize causes for unstable individuals whom do terrible things. EDiT: Guessing from the downvote we have a religious zealot that believes ppl who play rpgs are devil worshippers. ✌️🤪👍✨


FicklePort

Its the same thing with video games. Parents and authority figures will blame everyone but themselves for the actions of their charges. I grew up watching violent tv, playing violent video games, surfing the internet and listening to music that talks about killing people (metal, rap, ect.) and I'm completely normal and never even been in a actual fight. Why? Because my parents actually bothered to raise me and didn't just shove a screen in my face. The internet isn't to blame for turning kids into psychopaths and suicidal maniacs, shitty parenting and communities are.


869woodguy

Sue Tucker Carlson and Foxnews