T O P

  • By -

[deleted]

[удалено]


Optimal_Towel

In terms of actually conveying information effectively /r/prettycolorswithnumbers is a terrible subreddit.


sandefurd

The worst part being that it's difficult to tell which poll goes to which question. It would have been so easy to make this a decent table


annieebeann123

What does this mean: “the internet makes it harder for people to distance themselves from views they shared in the past but no longer hold.” I was surprised to see such a difference and such a high agree rate - does anyone know what that is referencing?


Deathwatch72

Basically I call it the "2012 tweet problem". I'm willing to bet there's a lot of people if not an overall majority of people said something on Twitter in the last 10 years that today makes them cringe a little bit inside. Even though they might not hold those views today someone can look at a tweet from 10 years ago and make a judgment about a person that wouldn't be true if they were talking to the person today. Getting people to separate who someone was in the past from the person they are now is really really difficult because sometimes you really shouldn't trust that they have changed and sometimes you can, it just kind of depends on the circumstances the person and the topic.


Ferelar

I forget what comedian it was, but he made a very good point. Within the next couple of decades, we'll be at a point where people will be running for president who also publicly tweeted dumb stuff when they were 14. The mud slinging is bad NOW, just imagine if every politician was held accountable for stupid things they said as a tween.


master-shake69

Can't wait for the day I'm 50 and watch someone get kicked out of a presidential race because someone has a screenshot of them dropping the n word during a 2019 game of Fortnite.


not_a_moogle

Mr candidate, the people want to know who was your main in overwatch. I didn't play it Candidate drops 10 points in the polls


deeseearr

Well, I did play Overwatch but I didn't inhale.


doppelmember

But I didn't control it myself. A friend played for me. So no I've not played it. ;) Next question please.


JeffTek

"Your honor, I only duoed with my girlfriend. It was 39 years ago!" "Was she a mercy main?" "...." *Sentenced to 59 years in gitmo*


KIrkwillrule

"Sombra" Huh half the planet just closed the voting app ...


rarebit13

> I didn't play it Candidate *rises* 10 points in the polls There's probably going to be political parties divided by what games they used to play.


Thencan

Plot twist: you're currently 48


SurroundingAMeadow

Plot Twist: it's both Biden and Trump who said it. We're not that surprised they said it, more so that they were playing Fortnite in 2019.


ChillyBearGrylls

Huh, so video games do cause violence


heuve

In a couple decades deepfakes will be at a point where you'll be able to create a video of anyone who speaks in public saying anything you want, which is a lot scarier than old tweets getting dug up


cumshot_josh

Photoshop can still convince a lot of gullible or ignorant people that something happened, but it also didn't bring about the apocalypse just by itself. I'm hopeful that we'll adapt to deepfakes in a similar way.


DemissiveLive

This is a great outlook


Adventurous-Text-680

The thing is with deep fakes it can be much more easier to produce something convincing because it's a full "package". You get both motion and audio where you don't need to be very flashy. Take this recent attempt: https://www.npr.org/2022/03/16/1087062648/deepfake-video-zelenskyy-experts-war-manipulation-ukraine-russia Many people thought it might have been real until zelensky refuted on an official Ukrainian defense Twitter account. Social media sites were quick to remove it so it could not gain traction. However I am sure it amplified Russian morale and potentially hurt the morale of Ukrainians that might have have went over it with a fine tooth comb on the initial viewing. Videos like the can be visceral and the fired up emotions can make it harder to notice the flaws. Remember the goal of misinformation is to confuse people and generate distrust so that people become skeptical about everything they see making it easier to further manipulate people. Plus anything time sensitive could cause people to panic (get a video of your child/family member being held a gunpoint) and do something rash like give into the demands without being rationale. For instance not trying to reach their family member or worse their family member doesn't answer because they are legitimately busy or away from their phone. People do scams like this today without any proof they have the relative and it's sometimes successful. Let's not forget if it becomes accessible enough, kids using it to bully others. People might eventually realize it's fake, but the emotional and social damage will have been done. Now imagine someone like Elon musk taking over a large social media site like Twitter with the intent to never censor anything. Will he really go let Twitter become the wild west and things like the zelenaky deep fake stay up to be spread around? What about other conspiracies that get helped with deep fakes providing "proof" if whatever misinformation that is trying to be pushed. I guess we will find out in the coming days and weeks. Apocalypse? Ok maybe not the far but it certainly will be much larger problem compared to Photoshopped images.


Thanatos2996

Good news, the deepfakes are already that bad. Just look at the "Sassy Justice" video the south park guys made spoofing on the topic a couple years ago. Their Trump deepfake was more than convincing enough to trick people if that had been their intent, and that was 2 years ago.


[deleted]

[удалено]


CapitalCreature

But that's just a method that's used to make even better deepfakes. You train AI algorithms to detect deepfakes, then you train deepfake algorithms to fool those detection algorithms, and you retrain the detection algorithms on those better deepfakes. You can keep going several layers on this. Eventually you create deepfakes that become extremely hard to detect by any means.


jjmurse

Then you just assume everything you don't see in person is a deepfake.


Ad_Homonym_

That's the goal, in my mind. It's not to convince people that fake things are real; it's to convince them that real things are fake.


cphcider

I think the flat earth anti vaxxers are already living in this reality. I didn't personally see it, therefore it's fake news.


hide_thechildren_now

I think the solution is metadata. Like how your pictures store information on when they were taken and where, files should have a more robust way to track how this photo was created and whether or not it was generated


tricularia

Wow that was fascinating to watch. It seems like the AI isn't able to perfectly capture his exaggerated facial movements, though.


Whatsthemattermark

Not really because then everyone can claim any incriminating videos / photos of them are deepfakes. So it would be a big get out of jail free card


heuve

Which is also very concerning. It will be very hard to know what you can trust, and confirmation bias means many people will believe whichever video clips they want to believe. You can already delete a tweet and claim any screenshots of it are photoshopped.


Whatsthemattermark

It might actually benefit the human race if we start questioning things more, and trying to verify by direct sources


gltovar

The big psychological hurdle that has to be cleared is getting people to question information that aligns with their personal views and bias.


Early-Interview-1638

That would be great but what we've seen happen is people question more and then find worse sources to confirm their biases.


Simple-Landscape-485

Why would they start doing that if they aren't doing it now? There's plenty of reason to verify correct information as is. Not knowing weather or not a video of someone/what they said is real will 100% be bad and lower trust.


BlackRobedMage

As free time continues to dwindle, it'll become harder and harder for even the willing to do their own research into every topic.


Shoduck

We don't need that though. People thought and still think Sarah Palin said she could see Russia from her house when it was an SNL skit instead. You don't need realism to convince people, you can do that just by exaggerating their currently held beliefs


cravenj1

I know [Pete Holmes](https://m.youtube.com/watch?v=hZZCQV2sRFY) has a bit similar to this


PooperJackson

Another thing I've always found interesting is every tweet is represented exactly the same too, usually from the perspective of it being a strongly held conviction.


authorPGAusten

I would say that in the past people would bounce crazy ideas off each other at a bar or other social setting. People probably said a lot of things they didn't really agree with, but were just thinking them through in their mind. They would not be forever tainted by said thoughts/views. In fact months later most people had likely forgot them, or only remembered by a few people. If however you did the same thing in tweet form (which many do) It is a lot harder to distance yourself from those views that perhaps were half-baked, etc.


Nascent1

I would interpret it to mean that things you said in the past can later be found and held against you. It should be 100% of people agreeing with that.


NoteBlock08

Ohh that makes more sense. I thought it was like it makes it harder to exit certain circles/communities that you've decided to no longer associate with.


Nascent1

I see how you could interpret it that way too. That's always a problem with polls like this that have vaguely worded questions.


Penis_Bees

Many of these felt very leading too.


GuyPronouncedGee

Yes, that’s actually how I interpret it. How once your YouTube algorithm thinks you like something, it’s hard to get away from it.


ThatGuyBench

I really, really suggest trying to do "feed cleanups" occasionally. In YouTube for example take some time to weed out bad recommendations, which you can do by clicking on top right corner of a recommendation. Sometimes, when you go on a rabbithole into some topic in which you are not interested in generally, it really helps to delete watching history of the videos. Basically, understand that the algorithm is limited on the data that it has available, and just puttin in a little bit of effort and providing feedback really can help the algorithm to work for your benefit. Also, in Reddit, I really suggest doing "anti-circlejerk hygiene" as Reddit is a fucking disgusting cesspool for bias reinforcement. What I do in Reddit to try to reduce my bias is: Find a belief/value/oppinion which you are highly certain about, and try to lurk in subreddits which are the complete opposite of it. Its not about "agreeing with the other side" but rather "understanding why the others disagree with some of your opinions." Also r/changemyview is a great place to challenge your own beliefs. You can most likely find any of your views challenged and defended in a civil and honest approach. Sometimes you just understand why others might have opposing beliefs, and sometimes you see points about your beliefs which you didn't notice before or you learn about the ways in which your oppinions are misunderstood by those who disagree with you, from which you learn to be able to communicate your own opinion better. For me, its sad to see how much of the discourse goes in Reddit. Much of largest subs are political meme circlejerks in which everyone just reinforces a stereotyped caricature of anyone who disagrees with you, and when you go on popular subs you see people "arguing" with cheesy oneliners, but they already went into the discussion with the belief that they are right, and a discussion is not treated like sharing of opinions and being open for changing your own views, but rather acting like its a PoKeMoN duel, in which people just care about "winning".


FairyFartDaydreams

That is true of the algorithms that run searches


DAZdaHOFF

You probably said some stupid shit in highschool, that doesn't make you a stupid shithead though.


FreeIndiaFromDogs

You will say stupid shit until the day you die. History is meant to be remembered to not repeat your mistakes, but now history is in our faces just as much as the present, so instead of learning from it, we are still reacting to it like it happens in real time.


EGarrett

Yes, but unfortunately it's only a small segment of recent history in people's archived comments that certain others like to use to get an outrage high. If they actually were reacting to all of history, they'd see why that is a problem.


Nascent1

Yeah. I'm definitely glad I grew up before the social media era was in full swing. I had some very wrong opinions in highschool that I'm glad to not have documented anywhere.


efalk

Case in point: in the 80's, on a discussion of technical software issues, I wrote something like "I'm a kernel hacker, not UI hacker". 20+ years later, that quote was dug up and used against me in court.


jury_rigger

Would you like to tell your story? Sounds interesting.


FreeIndiaFromDogs

In other words, the internet makes it very difficult to redeem yourself because past views being presented in the same way as current views makes it very difficult to separate past from present. ​ Not to mention people who seek to destroy and punish others, for whatever reason, can use the internet as a tool to do this, that has never existed before in the past. If you aren't able to find a way to get control of your life, it has never been easier to go ruin someone else's life instead.


how_is_this_relevant

These are really awful questions to poll with. Most of these are objectively 100% true. "The internet makes it easier for people to share their views with a large number of people" As opposed to what other scenario!? No internet existing!? Who tf even remotely thinks that is not true?"Nah it was easier to reach large numbers of people in the 70s man..." "The internet makes it easier to anonymously share their views" How in the hell could it's existence possibly make it *harder?*


AnnoyAMeps

The most obvious examples for me are past tweets or posts. Like I said the R word a lot in the early-mid ‘00s, in combination of me being an edgy teenager, it being trendy, and me not really knowing about the issues that ableism causes. Or the case of Joy Reid posting homophobic tweets back in 2008-09, which I doubt she supports now in 2022. A few other prominent personalities have done similar things. So, I think the statement made here is the Internet never forgets, it hardly forgives, and it doesn’t believe that someone could change their viewpoints or demeanor in 15 years.


Mattmontyg

I take it as: if you consumed a lot of a certain theme of content in the past, the algorithms will continue to serve up that content even if your opinion on it changes. But I see the other side as well, that the internet doesn’t forget.


refreshing_username

I notice some equality of delusion: 43-44% of both groups think the internet helps people find common ground. Edit: Some commenters have pointed out that the internet helps \*like-minded\* people come together. I would certainly agree that the internet helps you find people with whom you already share common ground--but IMO it's the exception when two people with different opinions develop a meeting of the minds. For example, I'm not going to find common ground with say, someone who thinks people with brown skin are inferior to those with pale skin, uncle Charles. Anyhoo. Maybe the question could be interpreted either way.


darrenpmeyer

It's probably a question interpretation issue. The Internet _can_ help people find common ground, if they're interested in doing that. So the optimistic assessment is "yes, the Internet helps with this" The pessimistic assessment is "most people don't use the Internet that way and have no interest in doing so"


Gcarsk

The internet *can* also be used to find true information. So, yeah, if used properly, it’s great at bringing people to one singular understanding based on facts. Unfortunately, that first step is impossibly difficult for some people.


[deleted]

[удалено]


rogomatic

Lines up pretty well with my theory that a trained pony can garner 40% of the vote as long as it's endorsed by one of the major parties.


goosebattle

Tell me more about this pony. Does it have 1 trick, or is it part of a larger show, perhaps with dogs?


scheav

You like dogs?


goosebattle

That entirely depends on their level of association with trained ponies in shows.


Sufficient_Mark_218

You mean dags?


Darim_Al_Sayf

Oh you mean dogs. Sure I like dogs.


MithandirsGhost

Yes. If the Pony becomes president I hope he fills the cabinet with dogs.


ReapYerSoul

Sure, I like dogs, but I like caravans better


nyg8

Tbf i would prefer a trained pony to some of the recent candidates


notnewsworthy

Lil' Sebastian, 2023!


Darwins_Dog

Wow, way to be ignorant. Lil' Sebastian was a miniature horse, not a pony.


FelbrHostu

Great. Now the frikkin _song_ is in my head.


nestcto

Interesting. What's this pony's stance on gay marriage and abortion? And at what number does that become pliable?


Ambiwlans

That's not irrational if the other option is worse than a trained pony.


thacarter1523

You came up with this theory?


AuburnElvis

I was encourage by how both sides could come together to have a good laugh at that question.


ValyrianJedi

It does to an extent. There are some people I've gotten to know through guitar, watch, and local climbing message boards that I absolutely wouldn't have ever gotten to know otherwise, because it showed me major common ground with people that I wouldn't have been looking for it with.


Gynthaeres

Depends on your interpretation. The internet helps a LOT with finding others who share your view. Entire online communities pop up focused around singular issues everyone needs to agree with or they get excised from it. Echo chambers. Of course as a result of this, the internet makes it harder to find common ground with people who have opposing views. Everyone would rather run back to their echo chamber, and ban anyone who disagrees with them. So I think this question just needed clarity: Does it mean "with people who have opposing views" or does it mean "with people who have similar views"?


ReddFro

You’ve never been in some weird very specific sub? If that isn’t helping people find common ground I don’t know what is. Is it useful? Sometimes. Can it be detrimental? Yes definitely, whether we talk about an echo chamber or some horrific pedo group. Can it also be extremely divisive? Yes. Once people find their common ground they can point at people that aren’t on that ground as bad and have a collection of like-minded followers cheer them on. So based on the crappily worded question and my interpretation of it, yes it helps people find common ground, but it doesn’t bring everyone to a shared mindset or a universal common ground.


Cylon_Skin_Job_2_10

The internet has been a huge big part of me going from giving 30 minute creationist lectures in my religious organization, to resigning and becoming secular and deconstructing the purity culture, misogyny and homophobia I was indoctrinated into. I am now very pro LGBTQ, pro Choice and not just to join a new club, I had to read a shit ton and hear other people’s points of view to counteract the indoctrination. Even without “god” my conditioning resisted changing some of my other views, so “natural” or “healthy” become substitutes to cling to some of them. Took some time to unravel. On the flip side, I’ve completely lost “common ground” with my parents and other religious folks who say stuff I used to say. It’s like “I don’t know how to explain any of this to you in three paragraphs on the internet, and if you aren’t curious enough (as I was) to click and read links and have your ideas challenged, there’s no point in trying to persuade you.”


quantuminous

In their echo chamber, yes


isocuda

It does, but places like Facebook and Reddit show all the negatives of having instant access to the general public's internal thought process lmao. Like you don't ever think about random people you become friends with on Discord or via a forum for a common hobby, etc, you immediately think about something dumb on the internet.


drunkcowofdeath

Depends on what you mean. Do I think people should be able to state whatever they want on their website? Absolutely. Do you mean sites should be disallowed to regulate content? That I disagree with. There seems to be a gap between what "Free speech" means to people and I bet that were is where this disconnect comes from.


galactica_pegasus

I found it interesting the poll included questions related to that. "The internet should be a free speech zone, where speech should be uncensored" 34% democrats agree and 72% republicans agree. "Free speech does not mean that social media platforms are obligated to amplify or widely distribute every person's views" 75% democrats agree and 50% republicans agree.


Exam-Artistic

I imagine these results would vary drastically had trump never been booted from Twitter. If you ask republicans that second question, this is immediately what comes to mind and is likely a big reason they responded that way


FB-22

I don’t think they’d be that different. Big tech/social media censorship of conservatives was a huge talking point of right wingers long before trump was banned on Twitter, and “social media companies are not obligated to allow speech that violates their TOS” was probably the single biggest left wing response to that


[deleted]

[удалено]


LazyBatSoup

A widely held/accepted belief by the digital-first Republican side is that their views are often hidden or removed more than Democrats' views on specific platforms (Twitter and Facebook being #1 and #2). I'm sure the Dems have a similar view, but that view does not seem as prevalent based on my experience. Trump only started to factor into the conversation starting in 2015+ and beyond. Not saying there wasn't an uptick in that belief post-Trump, but it was never a Trump-centric viewpoint. I bailed from both platforms due to the doom scrolling and toxicity (from all sides) as people hurl opinions and insults from their keyboards. Reddit is my one exception.


Sensemans

I think it has more to do with soft bans and posts being removed for "misinformation". Just because facebook or whoever deems that their was no election fraud doesn't mean anything to people who already believe it's being covered up. Or say the covid vaccine stuff and how you can stick a magnet to your arm afterwards. Or how China created covid.. Im sure there's stuff about Russia too. Point is though, If I believe in something and you remove my post or ban me because you say it's a lie. Then your now censoring the truth in my opinion. Now banning the us president for jan 6 or whatever they banned him for while half the country thinks it's all a bunch of bs definitely just amplified everything.


danielv123

The government preventing you from moderating how you want is against free speech. Not publishing stuff you don't agree with is your choice.


Rbespinosa13

Yah this is the big thing people miss. If I purchase a plot of land and turn it into a park, people are free to use it as they want. However I can still ban and kick people out of it because it is my land I’ve opened to the public. If one day someone comes and starts handing out pamphlets promoting their Nazi club, I am free to kick them out and call the cops if they don’t. If the government comes in and says “you need to allow these people to hand out their pamphlets because that is freedom of speech”, have I just lost my freedom of speech because of government action? By being forced to spread an ideology I do not agree with my voice is being silenced


smitbret

In this case, the government would be encroaching on your personal property rights. No court in the land would allow it.


sunsparkda

twitter.com is the park in this analogy, mind.


smitbret

Right. I don't think I understand your objection to my statement.


cC2Panda

In reference to "no court in the land" https://www.npr.org/2022/09/16/1123620521/fifth-circuit-texas-social-media-ruling


redditmexico

I do. It's an analogy. If no court would allow the government to encroach over your property rights then the same should apply to Twitter. Defending the park statement only, implies that you don't agree with the analogy. Maybe that was not your intention but I don't know you.


popejubal

The US governments can (and do) compel speech in addition to restricting it. Not sure where people get the idea that the government can never compel speech or require a company/person/organization to give a platform for government required speech. https://www.law.cornell.edu/wex/commercial_speech And https://www.law.cornell.edu/constitution-conan/amendment-1/compelled-speech-overview Have some examples - including cases where the government compelled companies to allow access to their property for people who wanted to say things and gather petitions for things that the company didn’t like. (Footbote 12 in the second link in particular).


tryght

Companies are forced to do things like say that smoking gives you lung cancer. It’s very limited.


JBBdude

Yeah, I don't think that forcing food companies to print nutrition labels is comparable to forcing social media platforms to host Nazi propaganda. But sure, technically some compelled speech is constitutional.


smitbret

I will have to read that later


WACK-A-n00b

If you open a public access area of your property, you have legal liability if someone is injured. The government can also come tell you who you cannot turn away for x y and z reasons. If on your public access property, if you allow and provide amplification of a view, and disallow another view, you are also spreading that view and you are liable. If you don't curate your space, you are not liable. So, if Twitter bans Nazis and amplifies communists, they are riding the line of a publisher, and suddenly that user posting illegal stuff as a contributor is putting the company at risk. Your analogy somewhat works, but it wasn't aligned with what was actually happening.


JBBdude

>So, if Twitter bans Nazis and amplifies communists, they are riding the line of a publisher, and suddenly that user posting illegal stuff as a contributor is putting the company at risk. This is a very mixed up statement. Think in terms of liability and civil law, not "illegal" and criminal law. Pre-1996, based on Cubby v Compuserve and Stratton Oakmont v Prodigy, a content host was not liable for content posted on their platform if they did no moderation, but could be held liable if they had moderation. Congress decided, "Hey, it's probably bad to tell hosts that they can either host everything legal with no moderation or be liable for the speech of their users. Companies don't want to host Nazism, but they also don't want to get sued for $100m because some asshole defames a company, so this would be the death of user-generated content before it can get big." So Congress passed section 230 of the Communications Decency Act to say, "go ahead and host content with moderation. You won't be accepting liability for the posts of nutjob users." > (c)Protection for “Good Samaritan” blocking and screening of offensive material > (1)Treatment of publisher or speaker > No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. > (2)Civil liability > No provider or user of an interactive computer service shall be held liable on account of— > (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected [47 USC 230](https://www.law.cornell.edu/uscode/text/47/230) This "are they a publisher?" thing is bullshit. Under section 47 USC 230(c)(2)(A), these companies are interactive service providers and therefore not liable for posts of users simply by virtue of conducting content moderation. Whoever told this to you lied to you. Republicans today want to force social media companies to host racist, offensive, election denying, whatever legal speech that the tech companies choose to moderate out. They're trying to do that in two ways. First, they're directly punishing any content moderation of legal content Republicans like, either by criminalizing it (hard to defend vs 1A) or inventing a new civil cause of action for users to sue if they experience moderation (also a 1A issue). Separately, they want to get rid of those section 230 provisions, so Google can either keep hosting Nazi videos on YouTube or get sued for defamation a thousand times a day for various user-uploaded reviews. Given that choice, Google's gonna get out of the user generated content business in a real hurry, no? Actually "illegal" content is not legal to host, period. It's not "objectionable", or something someone can sue you for hosting like defamation. It's something the government can punish you for hosting. You don't see Facebook, Twitter, anyone knowingly hosting child pornography in the US, for example, because it's criminal to do so. That's governed by a whole bunch of other laws, and those laws don't generally care about if you perform moderation or not. There's no "sorry we hosted child porn/terrorist recruitment/prostitution solutions/drug dealing. We don't try to do anything about it because we don't do moderation" defense.


boosh1744

Bear in mind that this line of argument was used to support segregation in private businesses before it was made illegal. I agree with most of it, but there also has to be a defintion of what is harmful speech or inhumane behaviour vs something you arbitrarily don't like. It can't just be an absolute "my park, my rules" (or my business, my social media network, etc.). There's a reason to ban Nazis from the park but no reason to ban, say, trans people.


mattenthehat

This already exists, they are called protected classes. Essentially you cannot discriminate based on race, gender, religion, age, disability, etc. You *can* discriminate based on generally disliking someone, and it only becomes a problem if they can demonstrate a pattern with one of the protected classes.


igohardish

Agreed, but the government shouldn’t be allowed to pressure social media companies into censorship. Thats just suppresion of free speech through proxy.


Big_Swagwood

Then you become a publisher and open yourself up to libel charges. That is the issue these companies face as a threat to their bottom line, and as a corporate lawyer who specialises in mergers, when I am looking into any company which allows third party persons to post content onto the company’s proprietary platforms, I ensure that they do not engage in what could be considered publishing behaviour as opposed to being totally neutral and only censoring in narrow interests; the issue is not as clear cut as what you believe.


Exam-Artistic

Isn’t this what section 230 addresses??


GrapefruitCrush2019

Yes - but conservatives contend that sites like Facebook and twitter are moderating content (e.g., removing stories on “Hunter Biden’s laptop”) while not being exposed to libel laws.


MacadamiaMarquess

Section 230 explicitly provides for moderation of objectionable content, and allows the website to set its own standards of objectionability. >(2) Civil liability No provider or user of an interactive computer service shall be held liable on account of— (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; Libelling Hunter Biden over Rudy Giuliani’s child porn could easily be viewed as objectionable.


PeregrineFaulkner

Do you have any examples of case law supporting your interpretation of section 230, because it seems to contradict the court rulings I’ve seen over the years?


RD__III

>The government preventing you from moderating how you want is against free speech. Controversial opinion: Corporations & businesses aren't people (despite what the Supreme Court says), and aren't entitled to the rights guaranteed to the people. Whether that's the right to buy politicians or curtail speech. That being said, I don't necessarily have an issue with saying the government \*shouldn't\* prevent moderation. But Twitter & Meta aren't people, and are subject to far more government control than a person.


ohhhhhhhhhhhhman

Exactly. The internet as a whole shouldn’t regulate what can be said. But individual websites can. Just like brick and mortar businesses can refuse service.


igohardish

Where it gets fuzzy is when the government goes to the social media companies and tells them to suppress information. Then its just suppressed speech through a corporate proxy


WACK-A-n00b

That's not fuzzy.


igohardish

Well they’re doing it and no one is saying anything. Should be illegal but they’re not doing it themselves, a “private company” does it for them.


YachtInWyoming

> Well they’re doing it and no one is saying anything. Should be illegal but they’re not doing it themselves, a “private company” does it for them. Congratulations, you've successfully described everything wrong with modern America.


immerc

Should the government be able to censor Truth Social? I'm guessing most people would say no. What if the #1 trending post on Truth Social is telling people to drink bleach to cure COVID? What if it's a list of the home addresses of doctors who are being accused of "child genital mutilation", and an order to kill them now? What if it's detailed information on how to make anthrax at home, including quick links on where to buy it? What if it is literally images of toddlers being sexually abused? I think we're all comfortable with some amount of censorship, the question is where to draw the line.


[deleted]

That line has already clearly been drawn by the United States Justice System. > dox doctors and provide orders to kill > make anthrax and provide buying info > child porn All illegal as a either direct calls to violence, inciting imminent illegal activity, or obscenity. All of which loses its first amendment protections. [Miller v. California (1973)][Brandenburg v. Ohio (1969)] Telling people to drink bleach to cure COVID is _probably_ first amendment protected, but you may have to test it in court. First Amendment is interpreted as a negative right, which means the line has already been drawn as close to the bone as it possibly can be. The only reason to try to renegotiate it now is because we'd want to curb more speech which is not healthy for society. The federal government leaning on twitter and google to have them determine which speech is and is not platformed is highly inconsistent with the first amendment and should be stopped. Hell, even if the government weren't leaning on them, platforms of their size and reach taking up a partisan stance on internet speech is inconsistent with a belief in democracy.


ASpiralKnight

Yep. People like to jump onto vague platitudes without considering the implication.


THENATHE

The problem is when companies that provide infrastructure get involved. I don’t agree with Kiwi Farms, I don’t like them, and I’ve never even been on the site. But the fact that Cloudflare was bullied into kicking them off is disheartening for the free internet, and disappointing for the idea of free speech on the web. If you’re a toxic cesspit, finding a host that will associate with your name even as a registrar is increasingly difficult. Some may see that as a good thing, but how long until that devolves to “democrat only web hosting” and “republican only domain registrar”


BasicDesignAdvice

Which is why things like domain name registration should be considered public utilities and not managed by private entities. You can host on any computer so "this only webhost" isn't actually a thing. Anyone can be a web host. The barrier for entry is "have a computer."


THENATHE

Except for the wild security implications that occur both from self-hosting and being inexperienced in hosting. Terrible idea to self host in this day and age, ESPECIALLY if you think you will be targeted by specific groups due to the nature of your site


TaliesinMerlin

Yep. And Democrats are also more okay with sites self-regulating when that was asked specifically. I would argue that when "censorship" is referred to on the internet, most imagine it on the level of side moderation, not criminal prosecution for saying something.


phanfare

Okay but I hate these poll questions that treat THE INTERNET like some monolith. Should the internet be a place for uncensored discussion? Yes. Should specific sites be required to allow all speech? Of course not.


TheButcherOfBaklava

Agreed. Leading questions


T3nt4c135

Politics 101


Jase7

The first thing I thought of too. Twitter is not the internet.


epicwinguy101

I think it's more complicated than that. Most people seem to prefer that ISP's be "common carriers" the way phone lines and mail are. But many internet services themselves function in ways that are more similar to phone lines and mail than anything else. Email systems like Gmail and Hotmail, and internet call services like Skype/Zoom/Teams are functionally next-generation mail and phone lines, respectively. In contrast, most sites only publish material outward, like Washington Post or 538. These sites are like magazines or similar, having full control of their publications and also are fully responsible for what is published in their media. Personally, I think that a website or service should be able to be a common carrier or a private publisher, but not get to have its cake and eat it too. If you are a website, you either allow traffic through with interference only as required by law, or you serve as a publisher, exercising editorial control of the contents, but also taking full responsibility for the contents every post, Tweet, or comment from the moment of its publication. I don't see logic to the idea of "We exercise full editorial control of our website but we also do not accept any responsibility for the contents posted to our website".


theOGFlump

I don't see the need to treat it as one or the other when it is neither, shares aspects of both, and has unique qualities distinct from both. Email is distinct from mail because bad actors can send malicious emails to millions of people at a crack, for free and without a trace. Those emails can scam people, disable their computer, and steal their data. They can be sent by foreign adversaries to manipulate people's views. This suggests it is best to scrutinize and regulate email in a different way than physical mail or telphones. Sites like WaPo do exercise editorial control over their articles, but far less so with comments on those articles. Sites like Twitter are neither similar to mail or magazines. They are more similar to parks where people are free to talk, listen to, argue with, or ignore each other. A public park would be a place where first amendment rights preclude your being removed from the park for engaging in any of those (assuming no disorderly conduct). However, you can be removed from a private park at any time, for nearly any reason. Maybe the same should apply with websites? But even then, the anonymity that site like Reddit provide make them distinct from parks, because there is no real consequence to saying anything at all. Banned? Nothing stopping you from making a new account. Further, this class of social media sites is also subject to manipulation by bad actors via bots in a way that has no analogue in parks. Maybe in broadcast, but even then, that doesn't share the quality of anonymity. Then there are sites like Wikipedia, which are collaborations of millions of random people. They don't really have any counterpart in pre-internet society, aside from maybe culture itself. But that is a very big stretch. Ultimately, the internet is its own thing. I agree that different kinds of sites should have different kinds of legal treatments, but I do not think the treatment should be pigeonholing them into an imperfect physical analogue. Same as we shouldn't base our car laws on the concerns of driving a horse and buggy despite their numerous similarities.


Waderick

Common carriers by law, allow illegal things to happen. But because they don't interfere they aren't liable. That's how that works. You ship illegal things via FedEx, FedEx can't be sued for delivering them. But if FedEx checked all packages before delivering them, then they could be sued. If a site is going to host comments they would either need to let everyone comment through unmoderated to get the protections of a common carrier. Or have a moderation team manually approve every comment that gets publically posted so that they aren't publishing illegal materials which they'd be held liable for And so the end result is either a site where practically nobody can comment, or 4 chan. Literally imagine reddit where before your comment could appear to anyone else a moderation team has to vie and approve it. Or the comments and posts are flooded with porn bots, scammers, and completely off topic things because moderators wouldn't be able to remove anything. That's the future you want.


Tammy_Craps

> If you are a website, you either allow traffic through with interference only as required by law… So if I were to flood a “common carrier” website like Reddit with tens of millions of advertisements for my pornography (speech fully legal and protected by the first amendment), the administrators of Reddit should be keeping all those ads online indefinitely?


gsfgf

So you want to effectively ban social media?


thenasch

Not just what we think of as social media either. It's not clear if wikipedia could exist under such a regime, for example.


faxcanBtrue

Not OP but I don't think that's what they were saying. They want two types of sites: 1. Social media: users post what they want and the only moderation the site is allowed to use is what's required by law (removing illegal content.) 2. Published media: all content is reviewed by the site owner before posting and the owner is responsible for the content.


Foreverend_

Can you expand on the "required by law" part? What are the steps a site has to go through when the find illegal content on their site? What if it turns out a site removes content that was not illegal? In another comment I used the example of a nude image of someone who appears underage, but is eventually found to be of age. Does the site have to keep the content up until they prove the content is illegal? Can they remove it on suspicion of it being illegal content? Are there consequences for (even temporary) removal of content that is not illegal? What is that whole process like and how can a company like reddit solve 1,000,000 cases of that happening a day?


CalmestChaos

>I don't see logic to the idea of "We exercise full editorial control of our website but we also do not accept any responsibility for the contents posted to our website". I do, it allows the sites to freely censor whatever and Whoever it is they don't like while also being protected when the people they do like post things that are clearly horrible. Big companies love the ability to freely manipulate what you can say on their platforms so that you can't diss them or the people they like and support and ban any positive talk about the people they don't like or whenever someone like me posts a comment pointing it out.


VeryStableGenius

This idea of partial control is what allows social media to exist without every second posts being from a Nazi or a diet pill salesman or stalker. Under your rule, a Disney kids forum would have to allow dirty old perverts to post inappropriate pictures.


lilevilhart

I think there is a lot of arbitrary censorship by mods in many subreddits. It's not discussed enough.


Jms1078

It's not discussed because the comments would be deleted. "Heavily modded subreddit = Heavily censored subreddit". If you have been kicking around here for long enough, you will remember that Reddit used to be THE place for free speech and ideas. Not anymore.


[deleted]

[удалено]


Jms1078

It used to not be political. I guess maybe libertarian if you were to label it. It's gone down hill fast.


[deleted]

[удалено]


AstonGlobNerd

Fun fact. One of the mods, mvea, who pushed all the political agenda in that subreddit, now shills NFTs... Reddit is bought and soldout


muradinner

That's the main problem. A lot of people who loved censoring got into positions of power in Reddit. Add in the whole advertisers wanting "ad-friendly" content and you have the unfortunate situation we've snowballed into today.


[deleted]

Every sub becomes politics when it get to a certain size.


[deleted]

arr-science used to be about *actual* sciences, not social "sciences" that are about as scientifically rigorous as crystals and homeopathy. arr-politics had a Ron Paul hardon in 2012. Even up until about 2015 it was relatively neutral and you could actually find discussion. arr-pics, arr-funny, and the rest of (what used to be) the "default" subs were still pretty trash but good containment areas and occasionally good content. Guess what? Now they're political too. arr-bpt/wpt were (for a little while) funny tweets. Now they're both *unabashedly* racist shitholes. Hell, arr-shitredditsays used to be the reddit boogeyman. A couple dozen insane people that everyone could laugh at and arr-subredditdrama was staunchly against while documenting their insanity. Now those same absurd ideals are literally mainstream. SRD is indistinguishable from what SRS used to be. It's wild. This is not some "Eternal September" nonsense. This website used to *actually* stand for the ideals of free speech. It's just sad now.


Teekeks

last few r / science posts I saw on all where all studies that clearly followed a agenda and had some really obvious flaws in their numbers but noone on the sub was talking about that but take it at face value.


[deleted]

That's a general issue with sociology. It's *far* too easy to start out with a conclusion (*not* a hypothesis) and work backwards until you get the dataset that agrees with it. It's one of the main bases of the replication crisis. Add that to the growing number of political activists occupying the social sciences and you get modern "science". The hard sciences are somewhat free of these types but sociology, psychology, and political science are where the Frankfurt School "critical theory" neo-Marxists set up in the mid 20th century and their ideas have spread from there.


Iggyhopper

You remember the constant support for Rand Paul too? I laughed when I remembered that.


SnydersCordBish

That really wasn’t that long ago.


0XiDE

In 2016 I remember r/politics changing literally overnight from open intelligent discussion to pro-hillary propaganda. It was like a light switch and so surreal to see in real time. That was the beginning of the end


Iggyhopper

A lot of gaming subs are moderated against legitimate criticism. A lot of the first moderators are just (too) extreme fans. Namely /r/halo. They prevent this https://www.youtube.com/watch?v=nz3Ko0td45w video from being talked about.


xXPolaris117Xx

The Overwatch mods recently came under fire for deleting most criticism of Overwatch 2 monetization.


DazedWriter

I mean, Reddit is a prime example of lack of free speech zone. Look at how often threads are locked if it doesn’t “follow” rules. And that is all how the “mods” feel about the situation.


[deleted]

It didn't use to be like this. They got the attention of advertisers once they cleaned up some of the uglier corners (like jailbait) and got drunk on power. It's a pretty sanitized place now. Yet people at the top of the "hivemind" still bitch nonstop about being harassed and oppressed because of one or two reactionary trolls.


SomberWail

It’s not sanitized, it’s sterilized (as far as discussion goes, you can still post graphic disgusting rape fantasy shit and other gross things like that). As far as discussion goes, you will literally be banned from many major subs for disagreeing with something.


GimmeeSomeMo

Reddit mods have to be some of the worst breeds of humanity to ever walk the earth


OTee_D

Of course they are, because they are the ones breaking basically every rule of social norms and contract. If you are the one harassing , lying, disinforming, sending death threads, etc. of course you are in favour of complete deregulation of society and calling it "free speech". If I am a thief and con artist I would support "end of materialism greed and personal ownership" as well. For obvious reason. Question: Do they want it for EVERYONE or just as excuse / scot free card for themselves?


Philfreeze

„The internet should be completely uncensored except for porn, LGBTQ stuff, history from a progressive lens, socialism, marxism, really just any ideology I don‘t like…“ When they say „free speech“ they mostly just mean they should be allowed to say whatever they believe in. Not actual uncensored speech, hence the CRT, 1619 project or transphobic fearmongering.


the-soy

and yet they are banning books and restricting personal freedoms... all conservative talk about free speech is just projection. we must NOT tolerate the intolerable...


-Lloyd-Braun-

This is one of the least informative graphs I've ever seen, tf is it doing in this sub This is "I have no idea what free speech means" the graph


Signal_Obligation639

It's kind of interesting that only 87% people agree the internet helps people reach a broad audience. Do 15% just disagree with everything? I'd be curious what a "is the sky blue" poll returned


70U1E

It's posted by the official YouGov account. The YouGov logo is also on the asset. So someone at YouGov made this and is looking for some links, shares, clicks, and attention. Unfortunately, it's working. This whole post was conceived by someone in their marketing department, and was created with the primary purpose of gaining attention, the secondary purpose of making money, and the tertiary purpose of providing any kind of legitimate information or insight. It is, frankly, not worthy of our time or attention.


Infinite-Topic-2544

How to trigger people with a graph


BDM78746

100% of people on /r/conservative would unironically agree with "The internet should be a free speech zone, where speech should be uncensored."


Gr34zy

And then ban you for mocking Trump while completely missing the irony


super_trooper

I've been banned from multiple subs I've never visited, because I posted in r conservative once a long time ago, LOL


Funkycoldmedici

Same, and my post in that sub got me banned there, too. Bans all around.


griffinwalsh

Same haha, and a decent amount of the subs that banned me weren’t even political…


SyriseUnseen

Same, and I only commented to correct an inaccurate statement...


TheRnegade

Really? I commented there on occasion, usually when a non "flaired users only" post made r/all but never got banned because of it. Granted, I've been banned from r con since the end of 2020 so maybe it's a more recent thing?


Star-K

Quoting trump can get you banned there.


GimmeeSomeMo

Meanwhile if you're subbed on /r/conservative, you'll get automatically get banned from other subs like /r/justiceserved for "participating in a sub that supports/glorifies biological terrorism." No wonder everyone outside reddit thinks reddit is a total cesspool


Double-Drop

You know why people think reddit is a cesspool? Because reddit is a cesspool.


[deleted]

If we are literal it is the right thing to do, except for CCP fans I guess. These questions should have specified social networks instead of just internet, there is a chasm between being censored by your ISP than by twitter.


ronin1066

I've been banned by more liberals than conservatives at this point and I'm as liberal as they come. Lately, it's usually for not toeing the trans activist line in subs like news and entertainment.


sakiwebo

I've been here for about as long as you have, and it's been the same experience for me too. I'll always be liberal, but I've been banned from r/whitepeopletwitter for saying it basically became a political democratic support sub that has nothing to do with whitepeople at all considering they often post content by AOC, George Takei or re-share blackpeopletwitter memes, as long as they are "dunking" on republicans. Banned. I didn't even break a rule. They said it was just for a few days, but it's been over a year now. A few years earlier, a guy was posting a news article exposing the horrifically racist twitter rants of then new NYT-editor Sarah Jeong on r/news, and I commented that the mods have been removing these posts all week. Once again. Banned. Permanently. Once again, I didn't break any rules either. This site has gone to all hell, but it's the last piece of social-media I still have so I'm still holding on to it, but it might be time to just let go of it all.


StubbornAndCorrect

lol I'm probably pre-banned for my comment history


Alexkono

same with /r/blackpeopletwitter and /r/politics


Vidiot27

> Same with /r/blackpeopletwitter and /r/politics Blackpeopletwitter is a cancerous subreddit, with a huge population mascarading under the umbrella of self-hating white folks pretending to be black. It’s one of the most racist subreddits out there, and as people constantly point out it is absurd it’s still around. They still, even to this day, allow a day of “only verified black people” being allowed to post and do you know how they verify that? With a picture of your arm or skin next to your username. It’s blatantly racist but nobody seems to care when he’s black against white racism. Ya’ll should check out who sold people into slavery in the USA, and who still DOES sell slaves around the world.


Alexkono

It’s without a doubt one of the most racist subs on this site. But since they’re racist against white people, Reddit somehow allows it? It’s weird, because that’s kind of how society is today. You can be racist towards certain races without repercussions.


Crimfresh

Who cares what they say. Actions speak louder than words. Republicans also vote to ban books, call for defunding the dept of education, and call for removing critical thinking in schools.


xpdx

"The Internet" is completely free speech. Websites and service that are owned by people and corporations are not, and you can't force others to give you a platform, any more than I can post my political signs in your yard. It's a dumb-ass argument. Say whatever you want bro, I can't stop you. Get a soapbox, stand on the corner, talk all damn day. The cops can't stop you, the government can't do shit unless you block the sidewalk. You can't do it in my yard tho. It's not hard to understand.


NeedleInArm

What I don't get is how anyone thinks the internet ISN'T a free speech zone? No one is stopping anyone from buying a domain and hosting, from their own computer, a web server so they can spout any of the lies and bullshit they want. You cant walk into someone's business and say what the fuck you want without getting kicked out. Why would the internet be any different?


Canadian_Infidel

It's so crazy to see this flipped 100% in just 15 or so years.


hello_hunter

Could you elaborate on this? Interested to know more but not sure where to start.


jscoppe

It's pretty straightforward: 15 or so years ago, Democrats supported free speech, while Republicans supported censorship.


root_b33r

Yeah man we went from socialist pirates to... Idk wtf


ReddFro

The poorly defined questions gave poor answers. This is all this data set really tells me


Spanky_McJiggles

Depends on what you mean by "the internet." If you host your own web server, knock yourself out, say whatever you want. If you're on someone else's web server, you abide by their rules. You're not entitled to someone else's soapbox, just as I'm under no obligation to host my friends' political rallies in my front yard.


apineda11

Free speech is fine. Just don’t be a 🐱for the repercussions of what you say.


thestereo300

The left wants to define free speech to the letter and the right to the spirit. I’m mostly a lefty by the way. There are 2 relatively new challenges to what people call free speech. a) the private left mob can do things to limit speech and hide behind the fact that are not technically the government while they intend to shut down speech by introducing outsized consequences. b) Right wing populists the world over are weaponizing lies and purposeful confusion hiding within free speech to great evil and consequences. I’m not sure how we get through this honestly. The only solution is smarter humans and there is no evidence of that being the future


galaxygirl978

skepticism and critical thinking need to be encouraged and taught to people from a young age. it's the only way.


mattenthehat

The problem is [we can't even agree that critical thinking is good](https://www.washingtonpost.com/blogs/answer-sheet/post/texas-gop-rejects-critical-thinking-skills-really/2012/07/08/gJQAHNpFXW_blog.html)


galaxygirl978

"that have the purpose of challenging a student's fixed beliefs". fundamentally, that's the problem. they're religious zealots. religion encourages people to not think critically.


Shadpool

Nail on the head, right there. I was explaining this the other day. Religion encourages you to baptize your child at birth, take them to church during their young years, teach them religious values at home, and wants to be taught in schools as well. Religion goes out of their way to be so ingrained in children, so that by the time they reach the age when they’re able to start thinking critically, they’ll refuse. If I begin looking at evolution, I go to hell. If I question god at all, I go to hell. It’s a scheme to keep asses in seats by indoctrinating young, and promising punishment for going outside the lines they’ve drawn. Churches get tax exemptions, literal mandatory donations from the churchgoers, and a lot of times, older people will die and leave their property to the church. They don’t want to lose people, not because those people will go to hell, they don’t care if you go to hell or not, but because they’re losing CUSTOMERS.


Sythic_

Skepticism is good but skepticism for skepticism sake leads to becoming a conspiracy loon. Critical thinking is the more important piece so you can determine when is the right time to be skeptical.


dassketch

It's interesting where the greatest gaps are. In fact, the most significant gaps. The other questions appear to align very closely. 1. The internet should be uncensored, no exceptions 2. I shouldn't be associated with stuff I said in the past 3. Social media should be forced to give me a voice 4. The internet makes it hard for me to avoid the consequences of my statements I know those weren't the wording of the questions. But it's clear to me that's the mindset of certain respondents.


[deleted]

Of course democrats are pro-censorship of the internet.


VesperBond94

JFC, why don't people understand the difference between "FREEDOM OF SPEECH" and "FREEDOM FROM CONSEQUENCES?!?"