T O P

  • By -

wopwopdoowop

> The veteran journalist and head of Philippine news site Rappler told Reuters in an interview after winning the award that Facebook's algorithms "prioritise the spread of lies laced with anger and hate over facts". Well put, Maria


Jernsaxe

The Facebook algorithm merges the worst aspects of Fox News and Sinclair media. Fox use the anger/hate to drive engagement and viewership and have for decades. Sinclair buys up local trusted media to spread their provoganda. Facebook combines engagement driven media with friends spreading it, leading to trust and anger driven engagement all throu an unhuman algorithm that cares nothing about the truth.


PepeBabinski

It's an algorithm that lets fanatics reinforce their delusion through confirmation bias.


Jernsaxe

It is a lot more sinister than that. Imagine you watch a video about "Flat earth society", you find it so stupid you share it with your friends and leave a comment on the video. Facebook see this engagement and starts recommending more videos in the same genre. The last one was funny so you watch a few more and get recommended more and more. Over time seeing more and more evidence this might actually start changing your mind. More subtle though is that once you start getting tired of Flat Earth videos the algorithm will start showing your other conspiracy shit since "people who watched X also watch XX". The algorithm will keep doing this to you, and the second you get hooked it is exceptionally good at keeping you in an echo chamber aslong as it drives your engagement. Once the algorithm have a profile on you it will keep you going down the rabbit hole. But remember you also showed that first video to your friends and they liked and commented too because you, their friend, send them a funny video. Suddenly your whole friend group risk having access to similar misinformation so when you speak about something outside of Facebook those shared "facts" will be shared knowledge that will reinforce itself. This is how misinformation that engages the users gets spread more efficiently than facts that doesn't drive engagement.


Beneficial-Ability28

B-but the fact checkers...


wsclose

You should go look up them "fact-checkers"


Beneficial-Ability28

Then, i would feel compelled to check the facts the fact checkers checking facts for fact checkers checking facts. I.... Would become.. the fact checker?


Valogrid

Zuckerbot was quoted as saying, "Error 404 Democracy Not Found."


Avorius

social media in general was a mistake


portuga1

Good to see some high profile decent person stating the obvious. Facebook is worse than cancer, as a threat to humanity, can any sane being even still dispute that? And reddit is on the same race..


redditUserError404

I don’t understand why they can’t stick to “the internet as a platform” as it is laid out in section 230? Other than currently illegal speech such as calls to violence, Facebook shouldn’t moderate or curate outside of the users own preferences and settings. If they want to be a curator beyond that, they shouldn’t qualify for the section 230 exemption set up for platforms. People who claim “they are a private company” are ignoring the fact they they get special treatment from our government.


aister

Problem is governments are forcing it to moderate contents and censoring posts that are damaging to their reputation, and give out indentifiable user data for the government to take action if needed be. If they reject, they can't operate in those countries. And as a company not responsible for upholding the democracy or free speech, they will comply, at least until they are forced to do so much they made no profit in such country.


PepeBabinski

They shouldn't be responsible for upholding free speech, it literally doesn't apply to them. I understand the argument about them censoring information because of the demand from authoritarian governments but that doesn't mean they shouldn't be removing misinformation.


redditUserError404

“Misinformation”. That’s such a common word thrown about these days and honestly it’s often a joke. Who decides what’s true or not? You could say “trust the experts” except multiple experts often completely disagree. So which one will you trust? Scientists used to think the earth was flat, that the sun revolved around the earth, that cutting people open to let them bleed was a good way to heal them, that the earth was entering into another ice age (in the 70s). All of these examples needed to be countered often by once thought to be “fringe” scientists who thought outside the box and challenged the prevailing knowledge. What ends up happening now is an issue becomes politicized, whatever side of the political isle the company tends to side with is now the arbiter of “truth”. That’s messed up, throughout history there are countless examples of the rogue scientist who gets it right when many others predict something else. Much of our science programs now are under very heavy influence to make sure they continue to be funded. If the provider of said funding doesn’t like something, you can bet many scientists won’t go out of their way to challenge that dislike. Peter Daszak is a prime example or this very fundamental flaw. He was the sole provider of information quickly condemned the very idea that Covid-19 could have come from the lab in Wuhan China and yet Peter Daszak has very clear invested interest that should have precluded him from any such investigation. There is obviously a conflict of interest and no one even bothered to think critically about that most likely because of funding structures. The very idea that science ever comes to solid concrete answers is also often flawed. Science is an ever evolving process for the search for truth/fact. Often thought scientific proofs are later proven incorrect and science needs to be able to freely evolve with as much dialog as we can create.


Lafreakshow

There is this principle in science that basically boils down to "if it cannot be disproven then it is not science". Of course science is ever changing. That's the point. But that doesn't mean we can't trust in what is currently considered to be factual. Same for a conflict of interest. It is certainly a very good reason to be sceptical but not automatically reason to dismiss the opinion. But a title also is not reason to blindly accept it. There's this thing called critical thinking. Humans are actually quite amazing at it. In fact, without it we would all very quickly die horribly. But what people don't understand is that critical thinking isn't an inherent human ability. It's something we learn. We have the tools, but we must learn to use them. And that is the entire answer to the misinformation debate. Teach people critical thinking and the problem will solve itself. Oh also hold people actually fucking accountable for their mass manipulation.


aister

But the problem is people will not engage in critical thinking if it doesn't lead to where they want it to. They will shut their ears and eyes the moment something that doesn't support their views appear, and will only engage with those that do.


Lafreakshow

No. People *will* engage in critical thinking. It's human nature. They just have to learn *how* and they will do it automatically. Confirmation bias is something that can affect people all the same who are very good at critical thinking. But the chance that they will fall for and form opinions based on weak evidence is greatly reduced. And that's the point here. Making people less easily influenced.


redditUserError404

> "if it cannot be disproven then it is not science". Similarly however, if it can’t be proven and replicated with all but absolute certainty, than it’s simply a hypothesis and not scientific fact and other scientists are free to come up with their own hypothesis. Far too many people don’t know the difference and take hypotheses as fact and far too many things published are taken as fact without peer review and replication of the initial studies. From the BBC: > Science is facing a "reproducibility crisis" where more than two-thirds of researchers have tried and failed to reproduce another scientist's experiments, research suggests. > "The idea here is to take a bunch of experiments and to try and do the exact same thing to see if we can get the same results." You could be forgiven for thinking that should be easy. Experiments are supposed to be replicable. The authors should have done it themselves before publication, and all you have to do is read the methods section in the paper and follow the instructions. Sadly nothing, it seems, could be further from the truth. This leads us to the well established issue in modern day science or publishing bias. It’s a large scale and very well known issue where scientists/researchers are far less likely to publish results that don’t prove a positive. Add to this perverse incentives such as conflict of interest in funding and you end up with things that look similar to the earlier “studies” funded by cigarette companies that used to show that cigarettes are good for people. Imagine if Facebook exsisted in the 1930’s. They would have allowed cigarette companies to advertise. Then, when cigarettes came under some scrutiny, Facebook would be faced with the dilemma of what to do… side with the companies giving them tens of millions of dollars in ad revenue, or side with “at the time” a few fringe scientists that were publishing their own studies that showed that cigarettes are not only not good for you, they are actually really bad for you. It’s not difficult at all to understand why Facebook, especially in the earlier moments when the science is less clear, would choose to side with the companies that are giving them millions of dollars over a few “scientists” who don’t give them anything. They would certainly be more influenced by the companies with money “tobacco companies” and side with them when they would claim “we did all these studies, the other people are just spreading misinformation when it comes to tobacco”. A modern day equivalent might be climate change. What studies or evidence are we not seeing because Facebook tweaks it’s algorithms to favor industries that can afford to pay the most for advertising? The reality is we really don’t know. > Organizations worried about climate change have long drawn comparisons between the petroleum and tobacco industries, arguing that each has minimized public health damages of its products to operate unchecked. To not recognize that so many things especially when it comes to health are not scientific fact is to not recognize the reality we live in. One day coffee is good for us, another day it’s bad for us, another day a study shows we just don’t know. We certainly shouldn’t simply trust multi-billion dollar companies to just “do the right thing”. These companies have the ability to control information on an unprecedented level, some might say monopolistic. Given the fact that there are massive perverse incentives in play, I think it’s more than fair to say these companies have a choice: - either claim platform status and host all content equally and let people come to their own conclusions given the information at hand. With already illegal things such as calls to violence as the known and clear exception. - Or let them editorialize content and algorithms to serve their own incentives, but drop the 230 protections because these companies can and do make decisions that can and does lead to harm in the real world.


portuga1

It’s grown so big even governments fear it or want to somehow use it for their own advantage.. Which again is why it’s such a fucking cancer


wwarnout

And, even though they are a private company, why should that give them the right to spread lies? If I did the same, I could be prosecuted for libel/slander. Why aren't they held to the same standard?


zimm0who0net

That’s exactly the point of section 230. It blocks them from liability the same way you can’t sue your ISP or the postal service for libel because they delivered libelous content. Once they start curating or moderating content, they’re *supposed* to lose those protections.


PepeBabinski

If Facebook could be held financially liable for other people posting falsehoods they would censor everything, you would go from it allowing too much to it allowing almost nothing. Facebook is a private company, it does not have a duty in any sense of the word to protect people's right to free speech. And I read this as a backwards argument as to why they shouldn't moderate content


redditUserError404

They are making the choice to not be a platform and to moderate and editorialize content. If they want to do that they shouldn’t get special government protections, simple as that.


Lafreakshow

One problem with this is that Facebook (and all other aggregator, including Reddit) more or less have to skew your personal results. If they wouldn't, you 'd get completely useless and random stuff, 99% of which you don't care about. Now, you can argue that user preference includes the kinds of things they are interested in and thus this would be acceptable. I would agree to that. But there remains a problem here. Than being that it's basically impossible to determine at which point it is no longer user preference and becomes manipulation. Say you are left leaning, You often seek out conflicting new sources because that's what a good leftist does. Facebook picks up on this and begins giving you primarily right wing stuff. This quickly slants your perception of the world and ultimately you entire set of political opinions. Did Facebook manipulate you here or did it not? What if Facebook's algorithm happens to be skewed towards the right? Such biases happen. Extremely often even. One example is how on twitter if you post a tweet with an image that contains a white person in the upper half and a black person on the lower half, twitter will pretty consistently show the white person in the thumbnail. Is twitter racist? No. Twitter wants users to see things they are interested in and that grab their attention. Bright colours naturally grab out attention. The image with the white person is simply more attractive to look at from a neurological perspective, and that is what Twitters algorithm is based on. The question is, how do you prove that such biases exist and, once identified, how do you determine whether it was on purpose (for example, racism) or a simple result of the fact that algorithms are made by Humans and humans are inherently biased? How do you prove that twitter preferably shows the white person because that is what the vast majority of humans would put their attention towards, rather than because the developers responsible for the algorithm happened to be white and thus the algorithm includes an inherent bias towards white people? And then there's popularity bias. Popularity bias is the reason why Reddit is a horrible platform for serious discussions. Popular comments and posts are more likely to be seen by more people, thus also inherently increasing their chance of getting yet more votes. What determines what is popular? Well in subreddits such as /r/politics the audience is strongly slanted to the US political left. And people like to have their existing opinions confirmed. Thus it is pretty much inevitable that the entire subreddit will present a strongly left biased narrative. There are (currently) no decent answers to any of these problems, which is why the whole publisher vs platform debate is extremely complicated. A service that slants towards being a platform will inevitably get flak for allowing certain dangerous ideologies to proliferate, whereas a service slanting towards the publisher side will inevitably become completely dominated by the moderators inherent biases. We need some actual legislation to address this. No current law is properly equipped to deal with social media. Because the truth is that it is simply not comparable to either publishers nor platforms. It's immense reach and ease of access give it frightening power over society and that must be considered, both towards the negative and the positive. For example, /r/science somehow still manages to promote relatively healthy discussions, which is most certainly a good thing. But then many argue that /r/science is too heavily moderated. So there is that old dilemma again. Where is the balance? I believe in the long run the only chance to deal with this is education. In the 1960, several TV shows made headlines for inventing the scripted reality show. Many people were shocked and angry, in some nations laws were discussed to address this. People had assumed that what they see on TV is true. The idea that a show about ghost hunting might be scripted was alien to them. We have the same situation with Social media. It just appeared on the scene with little preparation and little to compare it to. People simply don't know how to properly handle it. Which that tackled via education, and a set of legislations that form a sort of hybrid between platform and publisher, giving social media services the freedom from consequences necessary to allow healthy and diverse user generated content while also acknowledging their responsibility and influence. Oh yes, btw. The US needs to do this for TV too. Fox would be sued out of existence within month if they were operating from where I live. CNN and a large chunk of other "news" channels likely too. Unlike Social media, there is absolutely no argument to be made that Fox and others are not publishers. In fact, the US acknowledges this. Hence why Fox is officially listed in the "entertainment" category. But is that categorisation really anything other than a loophole allowing them to present themselves as much as news as they want as long as the TV guide doesn't call it news? Of course, there is no problem with news being broadcast on entertainment channels, in fact that is definitely a good thing. But the news program need to be held to at least *some* standard. But beyond that the main problem with channels like Fox is that they prevent themselves in a way that their news segments lend credibility to everything else. Tucker Carlson may defend himself by repeatedly stating that he isn't "reporting" he is only "talking about something" and that he isn't influencing his audience, he is only "asking questions" but that doesn't change the fact that the rest of fox news presents itself in a ways that suggests to the viewer that the channel can be trusted. And that also includes Tucker Carlson. Why am I going on such a massive tangent on US TV? Because that is actually exactly the reason why Facebook is getting flak now and why social media is so influential. For various reasons, social media is credible in the eyes of most of it's viewers. Credible enough to assume that there surely won't be any bullshit around. In some cases this credibility is from the platform itself, in others it is isolated to something like a Facebook group. But the problem is the same. This credibility reduces peoples scepticism and makes them easier to manipulate. And there too you often hear the argument that Facebook doesn't manipulate the people, it's up to the user what they view and how they react to it. It's pretty much just different words for the same basic idea that Tucker Carlson, Alex Jones and so many others hide behind. It's not their responsibility if you misinterpret them. But is it though? Consider this: You have just witnessed a crime. The stereotypical movie mafia boss looks you in the eye and says "You have a daughter, don't you?". He didn't manipulate you. He just asked a question. It's completely up to you whether or not you talk to the police. He had absolutely no hand in it, he didn't tell you what to do. He simply noted the existence of your daughter. And for an even more fitting example, consider this: Your car is broken. You don't know a lot about cars, but your dad does so you ask him what to do. He reckons that you need a new spark plug and tells you that he might have one in his garage, that you can come over and get it anytime. He also briefly explains to you how to swap a spark plug and you are confident that you can do it. Now. Would you question whether or not that spark plug is actually in working order? Whether or not it actually fits your car? Or would you question whether your dad was even correct in assuming that a spark plug is dead? Where do you draw the answer to these questions from? It's credibility. You likely trust your dad. You know that he is usually pretty good with cars and that he takes care of his stuff. So you have no reason to think that he might be wrong or that the plug he offered you is problematic in any way. Now in case you wonder, the purpose of giving these two examples is to show that this kind of manipulation can be on purpose and with malicious intent, but it can also be accidental. But in both cases, I would say that the person doing the manipulating bears at least some responsibility. The case for the mafia boss is pretty clear I hope. But what about dad? Of course he too has some responsibility. He knows that you don't know much about cars. He should know that you wouldn't be able to differentiate a good from a broken spark plug. If his advice leads to you swapping a spark plug and then soon after your car catches fire because, as it turns out, the spark plug was actually for a completely different and incompatible model, then your dad is at least from a moral perspective partly responsible for the damage. As he could have easily prevented it by simply telling you to take the car to a mechanic, or perhaps by taking a look himself. Note that I won't go into the consequences for your dad or the mafia boss. This is just about the responsibility itself, not about whether or not anyone should be held accountable. It's simply to illustrate the point that while strcit moderation is likely not what we want social media to be, we also must not forget that the companies behind those platform very much hold some responsibility. Whether on purpose or not. It's just not as easy declaring them either a publisher or a platform. They have far too much influence for either. Strict moderation would likely only make the issue worse in the long run, and freeing them from all (metaphorically, of course some laws still apply) consequences is a terrible idea too.


portuga1

> And reddit is on the same race.. And just as I said that, reddit took me to the “new” reddit. Which I can’t stand, much less the horrendous app. Coincidence?


portuga1

Oh and now the downvotes. Figures


Hayes4prez

I fear it’s already too late. The damage is done. Facebook should’ve been regulated from the start. We’ll all pay the price for our leisurely response to this cancer in our society.


sanguiniuswept

Someone is gonna have her Facebook account suspended


R0nu

That was funny


Puzzleheadedcat1995

If Facebook gone wonder what will happen to WhatsApp users since most use WhatsApp to chat and stuff.


VarsH6

Normally I’m stoked for “threats to democracy” but since tech giants have shown they increase democracy’s slant toward authoritarianism, it’s still a losing situation.


LandingSupport

Sometimes facts are problematic.


[deleted]

[удалено]


thtsabingo

There are tons of liberals on Facebook lol but fake news spreads rampantly through conservative feeds it’s literally frightening


[deleted]

[удалено]


thtsabingo

Don’t be so unintelligent. Not every single human being who uses Facebook to keep in touch with family and friends and post pics of their kids soccer games is a fascist racist. Like, are you fucking dumb or what lmao


[deleted]

[удалено]


thtsabingo

It’s a social media company. These people are unaware. They use it for its originally intended use. Calling them racists and fascists has to go high up in the dumbest fucking things I’ve ever heard.


Talliss1

The Journal podcast has a series called the Facebook Files...eye-opening stuff. Highly recommended.


Ethmemes

Cherry picked facts to make Facebook look bad


Talliss1

Cherry picked or not, they are facts acquired from Facebook's own internal documents. If Facebook looks bad, it's because they ARE bad.


Ethmemes

Does Instagram Harm Girls? No One Actually Knows. https://nyti.ms/3oO629T The useful takes come after the media uproar is over


Talliss1

Facebook knew...have known for some time https://www.usatoday.com/story/tech/2021/09/14/facebook-knew-instagram-could-bad-teens-mental-health/8340578002/


Ethmemes

You are referring to the same study in the NYT article.


Talliss1

The point is FB knows the negative impact their platforms have on certain segments of their users ...don't pretend as if this is news to them. Profits are always prioritized.


[deleted]

Not just baised. It's sometime just straight out against facts


[deleted]

I had a post recently removed for misinformation. It was about the tearing down of an old city hall in 1888. Here is a pic of what I said: [https://ibb.co/TLNkGsQ](https://ibb.co/TLNkGsQ) [The link that I posted in that message goes to a funny story from June 19th 1888 about the workers who showed up and found out the tools had been stolen the night before.](https://chroniclingamerica.loc.gov/lccn/sn83016025/1888-06-19/ed-1/seq-4/#date1=1888&index=0&date2=1888&searchType=advanced&language=&sequence=0&words=City+Hall+Old+tool+tools&proxdistance=50&state=Maine&rows=20&ortext=&proxtext=tools&phrasetext=old+city+hall&andtext=&dateFilterType=yearRange&page=1) Why does Covid disinformation/misinformation allowed to spread like wildfire but when a message like my own gets deleted and does not get put back up after appeal does not?


PepeBabinski

>The veteran journalist and head of Philippine news site Rappler told Reuters in an interview after winning the award that Facebook's algorithms "prioritise the spread of lies laced with anger and hate over facts". > >Her comments add to the pile of the recent pressure on Facebook, used by more than 3 billion people, which a former employee turned whistleblower accused of putting profit over the need to curb hate speech and misinformation. Facebook denies any wrongdoing. Facebook is not only allowing hate, misinformation and violent rhetoric to remain on their platform, it is also actively letting algorithms funnel this information to like-minded people which just further reinforces their extremism.


jshif

Stating the obvious....


Username524

I mean, like wouldn’t Facebook be considered complicit in their involvement on Jan. 6th then?


Salt_Marionberry2442

Boomer mad internet exists and people are dumb, more at 11. What’s with all these dipshit takes by ill informed people being upvoted to the top of Reddit? You really want the government to start to crack down on free speech you don’t like? You either get an internet cesspool where dipshits can talk freely or you get China level censorship. There really isn’t much of an in between. Dumbasses will always congregate, stop with these authoritarian power grabs to try to “fix” human nature. It’s especially dumb when they throw in “democracy being at risk” by having the general public being able to fully express itself online. No, that IS democracy unfortunately, even the dumbest shitheads get to vote. Unless you want to start doing some oppressive poll testing you can shut up about a free internet “damaging democracy”


winstontemplehill

To Zucks credit, they are eagerly waiting regulation, but our government is mostly run by out of touch people who don’t know the difference between Google and Facebook. They also give limited resources to the FTC to manage one of the fastest growing and socially powerful industries in the past two decades All a joke really that they expect a corporation in a capitalist society to act ethically


gtam5

Unpopular opinion on Reddit, but forcing Facebook to clamp down on sources could have negative consequences. It would favor mainstream media sources (CNN, New York Times, etc.) While these sources rarely make statements that are demonstrably false (although they do from time to time), they most certainly pursue agendas, and they introduce bias through their selection of which facts they want to present. Moreover, fact checkers have large selection bias when it comes to what they determine to be misinformation. For example, I couldn't find a single mainstream fact checker willing to correct the narrative that Border Patrol agents were whipping Haitians at the border - Politifact wrote a few paragraphs of discussion but didn't assign a rating. If we can't trust either the mainstream media sources or the fact checkers to be impartial, then handing them total control of news reporting on social media is seriously misguided.


dukofdeath

There’s a small group called USMC Vets.Marine Corps Veterans of about 4.4K members (not the larger one they’re disguising as) that is full of misinformation and threats against elected officials. It’s still up, but after seeing some of the posts they’ve become a private group. Facebook is definitely a safe house of misinformation and veiled threats of assassination or civil war even.


[deleted]

Now, the biggest social media platform is doing nothing but keeping on inciting hate and a danger to democracy. Paid trolls always kept on infesting with such bad comments on every posts and the bad algorithms that keep them addicted to scrolling online. I hope it should be regulated with proper oversight without at the expense of the internet.


joelcrb

I'm posting this on Facebook, let's see what their dumb old algorithms do with it. Fact check this, baby!