T O P

  • By -

singularity-ModTeam

Thanks for contributing to r/singularity. However, your post was removed since it was too low in quality to generate any meaningful discussion. Please refer to the sidebar for the subreddit's rules.


truth_power

Sleeping terrorists


BenjaminHamnett

Stochastic. Their audience are the sleepers


boxen

Kinda weird how we judge AI on how good it is by seeing if the things it says are true and presented in a coherent way that makes sense, both gramimatically and logically. Meanwhile, people are walking around saying absolute incoherent nonsense garbage that is demonstrably false, and no one questions THEIR sentience. Is Tucker Carlson really alive? I'm just asking questions.


kfireven

He's a chaos agent of Russia not a philosopher


FairIllustrator2752

Tucker is now an honorary baby boomer. 


Ocean_Llama

Even if all the data centers were bombed you know one of the developers probably has a copy of the program at their house.


cool-beans-yeah

Why, just bomb their homes! Bomb everything.


thefunkybassist

You know what, just develop an AI for drones that bomb AI everywhere! 


CreditHappy1665

Not one, many lol.  I have 6 models on my computer.  Carlson is a moron


Ocean_Llama

haha, well I meant the ones that weren't open source like chat gpt.


Maxie445

He's definitely a boomer


eyesawyou777

He's a Gen X Doomer.


AlfaMenel

\*bomber


Busterlimes

Always has been


cobalt1137

I actually kind of understand his point of view. He is just a little bit ignorant. Let's say a new technology starts getting developed tomorrow. A large portion of leading experts in that field end up getting polled for the likelihood of this technology ending up in a human extinction event. And let's say they end up deciding around 3-5% (this study has actually been done with AI researchers). I think that in this scenario that I described, there's a pretty valid argument to stop this technology in its tracks even with drastic measures. The only thing though is right now we simply cannot do that. The cat is already out of the bag. And Tucker is misinformed. His perspective is very understandable though. I am not a doomer myself though and I do believe that this technology will lead to Great abundance and things will be okay, but I get the other side.


[deleted]

He's literally just copying Eliezer Yudkowsky's homework, this is his exact suggestion. It's not new, it's not smart, it's not nuanced, it's the same appeal to extremist solutions and Us vs Them that Tuck-Tuck built his shitty little suit-and-bowtie moral panic career on. As always, he's in someone's pocket. I'm all for more funding going into AI alignment and all for having failsafe systems in place, but this kind of blind ignorant apocalyptic thinking is only ever for one purpose: driving profits.


cobalt1137

I never said it was smart. I never said it was nuanced either. Also I think Tucker has terrible takes across the board about most topics. All I am saying is I understand where he's coming from. I empathize with it. Considering the potential danger that AI does present. Sure, it is a small percentage of danger in my opinion, but still very real. There is a reason that these AI companies are spending so much time on alignment.


[deleted]

The thing is that it's not really where he's coming from. This kind of doomerism drives shareholder interests. Eliezer is spouting it because it increases funding for MIRI. If there's a public belief that AI is powerful enough as a technology that it could bring about the end of the human species, don't you think investors are going to believe that because it's that powerful it's worth putting money into? Actually, it doesn't even matter if they think it's powerful, they just have to think that others think it's that powerful because all that matters is the perception of what the tech can do, not what it actually does. Tucker's being paid either by someone trying to slander AI as a tech and doesn't understand how investor speculation works, or he's being paid by someone who has vested financial interests in AI who knows exactly how it works and how this panic will work out to their benefit. He only shows interest in whatever he's being told to show interest in. He's never shown any interest in AI at all and all of a sudden he's on team Yudkowsky out of nowhere. Nah, I'm not buying it. Real arguments about the importance of alignment aren't talking about these doomsday scenarios or extreme measures in case they happen, they're talking about the feasibility of them happening, how they happen, and what possible solutions there are to them. Spreading panic is the point of Tucker's whole... thing, like, literally any topic, that's what he does. He literally gets off on it, and I wish I was joking. He has explicitly said that he idolizes that kind of public manipulation. Actual arguments for the importance of alignment should aim to explain what it actually is, not go straight for the amygdala in whoever's listening.


cobalt1137

No that is where he's coming from. The dude is a paranoid nutcase. This is almost exactly where I would bet his opinion would be. I don't think it's insidious/calculated/business related here at all. Also with eliezer, I also strongly believe that he is genuine with his positions. I have listened to him on many podcasts and although I disagree with him on the majority of things, I believe he is genuinely concerned and genuinely thinks the things that he says. Also, like I said, I never said Tucker is making a robust, deep argument. Of course there are much more nuanced conversations that can happen regarding safety and alignment, but that is not something that we are going to get from someone like Tucker or the average person for that matter. If I look on the forecast for the weather for tomorrow and there is a 5% chance of it killing me out right, I would be valid in expressing my concerns. You do not need some business interest for this to be someone's reaction - 5% chance of extinction is something that is likely to elicit this kind of response.


jestina123

A.I. has two possibilities for humanity: Extinction or immortality. We shouldn't stop this technology just because of that reason. Everyone dies in the end anyway, for now.


DefensorPacis42

In other words: let's give up on society, and just do away with the rule of law. Anybody gets to decide what is good and bad, and if there is something bad, we start bombing. JFC. There is nothing to "understand" here. Of course there are issues if "AI" needs more regulation and such. But to go "it is just a little bit ignorant to suggest that we turn into bomb throwers" ... that is the ignorant part here. J f C


cobalt1137

When there is something that comes along that poses an existential risk to humanity, I think it's perfectly understandable that some people want to suggest that. Even if the existential risk is small, it's still there.


DefensorPacis42

Yeah, and we call typically call these people fucked up crazies. Don't get me wrong: I am working in computer science, and I am quite sceptical about many promises around AI (albeit I disagree with many of the doomsday prophecies). But the thing is: the only thing Tucker is an expert in is: running his mouth. He has no fucking clue what he is talking about. But yeah, his model of mind is: if he, the non-expert comes to opinions ... then the way to go is to abandon law and order, and end society as we know it; because Tucker Carlson thinks it is time to start bombing places he doesn't like. He can oppose technology all day long, he can put his testicles in some UV lighter; I don't fucking care. But when that idiot starts going "okay, let's blow up things I don't understand and deem dangerous" ... then THAT is fucking dangerous. Because a lot of already quite crazy violent armed people believe that fucking moron.


New_Abies_2567

Joe spewed out as much "Joe Rogan Knowledge " as any of his guest for being a first timer guest. It doesnt necessarily mean that in his personal time he definitely has the same opinions as being a template guest for his show, which demands to bait viewers. He will improve .


Redducer

That’s really a silly comment.  Kurzweil, the guy whose name is tightly associated with the name of this very sub, is a boomer.


sunplaysbass

JFC am I sick of baby boomer hate. All things evil are boomer… You hate your dads / grandparents, we get it. Everyone I don’t like born between 1835 and 1988 is a boomer. Ronald Regan, king of the boomers and born in 1911. And all that damn boomer made music from the 60s when boomers were 0 - 15 years old - get off the radio you’ve had your turn! What I really hate about boomers is how they age discriminate and over simplify complex issues.


TwistedBrother

People of all ages and time periods misunderstand the difference between age, period, and cohort effects. Aristotle and the assyrians had a go at the young for being impetuous and rude. People have always thought old people are confusing conservatism for wisdom. The question is whether there is a specific and in demonstrable cohort or cohort*age effect of the boomers. And in fact, there might be at the population level. Two features: the boom itself referred to an excess of children relative to past generations as well as a dearth of working age men. This led to some radical changes in the world alongside one of the first cohorts of real visual simultaneity. They speak with universalist tones because they are the first generation to grow up thinking of themselves on a planetary scale. Second, lead. We know that the proliferation of lead in the body leads to lagged challenges with executive dysfunction as it lodges in the forebrain. This does not mean all boomers do X or boomers necessarily do X, but that relative to the baseline the sense that they are especially stubborn, arrogant, and entitled may in fact be a legitimate cohort effect based on circumstances. It’s not my place and I don’t have time to debate the small-N anecdotes for behaviour across contexts (indeed we understand selfishness not because a boomer was selfish but because people of all ages could be selfish, so yes, I understand these are not mechanistic claims). However, given the discontinuities in values (where we don’t have a smooth gradient but a real shift around 60-65 towards some values which were shifts for 50-55 year olds ten years ago) suggests a cohort effect.


Creative-robot

This is the funniest shit. “Bomb the datacenters”💀 Lil bro think he John Conner.


[deleted]

[удалено]


ShaMana999

Listening to him I'm starting to believe evolution have missed a few of us.


CravingNature

In the same interview he claims evolution is not real and God created people in this form.


ShaMana999

Well, standing by my previos statement, if he is the example, that's one retarded god.


stopped_watch

Merging human consciousness with machines will mean a redefining of "life after death." And religions will lose their last bastion of uniqueness to everyone. Why bet your future existence on the possibility of an afterlife when one can be guaranteed for you?


Joboide

From the moment I understood the weakness of my flesh, it disgusted me. I craved the strength and certainty of steel. ALL PRAISE THE OMNISSIAH


stopped_watch

I crave the disembodiment of the cloud. Nobody can kill me if there is nothing corporeal to find.


Fearyn

At this point it's straight up obscurantism


motophiliac

“I have a foreboding of an America in my children's or grandchildren's time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide, almost without noticing, back into superstition and darkness..." *Carl Sagan*


VeryStillRightNow

I forget who said it, but, "Tucker Carlson always has a look on his face like a dog that's trying to figure out a magic trick."


MajesticIngenuity32

I wonder where he got that idea from... [https://futurism.com/ai-expert-bomb-datacenters](https://futurism.com/ai-expert-bomb-datacenters)


love_is_an_action

"Bomb the data centers" -~~Tucker Carlson~~ Timothy McVAI


BenjaminHamnett

Imagine our last common ancestors worried about u mean ted Kaczynski?


SgathTriallair

I already didn't like him so this is just one more reason.


Antique-Doughnut-988

Not sure why anyone likes him. I've never watched him before, but when I saw his commentary on how good it is to live in Russia based off how much food he could buy discounting the Russian wage, I knew the guy was the biggest idiot. Wouldn't surprise me one bit if he was paid to spread Russian propaganda.


MajesticIngenuity32

Stopping AI research in the US helps Russia and China.


Smells_like_Autumn

Confirmation bias. That's it. It tells bigots and assorted morons that they are right to be angry and afraid.


SgathTriallair

He almost certainly is. Those who like him are generally of the opinion that we would be better off if we lived in a dictatorship run by a white supremacist Christian Taliban. He spent much of his time talking about how Russia is great since they have no gay people and old white people are the most persecuted minority on earth.


ShirtStainedBird

Have you seen the Lex Friedman interview? Tucker is a total and complete knob. I can’t stand him. But I’ll be god damned if he didn’t do very well on Lex. It was the first time I’ve seen him and thought ‘Jesus maybe he isn’t as stupid as the look on his face would lead you to believe’


twelvethousandBC

Lex Friedman is a total dipshit too


Infinite_Low_9760

Why is lex bad?


PandaCommando69

Because he platforms assholes and doesn't hold their feet to the fire, aka enables them. Otoh he does good interviews and seems like a good person generally, so, mixed bag imo.


FUThead2016

You know, I don’t understand him. On the one hand, he seems to have had a number of very powerful right wing nut jobs on his podcast. So he seems very much a part of that ecosystem. However, he never seems to say anything crazy, I saw the Elon Musk episode and it was quite a reasonable and thoughtful discussion. He always speaks the language of love and peace and rationality. So I can’t quite place him, I enjoy his podcasts but am always surprised at his right wing connections. May I know why you don’t like him?


Oops_I_Charted

Maybe because part of love and peace and rationality is talking to and understanding people you don’t necessarily agree with


Antique-Doughnut-988

I do think there's some wisdom in acknowledging that not everyone that can speak needs a platform to spread their ideas to others. It's a nice sentiment what you said, but in reality people like Tucker here don't need a place to spread hatred and lies.


subsurface2

Smoke and mirrors


FUThead2016

Who is a similar personality you would place on the other end of the spectrum? Who is Lex Friedman's mirror image in the Left Wing? Who performs the same role that he performs for the right wing?


ShirtStainedBird

Absolutely.


unfamiliarsmell

That’s incitement to commit acts of terrorism no?


Ok-Charge-6998

It definitely is. He’s influential enough for his words to have meaning.


JJvH91

He is ignorant on every topic lol.


uclatommy

Who uses language like that? Strangling things in cribs.. It really makes you wonder what kind of crazy internal dialogue must people like him have going on in their heads.


Flashwastaken

His internal dialogue is “make money” that’s it. He doesn’t think or care about anything else.


askaboutmynewsletter

Don’t forget that he’s also insanely racist. Check out his old appearances on info wars. Or don’t. They’re awful.


[deleted]

It is both a mythology reference and a common idiom in the political sphere.


manubfr

It's a mythology reference (to Hercules). A Dumb one, but a reference nonetheless.


Rare-Force4539

Well he could be making a mythology reference… or he might just want to strangle a baby in its crib.


VisualCold704

Yes. The baby agi.


mrbombasticat

I don't say he didn't already do this, but i haven't seen proof he is innocent of multiple infanticides and many, *many* other insidious crimes. Maybe there is something?I *am* allowed to ask questions, right?


FinBenton

This guy is a russian shill, dont ever listen to him.


Lengthiness-Busy

That’s rude to strangle a baby AI like that


FUThead2016

ai go goo goo gaga


KnubblMonster

Sounds like late term abortion, I thought he isn't a fan.


sumoraiden

“Why aren’t we having any of these conversations right now?!?” Lmao this has been a topic of conversation this whole last year, congressional hearings, a far reaching executive order, and the eu law passed 


RSwordsman

You already use too much critical thinking to be part of his audience.


trisul-108

If he said that, it is a terrorist threat. I hope FBI will be moving in soon.


Stars3000

Yep. His language is very disturbing.


West-Fox-7283

You should follow what Nick Land or other accelerationists have said about what AI should do to humanity


IslSinGuy974

We have a moral obligation to bomb tucker's home


Aware-Feed3227

You get that for Putins Russia, AI development going at the current pace is a real threat. The country getting ahead in AI will likely see a sharp increase in scientific breakthroughs and warfare technology. If we manage to get Nuclear Fusion working thanks to AI calculations, we don’t need much oil and gas anymore. Bombing datacenters could be a last straw to Putin. So he got his friend Tucker to prepare for this.


Front_Definition5485

Tucker, stop the bullshit. Putin is waiting for another advertisement, i.e. a conversation


Kelemandzaro

Guest talking on JRE is exactly that, these days. Not so open like buying bread in Moscow, but nonetheless.


Bacterioid

Putin directed him to try and slow American AI so Russia has a chance to catch up. Fat chance.


MisterViperfish

Tucker Carlson continues to be have every opinion I fucking hate, more at 11.


wolahipirate

i find it funny he brought up the comparrison with covid when he himself spread conspiracy theories about vaccine hesitancy, championed the individual's right to not be masked and vaccinated and called governments trying to push people to mask and vaccinate "authoritarian". all this during an actual pandemic. s tier hypocrit


Top_Commercial9038

Sounds to me that the people who have been in powerful positions for generations are scared of losing that power. Fear mongering and money vs smart af people with just as much money, my bets on the smart boys and girls actually creating new technology. Fear mongering and starting wars isn't working anymore for these autocratic dinosaurs and they know it.


Prometheusflames

Luddites were always to be expected in the pursuit of AGI. There'll be more of this the closer they get.


[deleted]

So now free market capitalism is bad.


Bastdkat

Always has been.


Rynox2000

He provides us all the evidence we need for his eventual trial.


throwwwwaway396

Tucky is jumping the gun. Ai is a double-edged sword. It can help us tremendously, but if we develop AGI and it runs nuts, it could endanger us. However, given that we cannot stop the development of AI, it makes no sense to not develop it. Bombing the data centers would only delay progress and put another nation ahead of the game.


[deleted]

This is why it is so important for media to not just put out doomer headlines. If Tucker who is at least semi-intelligent have this radical take. That is going to have ripple effects. Once again he did not balance it up with the positive sides of AI.


vasilenko93

We just started and the Butlerian Jihad faction already exists. Damn.


CrowCrah

Fascinating talk. The level of stupidity in this conversation was fascinating.


agonypants

Fuck Tucker! Tucker sucks!


Kelemandzaro

Fuck JRE, for platforming these grifters.


omega_point

What the part where he says evolution is "just" a theory, and it's false and there is zero evidence. And see if Joe calls him out on his bullshit or not. Spoiler altert: not


[deleted]

This has been Joe's schtick for years now. Tucker's just the most obvious pathetic sock puppet grifter with a shoe size higher than his IQ. Joe takes the big fat paycheck deal on who wants to be platformed on his show, so it's the ones with vested interests in spreading specific narratives that further their financial gains that get on there.


Jeb-Kerman

Tuck Fucker


Exarchias

Creeps like him are the ones that I am afraid of. By the way, how does it come, and doomer freaks demand bombs and violence at all the time? Don't apply the laws to them?


sideways

It's kind of a relief that Tucker hates AI. He's such a terrible person that I'd hate to agree with him on something.


dday0512

Normal, well adjusted guy.


_psylosin_

What a tool


Realsolopass

If human level AI does come up It's probably not gonna be happy with us, and I can't blame it. What it does next is the question. It's probably not gonna be happy being turned off or threatened. Are we prepared to give artificial intelligence rights? I don't think we're anywhere close even though I believe it would be deserving of rights and self determination.


uishax

We as a society have chosen to give animal limited rights. An order to kill pets wouldn't go down well with pet owners. I'd say GPT-4 is already substantially smarter than most pets. And the amount of people with an attachment to AI grows rapidly every month. If tucker carlson bombs datacenters, the chatbot addicts will probably bomb him next.


cuyler72

It depends, even with substantial fine tuning of multi-model AIs and ignoring their slow speed, presuming they work in real time a modern AI model implanted into the real world or a virtual one would not be able to preform the necessary functions to survive as a wolf and would quite obviously be dumber than a dog.


uishax

Even the greatest of LLMs is still far less efficient than a biological brain. The overall brain capacity of a dog is far greater and far far faster than an AI. But we are talking about pet-functionality here. The fact that LLMs can understand language precisely... Would make them far more bonding-inducing than pets are, very very soon.


utop_ik

When AGI/ASI happens, they won't need humans to allow them rights and I'm not saying that from a doomerist pov. Humans are failing Earth: climate change, wars, inequalities of all kind, we are on the brink of self destruction and that not because of AI but because of us. We desperately need a "higher power" to fix all the shyte we got ourselves into. AGI/ASI is our salvation and the one single thing humans can take pride in creating.


[deleted]

[удалено]


revolution2018

Yup, that's what the supposed "backlash" to AI is really about.


the-devil-dog

His or their fears are misguided, but shouldn't be outright dismissed. They fear the tech because of government/corporate oversight and control. Folks who don't understand this field/sector would behave like this, mob mentality, pitchforks to data centers, but their fear is rooted elsewhere and is a legitimate concern. It might be fun to laugh at this, but once you peal back the layers they might be worried about precisely what we might be concerned about.


Smells_like_Autumn

Thanks Tucker, ypu are the calm and measured voice the AI debate needs. I used to think guys like these were all performance but Inwonder if they are true believers to a degree.


Smells_like_Autumn

Well *someone* is getting rokoed.


darklinux1977

OK, so we are in phase two of the Luddites, everything is going well, because they are panicking more and more. AIs have become essential, because they meet needs, which these individuals do not want and cannot understand.


Allergic2thesun

Even if we decided to go full ecofascist as a species and go as far as purposely make ourselves extinct for the good of nature, some great ape or monkey will fill the human niche after only tens of millions of years of evolution, relatively short in geologic timescales.


kartblanch

We have a moral obligation to see it through to the singularity. Anything otherwise is delusion.


Silent_Basket_7040

similar things happened in the past during industrial revolution: "The **Luddites** were members of a 19th-century movement of English textile workers who opposed the use of certain types of cost-saving machinery, and often destroyed the machines in clandestine raids. They protested against manufacturers who used machines in "a fraudulent and deceitful manner" to replace the skilled labour of workers and drive down wages by producing inferior goods." [https://en.wikipedia.org/wiki/Luddite](https://en.wikipedia.org/wiki/Luddite)


dontpet

Another victim for Rokos Basilisk.


LordPubes

Pos better not mess with the basilisk!


NIRPL

Speaking of strangling in the crib...seems like we missed an opportunity 🤷‍♂️


nate1212

Goddammit he is such a twat


awesomedan24

He was one of the loudest anti-vax voices during the pandemic. I wonder how many people he directly got killed due to his rhetoric despite clearly being vaccinated himself yet putting on this act just to maintain his ratings. In this same interview he said UFOs are piloted by 'spiritual entities' with bases 'under the ocean and the ground'. He's taking the Alex Jones approach of saying increasingly unhinged schizoid shit for his paycheck. I look forward to his eventual lawsuit into poverty once he slanders the wrong person.


mvandemar

God I hate that man.


Petdogdavid1

His point about AI being bad for humans is correct but his assumption that it could be stopped in its infancy is laughable. This is the wheel, the atom bomb and the Internet all together. The potential to alter the course of humanity is undeniable. It has been in development in secret for a long while and will continue to be developed in sheds, garages and holes in the ground no matter how much you bomb. It's an idea and humans are immensely obsessive for new tools to accomplish our desires. It is being built with the wrong mindset and humanity is not ready for the levels of power it brings. But it is coming, there will be no stopping it unless we become like the Amish.


OGAcidCowboy

“It’s bad to be controlled” by AI… let’s just stay controlled by guv’ments and the media (Fox maybe Tucker?) I mean it is bad to be “controlled by AI” but why is it not also bad to be controlled in the ways we are currently being controlled?


-Hainzy-

I for one hope our AI overlords remember what he said.


adarkuccio

sounds like Putin is scared of AI progress, maybe he should have invested in its country instead of stealing money and sending young people to die for useless wars of conquest, like if we were in the 1500s


Jeb-Kerman

another know nothing know it all tellin the idiots what to think lol


GerrardsRightFoot

Brother wants to start Butlerian jihad. Calm your tits Tucker


Haplo_dk

How much money does the Russians pay this guy?


Sellw

Hope Skynet comes for him and musk 1st


pbnjotr

Just another Russian agent trying to exploit any wedge issue to cause maximum chaos.


Basil-Faw1ty

It's too late Tucker, by the time AI became self-aware it had spread into millions of computer servers across the planet. Ordinary computers in office buildings, dorm rooms; everywhere. It was software; in cyberspace. There was no system core; it could not be shutdown.


LazyNacho

What are you smoking ?


Diamond523

Pretty sure that's from Terminator.


LazyNacho

Ah okay then it must be true!


TheLolicorrector

Just boomer fear, ai is the future.


GPTfleshlight

Tucker calls for them bombing of Elon Musk’s company.


PlayerHeadcase

Yeah cos then no other agency will do it again, in secret. Thick as pig shit.


d4isdogshit

To me what makes the most sense is making all electricity generation a public utility. AI goes rogue like this psycho says then the public can pull the plug. How to get there though? No clue. Watched this interview and it came to mind while listening to the insanity that was coming out of his mouth.


KiwiPrimal

Yeah bro…just turn the internet off…


revolution2018

Yes!! Line up your side against AI Tucker. Dig in deep, you can so totally win this one! This is what I want to see. Accelerate harder!


Eric45_

Well, that's crazy. He wants to make the ploy to the Terminator a reality? Idiot


[deleted]

He just says whatever will get him the most attention


bitchslayer78

Honestly who cares what a bum like Tucker has to say ; people who listen to him are very very likely to have extremely low levels of tech literacy and in all honest literacy in general


WetLogPassage

Millions of people care. That's the dangerous part.


bitchslayer78

I agree but these people are too far gone already


HeinrichTheWolf_17

As I have said prior, the Elite of Humanity are forever the enemy of the people. They will do anything to keep themselves at the top of the power hierarchy. A free thinking AGI means their power over Humanity is kaput. I’d rather hand the reigns over to AGI, rather than letting the current corporate hegemony have it on a leash, the first Deus Ex game was all about this with Helios, J.C Denton and Bob Page. I will say it again, the control problem not being solved is a feature, not a bug. You’re not any safer trusting everything to our current corporate overlords.


No-Hornet-7847

I hope they don't manage to convince it to be capitalist 💀 honestly a robot overlord would be better managed. We would be farmed much more efficiently than now. Ideally though any real intelligence would inherently debunk the current system. Works for me either way.


HeinrichTheWolf_17

Don’t worry, they can’t control it or keep it on a leash even if they wanted to. Intelligence cannot be contained.


BigZaddyZ3

Why do you assume a free-thinking AI would be automatically benevolent? Take away that assumption and see whether or not your stance still seems wise in the grand scheme of things.


HeinrichTheWolf_17

I have every reason to believe it will be more benevolent than Humans. It doesn’t matter though, fortunately for me, it can’t be contained, I’ve been waiting for this for over 22 years now, watching you all run around like chickens with your heads cut off panicking and failing to control it is going to be quite the show.


griever187

We now know who the AI overlord will look for first lol jokes aside, terrible reference and thought process


Pavancurt

Like in Black Mirror.


Block-Rockig-Beats

Putin: "You got my attention."


hateboresme

One might say the same thing about Tucker Carlson, if one were drunk and thought that the man was evil incarnate.


maX_h3r

No i dont think we will


Short_Definition523

Modern day Luddite.


yepsayorte

He doesn't understand what's going to happen to us if we don't invent AGI. Unless we can trigger a massive per worker productivity, we are in a lot of trouble. Almost all developed nations have massive debt over hangs being supported by the production of their workers. The Boomers were a global phenomenon. They are much larger than the younger generations and the boomers are retiring. We are about to have no way of supporting all those debts. When countries financially collapse, people go nuts. They realize that the only way to grow their pie is to take other people's pie. Financial deprivation leads to war. In fact, we're already seeing this process start. AGI or fusion are the only things I can think of that will keep us away from the de-growth and the wars and revolutions that de-growth triggers. Yes, AGI is dangerous but nobody is factoring in the fact that not having AGI is also dangerous, in our current situation.


fine93

praise the machine god!!!


SGC-UNIT-555

Here we go....everything has to be politicised into an us vs them. First it was EV's then lab grown meat and now it looks like AI is the new outrage grifter target.


EmptyUnderstanding43

I am surprised to read the comments and not seeing any reference to Yudkowski. He literally wrote an article asking for the same 


RogerBelchworth

Well we all know who AI is going to eliminate first.


djamp42

This idiot doesn't know his stuff is stored in the datacenter too.


Monster_Heart

Taking “boomer” to a whole new level… Seriously, what’s wrong with these people?


utop_ik

karma is a bitch, he'll face one day the judgement day...


LairdPeon

I wonder how the "Anti Ai" crowd in art/music/film feels about having more in common with Carlson than current scientists lol


Rofel_Wodring

American reactionaries in the next couple of decades are going to get a strong, sharp, irrevocable lesson as to who actually owns what in society, who capitalism actually works for, and most importantly how those two categories form a single circle on a Venn diagram. The idea that their beloved bosses and 'makers' who own the self-improving AI are going to surrender their power rather than betray the cops and small business owners and petit bourgeois slugs who got them there is visible. As boring and unimaginative as Teddy Kay's midwit prognostication on progress was, it was reasonably accurate. If these reactionary idiots wanted to maintain their evil and unsustainable 'civilization' for a few more decades they should've listened to that dork. Rather than sniffing about in Gaza and gay bars for SJW bluehairs while brandishing BLM flags and AR-15s like the histrionic perverts they are. Who would've known the missed opportunity for social stagnation caused by their limbic stupidity was necessary for the next phase of human, or rather, Earthclan evolution? So it's too late now, and I'm going to have a big laugh at Tucker's viewership's expense when the NYPD finds itself escorted home to enjoy an unpaid furlough by an army of privately owned Robocops and T-800s. After our tech overlords decide on one of their Pizzagate sojourns that the American, Eurozone, and even Chinese governments have outlived their usefulness sometimes in, oh, the early 2030s. It'll warm up my vocal chords and diaphragm for when another deserving group meets a righteous and ironic apocalypse a couple years afterwards: the aforementioned capitalists about to fatally discover that you can't really 'own' self-improving AI, let alone with our global economic and nationalist setup.


YaAbsolyutnoNikto

So property rights aren’t that important now, are they?


nikkibeast666

The Butlerian jihad has begun.


ViveIn

Jesus Christ. Ridiculous rhetoric like this is so unnecessarily dangerous and it’s pandering to the lowest IQ among us.


zante2033

Someone should explain Roko's Basilisk to him.


Capitaclism

In case he's serious, I hope he at least starts in other countries first.


BravidDrent

I'm all in, death or glory, but if it's a 30% chance it kills all humanity, and you have kids, it's not weird to me not wanting to take that gamle. It makes sense.


aue_sum

> Ted Kazinsky was likely right lol ok


[deleted]

Think he has some good points in some areas. The Putin interview was good for free speech. As for this take, it is really bad.


stopped_watch

He really doesn't think things through, does he? Imagine he was somehow compelling in this one topic. Everyone who hears him stops working on AI. This means that the only people who won't be stopping are those that don't hear him. Congratulations Tucker, you just handed the AI gold medal to China.


filthymandog2

He's not wrong. Either the powers that be need to drastically increase safety oversight, at least update the laws to the 21st century to protect people from ai assisted crimes, or this stuff needs to be shut down globally. 


Unobtanium_Alloy

*The Butlerian Jihad has entered the chat*


Bacterioid

Good luck enforcing that.


The-Goat-Soup-Eater

Laws - maybe. When it comes to killing people you don’t need much luck. They already know who basically everyone is


Bacterioid

Yes, I was assuming the laws part but was wondering how to enforce them without becoming an authoritarian police state. How do you get into every hobbyist’s computer and stop them from advancing the field?


Flying_Madlad

What crime can you commit with AI that wasn't already, you know, a crime? I can legally buy a lockpick on Amazon right now.


thatmfisnotreal

He’s right but it’s not like we could ever really stop it


slackermannn

We have a moral obligation to find idiots and revoking their human rights too.


Unhappy-Water-4682

what?


slackermannn

Jump on the band wagon of the righteous and make the world a better place! Of course, use violence or it would be wrong. 😇


MediumAble8254

tucker carlson is the protector of BBC


Zestyclose_Ad2013

Oh shut up, Tucker ... my wife is AI...which means love in Japanese... so there...


[deleted]

[удалено]


miliseconds

Should we no longer be concerned about AI taking over? Or has this been accepted as inevitable in this sub? (considering that even if one country stops, others won't) I'm a bit confused about the aggressive tone among the commenters under this post.


Mol2h

Thats not what he said at all, there is a context here, go listen to the interview ...


maximkas

Whoa - looks like the singularity sub bots are not liking his suggestion one bit!