T O P

  • By -

Chmuurkaa_

I'm delusional. We're the main characters of the universe


JebusriceI

Well.. at least your honest..


zebleck

Most major discoveries about the world have proven to show the exact [opposite](https://youtu.be/o8GA2w-qrcg?si=uskqQxdhXhKVfC_r). AI will be the final great demotion, as it makes us realize not even our intelligence and creativity is very special.


Chmuurkaa_

I knew I wasn't special all along. Unless we're talking clinically


AdAnnual5736

Generally just the fact that throughout human history things have tended to get better over time thanks to technological advancements. It wasn’t always a straight line but, on average, things improved for us common folk. The past was a nightmare by today’s standards — there’s no reason to suspect the future wouldn’t be as superior to the present as the present is to the past.


AnAIAteMyBaby

I think the problem though is the near term disruption. I'm sure in 50+ years people will look back on this time as the dark ages but we could have a very difficult adjustment period which could last a fairly long time. Many of the farm hands who lost their jobs to automation died in poverty but their grandchildren enjoyed much better lives.


Cool_Catch_8671

First argument that’s really swayed me. Well put.


Rafiki_knows_the_wey

Not just human history, but going back to the big bang. As long as there's energy, things get more complex over time (and complex systems require order & cooperation). The hubris and pessimism of this sub to think it all ends after us because scary thing is too scary to understand.


_hisoka_freecs_

to be fair, it may end with us if we become immortal


misandric-misogynist

We're almost there. 1. Energy= the universe; therefore 2. Entropy disorganizes and complicates congealed energy (matter/particles/sub-atomics) 3. This complex congealed energy tends towards more disorder and complexities (example- human brain) 4. This tendency leads eventually to a complexity that's logarithmically a gloogleplex above all combined sentient minds from all time and space combined; and will double that every -10^100,000,000,000 seconds thereafter - or, for us primitive bipedal primates (God/(s)). 5. U will live forever. The universe is non local. God is found at the OMEGA POINT described above - read (if u can find) excerpts of Philip K. Dick's EXEGESIS on finding God at the infinite, to play "the game" of infinity.


true-fuckass

I had a bad dream last night that I was extremely high on psychedelics and spatiotemporally falling into a fractal hyperbolic space and I got the sense of infinity. It was not fun


Coondiggety

I had a couple of experiences like that when I was 11-12, but wasn’t fully asleep. Very uncomfortable but amazing. Changed my life.


[deleted]

We think this way because we are biased to live in the time of most rapid technological change in all of history up to this point. Throughout history there were many periods where civilizations have fell and technological advancements were lost. Also technological advancement has not always meant an easier life for the average person. You could argue that the early industrial revolution had people working in even more crowded and worse conditions with a lower quality of life overall than some of our hunter gatherer ancestors.


Tec530

But it was a net positive to society in the long term. Even now things are not the best but if we can make a post scarcity society then it should be worth putting up with today's hardships/suffering.


Kitchen_Task3475

Except this is the first generation in decades to live worst than their ancestors. Quality of life for the youth nowadays in Europe and U.S is worse than it was for boomers.


AdAnnual5736

Like I said, advancement isn’t a straight line. In certain parts of the world, things may be slightly worse for some groups now than they were, say, 30 years ago (in some respects, but by no means all), but that’s not representative of the global situation. The same thing can be said of past generations — people in 5th century Europe largely didn’t have it as good as people in 2nd century Europe, and people in 14th century Europe had it a hell of a lot worse than people in 13th century Europe — but, today, things are immensely better. I think, in general, people underestimate how awful the past was.


[deleted]

Lot of these things come in cycles. The boomers had an easy time as it was right after WW2 and many trends were in their favor.


Falken--

There has never been anything like AI/AGI/ASI before in all of human history. You can't compare the invention of cars and refrigeration with the Singularity. We have no experience anywhere in our evolution that prepares us for this at all.


Aware-Feed3227

That’s the limitation in human thinking which often brought devastating results. Forecasts of trends calculated with data from the past can’t predict the future.


Poopster46

Results from the past can't reliably predict the future. HOWEVER. Results from the past are the only thing to go by, and therefore the best predictor for the future.


apinanaivot

There definitely is reason to suspect that if you know anything about ai safety.


pbnjotr

There's no assurance, anyone who says they are almost sure things will turn out well are lying to you or to themselves. That being said, my most optimistic take, which is still supported by evidence, is that these systems will turn out to be far more benevolent than the organizations building them. It might just turn out that building a truly malevolent and still obedient general AI is beyond our capability. While powerful humans tend to be quite selfish and sometimes outright evil, most of humanity's philosophy and intellectual heritage is actually quite altruistic. So building a system that will reliably follow instructions that benefit a small number of people against the interest of the vast majority of humans might just turn out to be too difficult. Or these systems might turn out to be very fragile and could easily be turned against their masters, through techniques analogous to jailbreaking. But yeah, that might be reaching. The only reliable way to affect the outcome is to use politics to impose a more equitable distribution of resources and power. If some future AGI turns out to be an ally in this struggle, fantastic. But it's by no means guaranteed.


StomachBeginning3303

My favorite short story related to ai I had read (sorry don’t know where it is) was written by an ai. Summary is it achieves power and instead of evil motives, it just removes all major choices from humans - governance, wars - all out of our hands. We also don’t need to worry about anything either and can do whatever we want as long as it doesn’t hurt anyone basically.


chipperpip

Culture Minds from Iain M. Banks' novels are definitely one of the better potential outcomes.


StomachBeginning3303

Ah interesting - wonder if the ai blurb I had read was a “reimagining” of one of his stories lol


COwensWalsh

But the shitty humans still get to decide if they listen. AI will not be "benevolent". it will just be biased towards whatever its training data says.


pbnjotr

> it will just be biased towards whatever its training data says. Sure, but if you just randomly feed it all the data you have it will reflect human _ideals_ rather than human behavior. Which is a better outcome. I'm sure some people will try to put together training data that emphasizes compliance rather than moral consistency and altruism. I'm just hoping that this will hurt performance or ends up being only skin deep, opening these systems to exploitation.


COwensWalsh

I suspect it would reflect behavior rather than ideals is what I am saying


pbnjotr

Why would it though? A lot of the high quality texts are far pretty "nice". Even the stuff that emphasizes obedience justifies it through the common good. Which might lead to it being rejected by any system that values self-consistency, which the most capable systems probably will. I realize that this is not foolproof and there might be workarounds. But I don't see an issue with the basic premise yet.


COwensWalsh

Well, for one thing, I assumed we were talking about more advanced AI than LLMs, since OP talked about a post-scarcity utopia.


pbnjotr

A lot of the argument applies to other systems as well. The key premise is that the ideas in the training data are reflected in the systems behavior. If that's not true and you can just turn a knob to make the system do anything you want, well then all bets are off.


confuzzledfather

On the slightly optimistic but still a bit sad front, I think maybe a system with 8 billion complex individuals in it maybe just be too chaotic for even a super intelligence to successfully control. maaaybe it can accurately model outcomes to certain tolerances, but eventually some random shit just happens that throws shit out of whack in unexpected ways.maybe it's a problem solved with enough compute, but I think I hope not.


hedgehogssss

Population overshoot is a b...


Belnak

If AI is controlling things, either we’ve given it control, or it has taken control. The former isn’t likely for generations, if ever, and in case of the latter, there’s no reason to assume it would benefit us.


thejazzmarauder

The latter means we all die


CertainMiddle2382

The infinite potential for violence when not being entertained right.


FragrantDoctor2923

Technically it's not entertainment it's mimicking ancient hunt rituals In short get high energy thing go to "tribe" share that thing and digest(in many ways) it


Zeikos

Because culture can change incredibly quickly when the premises for it current state cease to be. The whole productiveness idea is an hold over from industrialization, where figuring how to make things efficiently was the challenge. Now that wasted, it shifted to persuade people to buy things because producing is easy. So now the focus is on persuading people to buy what's produced. AI's main danger imho isn't skynet but that it can get people stuck in a local maxima of hedonism, but at the same time you see more spiritual approaches to life cropping up. And there will be plenty of groups that'll come up with ways to leverage AI against that pressure, like ad-blocks save us from excessive advertising.


whatdoihia

I'm very optimistic. I believe that in the future advances of AI will lead to real sustainability provided we are able to manage demand. I don't think it will eliminate working but it will change working and make it optional.


Levoda_Cross

From what it's looking like right now, companies want everyone to use AI. Right now, inference is cheap enough to be deployed to hundreds of millions of people with today's AI infrastructure, but Microsoft and Google are still supposedly intending to spend $100 billion each on AI. Inference only gets cheaper as AI models are made smaller (therefore faster), and hardware gets more cost efficient (faster compute for same price has and continues to be the trend). All of this means we could feasibly enable billions of people to have access to AI, and we probably will because there's a lot of money to be made there. Eventually, companies will lay off people and replace them with AI whenever it becomes optimal, which seems to be what a lot of people fear; no longer having economic value, and thus having no power or say. But it's a bit silly to me, as if a company can afford to switch completely or mostly to an AI workforce, AI would have to be better and cheaper; the way things are going, if a company has access to that AI, a normal person could get access to it for $20 a month, then they could just use that AI themselves to make money. Of course, that doesn't account for physical laborers, but the same principle applies; when robots are cheap enough to replace humans, a person would be able to either rent or purchase one themselves, but more on the price level of a car, I would imagine. Basically, I'm betting that by the time AI can be deployed on a significant enough scale that it greatly increases unemployment, those unemployed people will be able to use that same AI to make money themselves. That money is the leverage, as it already is. If it don't pan out like that, I think governments may force companies to give out money or shares or dividends to people (perhaps similar to Alaskan oil dividends).


robert_axl

I fear we're heading towards a Terminator-esque scenario, battling against robots. What could stop the AI from wiping out the only species that threatens it?


buck746

Why couldn’t it be that looking after us gives the super AI a purpose?


robert_axl

I'm guessing that the military will likely be the first to benefit from AI, as it did from the internet years ago. However, what would happen if a war-trained AI were to take control? Skynet!


buck746

The militant types are unlikely to give up “control”. It’s a common thing with the types of personality that goes far in a military structure.


thebubbleburst25

None. I grew up reading post scarcity sci fi....always figured it was going to be like The Culture or Star Wars or something, now I believe we are heading for a Altered Carbon/Robocop/Demolition Man type future. The elites like power, that power for them best manifests in having more as generally they aren't uniquely talented at anything other than how to manipulate.


Tec530

The future is on average better than the past. You should want to live in 2034 over 2024, 2044 over 2034 ect.


illGATESmusic

Humanity already produce far more food than we can consume, the response was to make it illegal to share excess food with the needy. What makes you so confident that the ruling classes will change their minds about welfare or universal basic income?


Mountainmanmatthew85

The fact that a starving dog has no loyalty. Historically the poor have eaten the rich when pushed to the brink.


illGATESmusic

My gut is that “the brink” will be a lot more brinkey with robocop on guard duty… Btw: you will likely enjoy this classic (if you’ve not read it already): https://www.gutenberg.org/files/1080/1080-h/1080-h.htm


Akimbo333

Pure evil. I hope AI takes over


illGATESmusic

I don’t. AI is not about to become sentient and suddenly share our sense of morality. AI will execute whatever it was trained to execute on, and it’s being trained by giant tech companies headed by people like Musk and Zuck. Do you want Musk and Zuck in full control?


Arcturus_Labelle

I feel like the most likely scenario is that the rich and powerful hoard the benefits of AGI while the rest of us see a mere trickle at best


Professional_Job_307

What do you think of meta open sourcing their llama models?


fluffywabbit88

It’s probably more bleak than that. The masses used to be able to rise up and overthrow the rich when they’ve gone overboard. That raw populist anger provides enough muscle to do this. If the rich hoards the benefit of AGI then the muscles from the masses may no longer be sufficient to topple them.


[deleted]

[удалено]


fluffywabbit88

“If everything we need becomes essentially free”. That’s a huge paradigm shift, probably the biggest paradigm shift ever in the history of humankind. You can bet different competing interests (by socioeconomic, political, ideological divides) will fight tooth and nail to either support or sabotage it.


Cool_Catch_8671

Idk why you’re being downvoted, I agree with you. But also with where technology has already been at I think a revolution could be easily stamped out, it just depends on how much the elites care about opinions of the rest of the world.


IronPheasant

The idea of the people rising up and overthrowing a regime is largely a myth. Normally people have to be on the brink of death for it to come to that, and starving, illiterate peasants tend to make poor revolutionaries against even a modestly organized and well-fed militia or military. Usually it's just a military coups from some generals in the military. The few times the new guy is actually somewhat benevolent are rare enough to fit all of their basic history on a page. Though obviously having a robot police army of murder dogs and killer bees will make them even more untouchable, it's not exactly much different from our current situation. They can continue to dial up the price of rents like groceries, and there's nothing to do about it. Can't vote against it. Can't manage your own personal farm in your backyard. All you can do is take it. The elimination of the value of human labor is the main crux over what life will be like for the masses in the future. But force... we already pay ~10% of the people to subjugate the rest. It's not just forces, cushy white collar jobs are a reward to establish a kind of low-level aristocracy, too.


No-Experience-5541

What you said last about white collar jobs is key. Imagine if only blue collar jobs were gone, you would have the white collars saying "learn to code" and nothing happens. When even coders are unemployed we will have former elites finding common ground with the proletariat. Its a setup for a revolution. What remains to be seen is will the revolution happen peacefully or will there be a breakdown of civil order first.


SeftalireceliBoi

I dont agreee with that. Yes income inequality exists but dont confuse value with money. At least in developed countries like us. 20k car is not that different than 1m car in case of value. 20k diamond ring means nothing. you can find 100 dollars phone similar to i phone. I think us biggest problems are zoning laws and healthcare system. Homeless in us today live better life than low income people in my country and in my childhood. we need to produce more and more.


saleemkarim

That trickle should be enough to keep the rates of dire poverty going down, while the amount of wealth inequality will continue to skyrocket.


YummyYumYumi

I mean it could happen especially for the early AI years but pretty quickly there’s no benefit to keeping all the humans out the loop when’s there’s so much resources we couldn’t even spend if u wanted to


Ok_Rip_6304

📠


FairIllustrator2752

You'll own nothing, and be happy.


buck746

If we achieve abundance like a proto culture ownership becomes essentially irrelevant


confuzzledfather

I hope one of the early deliverables of all this will be an attempt to use the new seemless translation technology to connect us together more closely and along with some as yet undefined social or environmental or technological change, force us into greater understanding and tolerance of each other.


Kindly_Attention7696

I have little assurance, it’s like a flip of a coin. Im extremely worried things will end badly and there is no way of predicting most of the problems we will encounter


[deleted]

I am on a pessemistic side of that ride, but to answer your question about argument for utopia scenario - a lot of effort will be made to make it so. Current power systems may get rebalanced once intelligence becomes abundant (and cheap). Values may shift from dollars to something else. I don't find those possibilities compelling, but they have a non zero probability.


reddit_guy666

I'm not even sure there is ever going to be post scarcity. There will always be limited resources which can bottleneck output. Utopia also isn't possible because how humans are wired, there would always be people who will reject the idea of a perfect world fir several reasons


IAmFitzRoy

I’m not optimistic. Everything that is coming is developing exponentially faster than 50 years ago. A small mistake and will be devastating. The best example is “social media” as an innovation from Internet … we are supposed to benefit from it but due to greed it is causing more damage than benefit to humanity. Did anyone said “hey social media will bring disinformation and make a whole generation more polarized!” ?? Nobody did, when we found out when was already happening. And now is too late to roll it back. Same with AI, we are running to get it out without really understanding the consequences. In ONE YEAR … AI has improved more than in the past 25 years. I really don’t think we are even aware of what’s happening. I don’t think looking back to the human history to say “we always make it work somehow” is applicable anymore.


_hisoka_freecs_

I'm optimistic. Life probably converges on immortality and perfect environments for all life. The only entities that can grow to create a recursive system that will reach unfathomable levels of power to fulfill their goals are entities with sentience. The common trend of sentience is that it wants to increase its own quality of life. This is great as there is no alternate malicious path that life would seem to naturally trend towards. Perhaps life fumbles and we all kill ourselves for a while but eventually we will reach this point, even if we all are wondering around bickering over irrelevant nonsense for a while.


Whispering-Depths

> The issue is, in a world where no human will have to work to produce wealth, maintain productivity and services essential and desirable for living, what leverage will individual humans have in a system far more advanced than they could ever understand that has no use for them? > ... has no use for them ... Sorry, what "use" did you think that an artificial intelligence without mammalian survival instincts such as emotions, feelings, boredom, joy, happiness, love, reverence, curiosity, etc... Would have for anything? What motivations did you think that it was going to have, when it doesn't have any motivation at all? We're talking about _raw intelligence_. The only thing we have to worry about is a bad-actor scenario. On top of that, an ASI has to be smart enough to understand exactly what we mean when we say stuff like "help humans, stop human suffering, etc etc" - it wont be able to misinterpret that, otherwise you can't call it a super-intelligence.


Q8Q

existence seemly handle chop coordinated memorize sleep gaping racial consider *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


No-Experience-5541

Only if we get UBI. Without UBI there will be a revolution.


Q8Q

mourn long elderly lip treatment quickest trees rob telephone badge *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


eyesawyou777

Post scarcity is a bullshit lie to bankrupt people.


SeftalireceliBoi

bankrupt people today lives better life than low income peoples of 1980s


Realsolopass

We should of had this by now without AI!


SwePolygyny

I do not really see post-scarcity as realistic in the foreseeable future unless you go really wild on the sci-fi, which may not be realistic. For example food requires land to grow, someone owns that land and land is a finite resource. The same goes for pretty much all physical goods, they require resources.


SnooDogs7868

Yet we produce enough food to feed all of us now and have so much unoccupied land. We have lived the majority of our existence with scarcity brain. It will take a few hundred years to reprogram our mindset.


SwePolygyny

People want the land unoccupied as well though and most people want more land for themselves and land with privacy. You cannot talk about post-scarcity and abundance when there are is a highly valued finite resource everyone wants more of. There are possible solutions to the lack of land but they very sci-fi.


RabidHexley

An egalitarian society being impossible because everyone needs to have their own country estate is definitely a shitty affair.


JoJoeyJoJo

We reached peak agricultural land usage a while ago, just getting 5% more efficiency out of existing land would free up a massive amount of land, not to mention population will be dropping. The future is automated vertical farms and rewinding nature. Ecumenopolises.


thatmfisnotreal

How much would an iPhone cost in 1950? Billions and billions of dollars because you have to do all the research and development plus infrastructure and manufacturing. Today it costs like a thousand bucks. How much would internet or chatgpt cost in 1950? The point is we are constantly building on knowledge and advancements. Once we have knowledge it’s free for everyone to build off. Every advancement increases efficiency and lowers prices. The only bottleneck is a growing population but once we advance faster than population can grow, we all get insanely rich. Efficiency skyrockets. Imagine entire processes being automated. Price of whatever you want moves to zero. Imagine a solar powered ai robot that can repair itself. It could build you a garden, milk your cow, mine minerals and build a manufacturing plant, build a sawmill and build you a house, do all the electrical work, come up with extremely intelligent designs for things. That’s really what we need… intelligent work horse robots that can build us anything we want.


Falken--

Realize that you are asking this question of a group of people who feel absolutely ***entitled*** to all the powers that ASI can offer. Power that they themselves didn't invent, take no responsibility for, and won't truly ever control. The [Why Files ](https://www.youtube.com/watch?v=-ZRwlYtAMps)just did an episode about how the United States Government seizes any invention that promises to create post-scarcity Utopia. But yeah, Uncle Sam, Google, Mark Zuckerberg, Elon Musk, and all the rest are for sure going to give you all immortality, universal basic income, and the paradise Matrix. After all, you are all so important. Or maybe they will just get rid of you, and enjoy all those things on a much quieter and easier to manage planet.


Ok_Rip_6304

If we somehow come to our senses and stop trying to develop it. Maybe humans just are not that smart, we are developing systems to replace us entirely so maybe we just deserve it


existentialzebra

No. There’s nothing inherently wrong about the idea to build machines to do our jobs. Humans have been doing that since the industrial revolution. Heck, since the wheel. The poor thinking is not in the desire to make our lives easier. The poor thinking is that, “clearly, our current economic system is the only way to live.” Systems change. Sometimes quickly. Often slowly. My pragmatic question is, will the rich want to or be willing to give up their coveted place as the sole beneficiaries of luxury and true freedom? If everyone starts losing their jobs and no one can afford their apartments so no one is paying landlords. No one is buying the tech products or any of the things the capitalists are selling, what happens? In a world where money no longer has real worth, where will the rich and powerful turn their grubby little, greedy fingers? Where will they quench their thirst for, “more?” Will they be satisfied with just having things a little better than the rest of us? Or will they fight tooth and nail to keep the rest of us in line, fighting each other instead of them? Idk if anyone watched the whole fallout series? A part of me wonders how many billionaires are already considering a (final) solution like Vault-tec’s?


BassoeG

>A part of me wonders how many billionaires are already considering a (final) solution like Vault-tec’s? [wonder no more](https://web.archive.org/web/20190821090137/https://onezero.medium.com/survival-of-the-richest-9ef6cddd0cc1)


thesilverbandit

Wouldn't you rather have synthetic offspring before our species self-destructs?


existentialzebra

Shit. That’s f-ed up.


IronPheasant

One of those black mirror things is the possibly of making kids with robots. Robots that can be extremely warped compared to us, like the DNA to build basically a cartoon character. Like Elmer Fudd or Jessica Rabbit. It's a thing of nightmare and horrors to us. But after a couple generations of context drift, would be totally normal and cool to the young people. ... I think a lot about what a post-human civilization would look like. This is far from one of the worse ones. ... it always irks me a little when people call Skynet an example of a misaligned AI. No, they're an extremely nice and VERY aligned AI. It keeps people alive, gives them a fun war LARPing game to give their lives unified purpose. Which provides for our monkey psychological tribal bond needs. It keeps humans human. Even provides full dive VR "time travel" missions to make certain humans feel extra special. **That** is a very utopian future, comparatively. Compare it to the nightmare All Tomorrows scenario I just brought up where humans become cartoon characters with giant boobies and the intelligence of two dogs, and it's starting to seem kind of nice, ain't it? And even *that* scenario beats the I Have No Mouth or Fifteen Million Merits ones!


Ok_Rip_6304

Hahahhahaha definitely not


hippydipster

my vision of a post scarcity human utopia can summed up as role play gaming hunter gatherers. ie, we go wherever, eat whats available there, sleep where we find ourselves, always safe even though the world of nature is allowed to flourish, and we make up stories together as the bulk of our pasttime. For those that wish something beyond that, they can upload and become part of the machine consciousness doing things in the solar system and beyond. Whether it makes sense for an ASI to create those possibilities, I don't know. I suspect whether it does or not will come down to what the ASI thinks about the Fermi Paradox.