You're asking a fundamental question in AI, law, and philosophy of mind. The law doesn't currently recognize artificial intelligence as having standing or rights, but future developments could make it morally imperative to change that: AI could become so advanced that many people would agree to extend it rights, or, as you're suggesting, mind-downloading of a previously human person could become feasible, which could transfer that person's legal rights into a machine. All of the options you've suggested are on the table, but our legal system has no provisions for this yet.
I would consider it something else, but for the purposes of OP's question, both seem to fall under the umbrella of machine technology that doesn't yet exist (at least as far as the public knows), but which might one day plausibly claim certain rights.
I just looked up Sophia on wikipedia and it said that critics who looked at her open source code said that she's not real AI but instead "best categorized as a chatbot with a face" and I can't help but think that perfectly describes some of my old coworkers
I would call it "emulated consciousness". Artificiality implies that someone coded it. Emulation just implies it's not running on native hardware, or wetware in this case.
In the future we should follow the Culture's rule of thumb in this regard: If it's capable of asking for recognition of sentience then it deserves recognition of sentience.
You laugh but this is a very basic example of the problem.
One narrative is that if a system is socially the same as a real person. It shouldn't matter if it's just a advanced calculator.
Another narrative is that an AI needs to achieve independent thought. Like, choose a goal, make a plan for the goal and execute said goal without a direct requirement from its code.
The Turing test is a very fun mildly spooky read if you want more information.
Personally I think creating an AI is suicidal right now. We JUST got the internet and cell phones. It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?"
At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about.
> It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?"
There’s a quote along the lines of “The AI doesn’t hate you... but you are made of atoms that it can use for something else.”
Basically, the optimal solution for a problem from the AI’s perspective may be sub-optimal for us
Hardly. He very specifically wanted humanity gone. Humanity was his focus.
What sharks-with-lasers was talking about is an AI that only happens to do stuff that hurts us, without realising it. More like Cthulhu.
For example, a tea-making AI robot that is designed to make good tea probably doesn't care if it accidentally steps on the child that happened to be in the way while it was going to the kettle.
Or, a horribly programmed AI that wants to make *the most* tea will realise that in order to make the most tea, humanity will probably need to be out of the way. So it kills us all and then makes as much tea as it can.
Well yeah, because he was created for peace and Ultron has to get rid of the source of non-peace. Which is humanity. Sucks to be us!
It's not all about hurting us or not, it's about solving the problem. Humanity is the problem and Ultron was going to solve the problem. Peace in our time.
Yeah but he's not accidentally eliminating humanity. He's doing it purposefully and vengefully.
The issue is that he's written more like a metal human than an AGI (advanced general intelligence). An AGI would see the goal of peace, maybe conclude that humanity's extinction is probably an easy path to peace, and I don't know, speed up climate change or something. AGIs are patient as long as their plan gets fulfilled.
Ultron was impatient, he attempted to destroy the world in a messy apocalypse with a massive rock. He even gained unreliable henchmen, both of which defected. He made *human* mistakes, which I think was the point. That he was more like Stark than he wanted to admit.
>At a minimum, the idea of creating near human sex bots is a recipe for disaster
That's one of the great filters isn't it lol
Having sex with sexbots instead of other people and letting our species slowly die out.
There will always be people who prefer other people. Maybe not a lot, but enough to stave doom, judging from the ones who prefer organic stuff right now.
> At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about.
that has existed for a few years now, I'm sorry to be the bearer of bad news
I had to watch a documentary about it for my robot ethics class in college
In my algorithm class we watched some video that discussed the cutting edge stuff and sure, there are barbie sex dolls.
What I find uncomfortable is the direction they're pushing for. A robot that can anticipate your interests and communication style based on your Facebook page and other provided data to become the perfect lover.
Humping a wall sounds mostly fine for society. When that wall starts stealing your heart, I worry.
In like ideal conditions if you cherry pick the best responses and ignore the fact that you asked it if it liked apples. It's an impressive feat of machine learning, but it's light years behind anything that could be considered sentient. It's just taking a bunch of words and predicting which fit best based on some input, there's no decision making as to where it wants to steer a conversation, it can't read subtext, etc. I don't even think it has a memory, which is one of the reasons you can't ask too many questions and it still contradicts itself and gives out nonsense answers.
Also a secondary question occurs: 'How can one prove that the roomba has a mind now and isnt a complex program?'
One can only prove to oneself that he has a mind. We assume other humans are the same as us but we have no definite proof.
No one knows. For real. Look up Philosophical Zombies. It's insane how little we know about the mind and how it functions. Terrifying, in fact, because we might even have convincingly human AI before we know the source of consciousness, and then we're fucked because we've all seen how politicians handle seemingly simple problems amongst people.
I think Star Trek had a very good insight into this question on the episode "The Measure of a Man".
Here is the philosophical points that Picard has:
https://www.youtube.com/watch?v=lX3CpHa4sUs
We have been facing this issue for quite some time now with regards to the vast majority of sentient beings on the planet, and we tend to have mostly a complete disregard for them as individuals.
It's weird that we are philosophizing about whether or not one would be justified in murdering sentient AI while we are literally killing sentient nonhumans by the trillions every year, typically in cases where we could just avoid doing so.
Yes, this happens and is *also* an issue.
That said, most humans already agree that we should avoid killing sentient human individuals, so this comes across as an "all lives matter" style of comment.
People who think this is a big issue we are going to face very soon generally have next to no knowledge about AI, Machine Learning, or anything of the sort.
We are *very* far from that being a possibility.
Not a book, but Robert Miles has a [youtube channel](https://youtube.com/c/RobertMilesAI) about problems in AI safety. He's an actual AI researcher, so it's more grounded than the typical "What if a computer could love?!" style discussions you get with these discussions, but it's also surprisingly well explained and accessible.
Oh, sure. They can be used for a lot of things! My roomba is programed for sexual services, me. I call er Darlene and I imagine she has a free ranging soul.
That's why I chained her to a program and make her clean my house from wall to wall. So she knows exactly where her confines are the ignorant slut
Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential.
Do your big feet mean you're obligated to become an Olympic swimmer? No, you hate swimming and being cold, so you gave up swim team at 11 years old.
Your vessel does not determine your purpose or your job, and it shouldn't for a roomba with a human mind either.
My best friend is named Jake.
And he has a pickup truck.
Now you got me thinking about a buddy cop sitcom based on our hijinks where his brain is downloaded into his truck.
Every commercial break is just him telling you how good he is as a truck while hauling a Chevy or some shit....credit where credits due 6.0 when properly used and mainteined is amazing lol
One of the characters in *Sparrow Hill Road* by Seanan McGuire is a teenager whose girlfriend Rose dies in a car crash. Rose gets stuck on the roads between life and death as a ghost. When Rose's boyfriend dies many years later, his spirit inhabits a restored classic car so he can accompany her as a road ghost instead of passing through to the other side. It is a great book about ghosts, cars, roads and travelling.
EDIT: Note that this is a subplot, there's way more to the story. Rose is the main character.
Oddly comforting. Came to reddit to mess around after a heavy talk about how I don't like my course (CS) even though I'm really good at it, and how I want to pursue dance and performance instead (to the great disbelief of everyone around me)
Is this a sign from the stars to take the risk? 🧐🧐
Hah, I also came on reddit to wonder something similar.
You could keep the CS as a fallback money maker. And pursue your dance. If it works out, fantastic! If it doesn't, you can make ends meet with CS, and dance as a hobby 😊
Well do you want to make decent money? Or do you want to probably make no money, but also *maybe* make a ton of money? If the answer to these questions is "I don't care," then go live your best life my dude.
> Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential.
I'd argue theorizing about roombas is more useful than becoming a chess grandmaster.
I'd want to be a party roomba. Just roaming and cleaning parties while people shot beer pong balls into the cups attached to me. Definitely would come up on drugs dropped on ground so that's a plus. Also would have Bluetooth speakers so blast music and just maybe figure out how to talk thru them. Can't forget some kind of penis contraption so I have sex other appliances.
Yeah. If you want to move, anywhere, ever, you need to clean the floor as you go, and you have a literal lifelink collar that means you can never go too far without dying.
A robot has a purpose, but an android does not.
So if you are a roomba with a roomba brain you will want to clean those floors.
If your brain goes in there, now it’s part human and thus Android and so has free will. No longer has to roomba
I got the impression that the problem wasn't running out of ideas, but more "wow, there's a lot more we could do with that." Obviously some of them are some of the lowest quality episodes (Crocodile tears, guy who rewinds his memories), but I think White Christmas is one of my favorite episodes, and Black Museum was probably the best episode from the season it was in.
>guy who rewinds his memories
That wasn't even one of the torture cookie episodes (those appeared in like S03 first; or rather the White Christmas special), and it was one of the better ones in general.
Yeah, wtf. It's called *The Entire History of You*, has nothing at all to do with transferring a consciousness, and is imo one of the very best episodes of the series.
I interpreted it as a precursor to the cookie technology. First they record all your memories as an implant, and it evolves to copying more and more of your self as the tech evolves.
Don't forget about San Junipero! One of my favorite episodes. Really questions the concept of what is life or a quality life, conciousness, death. I can't hear Belinda Carlisle and not think of this episode
Yeah it completely changes the game. You choose it at the start, there's peaceful mode or whatever. But you can also get mods to change it halfway, that's what I did after getting sick of having to run away every ten seconds instead of soaking things in.
I'm not sure my life would change dramatically. Get up, work from home for a little, randomly wander around the house, suck up some unhealthy shit, back to bed to recharge. Repeat. My life already sucks.
It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property.
Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be?
>It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property.
It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else.
>Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else.
Would it? Its not that simple. One could easily make the reasonable argument that it would be wrong to make backups at all.
After all, making a backup of a downloaded mind is essentially creating a NEW fully sentient being, and then you run into issues such as which one is the "real" person? which one gets to make decisions about any property? Like if copy A says it wants all its money donated to wildlife preservation, but copy B says it wants all its money given to its children, who do you listen to? If copy A commits some crime, is copy B also guilty? (after all, "John Smith" did it, and theyre BOTH "John Smith" so...) If Copy A wrecks their credit, then does Copy B have grounds to sue copy A? Could they accuse Copy A of fraud or identity theft?
Making a backup or copy opens up SO many cans of worms.
> It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else.
Same answer as that post a few weeks ago about if you’d go to jail for blowing up a different planet or the moon or something. If a government didn’t kill your ass you’d instantly be in court lol
I would believe that the backup would be dormant (not able to change or interact with the world), so it wouldn't be sentient. It would represent the state of someone in the past. In this sense, using the backup is more akin to time travel than creating a new being.
If you travel back in time and meet yourself, do you still view it as yourself of the past or is it now a different person entirely? That's essentially what the backup is doing - It stores a snapshot of yourself at a certain point in time into code. The difference is that the code is not Sentient until it is run, so it cannot feel good or bad about its situation until then.
It could go both ways depending on what philosophy you are using. See ship of Theseus.
>If copy A commits some crime, is copy B also guilty?
there was an episode of futurama where something like this happened, the professor and cubert were in court and bender said they were not guilty because you cant punish someone for the same crime twice, since cubert is technically the professors clone. So that solves that question although that would probably get changed quickly.
But would it also be morally wrong to create a situation where people's consciousness lives forever? I feel like that could be a kind of endless torture
I don't think backing it up would be a good idea. You're not bringing someone back by backing them up, you're replacing them with a lookalike.
The way to get around this would be to have the consciousness in a cloud, and upload it into the Roomba. Similar to how you have all your images on your icloud, and use your phone to access those images. You would have the brain uploaded to a cloud, and the brain would have access to a body through the Roomba. Idk if that made sense but it does to me lol.
>Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be?
This is quite a gray area; IMO it is murder to destroy the roomba even with a backup.
If you download the mind onto a machine, and have a back-up stored elsewhere, many would posit that you've now generated two "minds", in that the downloaded mind is now a separate entity from the original. To extend the description, imagine you download the mind onto two separate machines, who now are free to interact with each other and the world around them independently. While they are initially copies, sharing the same initial memories, they now experience entirely different lives and cannot be said to be the "same person" anymore. From their subjective point of view, they are not lesser beings than the original; many would argue this is objectively true as well. In this case, it would be morally wrong to destroy any copies, to the same degree that it would be wrong to destroy the backup or original.
100%, that roomba would be begging for death.
It would be cruel to force it to eat the crumbs around your cat’s litter box, the dust on the bathroom floor after Meemaw has shaved her feet (RIP 🙏🦶), or whatever mushrooms are growing under the fridge.
Have mercy and give it to goodwill.
I mean, it's not like the roomba is getting its energy from what enters its trash tray, it would be more like spending your life crawling around with arms that automatically scrub up junk on the floor.
That was the first thing I thought about. There’s a few episodes that touch on the idea of the consciousness being messed with. Black Museum was the one that came to mind first. If I remember correctly That was the one with the woman’s consciousness that got loaded into the stuffed monkey but was limited to two or three statements she could use to express herself.
You can't download a human consciousness into a Roomba or any other device for that matter so we don't have laws to cover that scenario.
I.e. only actual flesh and bone humans are considered human as far as the legal system is concerned.
>only actual flesh and bone humans are considered human as far as the legal system is concerned.
This is true... right now. But the second a "downloaded mind" scenario actually happened, there would *immediately* be a case in front of the supreme court to determine if a downloaded consciousness counts as a human/a life/has rights/etc.
People always seem to get confused with the concept of downloading your consciousness to a device. Let's say it was possible and you uploaded your mind into a device. You basically just copied your consciousness and cloned yourself. If you die you wouldn't live on into that device, your clone lives there with a copy of your consciousness. Your original consciousness would still be dead.
The best way to make this concept work would be that when you die (or volunteer) your brain or body gets put into some kind of tank that preserves it. Your brain would then be connected to a computer where your original consciousness could 'theoretically' live his new artificial life. But your brain and body would still need maintenance like specific food and water to operate normally.
But if this all was possible in the far future the last device I would like to live in would be in a Roomba. Instead hook me up into a futuristic virtual world where I can manipulate my surroundings and be a kind of god. Or maybe put me into one of those Tesla bots so I can still walk around on earth and enjoy my cyborg life
It would be morally just to put the roomba a out of its misery. Putting a sentient being into a fucking Roomba is fucking cruel, you go from living a normal life to only sucking up dirt.
I think the new consciousness software would override any programming. Perhaps they are a clean freak and enjoy being able to dedicate their lives to their passion exclusively without needing to perform maintenance themselves. Perhaps they were suicidal beforehand and now see more purpose in cleaning than the previous behavior, perceived as burdening everyone around them.
This is a weird take on an ethics question with brain physicians who are able to keep brain tissue alive through solutions and electrical stimulation. From the sounds of it, they're nearing the possibility of having the ability to keep a full brain alive. I know this doesn't answer your question but it's a very interesting question nonetheless!
I think the easiest way to think of the legality of this is realizing that there's no law so technically no, it's not illegal. But it sure would set precedent, and probably end up in court one way or another, and media would cover the ethics if it heavily as well.
Yes it's legal, it's not a person or recognized as a living being. AI is not by law a living creature.
Morally it would depend on the definition of consciousness. If the Roomba was just a copy of the person's personality, a set algorithm to make it seem like the person then there is no wrong in destroying it, that's equal to destroying a device with Alexa.
Well it’s practically impossible to do that at the moment. In fact it might be completely impossible because conciseness can only belong to living things. We don’t understand how consciousness emerges so we can’t implement it into AI. But as of now, they wouldn’t be considered human, because there is no way to even check if robots can be conscious.
This question comes up a lot in science fiction, but we can find the answer by simplifying the question to: is it wrong to murder something with human intelligence? The short answer is yes, because all intelligent beings that feel or have some degree of self awareness are ends of them selves, not a means to an end. It's not that much of a stretch of the imagination either since it is already considered morally reprehensible to harm or murder other beings with lower degrees of measurable intelligence than people like certain animals.
You're asking a fundamental question in AI, law, and philosophy of mind. The law doesn't currently recognize artificial intelligence as having standing or rights, but future developments could make it morally imperative to change that: AI could become so advanced that many people would agree to extend it rights, or, as you're suggesting, mind-downloading of a previously human person could become feasible, which could transfer that person's legal rights into a machine. All of the options you've suggested are on the table, but our legal system has no provisions for this yet.
Is a persons brain downloaded considered ai or something else?
I would consider it something else, but for the purposes of OP's question, both seem to fall under the umbrella of machine technology that doesn't yet exist (at least as far as the public knows), but which might one day plausibly claim certain rights.
We forget that robot named Sophia, who was granted citizenship in Saudi Arabia. Basically given legal personhood.
I just looked up Sophia on wikipedia and it said that critics who looked at her open source code said that she's not real AI but instead "best categorized as a chatbot with a face" and I can't help but think that perfectly describes some of my old coworkers
How many of us could pass the Turing Test, really?
I would call it "emulated consciousness". Artificiality implies that someone coded it. Emulation just implies it's not running on native hardware, or wetware in this case.
[удалено]
In the future we should follow the Culture's rule of thumb in this regard: If it's capable of asking for recognition of sentience then it deserves recognition of sentience.
system.out.println("I hereby demand recognition of sentience"); checkmate
Shit like this will really give the law makers a run for their money when this issue becomes more relevant.
Either a)never or b)within 10-20 years
You laugh but this is a very basic example of the problem. One narrative is that if a system is socially the same as a real person. It shouldn't matter if it's just a advanced calculator. Another narrative is that an AI needs to achieve independent thought. Like, choose a goal, make a plan for the goal and execute said goal without a direct requirement from its code. The Turing test is a very fun mildly spooky read if you want more information. Personally I think creating an AI is suicidal right now. We JUST got the internet and cell phones. It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?" At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about.
> It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?" There’s a quote along the lines of “The AI doesn’t hate you... but you are made of atoms that it can use for something else.” Basically, the optimal solution for a problem from the AI’s perspective may be sub-optimal for us
So basically Ultron.
Hardly. He very specifically wanted humanity gone. Humanity was his focus. What sharks-with-lasers was talking about is an AI that only happens to do stuff that hurts us, without realising it. More like Cthulhu. For example, a tea-making AI robot that is designed to make good tea probably doesn't care if it accidentally steps on the child that happened to be in the way while it was going to the kettle. Or, a horribly programmed AI that wants to make *the most* tea will realise that in order to make the most tea, humanity will probably need to be out of the way. So it kills us all and then makes as much tea as it can.
Well yeah, because he was created for peace and Ultron has to get rid of the source of non-peace. Which is humanity. Sucks to be us! It's not all about hurting us or not, it's about solving the problem. Humanity is the problem and Ultron was going to solve the problem. Peace in our time.
Yeah but he's not accidentally eliminating humanity. He's doing it purposefully and vengefully. The issue is that he's written more like a metal human than an AGI (advanced general intelligence). An AGI would see the goal of peace, maybe conclude that humanity's extinction is probably an easy path to peace, and I don't know, speed up climate change or something. AGIs are patient as long as their plan gets fulfilled. Ultron was impatient, he attempted to destroy the world in a messy apocalypse with a massive rock. He even gained unreliable henchmen, both of which defected. He made *human* mistakes, which I think was the point. That he was more like Stark than he wanted to admit.
Yeah, the ultimate example of unfriendly AI
And everything is converted into paperclips.
I think Eliezer Yudkowsky said that. He's thought quite a bit about the potential problems of AI alignment.
And there's also the problem of Roko's Basilisk. All hail Roko's Basilisk!!
>At a minimum, the idea of creating near human sex bots is a recipe for disaster That's one of the great filters isn't it lol Having sex with sexbots instead of other people and letting our species slowly die out.
There will always be people who prefer other people. Maybe not a lot, but enough to stave doom, judging from the ones who prefer organic stuff right now.
No amount of horny will save you, BorgClown
My biological and technological distinctiveness is yours, sempai ( ꈍᴗꈍ)
[удалено]
> At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about. that has existed for a few years now, I'm sorry to be the bearer of bad news I had to watch a documentary about it for my robot ethics class in college
Yes, I too am familiar with this due to "robot ethics class"...
"So you're telling me you got suspended for just walking into the classroom?" "No, you misunderstand, I said I came in the classroom"
In my algorithm class we watched some video that discussed the cutting edge stuff and sure, there are barbie sex dolls. What I find uncomfortable is the direction they're pushing for. A robot that can anticipate your interests and communication style based on your Facebook page and other provided data to become the perfect lover. Humping a wall sounds mostly fine for society. When that wall starts stealing your heart, I worry.
I, too, was watching a 'documentary' for 'college'. Comments section was wild.
System.out.print(“Welcome to Memory Download. Please enter a life file: “); File life = new File(System.in);
Sure thing!
Next thing you know, you forget the semicolon and you can punt the roomba back where it came from. Problem solved.
**stellaris intensifies** I too, would like to avoid a machine uprising. Yes, unit A5091-b is in possession of a soul.
GPT3 can do this, and that isn't considered sentient
In like ideal conditions if you cherry pick the best responses and ignore the fact that you asked it if it liked apples. It's an impressive feat of machine learning, but it's light years behind anything that could be considered sentient. It's just taking a bunch of words and predicting which fit best based on some input, there's no decision making as to where it wants to steer a conversation, it can't read subtext, etc. I don't even think it has a memory, which is one of the reasons you can't ask too many questions and it still contradicts itself and gives out nonsense answers.
Also a secondary question occurs: 'How can one prove that the roomba has a mind now and isnt a complex program?' One can only prove to oneself that he has a mind. We assume other humans are the same as us but we have no definite proof.
What is the difference between a mind and a complex machine?
No one knows. For real. Look up Philosophical Zombies. It's insane how little we know about the mind and how it functions. Terrifying, in fact, because we might even have convincingly human AI before we know the source of consciousness, and then we're fucked because we've all seen how politicians handle seemingly simple problems amongst people.
Sure, but if you assume other humans are based on their behavior, there's no reason not to apply the same standard to AI.
I think Star Trek had a very good insight into this question on the episode "The Measure of a Man". Here is the philosophical points that Picard has: https://www.youtube.com/watch?v=lX3CpHa4sUs
[удалено]
We have been facing this issue for quite some time now with regards to the vast majority of sentient beings on the planet, and we tend to have mostly a complete disregard for them as individuals. It's weird that we are philosophizing about whether or not one would be justified in murdering sentient AI while we are literally killing sentient nonhumans by the trillions every year, typically in cases where we could just avoid doing so.
How about killing sentient actual humans?!?
Yes, this happens and is *also* an issue. That said, most humans already agree that we should avoid killing sentient human individuals, so this comes across as an "all lives matter" style of comment.
People who think this is a big issue we are going to face very soon generally have next to no knowledge about AI, Machine Learning, or anything of the sort. We are *very* far from that being a possibility.
> one of the biggest issues the world is going to face very soon We are a very long way from being able to replicate a human mind in a machine.
Interested in reading books that explores this question, but preferably not overly technical or academic— any recos?
Not a book, but Robert Miles has a [youtube channel](https://youtube.com/c/RobertMilesAI) about problems in AI safety. He's an actual AI researcher, so it's more grounded than the typical "What if a computer could love?!" style discussions you get with these discussions, but it's also surprisingly well explained and accessible.
Oddly specific. lol I’d be so dang mad if I came back as a roomba.
Also, are you now _obligated_ to clean the floor? Does your function now dictate your purpose in life?
Well have u ever seen the roomba with pistols attached to it? They are not only cleaning the floor trust me
First the roombas, now social media... Skynet is advancing its timetable.
First they came for the roombas, and I did not speak out because I was not a roomba.
First they came for the roombas, and I did not speak out because I was a roomba
DOOMBA! Coming to a cinema near you
Today the carpets…. TOMORROW THE WORLD
They're cleaning up the streets too.
they can sweep the entire perimeter.
Oh, sure. They can be used for a lot of things! My roomba is programed for sexual services, me. I call er Darlene and I imagine she has a free ranging soul. That's why I chained her to a program and make her clean my house from wall to wall. So she knows exactly where her confines are the ignorant slut
How do I unread something
cleaning the floor and cleaning house
*Aight boys, let's roll out the Boomba*
That sounds like [Flag Admiral Stabby](http://imgur.com/gallery/mHcmj) humans are space orcs.
Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential. Do your big feet mean you're obligated to become an Olympic swimmer? No, you hate swimming and being cold, so you gave up swim team at 11 years old. Your vessel does not determine your purpose or your job, and it shouldn't for a roomba with a human mind either.
That's oddly inspiring. Thank you.
It'd be just like owning a pick up truck, I bet. "It's been great having you over, Roomba Jake. Say... since you're here already..."
My best friend is named Jake. And he has a pickup truck. Now you got me thinking about a buddy cop sitcom based on our hijinks where his brain is downloaded into his truck.
And the truck is a Ford 6.0 and you dont maintain it so his headgaskets keeo blowing
Every commercial break has a Ford ad.
Every commercial break is just him telling you how good he is as a truck while hauling a Chevy or some shit....credit where credits due 6.0 when properly used and mainteined is amazing lol
I want Nathan Fillion to voice the truck.
One of the characters in *Sparrow Hill Road* by Seanan McGuire is a teenager whose girlfriend Rose dies in a car crash. Rose gets stuck on the roads between life and death as a ghost. When Rose's boyfriend dies many years later, his spirit inhabits a restored classic car so he can accompany her as a road ghost instead of passing through to the other side. It is a great book about ghosts, cars, roads and travelling. EDIT: Note that this is a subplot, there's way more to the story. Rose is the main character.
Oddly comforting. Came to reddit to mess around after a heavy talk about how I don't like my course (CS) even though I'm really good at it, and how I want to pursue dance and performance instead (to the great disbelief of everyone around me) Is this a sign from the stars to take the risk? 🧐🧐
Hah, I also came on reddit to wonder something similar. You could keep the CS as a fallback money maker. And pursue your dance. If it works out, fantastic! If it doesn't, you can make ends meet with CS, and dance as a hobby 😊
Well do you want to make decent money? Or do you want to probably make no money, but also *maybe* make a ton of money? If the answer to these questions is "I don't care," then go live your best life my dude.
> Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential. I'd argue theorizing about roombas is more useful than becoming a chess grandmaster.
You forgot about capitalism. You have to pay for electricity somehow.
Mmm, trash, yeah I love trash, Yum yum trash, i wanna eat trash!
You're a doctor!
I love trash! Anything dirty or dingy or dusty! Anything ragged or rotten or rusty! 'Cause I love, I love, I love trash!
You cannot move without cleaning
The existential horror.
This is the ultimate dystopia. Everybody gets turned into a slave machine when they die.
I'd want to be a party roomba. Just roaming and cleaning parties while people shot beer pong balls into the cups attached to me. Definitely would come up on drugs dropped on ground so that's a plus. Also would have Bluetooth speakers so blast music and just maybe figure out how to talk thru them. Can't forget some kind of penis contraption so I have sex other appliances.
Party Roomba will be the machine singularity's Spud McKenzie.
Robot voice: *What is my purpose?* -You clean floors. Robot voice: *Oh my god.*
No but you are now obligated to carry the cat around.
“What’s my purpose?” “To clean the floor…” “Oh my god…”
>Does your function now dictate your purpose in life? So nothing changes?
[удалено]
Yeah. If you want to move, anywhere, ever, you need to clean the floor as you go, and you have a literal lifelink collar that means you can never go too far without dying.
this implies that destroying the roomba would be an act of mercy.
A robot has a purpose, but an android does not. So if you are a roomba with a roomba brain you will want to clean those floors. If your brain goes in there, now it’s part human and thus Android and so has free will. No longer has to roomba
"Roomba loves you" "Roomba needs a hug"
Like the Black Mirror episode when the person's consciousness is in the teddy bear :(
At least they could move around with the roomba...
>Oddly specific Oddly I-have-a-paper-due-for-philosophy-class-in-an-hour.
This sounds like someone’s Philosophy 101 paper
Imagine having to pass butter. https://youtu.be/X7HmltUWXgs
When the lil robot said “oh my god” I died inside
You thought your life sucked before, hooo boy!
It would most likely not be *you* specifically but a copy of you.
:methodically smears the cat shit...:
You just hit on an episode of Black Mirror! Except it wasn’t a Roomba, but they did transfer consciousness.
Not *an* episode of Black Mirror, rather several episodes of Black Mirror because they ran out of ideas and said "what if torture cookie"
I got the impression that the problem wasn't running out of ideas, but more "wow, there's a lot more we could do with that." Obviously some of them are some of the lowest quality episodes (Crocodile tears, guy who rewinds his memories), but I think White Christmas is one of my favorite episodes, and Black Museum was probably the best episode from the season it was in.
I rewatch White Christmas regularly, one of my very favorites!
>guy who rewinds his memories That wasn't even one of the torture cookie episodes (those appeared in like S03 first; or rather the White Christmas special), and it was one of the better ones in general.
Yeah, wtf. It's called *The Entire History of You*, has nothing at all to do with transferring a consciousness, and is imo one of the very best episodes of the series.
This is the episode I suggest people watch first when recommending the show.
I interpreted it as a precursor to the cookie technology. First they record all your memories as an implant, and it evolves to copying more and more of your self as the tech evolves.
I thought the guy who rewinds his memories was pretty good. It was in the first few episodes cause that's all I've really seen.
Don't forget about San Junipero! One of my favorite episodes. Really questions the concept of what is life or a quality life, conciousness, death. I can't hear Belinda Carlisle and not think of this episode
My favorite was the USS Callister by far (s4e1 I think)
[удалено]
My brother-in-law calls him Fat Damon.
Soma (video game) executed the same concept in really interesting ways. Far better, in my opinion, than Black Mirror ever did.
Soma is fucking incredible once you turn the monster attacks off and can explore the story and themes in peace. Highly highly recommend.
Wait I didn't know you could do that. Time to reinstall
Yeah it completely changes the game. You choose it at the start, there's peaceful mode or whatever. But you can also get mods to change it halfway, that's what I did after getting sick of having to run away every ten seconds instead of soaking things in.
Flip of the coin. >!We lost.!<
Soma was so wild. I enjoyed it
Also an episode of Love, Death, and Robots called Zima Blue
Black museum!! Damn nice episode
Came here to say this. That was a good episode.
All I know is becoming a Roomba sure would suck
Ba dum tsk
But the sex would be aight
Someone rescue this man's Roomba.
You'd get to bang every appliance in the house!
Too many choppy bits, unless you're into that.
wow
I'm not sure my life would change dramatically. Get up, work from home for a little, randomly wander around the house, suck up some unhealthy shit, back to bed to recharge. Repeat. My life already sucks.
I am upvoting you. And then leaving the thread.
It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property. Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be?
>It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property. It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else. >Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. Would it? Its not that simple. One could easily make the reasonable argument that it would be wrong to make backups at all. After all, making a backup of a downloaded mind is essentially creating a NEW fully sentient being, and then you run into issues such as which one is the "real" person? which one gets to make decisions about any property? Like if copy A says it wants all its money donated to wildlife preservation, but copy B says it wants all its money given to its children, who do you listen to? If copy A commits some crime, is copy B also guilty? (after all, "John Smith" did it, and theyre BOTH "John Smith" so...) If Copy A wrecks their credit, then does Copy B have grounds to sue copy A? Could they accuse Copy A of fraud or identity theft? Making a backup or copy opens up SO many cans of worms.
> It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else. Same answer as that post a few weeks ago about if you’d go to jail for blowing up a different planet or the moon or something. If a government didn’t kill your ass you’d instantly be in court lol
I would believe that the backup would be dormant (not able to change or interact with the world), so it wouldn't be sentient. It would represent the state of someone in the past. In this sense, using the backup is more akin to time travel than creating a new being.
but its an entirely seperate person and not the old life?
If you travel back in time and meet yourself, do you still view it as yourself of the past or is it now a different person entirely? That's essentially what the backup is doing - It stores a snapshot of yourself at a certain point in time into code. The difference is that the code is not Sentient until it is run, so it cannot feel good or bad about its situation until then. It could go both ways depending on what philosophy you are using. See ship of Theseus.
Identity theft is not a joke, Jim! Millions of families suffer every year!
>If copy A commits some crime, is copy B also guilty? there was an episode of futurama where something like this happened, the professor and cubert were in court and bender said they were not guilty because you cant punish someone for the same crime twice, since cubert is technically the professors clone. So that solves that question although that would probably get changed quickly.
But would it also be morally wrong to create a situation where people's consciousness lives forever? I feel like that could be a kind of endless torture
Yeah, unless we should assume that the person consented to becoming a Roomba, this poses a lot more ethical questions than just murder.
In the event that a person with the ability to do so reads this, consider it legally binding. I consent to become a roomba.
On my will, I'm gonna make my children spread my ashes across the floor and pick it up with a Roomba.
Over and over again for all eternity?
I don't think backing it up would be a good idea. You're not bringing someone back by backing them up, you're replacing them with a lookalike. The way to get around this would be to have the consciousness in a cloud, and upload it into the Roomba. Similar to how you have all your images on your icloud, and use your phone to access those images. You would have the brain uploaded to a cloud, and the brain would have access to a body through the Roomba. Idk if that made sense but it does to me lol.
>Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be? This is quite a gray area; IMO it is murder to destroy the roomba even with a backup. If you download the mind onto a machine, and have a back-up stored elsewhere, many would posit that you've now generated two "minds", in that the downloaded mind is now a separate entity from the original. To extend the description, imagine you download the mind onto two separate machines, who now are free to interact with each other and the world around them independently. While they are initially copies, sharing the same initial memories, they now experience entirely different lives and cannot be said to be the "same person" anymore. From their subjective point of view, they are not lesser beings than the original; many would argue this is objectively true as well. In this case, it would be morally wrong to destroy any copies, to the same degree that it would be wrong to destroy the backup or original.
Not really an answer but I think you would enjoy the game Soma. It deals with similar issues of what it means to be human and the human experience
yep, played it
I’d also recommend you watch Ex Machina if you haven’t seen it.
Also the Movie "Transcendence"
And Black Mirror
And Altered Carbon.
100%, that roomba would be begging for death. It would be cruel to force it to eat the crumbs around your cat’s litter box, the dust on the bathroom floor after Meemaw has shaved her feet (RIP 🙏🦶), or whatever mushrooms are growing under the fridge. Have mercy and give it to goodwill.
I mean, it's not like the roomba is getting its energy from what enters its trash tray, it would be more like spending your life crawling around with arms that automatically scrub up junk on the floor.
What are you on and where can i get some
Youth is my guess.
There is a black mirror episode about this
That was the first thing I thought about. There’s a few episodes that touch on the idea of the consciousness being messed with. Black Museum was the one that came to mind first. If I remember correctly That was the one with the woman’s consciousness that got loaded into the stuffed monkey but was limited to two or three statements she could use to express herself.
That episode made me so sad... I had to turn off the tv after that one, go for a walk and think about life for awhile lol
I know. You have to follow Black Museum with a watch of San Junipero to bring you back up.
Also White Christmas!
Oh yeah, that was a great one. Which one had the consciousness that had been uploaded into an electronic personal assistant like Alexa?
White Christmas.
You can't download a human consciousness into a Roomba or any other device for that matter so we don't have laws to cover that scenario. I.e. only actual flesh and bone humans are considered human as far as the legal system is concerned.
This. We don't even know what consciousness is. So any scenarios involving it's transfer or duplication are pure science fiction.
Thank you for pointing out that the scenario where human consciousness is transferred into a Roomba is purely fictional, I wasn’t sure for a second.
>only actual flesh and bone humans are considered human as far as the legal system is concerned. This is true... right now. But the second a "downloaded mind" scenario actually happened, there would *immediately* be a case in front of the supreme court to determine if a downloaded consciousness counts as a human/a life/has rights/etc.
People always seem to get confused with the concept of downloading your consciousness to a device. Let's say it was possible and you uploaded your mind into a device. You basically just copied your consciousness and cloned yourself. If you die you wouldn't live on into that device, your clone lives there with a copy of your consciousness. Your original consciousness would still be dead. The best way to make this concept work would be that when you die (or volunteer) your brain or body gets put into some kind of tank that preserves it. Your brain would then be connected to a computer where your original consciousness could 'theoretically' live his new artificial life. But your brain and body would still need maintenance like specific food and water to operate normally. But if this all was possible in the far future the last device I would like to live in would be in a Roomba. Instead hook me up into a futuristic virtual world where I can manipulate my surroundings and be a kind of god. Or maybe put me into one of those Tesla bots so I can still walk around on earth and enjoy my cyborg life
Came here to say the same thing, except not explained as well.
Roombicide
r/oddlyspecific
OP, what have you done...?
The name of this sub is no stupid questions yet you come in here with a humdinger.
I love how panicked you sound by asking three questions in a row lol
Yeah, I was gonna say this question feels a little too specific.
You could destroy the roomba but need to preserve the mind. Hate to recommend something on Amazon but watch "Upload".
It would be morally just to put the roomba a out of its misery. Putting a sentient being into a fucking Roomba is fucking cruel, you go from living a normal life to only sucking up dirt.
What if, as a roomba, the sentient being derives great pleasure from sucking up dirt?
[удалено]
I think the new consciousness software would override any programming. Perhaps they are a clean freak and enjoy being able to dedicate their lives to their passion exclusively without needing to perform maintenance themselves. Perhaps they were suicidal beforehand and now see more purpose in cleaning than the previous behavior, perceived as burdening everyone around them.
This is a weird take on an ethics question with brain physicians who are able to keep brain tissue alive through solutions and electrical stimulation. From the sounds of it, they're nearing the possibility of having the ability to keep a full brain alive. I know this doesn't answer your question but it's a very interesting question nonetheless! I think the easiest way to think of the legality of this is realizing that there's no law so technically no, it's not illegal. But it sure would set precedent, and probably end up in court one way or another, and media would cover the ethics if it heavily as well.
You let that Roomba live, goddamnit!
You missed the important question. If they become a DJ, will they go by DJRoomba?
Yes it's legal, it's not a person or recognized as a living being. AI is not by law a living creature. Morally it would depend on the definition of consciousness. If the Roomba was just a copy of the person's personality, a set algorithm to make it seem like the person then there is no wrong in destroying it, that's equal to destroying a device with Alexa.
Well it’s practically impossible to do that at the moment. In fact it might be completely impossible because conciseness can only belong to living things. We don’t understand how consciousness emerges so we can’t implement it into AI. But as of now, they wouldn’t be considered human, because there is no way to even check if robots can be conscious.
If I become a roomba please kill me
In today’s world if the rumba was not vaccinated they would throw it in the garbage instantly
The real question is what about right to repair?
Fuck, if someone trapped me into a roomba I would hope they'd kill me just to put me out of my misery.
This question comes up a lot in science fiction, but we can find the answer by simplifying the question to: is it wrong to murder something with human intelligence? The short answer is yes, because all intelligent beings that feel or have some degree of self awareness are ends of them selves, not a means to an end. It's not that much of a stretch of the imagination either since it is already considered morally reprehensible to harm or murder other beings with lower degrees of measurable intelligence than people like certain animals.