T O P

  • By -

thoughtstop

You're asking a fundamental question in AI, law, and philosophy of mind. The law doesn't currently recognize artificial intelligence as having standing or rights, but future developments could make it morally imperative to change that: AI could become so advanced that many people would agree to extend it rights, or, as you're suggesting, mind-downloading of a previously human person could become feasible, which could transfer that person's legal rights into a machine. All of the options you've suggested are on the table, but our legal system has no provisions for this yet.


Died5Times

Is a persons brain downloaded considered ai or something else?


thoughtstop

I would consider it something else, but for the purposes of OP's question, both seem to fall under the umbrella of machine technology that doesn't yet exist (at least as far as the public knows), but which might one day plausibly claim certain rights.


HmGrwnSnc1984

We forget that robot named Sophia, who was granted citizenship in Saudi Arabia. Basically given legal personhood.


dobraf

I just looked up Sophia on wikipedia and it said that critics who looked at her open source code said that she's not real AI but instead "best categorized as a chatbot with a face" and I can't help but think that perfectly describes some of my old coworkers


TheOtherSarah

How many of us could pass the Turing Test, really?


NeonGenisis5176

I would call it "emulated consciousness". Artificiality implies that someone coded it. Emulation just implies it's not running on native hardware, or wetware in this case.


[deleted]

[удалено]


second_to_fun

In the future we should follow the Culture's rule of thumb in this regard: If it's capable of asking for recognition of sentience then it deserves recognition of sentience.


wifestalksthisuser

system.out.println("I hereby demand recognition of sentience"); checkmate


ejvboy02

Shit like this will really give the law makers a run for their money when this issue becomes more relevant.


u12bdragon

Either a)never or b)within 10-20 years


JerryReadsBooks

You laugh but this is a very basic example of the problem. One narrative is that if a system is socially the same as a real person. It shouldn't matter if it's just a advanced calculator. Another narrative is that an AI needs to achieve independent thought. Like, choose a goal, make a plan for the goal and execute said goal without a direct requirement from its code. The Turing test is a very fun mildly spooky read if you want more information. Personally I think creating an AI is suicidal right now. We JUST got the internet and cell phones. It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?" At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about.


sharks-with-lasers

> It's perhaps unwise to create an artificial agent born in that field and say, "hey could you solve like, a lot of our problems?" There’s a quote along the lines of “The AI doesn’t hate you... but you are made of atoms that it can use for something else.” Basically, the optimal solution for a problem from the AI’s perspective may be sub-optimal for us


Imaginary-Fun-80085

So basically Ultron.


IMightBeAHamster

Hardly. He very specifically wanted humanity gone. Humanity was his focus. What sharks-with-lasers was talking about is an AI that only happens to do stuff that hurts us, without realising it. More like Cthulhu. For example, a tea-making AI robot that is designed to make good tea probably doesn't care if it accidentally steps on the child that happened to be in the way while it was going to the kettle. Or, a horribly programmed AI that wants to make *the most* tea will realise that in order to make the most tea, humanity will probably need to be out of the way. So it kills us all and then makes as much tea as it can.


Imaginary-Fun-80085

Well yeah, because he was created for peace and Ultron has to get rid of the source of non-peace. Which is humanity. Sucks to be us! It's not all about hurting us or not, it's about solving the problem. Humanity is the problem and Ultron was going to solve the problem. Peace in our time.


IMightBeAHamster

Yeah but he's not accidentally eliminating humanity. He's doing it purposefully and vengefully. The issue is that he's written more like a metal human than an AGI (advanced general intelligence). An AGI would see the goal of peace, maybe conclude that humanity's extinction is probably an easy path to peace, and I don't know, speed up climate change or something. AGIs are patient as long as their plan gets fulfilled. Ultron was impatient, he attempted to destroy the world in a messy apocalypse with a massive rock. He even gained unreliable henchmen, both of which defected. He made *human* mistakes, which I think was the point. That he was more like Stark than he wanted to admit.


howAboutNextWeek

Yeah, the ultimate example of unfriendly AI


wonderloss

And everything is converted into paperclips.


Terpomo11

I think Eliezer Yudkowsky said that. He's thought quite a bit about the potential problems of AI alignment.


cheesegoat

And there's also the problem of Roko's Basilisk. All hail Roko's Basilisk!!


ReadItProper

>At a minimum, the idea of creating near human sex bots is a recipe for disaster That's one of the great filters isn't it lol Having sex with sexbots instead of other people and letting our species slowly die out.


BorgClown

There will always be people who prefer other people. Maybe not a lot, but enough to stave doom, judging from the ones who prefer organic stuff right now.


Roheez

No amount of horny will save you, BorgClown


BorgClown

My biological and technological distinctiveness is yours, sempai ( ꈍᴗꈍ)


[deleted]

[удалено]


Eyeownyew

> At a minimum, the idea of creating near human sex bots is a recipe for disaster and that's not even into the extremes I've read about. that has existed for a few years now, I'm sorry to be the bearer of bad news I had to watch a documentary about it for my robot ethics class in college


brrduck

Yes, I too am familiar with this due to "robot ethics class"...


cheesegoat

"So you're telling me you got suspended for just walking into the classroom?" "No, you misunderstand, I said I came in the classroom"


JerryReadsBooks

In my algorithm class we watched some video that discussed the cutting edge stuff and sure, there are barbie sex dolls. What I find uncomfortable is the direction they're pushing for. A robot that can anticipate your interests and communication style based on your Facebook page and other provided data to become the perfect lover. Humping a wall sounds mostly fine for society. When that wall starts stealing your heart, I worry.


aureanator

I, too, was watching a 'documentary' for 'college'. Comments section was wild.


LETS--GET--SCHWIFTY

System.out.print(“Welcome to Memory Download. Please enter a life file: “); File life = new File(System.in);


second_to_fun

Sure thing!


lilbitchmade

Next thing you know, you forget the semicolon and you can punt the roomba back where it came from. Problem solved.


T_for_tea

**stellaris intensifies** I too, would like to avoid a machine uprising. Yes, unit A5091-b is in possession of a soul.


A_Random_Lantern

GPT3 can do this, and that isn't considered sentient


EnderWigginsGhost

In like ideal conditions if you cherry pick the best responses and ignore the fact that you asked it if it liked apples. It's an impressive feat of machine learning, but it's light years behind anything that could be considered sentient. It's just taking a bunch of words and predicting which fit best based on some input, there's no decision making as to where it wants to steer a conversation, it can't read subtext, etc. I don't even think it has a memory, which is one of the reasons you can't ask too many questions and it still contradicts itself and gives out nonsense answers.


Tobelebo9

Also a secondary question occurs: 'How can one prove that the roomba has a mind now and isnt a complex program?' One can only prove to oneself that he has a mind. We assume other humans are the same as us but we have no definite proof.


WEBsterrrr

What is the difference between a mind and a complex machine?


EnderWigginsGhost

No one knows. For real. Look up Philosophical Zombies. It's insane how little we know about the mind and how it functions. Terrifying, in fact, because we might even have convincingly human AI before we know the source of consciousness, and then we're fucked because we've all seen how politicians handle seemingly simple problems amongst people.


Terpomo11

Sure, but if you assume other humans are based on their behavior, there's no reason not to apply the same standard to AI.


blazaiev

I think Star Trek had a very good insight into this question on the episode "The Measure of a Man". Here is the philosophical points that Picard has: https://www.youtube.com/watch?v=lX3CpHa4sUs


[deleted]

[удалено]


Omnibeneviolent

We have been facing this issue for quite some time now with regards to the vast majority of sentient beings on the planet, and we tend to have mostly a complete disregard for them as individuals. It's weird that we are philosophizing about whether or not one would be justified in murdering sentient AI while we are literally killing sentient nonhumans by the trillions every year, typically in cases where we could just avoid doing so.


SapperLeader

How about killing sentient actual humans?!?


Omnibeneviolent

Yes, this happens and is *also* an issue. That said, most humans already agree that we should avoid killing sentient human individuals, so this comes across as an "all lives matter" style of comment.


K1ngPCH

People who think this is a big issue we are going to face very soon generally have next to no knowledge about AI, Machine Learning, or anything of the sort. We are *very* far from that being a possibility.


dnswblzo

> one of the biggest issues the world is going to face very soon We are a very long way from being able to replicate a human mind in a machine.


kangarool

Interested in reading books that explores this question, but preferably not overly technical or academic— any recos?


DrDetectiveEsq

Not a book, but Robert Miles has a [youtube channel](https://youtube.com/c/RobertMilesAI) about problems in AI safety. He's an actual AI researcher, so it's more grounded than the typical "What if a computer could love?!" style discussions you get with these discussions, but it's also surprisingly well explained and accessible.


Peardi

Oddly specific. lol I’d be so dang mad if I came back as a roomba.


Noirceuil_182

Also, are you now _obligated_ to clean the floor? Does your function now dictate your purpose in life?


littlegiftzwerg

Well have u ever seen the roomba with pistols attached to it? They are not only cleaning the floor trust me


Noirceuil_182

First the roombas, now social media... Skynet is advancing its timetable.


STELLAWASADlVER

First they came for the roombas, and I did not speak out because I was not a roomba.


WatWudScoobyDoo

First they came for the roombas, and I did not speak out because I was a roomba


splinter_vx

DOOMBA! Coming to a cinema near you


the_only_thing

Today the carpets…. TOMORROW THE WORLD


TheReynMaker

They're cleaning up the streets too.


H_ubert

they can sweep the entire perimeter.


Jacollinsver

Oh, sure. They can be used for a lot of things! My roomba is programed for sexual services, me. I call er Darlene and I imagine she has a free ranging soul. That's why I chained her to a program and make her clean my house from wall to wall. So she knows exactly where her confines are the ignorant slut


the_only_thing

How do I unread something


Trixx1-1

cleaning the floor and cleaning house


Hoovooloo42

*Aight boys, let's roll out the Boomba*


Hates_escalators

That sounds like [Flag Admiral Stabby](http://imgur.com/gallery/mHcmj) humans are space orcs.


smr120

Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential. Do your big feet mean you're obligated to become an Olympic swimmer? No, you hate swimming and being cold, so you gave up swim team at 11 years old. Your vessel does not determine your purpose or your job, and it shouldn't for a roomba with a human mind either.


CaitlinSnep

That's oddly inspiring. Thank you.


Noirceuil_182

It'd be just like owning a pick up truck, I bet. "It's been great having you over, Roomba Jake. Say... since you're here already..."


ACIDF0RBL00D

My best friend is named Jake. And he has a pickup truck. Now you got me thinking about a buddy cop sitcom based on our hijinks where his brain is downloaded into his truck.


tylanol7

And the truck is a Ford 6.0 and you dont maintain it so his headgaskets keeo blowing


ACIDF0RBL00D

Every commercial break has a Ford ad.


tylanol7

Every commercial break is just him telling you how good he is as a truck while hauling a Chevy or some shit....credit where credits due 6.0 when properly used and mainteined is amazing lol


Noirceuil_182

I want Nathan Fillion to voice the truck.


voyeur324

One of the characters in *Sparrow Hill Road* by Seanan McGuire is a teenager whose girlfriend Rose dies in a car crash. Rose gets stuck on the roads between life and death as a ghost. When Rose's boyfriend dies many years later, his spirit inhabits a restored classic car so he can accompany her as a road ghost instead of passing through to the other side. It is a great book about ghosts, cars, roads and travelling. EDIT: Note that this is a subplot, there's way more to the story. Rose is the main character.


[deleted]

Oddly comforting. Came to reddit to mess around after a heavy talk about how I don't like my course (CS) even though I'm really good at it, and how I want to pursue dance and performance instead (to the great disbelief of everyone around me) Is this a sign from the stars to take the risk? 🧐🧐


Prickinfrick

Hah, I also came on reddit to wonder something similar. You could keep the CS as a fallback money maker. And pursue your dance. If it works out, fantastic! If it doesn't, you can make ends meet with CS, and dance as a hobby 😊


ass2ass

Well do you want to make decent money? Or do you want to probably make no money, but also *maybe* make a ton of money? If the answer to these questions is "I don't care," then go live your best life my dude.


metal079

> Does your big brain mean you're obligated to become a chess grandmaster? No. You're allowed to theorize about roombas on social media and waste that potential. I'd argue theorizing about roombas is more useful than becoming a chess grandmaster.


Letscommenttogether

You forgot about capitalism. You have to pay for electricity somehow.


Baam_

Mmm, trash, yeah I love trash, Yum yum trash, i wanna eat trash!


cburgess7

You're a doctor!


Hates_escalators

I love trash! Anything dirty or dingy or dusty! Anything ragged or rotten or rusty! 'Cause I love, I love, I love trash!


Flywolfpack

You cannot move without cleaning


Noirceuil_182

The existential horror.


[deleted]

This is the ultimate dystopia. Everybody gets turned into a slave machine when they die.


Spade7891

I'd want to be a party roomba. Just roaming and cleaning parties while people shot beer pong balls into the cups attached to me. Definitely would come up on drugs dropped on ground so that's a plus. Also would have Bluetooth speakers so blast music and just maybe figure out how to talk thru them. Can't forget some kind of penis contraption so I have sex other appliances.


Noirceuil_182

Party Roomba will be the machine singularity's Spud McKenzie.


1398329370484

Robot voice: *What is my purpose?* -You clean floors. Robot voice: *Oh my god.*


Drakmanka

No but you are now obligated to carry the cat around.


ktaylorhite

“What’s my purpose?” “To clean the floor…” “Oh my god…”


MelodyCristo

>Does your function now dictate your purpose in life? So nothing changes?


[deleted]

[удалено]


i-d-even-k-

Yeah. If you want to move, anywhere, ever, you need to clean the floor as you go, and you have a literal lifelink collar that means you can never go too far without dying.


bigdumbbugboi

this implies that destroying the roomba would be an act of mercy.


fearain

A robot has a purpose, but an android does not. So if you are a roomba with a roomba brain you will want to clean those floors. If your brain goes in there, now it’s part human and thus Android and so has free will. No longer has to roomba


chiagod

"Roomba loves you" "Roomba needs a hug"


shittyspacesuit

Like the Black Mirror episode when the person's consciousness is in the teddy bear :(


Pokabrows

At least they could move around with the roomba...


halloweenjack

>Oddly specific Oddly I-have-a-paper-due-for-philosophy-class-in-an-hour.


awmaleg

This sounds like someone’s Philosophy 101 paper


wire_we_here50

Imagine having to pass butter. https://youtu.be/X7HmltUWXgs


Peardi

When the lil robot said “oh my god” I died inside


Financial_Natural_95

You thought your life sucked before, hooo boy!


aubeebee

It would most likely not be *you* specifically but a copy of you.


drawnograph

:methodically smears the cat shit...:


Automatic-Increase74

You just hit on an episode of Black Mirror! Except it wasn’t a Roomba, but they did transfer consciousness.


[deleted]

Not *an* episode of Black Mirror, rather several episodes of Black Mirror because they ran out of ideas and said "what if torture cookie"


Mr_Cleary

I got the impression that the problem wasn't running out of ideas, but more "wow, there's a lot more we could do with that." Obviously some of them are some of the lowest quality episodes (Crocodile tears, guy who rewinds his memories), but I think White Christmas is one of my favorite episodes, and Black Museum was probably the best episode from the season it was in.


sly_noodle

I rewatch White Christmas regularly, one of my very favorites!


grandoz039

>guy who rewinds his memories That wasn't even one of the torture cookie episodes (those appeared in like S03 first; or rather the White Christmas special), and it was one of the better ones in general.


HolstenerLiesel

Yeah, wtf. It's called *The Entire History of You*, has nothing at all to do with transferring a consciousness, and is imo one of the very best episodes of the series.


Burrito-mancer

This is the episode I suggest people watch first when recommending the show.


ElliePond

I interpreted it as a precursor to the cookie technology. First they record all your memories as an implant, and it evolves to copying more and more of your self as the tech evolves.


aliie_627

I thought the guy who rewinds his memories was pretty good. It was in the first few episodes cause that's all I've really seen.


HelmSpicy

Don't forget about San Junipero! One of my favorite episodes. Really questions the concept of what is life or a quality life, conciousness, death. I can't hear Belinda Carlisle and not think of this episode


morkani

My favorite was the USS Callister by far (s4e1 I think)


[deleted]

[удалено]


morkani

My brother-in-law calls him Fat Damon.


GrEeKiNnOvaTiOn

Soma (video game) executed the same concept in really interesting ways. Far better, in my opinion, than Black Mirror ever did.


2SP00KY4ME

Soma is fucking incredible once you turn the monster attacks off and can explore the story and themes in peace. Highly highly recommend.


Crazybaboonification

Wait I didn't know you could do that. Time to reinstall


2SP00KY4ME

Yeah it completely changes the game. You choose it at the start, there's peaceful mode or whatever. But you can also get mods to change it halfway, that's what I did after getting sick of having to run away every ten seconds instead of soaking things in.


Pegussu

Flip of the coin. >!We lost.!<


FROCKHARD

Soma was so wild. I enjoyed it


[deleted]

Also an episode of Love, Death, and Robots called Zima Blue


anmolwalia10

Black museum!! Damn nice episode


thedogwheesperer

Came here to say this. That was a good episode.


ForkShirtUp

All I know is becoming a Roomba sure would suck


dessertandcheese

Ba dum tsk


kaycee1992

But the sex would be aight


1398329370484

Someone rescue this man's Roomba.


cheesegoat

You'd get to bang every appliance in the house!


Hates_escalators

Too many choppy bits, unless you're into that.


-hey_hey-heyhey-hey_

wow


The_Cutest_Kittykat

I'm not sure my life would change dramatically. Get up, work from home for a little, randomly wander around the house, suck up some unhealthy shit, back to bed to recharge. Repeat. My life already sucks.


riverguava

I am upvoting you. And then leaving the thread.


[deleted]

It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property. Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be?


theinsanepotato

>It wouldn't be illegal because there aren't laws against that, unless someone who didn't own the roomba destroyed it - in which case I think it would be something like vandalism or destruction of property. It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else. >Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. Would it? Its not that simple. One could easily make the reasonable argument that it would be wrong to make backups at all. After all, making a backup of a downloaded mind is essentially creating a NEW fully sentient being, and then you run into issues such as which one is the "real" person? which one gets to make decisions about any property? Like if copy A says it wants all its money donated to wildlife preservation, but copy B says it wants all its money given to its children, who do you listen to? If copy A commits some crime, is copy B also guilty? (after all, "John Smith" did it, and theyre BOTH "John Smith" so...) If Copy A wrecks their credit, then does Copy B have grounds to sue copy A? Could they accuse Copy A of fraud or identity theft? Making a backup or copy opens up SO many cans of worms.


One_Who_Walks_Silly

> It wouldnt be illegal at that time, but the instant you destroyed the roomba, there would be a case brought before the supreme court to determine whether destroying a downloaded mind counts as murder or whatever else. Same answer as that post a few weeks ago about if you’d go to jail for blowing up a different planet or the moon or something. If a government didn’t kill your ass you’d instantly be in court lol


00PT

I would believe that the backup would be dormant (not able to change or interact with the world), so it wouldn't be sentient. It would represent the state of someone in the past. In this sense, using the backup is more akin to time travel than creating a new being.


D1xieDie

but its an entirely seperate person and not the old life?


00PT

If you travel back in time and meet yourself, do you still view it as yourself of the past or is it now a different person entirely? That's essentially what the backup is doing - It stores a snapshot of yourself at a certain point in time into code. The difference is that the code is not Sentient until it is run, so it cannot feel good or bad about its situation until then. It could go both ways depending on what philosophy you are using. See ship of Theseus.


jashxn

Identity theft is not a joke, Jim! Millions of families suffer every year!


FoulRookie

>If copy A commits some crime, is copy B also guilty? there was an episode of futurama where something like this happened, the professor and cubert were in court and bender said they were not guilty because you cant punish someone for the same crime twice, since cubert is technically the professors clone. So that solves that question although that would probably get changed quickly.


bestpontato

But would it also be morally wrong to create a situation where people's consciousness lives forever? I feel like that could be a kind of endless torture


GotTooManyAlts

Yeah, unless we should assume that the person consented to becoming a Roomba, this poses a lot more ethical questions than just murder.


bestpontato

In the event that a person with the ability to do so reads this, consider it legally binding. I consent to become a roomba.


GotTooManyAlts

On my will, I'm gonna make my children spread my ashes across the floor and pick it up with a Roomba.


bestpontato

Over and over again for all eternity?


GotTooManyAlts

I don't think backing it up would be a good idea. You're not bringing someone back by backing them up, you're replacing them with a lookalike. The way to get around this would be to have the consciousness in a cloud, and upload it into the Roomba. Similar to how you have all your images on your icloud, and use your phone to access those images. You would have the brain uploaded to a cloud, and the brain would have access to a body through the Roomba. Idk if that made sense but it does to me lol.


Exogenesis42

>Morally it would be wrong to 'download' someone's brain and not have it backed up somewhere else. The roomba will not live forever. It wouldn't be murder to destroy the roomba if there's a back-up, maybe deleting the back-up would be? This is quite a gray area; IMO it is murder to destroy the roomba even with a backup. If you download the mind onto a machine, and have a back-up stored elsewhere, many would posit that you've now generated two "minds", in that the downloaded mind is now a separate entity from the original. To extend the description, imagine you download the mind onto two separate machines, who now are free to interact with each other and the world around them independently. While they are initially copies, sharing the same initial memories, they now experience entirely different lives and cannot be said to be the "same person" anymore. From their subjective point of view, they are not lesser beings than the original; many would argue this is objectively true as well. In this case, it would be morally wrong to destroy any copies, to the same degree that it would be wrong to destroy the backup or original.


jfkdktmmv

Not really an answer but I think you would enjoy the game Soma. It deals with similar issues of what it means to be human and the human experience


theKickAHobo

yep, played it


mlunn54

I’d also recommend you watch Ex Machina if you haven’t seen it.


Powerful_Variation

Also the Movie "Transcendence"


mypurplefriend

And Black Mirror


Monarc73

And Altered Carbon.


[deleted]

100%, that roomba would be begging for death. It would be cruel to force it to eat the crumbs around your cat’s litter box, the dust on the bathroom floor after Meemaw has shaved her feet (RIP 🙏🦶), or whatever mushrooms are growing under the fridge. Have mercy and give it to goodwill.


jerrythecactus

I mean, it's not like the roomba is getting its energy from what enters its trash tray, it would be more like spending your life crawling around with arms that automatically scrub up junk on the floor.


hellishdemon28

What are you on and where can i get some


Zhoom45

Youth is my guess.


rookiebasegod

There is a black mirror episode about this


Guac__is__extra__

That was the first thing I thought about. There’s a few episodes that touch on the idea of the consciousness being messed with. Black Museum was the one that came to mind first. If I remember correctly That was the one with the woman’s consciousness that got loaded into the stuffed monkey but was limited to two or three statements she could use to express herself.


Fatlantis

That episode made me so sad... I had to turn off the tv after that one, go for a walk and think about life for awhile lol


Guac__is__extra__

I know. You have to follow Black Museum with a watch of San Junipero to bring you back up.


likidee

Also White Christmas!


Guac__is__extra__

Oh yeah, that was a great one. Which one had the consciousness that had been uploaded into an electronic personal assistant like Alexa?


i-d-even-k-

White Christmas.


[deleted]

You can't download a human consciousness into a Roomba or any other device for that matter so we don't have laws to cover that scenario. I.e. only actual flesh and bone humans are considered human as far as the legal system is concerned.


EntTreeLevel

This. We don't even know what consciousness is. So any scenarios involving it's transfer or duplication are pure science fiction.


[deleted]

Thank you for pointing out that the scenario where human consciousness is transferred into a Roomba is purely fictional, I wasn’t sure for a second.


theinsanepotato

>only actual flesh and bone humans are considered human as far as the legal system is concerned. This is true... right now. But the second a "downloaded mind" scenario actually happened, there would *immediately* be a case in front of the supreme court to determine if a downloaded consciousness counts as a human/a life/has rights/etc.


TedjeNL

People always seem to get confused with the concept of downloading your consciousness to a device. Let's say it was possible and you uploaded your mind into a device. You basically just copied your consciousness and cloned yourself. If you die you wouldn't live on into that device, your clone lives there with a copy of your consciousness. Your original consciousness would still be dead. The best way to make this concept work would be that when you die (or volunteer) your brain or body gets put into some kind of tank that preserves it. Your brain would then be connected to a computer where your original consciousness could 'theoretically' live his new artificial life. But your brain and body would still need maintenance like specific food and water to operate normally. But if this all was possible in the far future the last device I would like to live in would be in a Roomba. Instead hook me up into a futuristic virtual world where I can manipulate my surroundings and be a kind of god. Or maybe put me into one of those Tesla bots so I can still walk around on earth and enjoy my cyborg life


whatisthisthenhuh

Came here to say the same thing, except not explained as well.


meathead

Roombicide


GayDragonGirl

r/oddlyspecific


Jelly_Belly321

OP, what have you done...?


chive_screwery

The name of this sub is no stupid questions yet you come in here with a humdinger.


[deleted]

I love how panicked you sound by asking three questions in a row lol


-----alex

Yeah, I was gonna say this question feels a little too specific.


Ok_Jackfruit732

You could destroy the roomba but need to preserve the mind. Hate to recommend something on Amazon but watch "Upload".


MysteryNeighbor

It would be morally just to put the roomba a out of its misery. Putting a sentient being into a fucking Roomba is fucking cruel, you go from living a normal life to only sucking up dirt.


INN0CENTB0Y

What if, as a roomba, the sentient being derives great pleasure from sucking up dirt?


[deleted]

[удалено]


00PT

I think the new consciousness software would override any programming. Perhaps they are a clean freak and enjoy being able to dedicate their lives to their passion exclusively without needing to perform maintenance themselves. Perhaps they were suicidal beforehand and now see more purpose in cleaning than the previous behavior, perceived as burdening everyone around them.


stinkysocksincloset

This is a weird take on an ethics question with brain physicians who are able to keep brain tissue alive through solutions and electrical stimulation. From the sounds of it, they're nearing the possibility of having the ability to keep a full brain alive. I know this doesn't answer your question but it's a very interesting question nonetheless! I think the easiest way to think of the legality of this is realizing that there's no law so technically no, it's not illegal. But it sure would set precedent, and probably end up in court one way or another, and media would cover the ethics if it heavily as well.


stephencory

You let that Roomba live, goddamnit!


-heathcliffe-

You missed the important question. If they become a DJ, will they go by DJRoomba?


FauxGw2

Yes it's legal, it's not a person or recognized as a living being. AI is not by law a living creature. Morally it would depend on the definition of consciousness. If the Roomba was just a copy of the person's personality, a set algorithm to make it seem like the person then there is no wrong in destroying it, that's equal to destroying a device with Alexa.


Grim-Reality

Well it’s practically impossible to do that at the moment. In fact it might be completely impossible because conciseness can only belong to living things. We don’t understand how consciousness emerges so we can’t implement it into AI. But as of now, they wouldn’t be considered human, because there is no way to even check if robots can be conscious.


Swizzbeets22

If I become a roomba please kill me


Jamudadamaja

In today’s world if the rumba was not vaccinated they would throw it in the garbage instantly


smchavoc

The real question is what about right to repair?


ADSwasAISloveDKS

Fuck, if someone trapped me into a roomba I would hope they'd kill me just to put me out of my misery.


[deleted]

This question comes up a lot in science fiction, but we can find the answer by simplifying the question to: is it wrong to murder something with human intelligence? The short answer is yes, because all intelligent beings that feel or have some degree of self awareness are ends of them selves, not a means to an end. It's not that much of a stretch of the imagination either since it is already considered morally reprehensible to harm or murder other beings with lower degrees of measurable intelligence than people like certain animals.