MIT Researcher Proposes Rights for Robots

 Pages PREV 1 2
 

If robots ever do become indestinguishableablblab (however you spell it) from humans, then are they not themselves alive? Consider an AI so advanced it could feel emotion and act upon it? Is that not what defines us as humans also?

Daverson:

CardinalPiggles:
As for robots with rights: Call me old fashioned, but I just think that's pants on head retarded. Even if they can think independently (which I imagine is a way away) they're still not living creatures. Does a TV have rights? Does a PC have rights? Does an android phone have rights? No because they aren't alive. Sounds stupid to me.

I know! It's just like that time they gave those savages from Africa rights! It's like they don't even know that only true Christians have souls.

(That was sarcasm, by the way.)

Was it??? Because I couldn't tell?!?!?!?!?!?!?!?!

Why not? The way I see it, both computers and the human brain run on electrical impulses taken in from external stimuli, so one could argue robots have intelligence, however limited. So yeah, yay for robot rights.

Spartan1362:
Humans are nothing more than highly advanced biological robots.
As such, I support this.

Stephen Hawking:
Though we feel that we can choose what we do, our understanding of the molecular basis of biology shows that biological processes are governed by the laws of physics and chemistry and therefore are as determined as the orbits of the planets. Recent experiments in neuroscience support the view that it is our physical brain, following the known laws of science, that determines our actions, and not some agency that exists outside those laws. For example, a study of patients undergoing awake brain surgery found that by electrically stimulating the appropriate regions of the brain, one could create in the patient the desire to move the hand, arm, or foot, or to move the lips and talk. It is hard to imagine how free will can operate if our behavior is determined by physical law, so it seems that we are no more than biological machines and that free will is just an illusion.

I agree strongly with this.

Machines don't live? Chemistry and physics teach us that the human body basically is nothing more than a biological machine.

Yes, machines are programmed to act the way they do. Sociology teaches us, so are we humans. We are programmed from birth, by our parents, siblings, teachers, peers, the media,... to think and feel the way we do.

The logical conclusion is to support our robotic overlords.

Wait, that means that Ultimate Robot Fighting would never have the right to exist!

Fuck that, I want future generations to be able see Optimus Prime fighting Mecha-Godzilla for sport!

I think it's a good idea to get something in place before we create any sort of robotic AI but that is a long long time coming :|

I have a crazy idea. Don't make a true AI and you won't have to worry about the ethical and legal ramifications of robot rights.

I got this brilliant mental image of the battle hardened colonel with manly tears streaming down his face as the bomb detecting preying mantis robot takes one for the team. :P

Everything has got to have rights these days, even if it's not even a living thing...jesus...

Could someone tell me why? Why the fuck would we make robots that deserve or desire rights? What the fuck is the point in that?

Quaxar:

FREE THE ROOMBA!

On an unrelated note... I wonder if she's related to former head of MI6 Darling. Probably not. I don't know why I even brought that up...

WHY DID WE GIVE THEM GUNS!?

image

It's ever-so-slightly premature, although I would have thought that if the eventual conclusion is that if robots get to the point of using genuine AI, then we'd have to give them rights closer to that of humans than animals.

Sorry, but that's bull shit. Just because we project emotions on to things doesn't mean they have them or deserve to be treated as if they do.

We may project on to some animals, but we know that other animals do in fact have feelings. The law reflects this. No-one is going to prosecute you for torturing an insect or sticking a sharp hook through the face of a fish, but you will get punishment for abusing something like a dog or an ape. These animals do have feelings and thoughts, we don't just project them. When they are abused they show obvious signs of emotional response and even mental damage. A fish, however, may feel fear but little beyond that. We don't suddenly say that fish need rights because we feel sympathy for Nemo's father in Finding Nemo.

When robots get more advanced perhaps they will deserve rights, but we are a very long way off from that and merely our own projection is not a reason for us to give them rights.

Wondering how many people actually read the paper, read the article about the paper...or just saw the title and decided to post.

"My toaster doesn't need to vote...Hurrrr!"

The sheer amount of knee jerk technofear shown here is quite frankly one of the better cases for promoting this type of incremental advancement. As a species/society we simply aren't going to be willing to share our "god given rights" (their words,not mine) and it's going to be easier to make small changes as necessity demands rather than ignoring the issue outright until it's too late.

"Animal Rights" and "Children's Rights" are forged by, well... Human grownups (!) but A.I. is by definition able to communicate.

It's gonna get real when the robots speak their piece.

It seems a bit premature, robots can't think (yet) but when there are sentient self aware robots (aka like Data from Star Trek ) then yeah they deserve all the rights of a self aware being.

We should focus on equal rights for humans first though I think, maybe when we get that right we can move on to robots.

not sure how this is really news... many people, including myself, have made public articles and discussions about this very subject in great detail. nothing really new, but hopefully it's something we will continue to look into, especially as computers and robotics start getting closer to emulating human/living characteristics.

So where are our rights for Viruses[1] then? Since we're apparently giving rights to things that are barely self-aware (maybe not even that) and only react due to Stimuli rather than due to any actual intellect. And I'd say Viruses need rights far more since unlike robots they're the victims of a near-constant genocide.

She is right that people tend to get emotionally attached to robots though. Anyone heard of Sergeant Talon?

[1] Biological ones, though I suppose this could apply to digital ones as well.

"Shepard Commander, does this unit have a soul?"

Personally I say it could not hurt to have a few basic rights for robots they may not be advanced enough to care yet but it could help smooth over any issues later on if they do indeed gain human level intelligence (or if alien robots that may or may not look like earth vehicles show up)

gritch:

sethisjimmy:
Pets are actually alive, robots just mimic life. Just because we can get emotionally attached to something doesn't mean it's now an entity on the level of animal life.

Actually I would argue it's our ability to project our own feelings and emotions onto another object/animal/being that defines our entire definition of ethnics and rights. Ethnics for me have always been derived from our own personal desires. I would not like to die, therefore I consider you killing me to be bad. I am also able to project my own thoughts onto other objects, animals, or people. The logic would go: I don't like to die so I bet that guy over there wouldn't like to die either. The degree of easiness to which I can empathize with the object determines the degree of "badness".

As a human I can easily empathize with another human, therefore I consider killing another human very bad. Most humans are also able to empathize with animals of higher intelligence (dogs, cats, etc) and those that share many common features with humans (chimps, apes). Most people would consider killing these animals to be bad, at least far worse than killing animals of lesser intelligence and alien features (such as insects). Often people can form strong familial bonds with such animals (pets as they're called).
Anything that people can empathize with strongly enough with incite an ethnical response and thus deserves rights. If people can empathize with non-living robots as well as animals, those robots deserve rights on par with animals. Claiming something deserves rights only because it's alive is nothing more than a rationalization.

Geez sorry about the long post. I guess I just find this topic very interesting.

You actually really made me think with this. I came in here wanting to say it was stupid but now I am not sure I believe that. Somebody actually convinced somebody of something online. Thats crazy. Anyway I'm going to ramble for a while. What defines morality, ethics and what should give rights? Is its intelligence? If so shouldn't hunting and the entire meat industry be shut down? But I do not feel that way personally, yet I would feel appalled at someone killing a dog for no reason. Perhaps then it is the projection of myself onto a relatable object or organism as the person I quoted has said. If so then you are correct that ethics should be considered for nonliving things. But what would the repercussions be? Videogaming may become inhumane in the eyes of the law. Machinery would not be able to be used in factories, and hitting your computer when it isn't working would be considered illegal. Obviously these ideas are ridiculous. Perhaps then a set of right should be created that isn't that of a human or of a bet, but of something completely different.

Winthrop:

You actually really made me think with this. I came in here wanting to say it was stupid but now I am not sure I believe that. Somebody actually convinced somebody of something online. Thats crazy. Anyway I'm going to ramble for a while. What defines morality, ethics and what should give rights? Is its intelligence? If so shouldn't hunting and the entire meat industry be shut down? But I do not feel that way personally, yet I would feel appalled at someone killing a dog for no reason. Perhaps then it is the projection of myself onto a relatable object or organism as the person I quoted has said. If so then you are correct that ethics should be considered for nonliving things. But what would the repercussions be? Videogaming may become inhumane in the eyes of the law. Machinery would not be able to be used in factories, and hitting your computer when it isn't working would be considered illegal. Obviously these ideas are ridiculous. Perhaps then a set of right should be created that isn't that of a human or of a bet, but of something completely different.

Glad I got you thinking about it. I guess some people actually DO read my posts!
You bring up some interesting scenario but I believe with some careful considerations we can eliminate some confusion. While it's true an individual might be able to feel a strong affection toward a completely inanimate object (such as naming a car as such) they don't actually seem to be projecting their emotions onto that object, rather they're considering that object as a possession of their's. Harming or stealing another's possession is indirectly harming the person in possession of said object. These objects do not deserve rights themselves, but harming them indirectly harms a person instead. Harming the object itself isn't bad, but harming that person's object is bad.

To address your issue with video games, we have to make a distinction between imaginary and real constructs. When someone reads a book or plays a videogame, one is conscious of the fact that characters and plots within are not real, no matter the degree of emotional attachment they incite. We can always flip back a few pages or restart the chapter and these characters are back to where they were. For imaginary characters recorded in media there is a finite amount of actions they can/will perform. Most people are able to realize this and to a degree disassociate their own emotions. Killing off your most beloved character might cause anger or frustration towards the writer but most people would never think the writer morally corrupt for it.

Also machines that don't remotely represent humans/animals (such as in your examples of factories or computers)aren't likely to cause emotion responses with people. You'll probably never seen yourself jailed for harming your own computer.

And this post is even longer than the first. Oh well. I could talk about this topic all day.

 Pages PREV 1 2

Reply to Thread

Posting on this forum is disabled.