Jump to content
Objectivism Online Forum
Seeker

UK report says robots will have rights

Rate this topic

Recommended Posts

Should intelligent robots have rights in the future? The director of the Centre of Robotics and Intelligent Machines at the Georgia Institute of Technology thinks so. He says, “If we make conscious robots they would want to have rights and they probably should." Others go on to suggest that states will be responsible to provide robots with "full social benefits". I found this story to be preposterous, because robots are neither alive nor sentient, regardless of how well they may be programmed. But I would like to know what others think of this story.

UK report says robots will have rights

Share this post


Link to post
Share on other sites

Well if someone was to create a robot similar enough to a human in a way that it required the freedom to take certain actions in order for it to "live" (or whatever equivalent word would apply), actions that did not involve the breach of another beings rights, well yes I do suppose so.

I do not think a robot should have rights merely because it is conscious. For instance they could create a robot that is conscious but could only act in a limited way, and which does need to take certain actions to live, it could have its means of existence provided for it. Or it might be conscious and have no ability to make moral decisions. Claiming that such a being has rights is nonsensical.

Also the article goes on about duties that the robots should "inherit":

"including voting, paying tax and compulsory military service."

Ah ha, so clearly whomever wrote that article does not understand "rights", if he think duty comes hand in hand with rights. Duties and rights are polar opposites, incompatible, and one does not "logically" imply the other.

"Mr Christensen said: “Would it be acceptable to kick a robotic dog even though we shouldn’t kick a normal one?"

Now he is implying that animals have the same rights that we might attach to intelligent, conscious machines, or people. I think he does not understand the source of rights, consciousness alone does not necc. invoke rights when we are talking about machines. A robot would have rights for fundamentally the same reason that humans do.

"“If granted full rights, states will be obligated to provide full social benefits to them including income support, housing and possibly robo-healthcare to fix the machines over time,” it says."

Humans do not have the right to these things, there is no reason that robots would either.

Share this post


Link to post
Share on other sites

I imagine that if robots were granted the "rights" to welfare, food stamps, unemployment "insurance", etc. (even if they had a Calvinist work ethic), there would be welfare-, food-stamp, and unemployment-insurance-collecting hordes demanding that production of new robots be rendered illegal and that existing robots be deported back to where they came from (to the end of the assembly line, which then proceeds to operate in reverse).

Share this post


Link to post
Share on other sites

Oh man, now I have to put up with pan-handling cyborgs too! And you can't out-run them... they're like freaking machines or something!

In any case, assuming a rational consciousness with free will in a computer, I say they have rights. I don't see anything perposterous about the idea, either, since humans are supposedly entirely material beings which happen to have a particular construction which gives us rational consciousness and free will.

Share this post


Link to post
Share on other sites

This thread reminds me of a fiction story, entitled "The Humanoids", by Jack Williamson, which I read in the late 1950s. It was a futuristic society where robots pretty much dictated the law, a 'perfect' law, to men. It was like a Communist society, with the difference that the robots served the people, but also restricted what the people were allowed to do; it was the ultimate 'nanny state'.

Share this post


Link to post
Share on other sites

Every so often this discussion comes up. All I can say is: man derives rights from his faculties of reason and volition. Should any other being come to have those faculties, it will have rights. And not otherwise.

Share this post


Link to post
Share on other sites
Every so often this discussion comes up. All I can say is: man derives rights from his faculties of reason and volition. Should any other being come to have those faculties, it will have rights. And not otherwise.
Still, I think that they may be right, on that article: the next time you beat your keyboard in frustration, it will be able to sue you for assault, and vacuum cleaners will have the vote. In the UK. They will probably have elected a dog (other than a poodle) to be PM by that time.

Share this post


Link to post
Share on other sites
Still, I think that they may be right, on that article: the next time you beat your keyboard in frustration, it will be able to sue you for assault, and vacuum cleaners will have the vote. In the UK. They will probably have elected a dog (other than a poodle) to be PM by that time.

I think 2007 will be the year of "plant rights" myself. Those "don't step on the grass" signs are about to get some major muscle behind them!

Share this post


Link to post
Share on other sites

If reason and volition are all that is required to have rights, then this article isn't too far out there. Robots will probably require rights within the next 50 years.

Share this post


Link to post
Share on other sites
If reason and volition are all that is required to have rights, then this article isn't too far out there.

There is absolutely no evidence that it is even possible for robots to ever have a consciousness, let alone a volitional and rational one.

Share this post


Link to post
Share on other sites
This thread reminds me of a fiction story, entitled "The Humanoids", by Jack Williamson.
It reminds me of the storyline from The Beast, where sentient machines, outnumbering humans 10:1, manage to push through the Mann Act II and gain voting rights. Problem was, they weren't conditional consciousnesses, so they ended up being the moral equivalents of Rand's indestructable robot. Unconditional consciousness requires no ethics.

Share this post


Link to post
Share on other sites

Very seldom do I do this, but, I don't robots should ever have rights. Why?

Rights = Protection of Values

Ultimate Value = Life

Robots can not die.

Therefore, robots should not have rights.

Any disagreements?

Share this post


Link to post
Share on other sites
Very seldom do I do this, but, I don't robots should ever have rights. Why?

Rights = Protection of Values

Ultimate Value = Life

Robots can not die.

Therefore, robots should not have rights.

Any disagreements?

Wrong. Rights are not protection of values. Rights protect humans from the initiation of force. Also, life is not man's ultimate value - it is his standard of value. Depending on the identity and nature of the life another hypothetical sentient volitional being might have determines what it's ultimate value is. Since a robot is a material being it is capable of being harmed and capable of harming others - and since it hypothetically has a mind and volition similar to ours it also possesses rights. Your syllogism doesn't hold.

Share this post


Link to post
Share on other sites

This is something that always bugged me about iRobot.

In the movie, Will Smith accuses a robot (Sunny) of murder and wants it destroyed. The other characters assure him that robots don't have volition, so could not have committed a murder. the movie goes on to explain how "ghosts in the machine" could have possibly created a sentient and rational robot. The audience is left to assume that Sunny is one of these.

Will Smith however does not want to treat Sunny like a rational volitional being. It is my understanding that this would qualify sunny for rights, but Will insists that the robot (though able to make choices like murder) cannot have a right to live, or any of the derivative rights like freedom of action, property or even a trial to investigate facts.

my family just said i dont know how to enjoy a movie.

Share this post


Link to post
Share on other sites

[WARNING: CONTAINS PLOT SPOILER FOR "BLADE RUNNER"]

If there were robots as depicted in the movie Blade Runner, they would have to have rights. Blade Runner is the story of a very human-like robot, designed with only a 4-year life-span, who seeks out his human creator to find a way to live longer. The job of Harrison Ford's character, Deckard, is to kill the robot and his companions.

The climactic confrontation scene between Deckard and the robot, played by Rutger Hauer, is one of my all-time favorite movie scenes. The speech by the robot, given before he dies, and Deckard's follow-on soliloquy, artistically make the case for "robot rights".

However, all of this pre-supposes that robots could display the sentient characteristics that would make them require rights. That is a big supposition, and is really just a matter of speculation given the state of computer and bio-mechanical technology today. However, if such robots were possible, Blade Runner dramatizes why they would deserve rights.

Share this post


Link to post
Share on other sites

The "robots" in Blade Runner (the movie) aren't robots at all! They're genetically modified human beings. They're clearly organic in nature and possess a human consciousness. The major distinction is that memories are "planted" in them rather than experienced.

Edited by GreedyCapitalist

Share this post


Link to post
Share on other sites

I stand (largely) corrected. The replicants in "Blade Runner" are not robots, but they are not human, either. Rather, they are genetically modified humanoids. In fact, the story centers around the most recent "model" of replicants called "Nexus-6".

I got confused because the replicants in the movie are awfully robot-like in some ways, with model names, eye tests to distinguish between human and replicant, and several instances of robot-like behavior. (Three examples: (1) when Decker kills the gymnastic replicant and she thrashes around in a mechanical way; (2) when the lead antagonist shoves a nail into his hand to delay his death; and (3) the super-human strength displayed by the lead antagonist, which is "typical" of science fiction robots).

In any case, as nearly-human humanoids, with consciousness, etc., they don't serve as a good example for this discussion.

A long time ago, I read the book upon which the movie was inspired, called "Do Androids Dream of Electric Sheep?" The title of that book leads to the question discussed in this thread: If androids dream, are they human enough to have rights? Or, to state the question more broadly, if a creature had consciousness and other human traits, but it was a mechanical entity created by man, would it have rights?

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...