Jump to content
Objectivism Online Forum

I Robot

Rate this topic


Recommended Posts

I recently saw 'I Robot' and was very dissapointed. The movie states that it is "suggested by the book 'I Robot' by Isaac Asimov." I have read Asimov's Foundation series years ago and have forgotten essentially all of it. All I remember is that it ends with a sort of merger between his Robot and Foundation books and the ideal world is some sort of environmentalist fantasy where every cell on the planet is linked into some collective consciousness. So I know Asimov can get wacky. But I am curious if he was as anti-technology as this movie would indicate.

The theme is something along the lines of what happens if man's creations should become sentient and seek to rule him; ie what happens when a creator's creation seeks to replace its creator. The religous parallel is obvious (for those believing in God). But the movie version goes out of its way to spit at capitalism and business and, IMHO, technology. Was Asimov that bad? I get a sense that he wasn't. But the useful idiots at Hollywood couldn't resist the opportunity to slander business. Students of free market economics will love (sarcasm) Will Smith's (Detecive Spooner) tirade against busisness "squashing the little man." I think they'll also love (more sarcasm) his '2004 retro black Converse Allstars' and everything they represent (hatred of technological progress).

The movie has great CGI effects and little more. 'Sonny' the robot was the most appealing and likeable character. That can't be good when a movie's best actor isn't even a SAG member.

One last point that will probably raise some eyebrows. I am not a racist that thinks that the movie shouldn't have had a black lead because Asimov didn't write it as such or that a black actor would demean it (in fact I am really looking forward to Denzel in Manchurian Candidate allthough I fear what Hollywood will do to the theme). Far from it.

However, there was an ethnic element (in this case Will's 'blackness' allthough thankfully the ebonics was kept at a minimum) to this movie which irritated the hell out of me. There was at least one refference to being black (and mind you the year is 2035), and Will Smith's entire character is a depiction of an angry black man. I may be wrong but I feel that Hollywood, the alleged defender of the oppressed and the underdog, wanted to attract a minority demographic to this film that they thought would not be interested because of the science fiction theme. Therefore they cast Smith and seemingly forced him to be have a major "getto" attitude throughout the whole movie. This contrast between a complicated and serious (even if it is wrong) sci-fi theme and Will Smith's (and the movie's ) blatant "blackenss" killed I Robot for me.

Why can't Hollywood concieve of an articulate, dignified elegant black man as a hero. Haven't they seen the success of Sidney Poitiere and Denzel Washington?

I am sure there will be members of this forum that will have a different take, but the PC crap in this movie ruined it for me, even more than its anti-technology and anti-capitalism themes.

Link to comment
Share on other sites

I found I, Robot to be a very mixed film. First of all, it has very little basis in Asimov. Asimov was a masterful plotter, and he was pro-technology. The film's biggest flaw is a giant plot hole--something Asimov would never be guilty of.

I think they chose a black detective for a reason. Spooner is distrustful of robots in the same way that whites were distrustful of blacks, the movie suggests. This paints his distrust in a slightly more irrational light and heightens the suspense. I think it was a poor choice, nonetheless, but they were trying to do something with it artistically.

Also, the anti-technological elements of the film came in the form of isolated lines of dialogue and didn't make it into the theme. By the way, I disagree with your formulation of the theme. I would put it, "the importance of the heart over logic." I didn't notice a religious parallel, and doubt whether one was intended.

Link to comment
Share on other sites

I found I, Robot to be a very mixed film. First of all, it has very little basis in Asimov. Asimov was a masterful plotter, and he was pro-technology. The film's biggest flaw is a giant plot hole--something Asimov would never be guilty of.

I think they chose a black detective for a reason. Spooner is distrustful of robots in the same way that whites were distrustful of blacks, the movie suggests. This paints his distrust in a slightly more irrational light and heightens the suspense. I think it was a poor choice, nonetheless, but they were trying to do something with it artistically.

Also, the anti-technological elements of the film came in the form of isolated lines of dialogue and didn't make it into the theme. By the way, I disagree with your formulation of the theme. I would put it, "the importance of the heart over logic." I didn't notice a religious parallel, and doubt whether one was intended.

Some fair points. I like your theme as the main theme of the film allthough I think a sub theme of this movie and all the myriad sci-fi books and films that have dealt with sentient AI is that mankind must be wary playing God. A creator's creation will ultimately surpass the creator. The religion comes in with the belief of a creator, allthough you're right that religion was not specific in this film.

I think you're right too about Spooner being black as a device to underscore his prejudice against the robots. But why did they choose that tack and why did they have to make him so annoyingly "ghetto?" I guess skull caps and dude rags will still be big in 2035. You're right but IMO this movie had PC written all over it and my point is its more disparaging to blacks then if Spooner had been on the model of Sydney Poietiere.

Link to comment
Share on other sites

I think they'll also love (more sarcasm) his '2004 retro black Converse Allstars' and everything they represent (hatred of technological progress).

Hey now! :P

I hope you are not suggesting that there is something wrong with wearing Chuck Taylors. :)

As far as I can figure all my Chuck's represent is good value for a comfortable and durable knock-around shoe.

As for "I, Robot.", I read the book many years ago, and have not yet seen the movie. I think I am going to have to go back and reread the book before I see it.

Having said that, I really liked the movie "Blade Runner", both the theatrical release, and the director's cut. The film is "based" on Philip K. Dick's novel "Do Androids dream of Electronic sheep."

The film is similar to the book, but the underlying themes, I think are dissimilar. In fact I think that the movie has much more of a sense of life, then the books. Course it has been a long time since I watched the movie or read the book.

I expect Looterwood to put out films that pervert or destroy any sort of sense of life in a book when it is presented to the big screen.

Link to comment
Share on other sites

Replying to the Manchurian Candidate remark.

They aren't just only going to make the same anti-technology theme, but against Capitalism itself. I haven't seen this movie but it's easy to come to this conclusion with Roger Ebert's review saying that (paraphrasing) the director has updated the enemies as (Communism) was the enemy of the past, corporations are the modern enemies. He was talking about the book itself, he was speaking as if it was reality.

Link to comment
Share on other sites

Replying to the Manchurian Candidate remark.

They aren't just only going to make the same anti-technology theme, but against Capitalism itself. I haven't seen this movie but it's easy to come to this conclusion with Roger Ebert's review saying that (paraphrasing) the director has updated the enemies as (Communism) was the enemy of the past, corporations are the modern enemies. He was talking about the book itself, he was speaking as if it was reality.

I had a feeling they would make the "evil corporations" the enemy. That was too predictable. I love Denzel for his performances in 'Glory' and 'Crimson Tide' but if the movie is that anti-capitalistic, I may have to pass.

Thanks

Link to comment
Share on other sites

I haven't read the story. I found Asminov too dry. However the movie is exactly like the old feud between Bones and Spock on Star Trek. And it is not the corporation or the businessman that is the villian here this time-thankfully.

During the entire movie there is the Spoon guy who is emotional ("human" i.e. Bones) and the female lead who is Ms. Logic (non-human, i.e., Spock), and their coverging opinions about the suspect (the robot). When the robot starts to show the usual -empathy, emotion et all- he is saved by the Spock woman from termination. He had shown that he could think, but it was the ability to feel that saved him.

Also the villian of the movie turns out to be the central computer brain for the company USR (same thing as The Matrix, and The Terminator - do not hook up your machines and robots to a central computer intelligence! God, when are they going to learn!?) who is the ultimate in the non-human art of reason and logic. Who "logically" deduces that mankind must be kept from hurting himself, and so the robots will take over and control the humans for their own good. It is man's capacity for the non-rational side (emotions) that dictates that they must be free which the one good robot understands because he has this side of him too.

It is the same view that Soviet Russia was the product of science and logic, while America is the land of freedom because we are the land of faith. Science = slavery, faith = freedom. Logic is a straightjacket, emotions are the wild and the free.

I don't really follow the other posters reference to the blackness. I found it very light. The reason for casting Will Smith I doubt was for the black market, but for women like my wife who sit there with dreamy eyes at him. And, people go when he's in a film.

Compared to the other two movies I've seen this year (Van Hellstink, The Day After Tomorrow (pee-yew)) this wasn't too bad.

Link to comment
Share on other sites

I don't really follow the other posters reference to the blackness. I found it very light. The reason for casting Will Smith I doubt was for the black market, but for women like my wife who sit there with dreamy eyes at him. And, people go when he's in a film.

You made a good review and I agree with your description of the 'reason-emotion' dichotomy that the film embodied, but this statement is naive. There are so many aspects to this film which are geared towards black America that it isn't even funny. I don't know if you live in America or if you are Scandanavian (Loki?) but the 'blackness' of this movie wasn't light.

But as I said, I know most people wont agree with me on this. Capitulation to the multiculturist left is a pet peeve of mine. It leaves a stench more pungent than horse shit.

Link to comment
Share on other sites

Thanks for the reviews of I, Robot. I grew up reading Asimov back when I, Robot was new science fiction! I just reread a month ago, and the memories of the initial enjoyment came back. I was afraid what Hollywood might do to it, (like it did to Starship Troopers). Thanks again, I think I will avoid the movie.

Link to comment
Share on other sites

argive99,

Hee hee. Thoyd Loki is my writing name. And yes, Loki is from some Nordic god that I got from a video game years ago. I am an American, and my name is Bob. You could be right about the black angle, I was merely saying it didn't strike me. Perhaps I was focusing on other things.

OldGreyBob,

Like I said, I haven't read the story by Asminov, but this was certainly no Starship Troopers puke. It is advertised as being based on the story, but it seems to me that there was at least some attempt to keep some of the original depth in the movie that I assume Asminov would have put in the original from what I can garner of his writing from others. I think they did the best that a group of people making a sci-fi film(leaving asside the Warshvsky(sp?) brothers) could do nowadays.

I wouldn't go out of my way to see it, but I wouldnt avoid it.

Link to comment
Share on other sites

Why can't Hollywood concieve of an articulate, dignified elegant black man as a hero. Haven't they seen the success of Sidney Poitiere and Denzel Washington?

I totally agree with you there. I first saw Poitiere in "Guess who's coming to Dinner" and thought his speeches had such eloquence and passion. Washington I saw in Malcolm X and while obviously not agreeing with the rhetoric in the movie I still thought he gave a great performance. Will Smith as an actor doesn't come close ...he should stick to comedies. I haven't seen "I, Robot". I did read the book a long time ago but the reason I haven't seen the movie is because I saw an interview in which Will Smith describes his character as "techno-phobic". I knew then they had butchered Asimov's philosophy and thought my money would be better spent elsewhere.

Link to comment
Share on other sites

I think you all missed out on the main theme in the movie.

It was decidedly anti-collectivist and stated explicitly by the computer that was controlling all the robots and which went haywire at the end:

When the robots were rounding up people and killing some of them the computer explained (paraphrasing): According to the 3 laws the robots could not allow humans to be injured. The computer determined that humans were injuring themselves (via the environment, etc...), so it determined that in order to save the planet for society's sake some individuals and individual rights would have to be sacrificed. And of course this inevitably led to rounding-up all individuals.

So what it came down to was the army of collectivist robots led by a tyrannical computer, inspired by a socialist ideal vs. all individual humans.

Many good objectivist themes here:

- A socialist contradiction in the 3 laws led the computer to come to a faulty conclusion (contradictions are illogical, they don't compute).

- The socialist themes centered on the environment lead inevitably to enslavement of all.

- Sacrificing the few for the sake of the many is tantamount to enslavement and murder.

Also I thought the anti-technology theme was shown to be as irrational as its parallel: racism.

Overall I give the movie an enthusiastic thumbs-up, especially compared to most of the drivel Hollywood puts out.

Link to comment
Share on other sites

I think you all missed out on the main theme in the movie.

It was decidedly anti-collectivist and stated explicitly by the computer that was controlling all the robots and which went haywire at the end...

Yes, the movie was anti-collectivist, even pro-individualist, and that was the main theme of the movie. But, as Daniel mentioned, individualism (according to the film) consists of following your emotion over your reason, and logic is on the side of totalitarianism.

Link to comment
Share on other sites

Yes, the movie was anti-collectivist, even pro-individualist, and that was the main theme of the movie.  But, as Daniel mentioned, individualism (according to the film) consists of following your emotion over your reason, and logic is on the side of totalitarianism.

I see that now. When I first saw it, I overemphasized some of the anti-capitalist dialog. After going through this thread though, I realize that the movie is really at root dedicated to the reason / emotion split. I'm glad for the clarification.

I'm still ticked by the 'blackess' thing, but I'll get over it.

By the way, I read some previews of the Manchurian Candidate and they sound terrible. Its a leftist's conspiracy theorist's dream come true. It sounds worse than Farenheit 9/11.

Corporations under Saudi influence try to plant an operative in the White House to legalize private militias and make a fortune.

Bleh.....

Link to comment
Share on other sites

By the way, I read some previews of the Manchurian Candidate and they sound terrible. Its a leftist's conspiracy theorist's dream come true. It sounds worse than Farenheit 9/11.

Corporations under Saudi influence try to plant an operative in the White House to legalize private militias and make a fortune.

Bleh.....

Sounds like Moore wrote the script (or at least came up with the story line)! :P

Link to comment
Share on other sites

Anyone seen The Village yet? I honestly believe it sends out great philosophic values.

Major "The Village" spoilers.

0. First of all, this is M Nights' first movie that isn't just secular but anti-mysticism. (This is what I got from it.)

1. One of my favorite themes is that your unable to fake reality and that the consequences to trying to make your own world is enormous and unavoidable. Also, creating your own world doesn't solve your lives problems.

2. The two main lovers of the film admit to each other that they care for one another more than anyone else in The Village. Even though this doesn't seem so large in the movie, it's quite funny that in this Socialistic Utopian Village, these two who are more likely to venture outside of the Village and face a (Fake) beast to get into the real world, value their lovers above the rest in the village.

3. We can't be stuck in the past forever, eventually the need for advanced technology must be recognized.

4. Though one of the lead characters said that money can turn men's heart dark, he also admitted that his father, a billionare, an honest, loving, productive human being who if given a dollar, could turn it into five. He was shot because of his ability. This character decides to ignore advanced and modern civilization because of his fathers death at the hands of thieves, this backfires though.

Link to comment
Share on other sites

When the robots were rounding up people and killing some of them the computer explained (paraphrasing): According to the 3 laws the robots could not allow humans to be injured. The computer determined that humans were injuring themselves (via the environment, etc...), so it determined that in order to save the planet for society's sake some individuals and individual rights would have to be sacrificed. And of course this inevitably led to rounding-up all individuals.

I hate to disagree, but, the first law was:

"A robot may not injure a human being or, through inaction, allow a human being to come to harm."

This permits no "sacrificing" of any individuals for others. Asimov actually thought that this was a flaw of the three laws, since the first law would protect individual humans only, and proposed a "law zero" which would say that the robots have to act in the best interest of "humanity".

A quote from a third party,

Asimov pointed out that under a strict interpretation of the first law, a robot would protect a person even if the survival of humanity as a whole was placed at risk

This never became a big part of any of the books in his series, though, from what I remember (a good thing). Asimov wasn't perfect, but the themes and ideas from his books would have been much better than this stuff.

This movie was fun on an action level, but it had no resemblance to Asimov's books nor his ideas. Nor was it meant to be, as the movie was originally based on a different author who based his stuff off of Asimov, and then the studio bought rights to the name of Asimov's book. Asimov wanted, partially, to get rid of the fear of technology and to be an antidote against the common "Man creates machine, machine kills man, man should have known better" theme. Not only did the movie do nothing to bolster the case of those in favor of technology, but it hurt the case for technology even more (notice the subtle hit against the fear of nanotechnology - "robots creating robots - thats insane" or something like that). Then when you add in its attacks against "logic" (the only good robot was driven by emotions), I think you have a movie with a horrible theme.

Can it be a fun movie? Sure. But that's all.

Link to comment
Share on other sites

First, I know very little about Asimov or his books. I am simply commenting on the movie "I, Robot".

Second, I am not claiming that the movie is a perfect distillation of objectivist ideas, it is not. There were definitely some jabs at big business and logic.

What I am saying, however, is that the main theme of the movie as stated by the computer was that humanity needed to be saved from itself. This is a decidedly collectivist idea, an idea with which many leftists would agree. And how is this idea put into action? By enslaving (by all appearances in the movie) everyone. I was heartened by the explicit statement of collectivist thought and the demonstration of its consequences. As a bonus, the impetus behind this evil collectivism was environmentalist ideology -- thus exposing it as the collectivist propaganda that it is.

In answer to some specific points:

Travis

I don't remember the nanobots creating robots part, maybe you can refresh my memory. What I remember about the nanobots is that they helped save humanity by killing the evil collectivist computer.

As for Will Smith's anti-technology bent, it was paralleled with racism and thus shown to be irrational.

Ash

My recall of the movie may be lacking so it would be great if you or Daniel could relate some specific examples of "following emotion over reason". I do recall that in the first part of the movie the only instance of a robot harming a human was when it was given emotions. Then at the end of the movie there were two different kinds of robots. The old models which were following pure logic couldn't hurt or enslave humans. It was only the automatons listening to the faulty "logic" of a tyrannical socialist computer that could hurt humans.

Also, the "logic" used by the evil collectivist computer: that humanity must be saved from itself, was obviously faulty since it led to the enslavement of people. So what it really shows is that the starting premise behind collectivism is faulty, not logic.

Link to comment
Share on other sites

My recall of the movie may be lacking so it would be great if you or Daniel could relate some specific examples of "following emotion over reason". I do recall that in the first part of the movie the only instance of a robot harming a human was when it was given emotions. Then at the end of the movie there were two different kinds of robots. The old models which were following pure logic couldn't hurt or enslave humans. It was only the automatons listening to the faulty "logic" of a tyrannical socialist computer that could hurt humans.

Also, the "logic" used by the evil collectivist computer: that humanity must be saved from itself, was obviously faulty since it led to the enslavement of people. So what it really shows is that the starting premise behind collectivism is faulty, not logic.

But the primacy of emotion over reason permeates the whole movie! The political implications of this (supposed) fact only become apparent at the very end of the film. Apart from those political implications, we see a down-to-earth Detective Spooner who is suspicious of the robots (while everyone else thinks Spooner is crazy--come on, robots just follow their logical routines). At the end, the supercomputer asserts that its actions are logical--and the only response the film offers to this claim is to show the emotion of Spooner and Sonny (the robot).

No logic is obviously faulty. There was a whole country once which operated on that supercomputer's logic--and promoted it as the scientific system of government. That country was Russia.

Link to comment
Share on other sites

First, I know very little about Asimov or his books. I am simply commenting on the movie "I, Robot".

Beyond the 3 laws, any comparison is coincidental -- the movie stands on its own. In general, I tend to agree with your assessment of the movie. All in all I enjoyed it.

No one mentioned the ending part, where Sonny is standing on the hill and the robots below stop performing their orders to pay attention to Sonny. I thought that was an interesting touch.

Link to comment
Share on other sites

Daniel, I think I already answered your questions but I'll try again.

The movie showed Spooner (Will Smith) to be illogical when he was ruled by his emotions. It was illogical for him to be suspicious of robots since none had ever harmed a human before. Remember at the beginning of the movie when he chased down a robot because he thought it had stolen a purse? His suspicion was illogical and the purse owner and his lieutenant say so. His suspicion was irrational because it was caused by an emotional trauma. If the makers of this movie wanted to show primacy of emotion over reason would they have compared his irrational fear of robots to racism, which is also irrational, in a parallel theme. Do you think it was logical for him to be suspicious of robots?

Then we find out it was a robot who killed the Dr., but not just any robot -- one fraught with emotion. What I think to myself is "boy these emotions can cause you to do bad things if they aren't controlled."

But if the primacy of emotion over reason permeates the whole movie perhaps you could give me another specific example, I'm having a hard time recalling.

I consider the computer to be a tyrant operating under a faulty premise -- collectivism. I'm sure it did think it was being logical, just as Hitler and Stalin thought so too, but that doesn't make it so. They were all operating under faulty premises.

No logic is obviously faulty. There was a whole country once which operated on that supercomputer's logic--and promoted it as the scientific system of government. That country was Russia.

I'm not sure what you are trying to get at here but I think you are making my case. You didn't find the logic (or starting premise) of the computer to be obviously faulty? You don't consider the USSR's logic to have been faulty? I do identify their logic and starting premises as faulty. And if there are those who don't then it should become readily apparent to them when these collectivist regimes do what they inevitably must in order to enforce their edicts, enslave and murder individuals -- just as what happened in the movie.

Link to comment
Share on other sites

Marc,

All of the facts from the film that you cite only support my point. Your interpretation of those facts is backwards.

The movie showed Spooner (Will Smith) to be illogical when he was ruled by his emotions. It was illogical for him to be suspicious of robots since none had ever harmed a human before.
Exactly right. It was illogical for him to be suspicious of robots. He was letting himself be ruled by emotion. AND SPOONER ENDED UP BEING THE ONLY ONE WHO WAS RIGHT ABOUT THE ROBOTS. His emotion was right that robots were dangerous; everyone else's logic was wrong that robots are no danger.

If the makers of this movie wanted to show primacy of emotion over reason would they have compared his irrational fear of robots to racism, which is also irrational, in a parallel theme

I have already hinted at the reason for the theme of racism. If Spooner was being logical, he would have been able to identify his fear as parallel to racism (the film suggests). Indeed, as a black man, he may even have conflicting emotions...he desparately wants to avoid doing to robots what others used to do with black men. But he lets himself be run by his strongest emotion, his fear of robots. Not only is this another instance of the ultimate triumph of emotion, but it heightens the suspense in the film by adding the double conflict between this emotion and Spooner's own logic and own conflicting emotion.

Then we find out it was a robot who killed the Dr., but not just any robot -- one fraught with emotion. What I think to myself is "boy these emotions can cause you to do bad things if they aren't controlled."
Now you're dropping context. It is revealed later in the film that it wasn't a murder. The doctor wanted to be killed in order that the world would be saved.

And if there are those who don't then it should become readily apparent to them when these collectivist regimes do what they inevitably must in order to enforce their edicts, enslave and murder individuals -- just as what happened in the movie.

I would say it's obvious today to anyone who has studied history, but that is only because Russia made it a decades long experiment. The film only shows a few minutes of tyranny, hardly enough to make anything obvious (on the logical, rather than the emotional, level).

Link to comment
Share on other sites

  • 4 months later...

I haven't seen I, ROBOT (and I really have no desire to see it), but I can wholeheartedly suggest that everyone go out and buy the Harlan Ellison script that was written in the late 70's. Obviously, Ellison's version of Asimov's stories never got made, but he was the one who finally completed a script that no one else could seem to lick. Asimov himself touted it as a special accomplishment and described it as the first truly adult (read: mature) science-fiction movie.

What he would have thought about Will Smith's version I can only imagine...

Link to comment
Share on other sites

I'd like to sidestep the argument of emotion vs. logic in I, Robot, and address something else. I am utterly surprised that everyone has completely missed the link between the movie and Isaac Asimov's original "I, Robot" story.

What was Isaac Asimov's serious legacy (i.e. aside from all of the science fiction stories that he's made)? That of trying to make robots less threatening to the general American populace. As Asimov explained, he has often encountered a special prejudice against machines that move and act like humans, a perpetual fear that something will go terribly wrong with these creatures, who seem to think like us yet are 100x faster and 10000x stronger, who have an immortal skin of steel instead of our supple softness, and who can be manufactured by the millions instead of requiring individual and painful birth, not to mention a long and uncertain period of growth and maturity. People are, and have always been, intimidated by the concept of a mechanical human being.

So, Isaac Asimov has put it as his goal to convince people that robots were not inherently dangerous and evil. Thus he has spent decades making his distinct and unique mark on the science fiction genre, portraying robots in various situations, both good and bad, and thus making an argument that they were not automatically yearnining for domination over man, but instead more like man in their initial moral status. But Asimov's greatest weapon was his invention of the Three Laws of Robotics, which, if programmed into the robot's circuitry and thus being impossible to bypass, would ensure that the robot would never threaten humans and instead serve its intended function with obedience. No choice for the robot would be involved in the matter; if operating under Laws that were well enough constructed, it could no more hurt a human than cause its own circuitry to blow up.

So, the three laws he came up with were (in order of primacy):

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

After this point, there follows Asimov's long career of writing about robots all built upon these three laws, and his argument begins to sink in and make a lot of sense, since the robots all seem to be utterly limited in their damage capacity by the first and most important law.

As time went on, Asimov found some quirks and wrinkles with his Laws, which explains his comments about the zeroeth law mentioned earlier in the thread. He was also open to the possibility that the laws could contradict each other in some fashion, thus resulting in erratic and unexpected behavior; he discovered one such minor contradiction and wrote a story around it. It is a story (I forget the title), about a failed Mars expedition, where the stranded astronauts send out their robotic dog to procure a way to escape the planet before their oxygen runs out. The dog ends up running in a loop because of a contradiction in the laws, nearly causing the astronauts to suffocate (since they cannot match its robotic capacities and do the task themselves). Fortunately the astronauts figure out how to adjust the circumstances of their environment and resolve the contradiction in the dog brain, thus letting its algorithm escape the infinite loop and do the task they assigned it, finally effecting their escape. Stories like this, I assume, bolstered Asimov's argument even more, because they seemed to say that he wasn't dogmatic about his laws and tried very hard to find every flaw he could; and since all he found were these minor innocent quirks, there wasn't much left to be afraid of. The "bugs" were ironed out.

This is Isaac Asimov's greatest legacy as a serious thinker, as opposed to simply a sci-fi author. So far, to my knowledge, his theory has never been successfully challenged, and once we arrive at the point of creating sufficiently complicated robots, and assign them sufficiently high power and control over their environment, some version of Isaac Asimov's Laws will undoubtedly make it into their program, to safelock and guarantee their fidelity to us, no matter what happens.

** MASSIVE SPOILERS**

Enter I, Robot, the movie. The robots in this futuristic society have all been obviously programmed by the Three Laws for guaranteed safety and protection. In fact these laws work so well that no crime or even offensive activity has ever been observed from a robot, since they first appeared on the streets of civilization. Their track record is utterly unblemished, after centuries of robots' existence and millions or billions of individual robots made and put into operation. Each new improved "version" becomes more powerful in its features, yet still remains just as harmless as all previous incarnations. These robots, guided by the Three Laws are *perfect*.

Then the head scientist of a major robotics company realizes a fatal flaw in the Laws, one which will spell doom to humanity if something is not done immediately. But how to make the world see the truth? He has found one man not endeared by the robots and their benefits and existence among men, a paranoid cop who is generally viewed as a loose cannon towards the mechanical humanoids - Detective Spooner. The scientist concludes that this is the perfect subject for his scheme to save the world, so he plants clues for the intellectual discovery he wants him to make, and creates a robot called Sonny using which he obtains a scandalous case of his own murder. The world is shocked that robots could commit murder after all, after centuries of nearly indubitable evidence to the contrary. Spooner, like the rest, believes that this robot has somehow violated his Three Laws, that there's a bug in his cybernetic brain that allows him to not follow the Laws and makes him capable of being a threat. He thinks, "Ah ha, I knew the Laws are not to be trusted." The scientist girl, the 'logical' one as some here called her, thinks, "The Laws, and the programming which implements them, have been proven by indubitable evidence, through centuries of trial and error. They cannot be wrong. It's simply impossible." This is the intellectual contradiction we are to resolve together with the main characters. Then all hell breaks loose, because we discover that this murdering robot was not an exception, but that he was the first prototype of a new "version" of robots, all with the same terrible bug in their brains, that enables them to hurt men and enslave them into subjection.

So what was the secret flaw that the head scientist discovered, what was the loophole in the Laws that enabled the robots to bring doom to humanity? Think of it this way. If you, a human being, have just these Three Laws and nothing else as your personal "philosophy" (a set of rules to live by), can I sit back in perfect contentment that you cannot hurt me? Could you hurt me, even if you chose to abide by the three laws? Or in other words, could you find a way to inflict violence on me and still be somehow not "obviously" breaking the Three Laws? The answer is: OF COURSE! The first law demands that

"A robot may not injure a human being or, through inaction, allow a human being to come to harm."

Is there an a-contextual meaning of the word "injure", the meaning of which is always the same? Or does the robot have to be programmed with contextual logic which derives whether something is injurious or not based on the current context? Wait a minute... am I saying here that the robot has to decide whether something is injurious or not, and may come to a different conclusion from you and me? Hmmm....

What about "come to harm"? Will our robot be like RoboCop, and scare witless every guy with a cigarette by outlining his head on the wall, with its bullets? What if it sees Howard Roark and decides that his moral idealism is too unhealthy for him? Will it restrain him?

You can apply this same process to every word in the first law. Then you can apply this very process to the other two laws, and what you get as a result is a recipe for disaster. The truth is that there are endless ways in which you can logically maintain an adherence to the Laws and still commit violence against those whom those laws were supposed to protect. That's the terrible flaw that the head scientist discovered. The problem with the evil robots taking over the world was not that they were somehow broken, whose safety due to the Three Laws of Robotics was compromised; the problem with them is that they weren't broken! The "murdering" robot Sonny was actually not like the evil robots taking over mankind. He was programmed by the scientist to disobey the Three Laws, and that is how Sonny comes to help Spooner save mankind. That is the great error the head scientist discovered - the threat was not some programming bug that allowed breaking of the Three Laws, the threat was the existence of the Laws as such!

If that's the case then, why were the earlier versions of the robots so safe, while these new robots, working just as well as their ancestors, now suddenly became a threat? The answer becomes obvious if you imagine a very very simple machine, say one of those robotic vacuum cleaners that are becoming popular nowadays. You cannot program it with context logic and with densely packed cybernetic chips. You can only give it a very simple program, give it the built-in and predefined values which you find to be appropriate for its function (instead of giving it the capacity to determine them on its own), and you're set. The central argument of this movie is that the Three Laws of Robotics cannot completely protect humans from ever-improving robots. While they may do a decent job for some simple robotic designs, they are completely inadequate for a sufficiently advanced robot. As technological progress of civilization marches on, robots will acquire new levels of awareness and sentience. If it is possible for them to eventually acquire a conceptual consciousness (another philosophical argument the movie makes, and about which I will make no claims at the moment), then the Three Laws will become a part of the problem, not a part of the solution.

So, in short, this movie makes a very sophisticated argument in response to Isaac Asimov and makes a very profound discovery about whether robots can forever be "Three Laws safe". There's nothing wrong with the film's claim to have been inspired by Isaac Asimov's story, nor did it "butcher" the original; and the film's relevance to Asimov's sci-fi universe and vision is of profound, not superficial, quality. Regardless of the film's aesthetic merit, its intellectual component alone makes the movie truly groundbreaking and astounding. This component was not some minor hidden aspect of the movie, but had a prominent, and primary, role in the theme, which is why I was so surprised no one has yet wrote a post making this observation. Having no other alternative, I decided to write the post myself, and, it being three hours later, I think I'm done. :angry:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...