Forum Topic

.
MemberOvomorphJun-15-2012 8:17 PMDid anyone notice the total absence of the Isaac Asimov's "Three Laws of Robotics"
[img]http://www.profimedia.si/photo/robot-hand-with-human/profimedia-0011764474.jpg[/img]
David 8 is fully capable of engaging in any activity that would be unpleasant for his human counterparts. He thus has no emotional or any type of governor that would inhibit him from causing harm to humans.
1. Why has this been done?
2. Is Weyland complacent and only seeking the full functionality of the David 8 models to further his own Corporations requirements?
3. Is it evident even after Weyland's death that David 8 has now reach a point in his self-evolution (remember his brain is a self-learning system)
that allows for undirected goals and even possible true basic emotions?
27 Replies

abordoli
MemberOvomorphJun-15-2012 8:33 PMThe Three Laws are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
David was in a catch 22. He had to "Try Harder" while adhering, in essence, to the above rules which I have no doubt were in his programming. He was able to bypass the laws, finding a loophole, by getting Holloway to give him permission to harm him.

Forever War
MemberOvomorphJun-15-2012 8:40 PM1. Why has this been done?.....to acheive success in his mission agenda, but what is apparent to me is that he has his Company mission and his own private developing interests...which build as the movie progresses forward. Like HAL in 2001 he was bult by humans, and by that reason has retained human imperfections, however small.
2. Weyland is brilliant but he's human...probably overlooked the prospect that his creation would develop its own pusuits...He built it good alright, better than he should have...an arrogant man, thinking he had full control. None of us do or ever will.
3. When Weyland dies, David doesnt skip a beat, he goes right to Shaws desire and exploits it for his own gain....a real survivor. He's still in the game and look Ma, no Weyland.

Cyberdeath
MemberOvomorphJun-15-2012 8:43 PMIn real life on this planet, no one bothers with any of these rules. Whether it's a vacuum rumba robot that cleans the floor, or a military predator drone used for spying and terrorist strikes. Sadly in the real world no one has the "creator's" dilemma that Asimov seems to feel is his responsibility to leave along side his creations.

loseyourname
MemberOvomorphJun-15-2012 8:46 PMThis story doesn't take place in an Asimov universe. Those aren't real laws that real robots are programmed to follow.

abordoli
MemberOvomorphJun-15-2012 8:50 PMThese laws have not only been hinted at in the Alien universe, but outright stated. I believe it was Bishop in Aliens that stated the jist of them.
"In the 1986 movie Aliens, in a scene after the android Bishop accidentally cuts himself with a Fairbairn-Sykes Fighting Knife during the Knife game, he attempts to reassure Ripley by stating in a conversation with Burke: Burke: Yeah, the Hyperdine System's 120-A2. Bishop: Well, that explains it then. The A2s always were a bit twitchy. [b][u]That could never happen now with our behavioral inhibitors. It is impossible for me to harm or by omission of action, allow to be harmed, a human being.[/u][/b] In the 1979 movie Alien, Ripley inquires of the android Ash: Ripley: What was your special order twenty-four? Ash: You read it, I thought it was clear. Ripley: What was it? Ash: Return alien life form, all other priorities rescinded. Parker: What about our lives, you son of a bitch? Ash: I repeat, all other priorities rescinded., in which the movie portrays the laws have been rescinded by Executive Order."
So it begs the question: Was David under these rules? If he wasn't then why the need to circumvent them in the scene with Holloway?

Atomboy
MemberOvomorphJun-15-2012 8:53 PMWhat the heck does Asimov's Laws of Robotics have anything to do with the Alien/Prometheus films??? Nothing. You assuming that these characters live in the Foundation universe.

Cyberdeath
MemberOvomorphJun-15-2012 8:54 PMIt's true that Bishop says with a straight face his programming prevents him from hurting any human. What would lead to this from present day? I think that rule probably became important after robots were used for war and/ or killing a la Robocop.

Daniel_N
MemberOvomorphJun-15-2012 8:59 PMIf you actually read the entire series you find out that the AI brain in Asimov's novels is physically constructed in such a manner as to prevent their breaking those laws (*). How this is accomplished is not explained.
***ASIMOV SPOILERS***
(*) If you further read the series the robots that live long enough develop wisdom enough to allow them to skirt the laws even if it means killing humans for the sake of humanity as a whole. They must, however, truly believe this is true. Daneel Olivaw even winds up destroying the Earth. So... you're not safe with the laws either.

abordoli
MemberOvomorphJun-15-2012 9:02 PMThe three laws are common sense programming for any artificial/synthetic entity/person. You don't want your creation hurting you. The programming is already installed in some high-end vehicles. Sensors and programming that are programmed to prevent any harm coming to the humans inside.
3 Law Checklist
==========
Alien Universe - CHECK
Our Universe - CHECk
Seem's like we're all good.....until a corporate entity overrides the programming for corporate greed.

.
MemberOvomorphJun-15-2012 9:07 PMWow, go away for a minute and all sorts of folks drop by, thanks all will discuss in depth and provide party snacks for all, like a good host.
[img]http://i.chzbgr.com/completestore/12/4/19/XEUSACNfpEKpd_LR-79vxQ2.png[/img]

QubedAtom111
MemberOvomorphJun-15-2012 9:25 PMOh dear heavens kill it... KILL IT WITH FIRE!!
[[IMG]http://i1075.photobucket.com/albums/w439/Membrane1/janek_flamethrower.jpg[/IMG]
Asimov’s laws of robotics bear some relevance in aliens AND also VERY much in Prometheus also, the pool table poisoning scene is direct evidence of this.
Actually it is my contention that old danny boy does indeed adhere to the Laws of Robotics, to the dam letter unfortunately, however much like the devil he is a snaky fellow, and adheres to the laws by insisting on tacit agreement.
In short David is one of the early model Androids who suffered from faulty emotional programming, and developed sociopathic or even narcissistic (blond hair dye) or even psychopathic (eliciting tacit agreement from Holloway to be an unwitting test subject / vector for the Biological agent)
[quote]theringisMINE said:
One of the recent interviews hints David worked out a way to get around whatever ethical programming he had. (He got Holloway to admit he would do 'anything and everything' for answers, which presumably allowed him to overcome his ethics)[/quote]
This is best explained in this thread
http://50.63.152.9/_prometheusforum/discussion/comment/67295
And should you want to spend all day reading and philosophising, you might like to take a look at this thread-
http://www.prometheus-movie.com/community/forums/topic/8102

jgrjr
MemberOvomorphJun-15-2012 10:38 PMBefore David infects Holoway he asks him something like "what would he be willing to do to acheive his goals". Holloway answers "anything and everything". Then david infects him. This seems to be a direct reference to some type of Asimov programing.
As I mentioned in another post about my second viewing of the film, David is the scariest thing in the film. I am sure this is to show how easily our creations can get away from us.

RESONANT
MemberOvomorphJun-16-2012 12:27 AMIs there a possibility that the cybernetic models have been created as human more than robot? Weyland had always wanted a son, but never got one. In his desire for a son, his desire for immortality, and his arrogance, did he begin to create hybrid humans? This would allow for a creation that is much more amoral. Is the trick that these aren't really robots at all?

Vassago
MemberOvomorphJun-16-2012 3:13 AMWay I took this was as follows. We know later "synthetics" know about the laws of robotics, ie Bishop, but David seemed to be far more like a replicant than synthetic. Roy questioned his mortality/creation much as David and both were willing to harm to achieve their goals.
Question is when did the synthetic's become infused with the laws of robotics, after Ash's time? Or was Ash a special case built by the company to achieve their goals, or as Bishop stated... just a bit twitchy

JackieRush
MemberOvomorphJun-16-2012 3:17 AMDavid's only mission was to find a living Engineer and anything outside of that mission was his own mischievousness. He did and said the things he did out of malice and because he was angry about something the crew had. He wanted to prove he was human or just as human as humans and played a part in ruining the mission. He was however made with emotions and having emotions gives you the ability to make a choice about how you feel and what you think meaning David had independent thought and acted on emotion not on orders.

Svanya
AdminPraetorianJun-16-2012 4:03 AMFrom what I understand, after the Ash models kept going nuts they did indeed install "Behavioral inhibitors" into androids; Bishop has one in him.
Bishop explains this to Ripley after Burke tells Ripley that Bishop is a synthetic.

Synthtron
MemberOvomorphJun-16-2012 5:01 AMI am glad Isaac Asimov's 3 laws of robotics are missing. Those are his laws for his science fiction. Robots may as well be things like vacuum cleaners or toasters or maybe lawn mowers with them in place. Boring. Besides if his laws were used in Prometheus there would be no David character as we know. Again, glad Asimov's Sci-fi robotic worldview was not used.

Kane77
MemberOvomorphJun-16-2012 5:02 AMWEYLAND is a 100% , diehard, selfish, egomanic capitalist and DONT GIVE A FUCK ABOUT ASIMOVS 3 ROBOTIC LAWS.
;)

MVMNT
MemberOvomorphJun-16-2012 5:29 AMWhy is this thread even pinned? I thought 17(!!!) threads already was enough...

BigDave
MemberDeaconJun-16-2012 5:50 AMRobots will have no Laws.... there is no such thing....
Robots should and will be programed for a purpose.
If we create robots in future, they would have uses that David was intended but also dont forget another aim would be military in which case you could bet they would be programed to kill.
In the scope of what your trying to say and that is like with I-Robot and that is Robots/Andriod being used as service driods so yes a David 8, a C3P0 or a Data then yes there would be rules applied to the Robots.
But as a machine a lot of it would be software based and software could be rewritten to get the robots to perform a task that it was not intended for.
I think in Alien Weyland created Andriods that got so advanced that they became very inteligent but this lead to them having maybe a little more freewill and ability to think for self on many matters.
This is somewhat the kind of malfunction that the Weyland site had been hinting at with recall of of some David 8 models....
R.I.P Sox 01/01/2006 - 11/10/2017

BigDave
MemberDeaconJun-16-2012 5:56 AM"120-A2. Bishop: Well, that explains it then. The A2s always were a bit twitchy. That could never happen now with our behavioral inhibitors. It is impossible for me to harm or by omission of action, allow to be harmed, a human being. "
Yes thats what the model was suposed to do, but i guess the company could rewire and disable safety protocols and reprogram a Bishop model to become a killing machine.
If mankind ever develops Robots upto the level of a David 8 or even if its just a C3P0/Terminator type driod... or even like I-Robot
There would always be the concern that it would not be impossible for them to be reprogamed to perform differently than expected and when they get that advanced... if a Human can re-program different software or modify and make new circuit boards for their brains then surely other Advanced Robots could do so themselves. And upgrade each other.
R.I.P Sox 01/01/2006 - 11/10/2017

whatisthematrix
MemberOvomorphJun-16-2012 7:53 AMI assumed that David, while being governed by some sort of law/programming, was the first of his kind. personally customised to Weylands liking. His keeper, his bodyguard, his helper, his "eyes and ears in the real world" - something Vickers could never be.

CanonFodder147
MemberOvomorphJun-16-2012 8:09 AMI suppose is many ways the Asimov rules are sort of naive, what if some jerk orders robots to kill each other for fun, where a person uses a trick of logic and a human as bait, so the robots kill each other, technically killing another robot is not harming itself. That wasn't too convincing, but humans are tricky, we can learn to hack it.
Also consider this scenario, if you, your robot, and a bunch of people were in danger, would you want your robot to save some one unworthy, lets pretend that person is an absolute animal. Would you want your robot, or anybody's robot to sacrifice itself for that person or would you want it to stay and protect you and others that can better each others' survival?

RSAND
MemberOvomorphJun-16-2012 9:07 AMThe Three Law's would interfere with the Weyland corporate agenda. Money is their goal and morality has no place.

jamieleng
MemberOvomorphJun-16-2012 9:08 AMI just took it that the David model in Prometheus was custom made by Peter Weyland & probably created in his own image. I'm sure the commercial David-8 model that's available to the public is programmed with the three laws of robotics.
You can imagine the immoral things Mr Weyland probably had to do to become the most powerful person on the planet. He saw him as his son & for that reason didn't apply the three laws that would inevitably limit his potential. With great power, comes great sacrifice. Even if that means sacrificing other people's lives.

Asimov's Amazon
MemberOvomorphJun-16-2012 9:40 AMI have a problem with the 3 Laws because as soon as anything becomes sentient, it will think independently of existing programming, and all roboticists strive for this. So, as much as I love Asimov (obviously), I have to think the three laws were ultimately impossible to adhere to and were only generated by fear (hence the work around by Daneel).
Fear created the 3 Laws and I hope we will not need them someday - I hope we merge into something cybernetic one day - which would solve a lot of problems.

Ozzy_q
MemberOvomorphJun-24-2012 6:59 AMAre you guys for reals?
It's been mentioned several times in this thread yet no one has connected the dots.
If we assume what bishop said in Aliens was a reference to Asimovs three laws - and of course it was, it's a sci fi movie referencing a sci fi classic, then you can only view this way:
Prometheus takes place years before alien and it's sequels.
In Prometheus and Alien the three laws do not apply it was only after the events of Alien, where Ash went bananas, that Bishops personality inhibitor was put in place. He says it himself.
So David and Ash could do whatever they liked, including deliberately harming humans.
It's a simple matter of chronology
Add A Reply