1. Robot may not through action or lack thereof, cause harm to human.
2. Robot must obey human commands.*
3. Robot must preserve its own existence**
*except when in conflict with law #1.
**Except when in conflict with either laws 1 or 2.
In Atom's world, it is inherent in the very concept of a robot, that it cannot rather than may not violate these laws. Thus, only with the Omega Factor , can a robot have the ability to choose whether or not to obey.
It is not a matter of programming. Programming alone cannot give robot ability to violate Asimov's laws. Only with the O-factor, can it have any choice.
This is clearly contrary to the real world. Machines execute instructions.
Let us not forget "The Lying Robot". Daddy Walrus reminds the class that Atom has not problem with lying, since robots cannot lie. But did he lie, when imitating True's voice and deceiving blind girl into believing he was ok? Does evil intent become a factor in what is and what is not a lie? Atom meant no ill by imitating True's voice, in fact, he meant to do good. Rather than cause a blind girl grief by telling her that True had gotten his ass kicked, and needed repair; he, being confident that Dr. Elefun could fix True, reassured her. Nevertheless, he did deceive her.
