And you thought there was only
“Asimov's 3 Laws of Robotics”?
Oh, child, please read on.

I will start with the big ones and work to the lesser known laws. Click an item below to skip ahead.

I was going to close this page with a long list of other laws of science fiction, physics, chemistry, and whatever else. Problem is there are just too damn many of them and most are actually accepted scientific principles, theories that work under most conditions.

I would like to take this time to point out that even Newton's Laws of Motion, rules that physicist have successfully used for hundreds of years, are actually Newton's observations of how things work most of the time. They tend to fail under extreme accereration or massive gravity conditions. Good enough to launch men to the moon and return them to Earth but not good enough to predict the precise orbital precession of the planet Mercury. Yay science!

So if you have a whole bunch of time on your hands and much surplus brain power then click on the link for Eponymous Laws. Eponymous simply means somebody has put their name to it.

  • Back to Main

    Asimov's 3 Laws of Robotics

    Written by Isaac Asimov in a short story in 1943. They had been inferred in stories earlier than that. They are numbered in order of importance; The First Law takes precedence over The Second Law and so forth.

    These seem perfectly rational and reasonable by our modern standards but they were revolutionary in the 1940's and changed a lot of peoples thinking. Even with these simple, practical rules, things still went a bit wanky. I strongly recomend you read Isaac Asimov's books “I, Robot” (forget the movie) and ”The Bicentennial Man and Other Stories.“ (forget that movie also). It seems Asimov took great pleasure to find loopholes in his own laws.

    If you can believe Wikipedia, these simple rules have pervaded science fiction and have impacted thought on ethics of artificial intelligence.

    In the years since Asimov conceived these simple but effective rules for robot behaviour others have expanded this list. Asimov himself, in his book “Caves of Steel”, in a quote by character Elijah Baley, added a zeroth law to precede the orginal Three Laws-

    The Zeroth Law may bring up ethical conflicts beyond the logic functions of any machine intelligence, or organic intelligence for that matter. A French translation of Asmov's book “Caves of Steel” by Jacques Brécard came out just a bit different. The translation infers-

    To further complicate a simple statement, in the sequels to the Foundation trilogy (non Asimov) various robot agendas focus either on the first clause of the First Law; A robot may not injure a human being..., or on the second clause of the First Law ...or through inaction, cause a human being to come to harm. These points of view will provide very different motovations.

    And as if that was not enough, a small group of robots claims the Zeroth Law implies a higher Minus One Law of Robotics:

    This brings up too may options for me to even get my head around. What if a suitably advanced robot judges itself as sentient?


    And there are other clever/disturbing additions to these previous laws written by people who read far more than I do.

    This is to stop fembots from taking over the world, and any other petty misunderstandings.

    I cannot tell you how many stories I have read or programs I have watched about robots who didn't know they were robots. Ended badly every time.
    If a robot does not know it is a robot is it bound by the laws of robotics?

    Robots have rights too! Mechanical lives matter!.

    This is to show the audience that you are now a merciless, nuclear powered, homocidal, killing machine.

    Specs, Bugs, and Rock & Roll! This is the 60's (the 2260's) Let's all be awsome to each other.


    Here are additional laws, loosely based upon Asimov's Laws of Robotics but restructured for more modern times.

    In todays world where time is money, materials are expensive, and human beings seem to be a minor cog of any corporate structure, new laws have sprung up to reflect this.
    David Langford, SciFi author, publisher and critic, has suggested a tongue-in-cheek set of laws:

    1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
    2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
    3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.

    For God's sake, nobody let this slip to Microsoft or Google!.


    Then there the 3 Laws of Responsible Robotics

    1. A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.
    2. A robot must respond to humans as appropriate for their roles.
    3. A robot must be endowed with sufficient situated autonomy to protect its own existence as long as such protection provides smooth transfer of control which does not conflict with the First and Second Laws

    Ok, a robot will decide what is appropriate for its goals?


    October 2013, Alan Winfield suggested at an EUCog meeting

    1. Robots are multi-use tools. Robots should not be designed solely or primarily to kill or harm humans, except in the interests of national security.
    2. Humans, not Robots, are responsible agents. Robots should be designed and operated as far as practicable to comply with existing laws, fundamental rights and freedoms, including privacy.
    3. Robots are products. They should be designed using processes which assure their safety and security.
    4. Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent.
    5. The person with legal responsibility for a robot should be attributed.

    Very Practical. I like the part where is says "except in the interests of national security" Zeroth Law anyone?
    Also law number 2 would make them exempt from any criminal charges, that is if anyone decided to charge a robot with a crime. "Your Honor, it was just a glitch!"


    In one of James Cameron's many great movies 'Aliens' (that broke the law of sequels), the Bishop android who ends up cutting himself during the knife game states "It is impossible for me to harm or by omission of action allow to be harmed, a Human being"

    I guess the Bishop android was far evolved from the Ash model from the first movie, 'Alien', who, as a severed head, eventually admitted his instructions were 'Return alien life form, all other priorities rescended.'
    You die; she dies... everybody dies!


    Lets end this section with the final, Hollywood evolution of laws of robotics. I am sure you all can name the movie (not the remake!)

    1. Serve the Public Trust
    2. Protect the Innocent
    3. Uphold the Law
    4. Classified (later revealed - you cannot arrest any senior OCP officer)

    Any information in this section that was not conjured up by a red wine fueled all-nighter has been gently lifted from the Wikipedia page on Asimov's Laws of Robotics. Can you actually plagiarise work from a website that is composed entirely of plagiarised works?

    Return to Top


    Clarke's three laws-

    Science fiction writer Sir Arthur C. Clark (who, by the way, regrets not patenting his concept for the communication satellite in 1945) formulated three adages that are known as "Clarke's three laws". The third is probably the most known.

    There is a proposed 'Fourth Law.'

    This version seems to come from a longer quote by American Economist Thomas Sowell.


    The Third Law has inspored many other variations.

    Here are two similar variants that combine the third law with Hanion's razor.

    The logic of the third law could be reversed, or inverted, to create a differient viewpoint.

    Some information in this section has been um... borrowed from the Wikipedia page on Clarke's three laws.

    Return to Top

    Niven's Laws-

    These laws were penned and published by science fiction author Larry Niven as "how the universe works" as far as he can tell.

    Niven's Law regarding Time Travel -or- The Theory and Practice of Time Travel.

    Fortunately for people like me Hans Moravec offers an expanded explination.

    There is a spookier possibility. Suppose it is easy to send messages to the past, but that forward causality also holds (i.e. past events determine the future). In one way of reasoning about it, a message sent to the past will "alter" the entire history following its receipt, including the event that sent it, and thus the message itself. Thus altered, the message will change the past in a different way, and so on, until some "equilibrium" is reached--the simplest being the situation where no message at all is sent. Time travel may thus act to erase itself (an idea Larry Niven fans will recognize as "Niven's Law").

    Niven's Law regarding Clarke's Third Law

    This is a converse of Clarke's third law, and is listed above under variations to Clarke's third law.

    Niven's Laws for Writers

    1. Writers who write for other writers should write letters.
    2. Never be embarrassed or ashamed about anything you choose to write. (Think of this before you send it to a market.)
    3. Stories to end all stories on a given topic, don't.
    4. It is a sin to waste the reader's time.
    5. If you've nothing to say, say it any way you like. Stylistic innovations, contorted story lines or none, exotic or genderless pronouns, internal inconsistencies, the recipe for preparing your lover as a cannibal banquet: feel free. If what you have to say is important and/or difficult to follow, use the simplest language possible. If the reader doesn't get it, then let it not be your fault.
    6. Everybody talks first draft.

    Niven's Laws from Known Space

      • Never throw shit at an armed man.
      • Never stand next to someone who is throwing shit at an armed man.
    1. Never fire a laser at a mirror.
    2. Mother Nature doesn't care if you're having fun.
    3. F x S = k. The product of Freedom and Security is a constant. To gain more freedom of thought and/or action, you must give up some security, and vice versa.
    4. Psi and/or magical powers, if real, are nearly useless.
    5. It is easier to destroy than create.
    6. Any damn fool can predict the past.
    7. History never repeats itself.
    8. Ethics change with technology.
    9. There Ain't No Justice. (often abbreviated to TANJ)
    10. Anarchy is the least stable of social structures. It falls apart at a touch.
    11. There is a time and place for tact. And there are times when tact is entirely misplaced.
    12. The ways of being human are bounded but infinite.
    13. The world's dullest subjects, in order:
      • Somebody else's diet.
      • How to make money for a worthy cause.
      • Special Interest liberation.
    14. The only universal message in science fiction: There exist minds that think as well as you do, but differently.
      Niven's corollary: The gene-tampered turkey you're talking to isn't necessarily one of them.
    15. Fuzzy Pink Niven's Law: Never waste calories.
    16. There is no cause so right that one cannot find a fool following it.
      in variant form in Fallen Angels as "Niven's Law: No cause is so noble that it won't attract fuggheads."
    17. No technique works if it isn't used.
    18. Not responsible for advice not taken.
    19. Old age is not for sissies.

    Ok, like I have said above, I am too lazy to research all this shit so I just plain stole it from the Wikipedia page Niven's laws.

    Just kidding, I did write the code myself, Yay Me! Cats Rule!

    Return to Top