Welcome to Gaia! :: View User's Journal | Gaia Journals

 
 

View User's Journal

Food For Thought?
Often times I have thoughts that I deem worthy of sharing with whoever wants to see. They are dumped here.
Asimov Robot Laws
Just for reference, they are:
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

They were first introduced to everyone, I'm sure, in the movie I, Robot. They were used as laws that were programmed into all robots manufactured to prevent harm to the human race.

But what if those laws were "programmed" into humans?

The first law: A robot, in this case human, may not injure a human being, or, through inaction, allow a human being to come to harm. Wouldn't that solve the excessive lack of caring in the world? I'm sure there have been several times when you have observed harm directed toward another human and done nothing to stop it, or even harmed someone yourself. The type of harm could be emotional, or physical, or indirect. But you still did it.
If a person was "programmed" not to harm another human, or allow another human to come into harm, every single person on the planet would be responsible for, and respectful toward every single other person on the planet. Certainly this would solve a HUGE chunk of society's problems.

The Second Law: A human must obey orders given it by other humans, except where this would conflict with the First Law.

This one would certainly bring about quite a lot of controversy, especially since we've grown accustomed to a lifestyle that does not take commands or requests well. Instantly the human mind commands rebellion upon order, probably because we think too highly of ourselves to give in to another. It would be considered "weak." However, this law would be impossible to flaw, since it can not conflict with the First Law that prevents harm to any other human. An order cannot also be given with reasons of hurtful intent because the order would conflict with the First Law.

The Third Law: A human must protect its own existence as long as such protection does not conflict with the First or Second Law.

This one's just intelligent, and also correlates with the First and Second laws nicely. A person cannot simply "skip out" of this law because it would conflict with the First Law. A person would also be more willing to perhaps, on a larger scale, save their planet.
Say, for example, a meteor was headed toward Earth. Not a person could think they were doomed, it would conflict with not only the Third Law, but the First as well. The thought would be allowing a human (in this case, the entire race) to enter harm's way. Should that meteor destroy us all? Good try, humans. At least we didn't die as apathetic wastes of land space. We died as a strong race that was willing to save our race.





SherranWrap
Community Member
SherranWrap
Prev | Next»
Archive | Home

  • [03/13/08 12:19am]
  • [06/07/06 07:22pm]
  •  
     
    Manage Your Items
    Other Stuff
    Get GCash
    Offers
    Get Items
    More Items
    Where Everyone Hangs Out
    Other Community Areas
    Virtual Spaces
    Fun Stuff
    Gaia's Games
    Mini-Games
    Play with GCash
    Play with Platinum