|
|
|
|
|
|
Posted: Thu Jan 12, 2006 9:44 pm
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 9:46 pm
|
|
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 9:46 pm
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 9:48 pm
|
|
|
|
shampoos_sis demon_lover69 shampoos_sis demon_lover69 *sings* MINIMOOSE WILL RULE YOU ALL!! what about Mr. buggy? Mr. Bugi owns Minimoose!! ^^ ok... Tee-Kay!!^^ *pets Minimoose*
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 9:53 pm
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 10:07 pm
|
-dipkittydumbdog- Vice Captain
|
|
|
|
|
|
|
|
Posted: Thu Jan 12, 2006 10:25 pm
|
|
|
|
shampoos_sis Yoichiro Tsukasa shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years. But adding all these stipulations just takes more and more rights away from a potentialy sentient being. Humans get humanity. Robots should get something else to the same level, if they are self thinking, learning, and socialy functional. Doesn't that make them just like us? I understand what your saying but to the USA at least ist like saying a gorilla should have the same constitutional rights as humans.( i personaly hate my goverment right now and we searyously should restart not like it won't happen again but i don't wand a goverment mainly controled by reptilian half breads from another dimention) also i have a fealing that they will try to make as manny laws to infact reduce the posabilaty of a robot being consitered a human. I beleive that if a gorilla indeed became sentient enough to be aware of self and of the fact that it could have more, and had the ability to comunicate it and operate functionaly in society, then yes, it should be granted rights equal to those of people
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 8:01 am
|
|
|
|
Skiz-Erz my dog ate my sanity He hasn't said anything about AI. It will have the intelligence of a cellphone. He said self sustaining. Which would mean either an AI or a programmed response for every imaginable situation. I said semi-self sustaining. All I ment was that it would be able to refuel its self in the event that it is about to run out of power. The plans include a battery that would allow for about an hour of extra time, and in the event that the main power sorce is about to be depleated, it would be able to refuel the hydrogen fuel cells and, quite literally, give it a full tank of gas.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 8:10 am
|
|
|
|
shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years.
I would have to quite agree with you because of what I have had to suffer in school because of my ideas.
I would have to disagree with you because I do not plan on adding an AI, unless I would be able to program it to only learn functions, and nothing more, for example, it would learn how to, say, bake bread, instead of how the bread is truely made and how all the ingreadiants are processed. NOTHING WOULD BE LEARNED THAT WOULD ALLOW IT TO, IN THEORY, CREATE A MIND OF ITS OWN AND REALIZE THAT IT HAS TOTAL FREE WILL. All that I am saying is that if an AI WERE installed, I would only let if function under the fact that it would be able to teach the robot to learn how to do different functions. Nothing would be put on the robot that could harm a human, such as a knife or firearm, except the fact that it would probably have the strenght of about 4 full grown men.
You mention that it would "die" in three years, please elaborate more on that because what I see is people just dismanteling a robot because it didn't do something that was against its programing.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 8:19 am
|
|
|
|
Skiz-Erz The realistic reason why no-one's made an fully artificialy intelligent being is because of the sheer time and effort that would have to go into it. You would need to program a reaction for every single concievable circumstance, or else develop some sort of learning computer, which would, again, be very difficult, and you'd have to have it learn things the way you would teach an infant what it learns through life; i.e. subject it to an average human life, as a human. Which, again, would take approximately the length of a human lifetime, and then you get to the whole argument about how it would keep accumulating knowledge, getting to the point of "knowing too much" and being "bad for the human race." ... Also, I don't know why you're complaining about having an I.Q. of 126, 100 is the average human, and each individual number weighs more than you'd think, as a person with an I.Q. of 70 would have alot of trouble with counting.
I thank you for your compliment, but having a higher I.Q. doesn't nesicarally mean that I have it better. Like you said, each I.Q. point number weighs more than I think, and the intellect weighs more than YOU think. I do not say this as a put down, I am just pointing out that you have no idea how much stress that I am put under to get PERFECT GRADES or at least STRAITE B's!! And have you ever considered that I happen to know about 15 different flaws with humanity that keep me up all night. One flaw is that, like someone said in either an earlier post or a later post, is that humans are naturally racist. Two, humans have no remorse for others of their kind, or of other species, which is why some species of animals have gone extinct. I don't think that I would need to go on because I would just become boared of typing. Also, I would rather have an I.Q. of 70 then 126. It would just be less stressful.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 1:38 pm
|
|
|
|
you_die!!! shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years.I would have to quite agree with you because of what I have had to suffer in school because of my ideas. I would have to disagree with you because I do not plan on adding an AI, unless I would be able to program it to only learn functions, and nothing more, for example, it would learn how to, say, bake bread, instead of how the bread is truely made and how all the ingreadiants are processed. NOTHING WOULD BE LEARNED THAT WOULD ALLOW IT TO, IN THEORY, CREATE A MIND OF ITS OWN AND REALIZE THAT IT HAS TOTAL FREE WILL. All that I am saying is that if an AI WERE installed, I would only let if function under the fact that it would be able to teach the robot to learn how to do different functions. Nothing would be put on the robot that could harm a human, such as a knife or firearm, except the fact that it would probably have the strenght of about 4 full grown men. You mention that it would "die" in three years, please elaborate more on that because what I see is people just dismanteling a robot because it didn't do something that was against its programing. I do understand that you would not put any AI in your robot but since other people were talking about it I thought I would join in... mrgreen
Science fiction novelists have found many safety devises to prevent robots from becoming to "smart" they cant find a way to stop this AI thing so they gave it a three year life span so it could no "conquer the world" if I remember right there would be nano bots that would dismantle it from the inside so it would become dysfunctional enough that it would stop working at all.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 1:48 pm
|
|
|
|
Yoichiro Tsukasa shampoos_sis Yoichiro Tsukasa shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years. But adding all these stipulations just takes more and more rights away from a potentialy sentient being. Humans get humanity. Robots should get something else to the same level, if they are self thinking, learning, and socialy functional. Doesn't that make them just like us? I understand what your saying but to the USA at least ist like saying a gorilla should have the same constitutional rights as humans.( i personaly hate my goverment right now and we searyously should restart not like it won't happen again but i don't wand a goverment mainly controled by reptilian half breads from another dimention) also i have a fealing that they will try to make as manny laws to infact reduce the posabilaty of a robot being consitered a human. I beleive that if a gorilla indeed became sentient enough to be aware of self and of the fact that it could have more, and had the ability to comunicate it and operate functionaly in society, then yes, it should be granted rights equal to those of people Yes but sadly one person has not the power nor opportunity, unless you happen to be the man behind the Supreme Court or somthing.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 2:00 pm
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 7:06 pm
|
|
|
|
shampoos_sis Yoichiro Tsukasa shampoos_sis Yoichiro Tsukasa shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years. But adding all these stipulations just takes more and more rights away from a potentialy sentient being. Humans get humanity. Robots should get something else to the same level, if they are self thinking, learning, and socialy functional. Doesn't that make them just like us? I understand what your saying but to the USA at least ist like saying a gorilla should have the same constitutional rights as humans.( i personaly hate my goverment right now and we searyously should restart not like it won't happen again but i don't wand a goverment mainly controled by reptilian half breads from another dimention) also i have a fealing that they will try to make as manny laws to infact reduce the posabilaty of a robot being consitered a human. I beleive that if a gorilla indeed became sentient enough to be aware of self and of the fact that it could have more, and had the ability to comunicate it and operate functionaly in society, then yes, it should be granted rights equal to those of people Yes but sadly one person has not the power nor opportunity, unless you happen to be the man behind the Supreme Court or somthing. That is most unfortunate, I know. No onw, no matter how right, just, or well meaning they are, can change anything unless they hold politcial power of some kind in this country. No offence to anyone, but I am so glad not to be an American citizen.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Posted: Fri Jan 13, 2006 7:10 pm
|
|
|
|
you_die!!! shampoos_sis Yoichiro Tsukasa Hopefully so, but if they where, and the AI grows to consious levels, wouldn't that go against the fact of it having rights a bit? Constriction of free will? Then, as I mentioned earlier, the problems might start, albeit in a few at first. people are you could say naturaly racist, they will not give a robot any rights untill you can prove its humanity.(i am no speaking for my self just my understanding of humans)
thers a nother safty device for things like that, a three year life span, its garented to "die" in three years.I would have to quite agree with you because of what I have had to suffer in school because of my ideas. I would have to disagree with you because I do not plan on adding an AI, unless I would be able to program it to only learn functions, and nothing more, for example, it would learn how to, say, bake bread, instead of how the bread is truely made and how all the ingreadiants are processed. NOTHING WOULD BE LEARNED THAT WOULD ALLOW IT TO, IN THEORY, CREATE A MIND OF ITS OWN AND REALIZE THAT IT HAS TOTAL FREE WILL. All that I am saying is that if an AI WERE installed, I would only let if function under the fact that it would be able to teach the robot to learn how to do different functions. Nothing would be put on the robot that could harm a human, such as a knife or firearm, except the fact that it would probably have the strenght of about 4 full grown men. You mention that it would "die" in three years, please elaborate more on that because what I see is people just dismanteling a robot because it didn't do something that was against its programing. So you don't want a true robot at all? Just a slave, or mechanical "do it for me" mechanism? Isn't the true dream of all who study robotics to make a more lifelike, self opperating mechanical being?
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|