Topic: Perfect... | |
---|---|
You would be the one jumping James...
Here it all is... address if you could, concisely. Tell me where this construct fails specifically, and why. Perception requirements 1.) Perceiver 2.) Stimulus 3.) Perceptual faculty 4.) Survival instinct Awareness requirements 1.) Perception 2.) Experience 3.) Accessible memory(subconscious) Self-awareness requirements 1.) Perception 2.) Awareness 3.) Knowledge base(conscious) 4.) Sense of individuality Human condition 1.) Perception 2.) Self-awareness 3.) Sense of ought |
|
|
|
Edited by
Abracadabra
on
Sun 06/08/08 10:50 PM
|
|
I am not here to impress you.
But you are! If you're not here to present your view then why do you bother to type it all in? You're obviously here to present it to someone. Are you only interested in presenting it to 'yes men'? People who won't question it? Computers do not have the ability, nor can man give computers the ability to perceive...
Baloney. And the reason I say that is because you defined the ablity to perceive to be the ability to collect and process information. And computers most certainly do that! Now you seem to be trying to use the word 'perceive' to mean 'aware of perceiving'. But that's also what you're trying to go to. You can't start with the assumption that you are already where you want to go. Moreover, some computers do indeed have an inherent 'perceiver'.
So now you claim that computers inherently have the ability to perceive? Absolute. I've worked on robots that have that ablity myself. I worked on industrial robotics and smart bombs. I know what I'm talking about. It's analog technology no programming requried. It's all hard-wired just like biological animals are. Instincts are only inherent in living beings. They are not programmable, neither is perception, nor awareness.
No true, you can even buy hobby robots that have light sensors on them and will react 'instinctively' to light. Or any other sensory input. In fact, in nanotechnology they very often use these analog techniques. Computers are not alive, and everything which perceives is.
Not by the defintion you gave for 'percieve' - the ability to collect and process information. You didn't say anything about it having to be 'alive'. I think you're just trying to use the word 'percieve' now to simply mean 'aware' and save yourself from having to prove awareness because you'll already have it by mere proclamation that you have it. Could you address the requirements I mentioned?
Read up on analog computing methods, and robotic sensors. Computers can indeed 'percieve' based on the definition that you gave for 'percieve'. |
|
|
|
Your assumptions serve no purpose other than to deteriorate the relevance of the conversation at hand.
Unplug the artificial power source, and see just how sentient your robot is... let it plug itself back in. Teach it awareness. Teach it to reason. Teach it a sense of ought. Teach it to come up with a new idea. Address the requirements. Therein lies what you seek from me. |
|
|
|
You would be the one jumping James... Here it all is... address if you could, concisely. Tell me where this construct fails specifically, and why. It will be difficult to say where it 'fails' because I'm not precisely sure what you are trying to 'prove'. But I'll give it a go. Perception requirements 1.) Perceiver 2.) Stimulus 3.) Perceptual faculty 4.) Survival instinct A computer can do all of this. Awareness requirements 1.) Perception 2.) Experience 3.) Accessible memory(subconscious) What's the difference between 'Perception' and 'Experience'? Does experience mean that you have to have 'prior' perceptions? If that's the case how could a baby become aware for the very first time? If experince is required for awareness it could never become aware to have an experience. That's a stumbling block right there. Also, if 'Experience' simply means to have previous 'preceptions' Then a computer can certainly have this. In fact, a comptuer can be set up to innately have previous 'experiences' in it's ROM before it's even turned on. And it would certainly have accessible memory. So based on this a computer can become 'aware'. And that may very well be true. I'm just going by your presentation here. Self-awareness requirements 1.) Perception 2.) Awareness 3.) Knowledge base(conscious) 4.) Sense of individuality We'll a computer would already have 1, and 2 from above. And we know they can also have a knowlege base. So that brings us to a 'sense' of individuality. What does that mean? How could we know wheter a sufficiently complex computer would have a 'sense' of individuality or not. We could certainly write a progam to make it 'think' it does. But does that really mean that it is 'aware' of this. Based on your requirements for awareness I would have to conclude that it would then. Human condition 1.) Perception 2.) Self-awareness 3.) Sense of ought Well, we've already got 1 and 2 from above. Looks like all we need now is a 'sense' of ought. If the computer made it this far I'm sure it would have a sense of ought by now. But what doesn't that even mean? Does a serial killer have a sense of ought? |
|
|
|
Unplug the artificial power source, and see just how sentient your robot is... let it plug itself back in. Do the same thing to a human. Take away its energy source until it passes out from starvation. What's it going to do then? Probably die. I don't see the point here at all. Teach it awareness. Teach it to reason. Teach it a sense of ought. Teach it to come up with a new idea. Address the requirements. Therein lies what you seek from me. I don't see why it couldn't be taught all of these things. The real question is whether or not 'it' knows that it's alive. But you can't say. Anymore than you could say it about another human being. You can't disprove solipism and you're going to try to say whether or not an android could actually have a sense of self like a human being? I don't think you have anything here but a bunch of ill-defined words to be quite honest about it. I'm sorry Michael, but you seem to be the one who is out to claim something. I'm saying we don't know. You're saying we can know. And that your arguments here somehow show that you have logically reasoned it out. I disagree. |
|
|
|
Edited by
Abracadabra
on
Sun 06/08/08 11:27 PM
|
|
In fact Michael,
I truly believe that given enough funding where I could have enough teams of engineers working for me I could indeed produce an Anadigidroid that would indeed meet all of the requirements that you have listed. So if that's all that is required for a sentient lifeform then my Anadigidriod would be a living sentient being. I would have succeeded in creating 'life' using non-biological materials. But a lot of people would disagree that the thing is indeed a lifeform. They would claim that it's "Just a Robot". The bottom line is that it would be impossible to know, because if you ask the thing if it is sentient it would say to you, "Yes, I am alive" The question would then be, does it know what it's talking about? |
|
|
|
In fact Michael, I truly believe that given enough funding where I could have enough teams of engineers working for me I could indeed produce an Anadigidroid that would indeed meet all of the requirements that you have listed. So if that's all that is required for a sentient lifeform then my Anadigidriod would be a living sentient being. I would have succeeded in creating 'life' using non-biological materials. But a lot of people would disagree that the thing is indeed a lifeform. They would claim that it's "Just a Robot". The bottom line is that it would be impossible to know, because if you ask the thing if it is sentient it would say to you, "Yes, I am alive" The question would then be, does it know what it's talking about? Would you make me one please? |
|
|
|
Would you make me one please? Do you have billions of dollars to fund the project? If you do I can get you something up and running in about a year's time. But it will be extremely crude. Functional, but crude. Think of it like a new born 'infant' quite helpless yet, but with great potential to grow. That's not bad really. I'm basically talking about a 12 month gestation period to give birth to a sentient being. Not bad huh? Of course, that would depend on the finances which would mainly go for the salaries of the multitudes of engineers that would be working under my tender loving supervision. Hey, creating a sentient life form is no small task. |
|
|
|
Would you make me one please? Do you have billions of dollars to fund the project? If you do I can get you something up and running in about a year's time. But it will be extremely crude. Functional, but crude. Think of it like a new born 'infant' quite helpless yet, but with great potential to grow. That's not bad really. I'm basically talking about a 12 month gestation period to give birth to a sentient being. Not bad huh? Of course, that would depend on the finances which would mainly go for the salaries of the multitudes of engineers that would be working under my tender loving supervision. Hey, creating a sentient life form is no small task. You got that right. Of course it's easier if you are a woman and still ovulating. |
|
|
|
Edited by
Abracadabra
on
Mon 06/09/08 12:12 AM
|
|
How to Build a Sentient Anadigidroid
Perception requirements 1.) Perceiver This would be the "I AM' project. To construct the perceiver (the brain). This is one team of engineers. (a very large team) 2.) Stimulus This would be the "I FEEL" project. To build the sensors, as well as the body. This is a second team of engineers. 3.) Perceptual faculty This would be the "INTERFACE" project it would also include mobility A third team of engineers 4.) Survival instinct This is basically a digital program but can also include hard-wired analog 'instincts'. This would be a very small team, closely related to the "I AM" project and the "I FEEL" project. Awareness requirements 1.) Perception <-- Done. 2.) Experience We're up and running now teaching it how to walk, run and do gymnastics (gaining experience) 3.) Accessible memory(subconscious) This is a given taken care of by the 'I AM' project. Self-awareness requirements 1.) Perception <-- Done 2.) Awareness <-- Done 3.) Knowledge base(conscious) Off to school we go! Physical and mental therapy. More engineers (borderline doctors) (just more EXPERIENCE) 4.) Sense of individuality This will develop by itself following the work of the "I AM" project. Human condition 1.) Perception <-- Done 2.) Self-awareness <-- Done 3.) Sense of ought It's basically sentient already. Teach it whatever you want.. Just PLEASE don't turn it into a Christian! (Not that there's anything wrong with Christians. It's just that it's so unnecessary! Just teach it to be nice and it will be. ) ~~~ Yes, Micheal, I think you have it pretty well figured out. The only thing you got wrong is the idea that an android cannot be built to meet these requirements. It can be built! |
|
|
|
Cute extrapolations...
I don't see the point here at all.
Two initial points... computers do not have survival instinct which is inherently necessary for perception, and humans do not have an artificial power source. You want proof? Everything which perceives has those requirements. Nothing fulfills all those requirements which does not perceive. Is that not establishing truth based upon logic? Computers do not fulfill the requirements James, if for no other reason they cannot have that which necessitates perception's existence... survival instinct. Therefore, computers cannot perceive. That is the difference. The source of the need for perception. Computers do not have personal needs James. I don't see why it couldn't be taught all of these things.
Instinct cannot be taught. The real question is whether or not 'it' knows that it's alive.
Now you consider a computer to be alive? Being alive requires a physiological construct, as does instinct, perception, awareness, self-awareness, and the human condition. The latter of which you are attempting to correlate with artificial intelligence while simultaneously expecting me to disprove your assertation. I'm sorry Michael, but you seem to be the one who is out to claim something.
No need to be sorry James. Of course I am claiming something. I have laid it all out, and I await your logical refutation. This path began with my assertation of awareness not being possible without perception. You claimed that awareness was before perception, which cannot be true, because there are plenty of living creatures which perceive but are not aware of it. Knowing this compelled me to think about the subject more directly. I am not sure exactly why you insist on the artificial intelligence aspect. I find it highly suspect and completely unreasonable for you to suggest that there is no difference between humans and AI, and then expect me to refute your claim, which does not have the necessary elements to begin with. I'm saying we don't know... You're saying we can know. And that your arguments here somehow show that you have logically reasoned it out. I disagree.
Of course you do James, you claim that computers have a survival instinct, awareness, and self-awareness. Those claims are without merit. No disrespect intended, your knowledge on many things exceeds my own, and I am not at all ashamed to say that. Perception requirements
1.) Perceiver 2.) Stimulus 3.) Perceptual faculty 4.) Survival instinct To the above you wrote this... A computer can do all of this.
It most certainly does not have those requirements. You may be able to make a reasonable case for three of the four James, but all of those three completely require the fourth. That element exists only in a natural biological framework... survival instinct. What's the difference between 'Perception' and 'Experience'?
Perception is the ability to collect information and it requires five different elements, one of which is the act of perceiving(experience) which can be separated into two different types, conscious and subconscious. I must admit that the awareness construct should have read conscious experience... it does in my notes... Experience(subconscious) all by itself is also necessary for perception, but it does not require awareness of any sort. If that's the case how could a baby become aware for the very first time? If experince is required for awareness it could never become aware to have an experience. That's a stumbling block right there.
I fail to see the logic here. A baby is born, and possesses all of the elements necessary for awareness, the first of which is perception, which as I just stated, necessarily includes subconscious experience. Becoming aware happens only after all of the necessary elements exist. James, a thing does not have to be aware to have perception. So based on this a computer can become 'aware'.
And that may very well be true. I'm just going by your presentation here. You did not base the above on anything valid. Your claim fails at the very beginning. Computers do not have survival instinct James, therefore they also cannot have perception, awareness, self-awareness, nor a human condition. You would have to prove that computers have survival instincts in order to continue through the construct. Does a serial killer have a sense of ought?
Yes, unfortunately so. The sense of ought is what separates humans from most other animals. |
|
|
|
Edited by
Jeanniebean
on
Mon 06/09/08 07:05 AM
|
|
How to Build a Sentient Anadigidroid Perception requirements 1.) Perceiver This would be the "I AM' project. To construct the perceiver (the brain). This is one team of engineers. (a very large team) 2.) Stimulus This would be the "I FEEL" project. To build the sensors, as well as the body. This is a second team of engineers. Abra I imagine that androids of the future will be part regular machine and part biological. Perhaps more study should be done surrounding the pineal gland which could be the basis for the "I Am" center. If this gland can be constructed biologically from existing DNA (perhaps grown or created) then they could be grown in a lab and inserted into the android. Upon the official "birth" of the android, perhaps a real connection can be made with source through this gland. Did you see the cute movie "Short Circuit? I thought that was cute. That robot became alive and sentient after it was struck by lightning. Later when he realized that "disassemble" meant death for him, he realized he did not want to die. His survival instinct came from the fact that he desired to live. Therefor, feelings and desire are necessary for real survival instinct. An artificial survival instinct could just be build in protection devices. JB |
|
|
|
Edited by
Abracadabra
on
Mon 06/09/08 08:44 AM
|
|
No need to be sorry James. Of course I am claiming something. I have laid it all out, and I await your logical refutation. This path began with my assertation of awareness not being possible without perception.
Logical refutation? All you have is a list of words that you seem to refuse to accept your own definitions for. And you simply claim that perception must come before awareness, but you don't give any logical explanation of how awareness arises from perception. In fact, you claim that 'only living things can 'perceive'. Yet you have defined 'perceive' to be nothing more than the ability to collect and process information. You refuse to accept that computers can do this. You refuse to accept that computer can perceive. But you have no logical basis to refuse is based on your definition of 'perceive' In all honestly Michael you seem to be demanding that 'perception' already is 'awareness'. That if you can't be 'aware' of what you 'perceive' then it doesn't count. You're claim that only living animals can 'perceive' is a bogus claim based on the definition that you've given for perceive. You're clearly demanding that the ability to perceive requires more than just the ability to collect and process information. In short, you're claiming that I can't build a robot that can even 'perceive' without being 'aware' of what it is doing. But robots like that already exists. There are plenty of robots in this world that can 'collect and process information'. So by your definition robots already exist that can 'perceive' Yet, you deny that this can be true. You're the one who's is being illogical and unreasonable here Michael. You define 'perceive' as noting more than the ability to 'collect and process information', but then you deny that computerized robots can do this. Clearly you are demanding that the ability to 'perceive' is something more than just the ability to collect and process information. Clearly you are demanding that there already be an awareness of the perception. You are starting with the assumption that you already have awareness before you even move to you next step. Otherwise you could not deny that computers can collect and process information, which is your definition of 'perceive'. I'll just call you out on your first step because you haven't shown me why computers can't collect and process information which is what you have claimed to be what you mean by perceive. How can you rule out computers at this early step? They most certainly can collect and process information. Therefore you are cheating you are demanding that 'perceive' means something more than just to collect and process information. Yet, you refuse to say what it is. You just arbitrarily claim that only living things can perceive. That's where you are not being consistent. Clearly you are demanding that the ability to 'perceive' is something more than just the ability to collect and process information. But you refuse to broaden your definition of 'perceive' to include these additional criteria. This path began with my assertation of awareness not being possible without perception.
No not at all. You seem to be asserting that 'perception' is only possible by living animals that can be 'aware' of it. You seem to be asserting that awareness must come first. Even though you are claiming to be asserting the opposite. If you stick to your definition that the only thing required for the ability to 'perceive' is the ability to collect and process information, then you must concede that computers can perceive, by your very own definition of the term! |
|
|
|
Either you have to concede that computers can indeed 'pereive' by your defintion of the term.
Or you must concede that you are demanding a lot more of perception than just the ability to collect and process information. You're demanding that 'awareness' already exists! That's what you're trying to do when you claim that only living things can 'percieve'. That claim is not in line with your defintion of 'perceive'. |
|
|
|
Perception is the ability to collect information and it requires five different elements, one of which is the act of perceiving(experience) which can be separated into two different types, conscious and subconscious.
I must admit that the awareness construct should have read conscious experience... it does in my notes... Experience(subconscious) all by itself is also necessary for perception, but it does not require awareness of any sort. Ok, given this then you could say that computer can perceive in their (subconscious). A consciousness that they are 'unaware of'. In fact I can easily see by this that computer can already be said to have a (subconscious). A consciousness that they are 'unaware of'. So what does it take then to 'become aware'. That is true sentience. Based on your constructs thus far, I give computers the ability to perceive and store their collected information (experiences) in their subconscious memory. They still wouldn't be sentient at this point. Because they are not yet 'aware'. |
|
|
|
I am not sure exactly why you insist on the artificial intelligence aspect. I find it highly suspect and completely unreasonable for you to suggest that there is no difference between humans and AI, and then expect me to refute your claim, which does not have the necessary elements to begin with.
I'm not claiming that there is no difference. I'm ASKING the question "What is the difference?" |
|
|
|
Edited by
feralcatlady
on
Mon 06/09/08 09:12 AM
|
|
well with the harden heart that you have....this again is no great shock. I would say you fit the profile of what your saying to a tee.....maybe you should use yourself as the sample of this perfect robot.......But for me I will
Believe in one who sets me free I believe in one who heals me I believe in one who comforts me I believe in one who gives me joy I believe in one who softens my heart I believe in one who gives me everything I believe in one who leads me I believe in one who speaks to me I believe in one who gives me life I believe in one who shows me mercy I believe in one who prays for me I believe in one who is wonderful I believe in one who died for me and is the perfect one......now you just try to make a robot to compare....I think not. |
|
|
|
Of course you do James, you claim that computers have a survival instinct, awareness, and self-awareness. Those claims are without merit. No disrespect intended, your knowledge on many things exceeds my own, and I am not at all ashamed to say that.
I'm not claiming that computers already have these abilities. I'm merely claiming that a sufficiently complex Anadigidroid can satisfy all of the criteria that you have laid out. And therefore by your criteria for sentience, that android would have to be sentient. I'm not sure whether I agree with that conclusion or not. I'm just saying that by your criteria for sentience it would have to be sentient. Jeanniebean (and religious people) would disagree. They both believe that our true nature is are ultimately connected to (or provided by) a spiritual source. In my own pantheistic view I could potentially go either way. Jeanniebean believe that to be connected to the higher spiritual source it must be done via a special physical interface gland. She may very well be right. There may be an actual 'plug' or umbilical cord that connects us to the spirit world. However, I'm willing to believe that it may very well be possible that any sufficiently complex mind will automatically be connected to the spirit world because all is spirit. Anything that can become complex enough has the innate ability to become aware of itself. In other words, even if your constructs ultimately turn out to be correct, the pantheistic view of a spirit world could still be in tact. Thus allowing for the android to indeed become a full-fledge spiritual being via nothing more than it's own cognizance. Pantheism says, "God is in the rocks" therefore if a rock can be made self-aware it would be the spirit being aware of itself as a rock. The android is nothing more than an extremely complex rock. Of course, if atheism is true, then we're all noting but biological androids in the first place. This whole discussion has only served to reinforce my own personal ideas of pantheism. But I must confess, I've been down this road before. I have considered the possible sentience of manmade androids in the past. My personally conclusion is that they very well may be able to become sentient. In fact pantheism almost demands it. I'm not so sure about the need for an interface gland that Jeannie speaks of. There are reasons that it may be required, and there are reasons why it may not be required. I'm still up in the air on that one. |
|
|
|
Debbie wrote: well with the harden heart that you have....this again is no great shock.
I have a hardened heart because I refuse to believe a story about a supposedly perfect God that created a universe were he has to be nailed to a cross to save his own creation? Yeah right. Really Debbie, I think that just means that I don't believe that our creator is that stupid. What am I supposed to do? Feel sorry for God because he's such a poor designer? |
|
|
|
It has nothing to do with you believing in God or not believing in God.....you have a harden heart because of what your experiences have been....deny it abra and your lying to yourself.
|
|
|