Topic: Philosophy Challenge: Define Non-physical | |
---|---|
I tried real hard to communicate the meaning of 'non-physical' but the only way to convey was with my non-physical fingers... and they simply output that communication to a non-physical screen... soooo... in order to receive that communication I reckon yall will have to tune into the non-physical internet and download it non-physically. Right on. I agree with this satire LITERALLY. |
|
|
|
I think that the computer analogy can provide some insight into the concept of "body".
Correct, an analog may be challenging to build but a finite task.
With human bodies, there are senses that receive input. The eyes and ears are probably the most readily compared to the computer analogy. If we hook up a camera and a microphone to the computer, would tha not be equivalent to giving it "sight" and "hearing". And if we go further and put it into a robotic mechanism that is capabale of ambulation, we need to provide a whole complex system for sensing the relationship between the mechanism and the terrain over which it it ambulating. So it seems to me that the whole system of mechanical and electronic input/output devices, which provide the "brain" with an interface to the external world, would be functionally equivalent to a "body". What exactly was the point? Not being rude but honestly curious, I love this topic. Whether or not a computer would be considered to "have a concept" at all is, of course, debatable. But if it were capable of "conceiving" in that sense, it seems to me that the various "extensionals" (legs or wheels for mobility, camera and microphone for input, speaker for output, etc.) would constitute that computer's "sense of body". |
|
|
|
Concept of body is a learned experience.
I suppose a man made electromechanical 'brain' could grow into such a concept. |
|
|
|
Edited by
JaneStar1
on
Tue 12/01/09 09:17 PM
|
|
To be exact, Sky, my statement was in response to the Jeannie's insistence that
We know we exist because we have a body and we are self aware. We actually identify ourselves with our body. If a simulated computer 'brain' does not have a 'body' what will its self-awareness be like?
... to which I replied that a simulated computer 'brain' does not identify itself with body (yet), but only with the Input & Output. Back in the mid-80's I visited my uncle's Robotics lab in the US: his students constructed a Mobile Autonomous System -- a little camera mounted atop of a moving cart which was equiped with the receiver through which the students could set the parameters... and the cart would autonomously find its way through the terrein to the final destination. They also have built a robotic arm which could perform many tasks -- pick the object up, move it to a certain location, etc. *** However, the Canadian project has been adopted for the space-shuttle. What I meant to say is that the technology is already in place. Besides, Japan has already produced a dumb toy robot (full size) capable of performing simple tasks... Unfortunately, current level of the technology doesn't allow for marrying the two together -- computer miniturialization has still a way to go! (though we're on the right track...) |
|
|
|
.. to which I replied that a simulated computer 'brain' does not identify itself with body (yet), but only with the Input & Output.
I'm sorry Jane, you may have missed the progress of the conversation. I was speculating about the "Blue Brain project" (http://bluebrain.epfl.ch/) and IF the brain itself might "eventually" result in an emergent consciousness or self awareness. (earlier in this thread.) If this happened, it is expected that the brain might be able to create a virtual reality that we could actually step into (like a hologram.) My question concerned how the brain consciousness might become aware of self and experience its own created reality by also manifesting a "body" that could live inside of that reality. Sort of like a virtual computer world or holographic computer program. I am thinking it would be pointless to create a reality if you can't experience it. So just as you create a dream body to experience you dreams, the computer brain would create a body so it could experience its virtual reality. This body would be what it identified itself with. |
|
|
|
Philosphy is logic in it's most basic form, so yes logic is required
|
|
|
|
Thought.
The state of awareness and the process of thinking are non physical. The cause of it and it's origin does not define it. Mass may influence energy but it does not define it. We are universal detectives because of awareness. We can never truly define anything, only to the point of our satisfaction. The answers are as neverending as the questions, but that is the joy and sorrow of our affliction. |
|
|
|
Edited by
JaneStar1
on
Wed 12/02/09 12:38 AM
|
|
Jeannie maintains:
My question concerned how the brain consciousness might become aware of self and experience its own created reality by also manifesting a "body" that could live inside of that reality. Sort of like a virtual computer world or holographic computer program.
I comprehend your idea -- a truly conscious computer, i.e. an android, capable of human-like fits (perceptiion, cognition, etc.) Unfortunately, even the BlueBrain -- as advanced as it is -- is still as far away from the consciousness as the Earth from Alfa-Centavra! The movie, "I ROBOT" is depicting something that won't be realized for at least another 100-200 years minimum! (imagine that: Isaak Asimov has written that novel back in 1970's -- what a foresight!!!) Unfortunately, for now (and for the foresee-able future), computers will remain just dumb number-crunching mechanizms, incapable of (i.e. not programmed to) performming any independant "thought"! (not to mention being anywhere close to the self-awareness!!!) Until the scientists decide to equip the computer with the artificial limbs, it won't be capable of conceiving of the idea on it's own -- it is not aware of any body parts -- nor is it aware of the necessity for any body parts (or a body in general)... Yet |
|
|
|
Edited by
Jeanniebean
on
Wed 12/02/09 09:36 AM
|
|
Jeannie maintains: My question concerned how the brain consciousness might become aware of self and experience its own created reality by also manifesting a "body" that could live inside of that reality. Sort of like a virtual computer world or holographic computer program.
I comprehend your idea -- a truly conscious computer, i.e. an android, capable of human-like fits (perceptiion, cognition, etc.) Unfortunately, even the BlueBrain -- as advanced as it is -- is still as far away from the consciousness as the Earth from Alfa-Centavra! The movie, "I ROBOT" is depicting something that won't be realized for at least another 100-200 years minimum! (imagine that: Isaak Asimov has written that novel back in 1970's -- what a foresight!!!) Unfortunately, for now (and for the foresee-able future), computers will remain just dumb number-crunching mechanizms, incapable of (i.e. not programmed to) performming any independant "thought"! (not to mention being anywhere close to the self-awareness!!!) Until the scientists decide to equip the computer with the artificial limbs, it won't be capable of conceiving of the idea on it's own -- it is not aware of any body parts -- nor is it aware of the necessity for any body parts (or a body in general)... Yet I know that Jane. I am not asserting any such thing. I am speculating what 'might' happen with the blue brain project IF the assertion is true that consciousness arises from form. The question that will be answered is: If he succeeds in duplicating a human brain, does the possibility even exist that it might become conscious and self aware? If it does not, then the idea that life and form arise from consciousness is supported. If it does then the idea that consciousness arises from form is supported. So why is this hugely important? Because it comes close to proving that consciousness (spirit) exists BEFORE form is manifested. Its evidence in the "chicken or the egg" question. It's evidence of the existence (or not) of dis-embodied spirit consciousness. This brain may never become conscious. It may never become aware, no matter what they do. Which is evidence that consciousness does not arise from form. (or the brain) |
|
|
|
Philosphy is logic in it's most basic form, so yes logic is required i suppose you could say that marxism requires logic. not my logic of course but somebody's i guess. white supremacy is a philosophy of sorts. logical? eh. whatever floats your boat huh? |
|
|
|
I think that the computer analogy can provide some insight into the concept of "body".
Correct, an analog may be challenging to build but a finite task.
With human bodies, there are senses that receive input. The eyes and ears are probably the most readily compared to the computer analogy. If we hook up a camera and a microphone to the computer, would tha not be equivalent to giving it "sight" and "hearing". And if we go further and put it into a robotic mechanism that is capabale of ambulation, we need to provide a whole complex system for sensing the relationship between the mechanism and the terrain over which it it ambulating. So it seems to me that the whole system of mechanical and electronic input/output devices, which provide the "brain" with an interface to the external world, would be functionally equivalent to a "body". What exactly was the point? Not being rude but honestly curious, I love this topic. Whether or not a computer would be considered to "have a concept" at all is, of course, debatable. But if it were capable of "conceiving" in that sense, it seems to me that the various "extensionals" (legs or wheels for mobility, camera and microphone for input, speaker for output, etc.) would constitute that computer's "sense of body". Concept of body is a learned experience. I suppose a man made electromechanical 'brain' could grow into such a concept. |
|
|
|
Edited by
Monier
on
Wed 12/02/09 09:38 PM
|
|
Philosphy is logic in it's most basic form, so yes logic is required i suppose you could say that marxism requires logic. not my logic of course but somebody's i guess. white supremacy is a philosophy of sorts. logical? eh. whatever floats your boat huh? Nobody said that logic was good or evil. The same for philosophy. It is not a result, but a process. White Supremacy a philosophy? Hardly. Philosopy and logic cannot be supported by fallacies. Making up definitions without exploring the realism of beliefs is folly, and not logic. Philosopy to Logic to Critical Thinking The 3 above are similar, only more advances of process. |
|
|
|
BushidoBilly:. If form gives rise to experience then if this simulated brain had the same neural response to the same data I would imagine the same experience would be involved.
Unfortunately, form alone isn't sufficient for giving rise to anything... (otherwise the jungle would be filled with the human-like cretures!) On the contrary, monkeys and even apes no longer evolve! P.S. On the other hand, what's a "Big Foot" if not an evolved ape??? |
|
|