Revision as of 14:53, 13 March 2004 editPsb777 (talk | contribs)Extended confirmed users9,362 editsNo edit summary← Previous edit | Revision as of 15:21, 13 March 2004 edit undoTkorrovi (talk | contribs)Extended confirmed users1,655 edits I am not the only one concerning the ability to predict, while your particular set of arguments belong only to youNext edit → | ||
Line 5: | Line 5: | ||
Consciousness is sometimes defined as self-awareness. Self-awareness is a subjective characteristic which may be difficult to test. Other measures may be easier. For example: Recent work in measuring the consciousness of the ] has determined it manifests aspects of ] which equate to those of a human at the neurological level, and, if attention is deemed a necessary pre-requisite for consciousness, then the fly is claimed to have a lot going for it. | Consciousness is sometimes defined as self-awareness. Self-awareness is a subjective characteristic which may be difficult to test. Other measures may be easier. For example: Recent work in measuring the consciousness of the ] has determined it manifests aspects of ] which equate to those of a human at the neurological level, and, if attention is deemed a necessary pre-requisite for consciousness, then the fly is claimed to have a lot going for it. | ||
It was argued that one necessary ability of consciousness is the ability to predict external events where it is possible for an average human, i.e. to ''anticipate'' events in order to be ready to respond to them when they occur and to act so that the results can be anticipated. | |||
] insists this is patently nonsense as: | |||
#A human in an '''entirely''' strange, confusing situation where anticipation is '''impossible''', could still be entirely conscious. | #A human in an '''entirely''' strange, confusing situation where anticipation is '''impossible''', could still be entirely conscious. | ||
#It is not necessarily only humans which are conscious. | #It is not necessarily only humans which are conscious. |
Revision as of 15:21, 13 March 2004
This article's factual accuracy is disputed. Relevant discussion may be found on the talk page. Please help to ensure that disputed statements are reliably sourced. (Learn how and when to remove this message) |
An artificial consciousness (AC) system is an artefact capable of achieving verifiable aspects of consciousness.
Consciousness is sometimes defined as self-awareness. Self-awareness is a subjective characteristic which may be difficult to test. Other measures may be easier. For example: Recent work in measuring the consciousness of the fly has determined it manifests aspects of attention which equate to those of a human at the neurological level, and, if attention is deemed a necessary pre-requisite for consciousness, then the fly is claimed to have a lot going for it.
It was argued that one necessary ability of consciousness is the ability to predict external events where it is possible for an average human, i.e. to anticipate events in order to be ready to respond to them when they occur and to act so that the results can be anticipated.
Paul Beardsell insists this is patently nonsense as:
- A human in an entirely strange, confusing situation where anticipation is impossible, could still be entirely conscious.
- It is not necessarily only humans which are conscious.
- Being conscious of the past makes sense.
- An entirely introspective consciousness unable of anticipation seems possible.
- Being without senses, possibly only temporarily, and therefore being unable to gather information to allow anticipation but being conscious also seems entirely possible.
As a field of study, artificial consciousness includes research aiming to create and study such systems in order to understand corresponding natural mechanisms.
Examples of artificial consciousness from literature and movies are:
- Vanamonde in Arthur C. Clarke's The City and the Stars
- Jane in Orson Scott Card's Speaker for the Dead
- Xenocide
- Children of the Mind
- The Investment Counselor
- HAL in 2001 A Space Odyssey
- R2-D2 in Star Wars
- C-3PO in Star Wars
Professor Igor Aleksander of Imperial College, London, stated in his book Impossible Minds (IC Press 1996) that the principles for creating a conscious machine already existed but that it would take forty years to train a machine to understand language. This is a controversial statement, given that artificial consciousness is thought by most observers to require strong AI. Some people deny the very possibility of strong AI; whether or not they are correct, certainly no artificial intelligence of this type has yet been created.