This is an old revision of this page, as edited by Psb777 (talk | contribs) at 19:44, 13 March 2004 (merge, re-order, nothing lost). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Revision as of 19:44, 13 March 2004 by Psb777 (talk | contribs) (merge, re-order, nothing lost)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)An artificial consciousness (AC) system is an artefact capable of achieving verifiable aspects of consciousness.
Consciousness is sometimes defined as self-awareness. Self-awareness is a subjective characteristic which may be difficult to test. Other measures may be easier. For example: Recent work in measuring the consciousness of the fly has determined it manifests aspects of attention which equate to those of a human at the neurological level, and, if attention is deemed a necessary pre-requisite for consciousness, then the fly is claimed to have a lot going for it.
It is asserted by some that one necessary ability of consciousness is the ability to predict external events where it is possible for an average human, i.e. to anticipate events in order to be ready to respond to them when they occur and to act so that the results can be anticipated.
What consciousness is, is controversial. In the Misplaced Pages article Consciousness these attributes of psychological consciousness are both listed and defined:
- spatialization
- analog I
- analog Me
- excerption
- conciliation
- narratization
As a field of study, artificial consciousness includes research aiming to create and study such systems in order to understand corresponding natural mechanisms.
Some deny artificial consciousness is possible at all or possible without the existence of strong AI, which is itself said, by some, to be impossible.
Another area of contention is which subset of possible aspects of consciousness must be verifiably present before a device would be deemed conscious. One view is that all aspects of consciousness (whatever they are) must be present before a device passes. An obvious problem with that point of view, which could nevertheless be correct, is that some functioning human beings might then not be judged conscious by the same comprehensive tests.
Professor Igor Aleksander of Imperial College, London, stated controversially in his book Impossible Minds (IC Press 1996) that the principles for creating a conscious machine already existed but that it would take forty years to train a machine to understand language.
Examples of artificial consciousness from literature and movies
- Vanamonde in Arthur C. Clarke's The City and the Stars
- Jane in Orson Scott Card's Speaker for the Dead, Xenocide. Children of the Mind and The Investment Counselor
- HAL in 2001 A Space Odyssey
- R2-D2 in Star Wars
- C-3PO in Star Wars