Misplaced Pages

Artificial consciousness

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by Wikiwikifast (talk | contribs) at 19:53, 13 March 2004. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 19:53, 13 March 2004 by Wikiwikifast (talk | contribs)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

An artificial consciousness (AC) system is an artefact capable of achieving verifiable aspects of "consciousness".

Consciousness is sometimes defined as self-awareness. Self-awareness is a subjective characteristic which may be difficult to test. Other measures may be easier. For example: Recent work in measuring the consciousness of the fly has determined it manifests aspects of attention which equate to those of a human at the neurological level, and, if attention is deemed a necessary pre-requisite for consciousness, then the fly is claimed to have a lot going for it.

It is asserted by some that one necessary ability of consciousness is the ability to predict external events where it is possible for an average human, i.e. to anticipate events in order to be ready to respond to them when they occur and to act so that the results can be anticipated.

What consciousness is, is controversial. In the Misplaced Pages article Consciousness these attributes of psychological consciousness are both listed and defined:

  • spatialization
  • analog I
  • analog Me
  • excerption
  • conciliation
  • narratization

As a field of study, artificial consciousness includes research aiming to create and study such systems in order to understand corresponding natural mechanisms.

Some deny artificial consciousness is possible at all or possible without the existence of strong AI, which is itself said, by some, to be impossible.

Another area of contention is which subset of possible aspects of consciousness must be verifiably present before a device would be deemed conscious. One view is that all aspects of consciousness (whatever they are) must be present before a device passes. An obvious problem with that point of view, which could nevertheless be correct, is that some functioning human beings might then not be judged conscious by the same comprehensive tests.

Professor Igor Aleksander of Imperial College, London, stated controversially in his book Impossible Minds (IC Press 1996) that the principles for creating a conscious machine already existed but that it would take forty years to train a machine to understand language.

Examples of artificial consciousness from literature and movies


External Links