Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Wednesday 8 February 2023

Pinning down consciousness

After some recent reads, I'm tempted to venture a working definition of what I think I mean when I say "consciousness". This is nothing more than what I find useful, mind you.

I've said in the past that I think consciousness is a spectrum, and that a dreaming or sedated person has a consciousness of a sort. But actually now I'm thinking if it wouldn't be better to demark consciousness as a particular kind of the more general category of mental states. Clearly, dreamers and the drugged have those, but they're not conscious in the usual sense. We can be subconsciously aware of our surroundings all the time but not really actively thinking about them, and in the vernacular we often use "conscious of" to mean "attending to" rather than merely "sensing".

It's helpful to start with the extremes. Let me be bold enough to declare that I think our regular, waking sort of awareness - of the Sun at noon, in bright light with nothing hidden or concealed beyond our own sensory limits - is the truest sort of consciousness. Look, I know, we could spend all day trying to justify this, but I don't want to do that so I won't.

Under these conditions, I think the following characteristics define our conscious person. Each of these needs to be heavily qualified, but let me just state them first :

  • They have an inner awareness of themselves and their surroundings which has no direct connection to their external reality. For example they can create mental imagery, run an inner monologue, and in general create imaginary things that aren't there.
  • Intelligence. They can take information and process it using some kind of (preferably, but not necessarily, reasoned and self-consistent) judgments to arrive at a conclusion.
  • Agency. That is, they feel they can make choices put before them and decide for themselves as to how they act. This can be done with varying levels of intelligence.
  • Sentience. They receive external information from their senses (or internal in the case of proprioception) which is incorporated into their internal awareness.
  • They are self aware. They have control over themselves in relation to the external world. They can distinguish their own inner awareness from their perceived external reality.
My reasoning being that we can surely all agree that someone having all of these characteristics is definitively conscious, and someone having none of them is definitely dead, or a rock. Furthermore, each individual characteristic is a spectrum : you could be sentient about one or many things and have varying degrees of sensitivity; you can have agency over some things you do (choosing to go left or right) but not others (liking or disliking a particular taste); you can sometimes muddle up external and internal realities (did I just see that or was it a trick of the light in the corner of my eye ?). 

With this definition, not only is consciousness itself allowed to be a spectrum, but you can have qualitatively different sorts of mental states as well. A dreamer is not conscious, but is clearly alive; an intelligent robot can be useful, but isn't necessarily conscious or alive.

Let me further qualify these characteristics to avoid any gross misunderstandings. Note that I'm trying to keep each of these as independent as possible in order to allow for different types of mind and information processing. The reasons for this will become clearer as we go on. 


Inner awareness : I mean a mental state that has no direct connection to reality; essentially, qualia. A conscious person experiences something. For example, colour is not wavelength, touch is not pressure, sound is not frequency, and heat is not temperature. What we experience is how our brain interprets those external events, and the experience of a thing is not at all the same as the thing itself. Memory is another good example. Other things are also totally imaginary* and exist nowhere in reality, e.g. concepts ("show me one atom of justice" and all that). Such imaginings need not be at all sophisticated, however.

* From our senses we only know our mental representations of the external thing and not the true nature of the thing itself - but we do at least know that something external induced an internal representation. 

For me this is the main hallmark of a mind and thought. The "mind" is the collection of these thoughts which are all of this same basic type. If you don't have this inner, unphysical awareness, you're a mindless zombie or an abacus. But if you do have it, you still might not be what we mean by a fully conscious being. Awareness is necessary but not sufficient for consciousness, but necessary and sufficient for a mind.


Intelligence : This I see as the capacity to solve problems. That is, processing and integrating information, forming a reasoned judgement about how things work and forming conclusions appropriately. This need not at all be a rational process, that's much more sophisticated. Pavlovian responses are a good example of basic intelligence. A pure stimulus response (smelling food -> go towards food -> eat the food) is not intelligence, but responding to something only indirectly related requires some level of reasoning (hearing a bell -> go to the place food is dispensed -> eat the food). You can't do this without some sort of very basic learning.

The difference between instinct and intelligence isn't always clear. Some key behaviours can be purely instinctual but then applied using genuine intelligence to different circumstances. And computers, in my view, can count as intelligent but they're not at all likely to be conscious. But in the main they have only the most rudimentary form of intelligence, further blurring the line with instinct : they can carry out highly sophisticated calculations, but only in a purely mechanical, instinct-like way. They don't form any chains of reasoning by themselves. Chatbots are increasingly able to overcome this, but I hasten to add that intelligence does not automatically imply that any of the other conditions I'm proposing are satisfied.

Intelligence, as with some of the other characteristics, surely requires memory. I'm hesitant to include memory as its own parameter as to me it seems like intelligence is more crucial to consciousness : an automaton could regurgitate information, but intelligence requires some form of thought (if not necessarily the conscious kind). Memory is thus included here only implicitly.


Agency : Some capacity for decision-making is essential. It doesn't mean a conscious being must actually be capable of carrying out its decisions, only that it's capable of making them. It must be able to determine or act towards a goal, however crudely. 

Note that I regard this as closely related to, yet far removed from, intentionality. I don't think any artificial intelligence at present comes close to having any sort of intentions whatsoever - none of them have even the merest inkling of any sort of "desire" at all. For that I think you require also this inner awareness, the capacity to form internal imaginary constructs. As it stands, programs can be said to have "agency" only in the very loose, very basic sense that they can make decisions. They can't really act on their own volition because they don't have any. They can have goals, but they can't as yet analyse those goals and alter them, certainly not based on any deeper objectives.

I should note a couple of other points. First, by "desire" I don't mean this in the emotional sense. I don't think emotion is necessary for consciousness. There isn't a good word meaning "desire" that isn't also emotive so we're just going to have to live with that.

Second, I don't propose to define free will here; I personally do think this is a thing but that's a whole other gargantuan kettle of fish. More to the point is that a conscious being (which must have inner awareness) will tend by default to believe it's acting on its own will. It will probably, I think, have true intentionality, but let's limit this requirement to mere agency to play it safe. This property doesn't require intelligence, but could be limited to just choosing between options or deciding a numerical value. They key point is that it makes decisions. Of course, when coupled with intelligence, this becomes very much more powerful.


Sentience : It's an interesting question as to whether this condition can exist independently of all the others. I lean towards no. A camera receives light, but by itself it can't be said to be "sensing" anything - something must occur as a result of its input for that to be the case. This could be merely changing the inner awareness, or causing a decision to be made or the information to be analysed. But it can't do absolutely nothing at all, or nothing has really been sensed. So while sentience can be independent of all of the other conditions here, it can't exist without at least one of them. You can't have something which is just sentient.

Should we choose this capability to mean only in relation to the external world, or should we allow internal senses as well ? I lean towards the former. Bearing in mind that I've said consciousness is just one particular sort of a more general variety of mind, I'd say a person who is totally disconnected from the external world is still, crucially, thinking, but like a dreamer, we wouldn't say they are conscious.

A more general point is that a conscious being requires something to think about. Now a sentient being automatically has this, but sensory perception is not the only route by which this can happen. Computers without external sensors can still have information to process, and I want to allow for the possibility for a conscious (or at least mentally aware) computer otherwise there's a risk of defining valid options out of existence. So as memory is implicit for intelligence, let's let "access to information" be implicit for sentience.


Self awareness : Since we allow the minimum condition for sentience to be sensing the external world, self awareness here means being able to distinguish the internal and external realities. In this definition, a conscious being cannot help but have an internal reality - that is indispensable to the whole concept of a mind. But to be conscious they must be able to sense the external reality as well. They must further be able to distinguish the two, otherwise they cannot have any real agency within their environment : they would be living in a dreamworld, or more accurately a sort of augmented reality which blurs the lines between real (external) and imaginary (internal).

Bearing in mind again that dreaming is here defined to be a kind of thinking, and consciousness another sort, it seems essential that conscious creatures must be at least partly able to distinguish the two. And again, since I allow everything to be on a spectrum, it's not necessary that they are fully capable of making this distinction at all times (hardly anyone can do that). It's enough that they can do it at all.



So that's my minimally-conscious entity then. It has an inner life, can sense the external world and distinguish it from its imagination, make judgements about information, and has goals. It needn't do any of these very well, but the more it does them, the more conscious it can be said to be. A dreaming person thinks, but is not conscious; they may think they have agency, but are unable to make decisions with respect to the real world. Sleepwalkers blur this distinction, and that's perfectly fine since this definition allows for different degrees and types of consciousness.

Recall that I've decided that memory and access to information, and possibly emotions, are only implicit requirements for consciousness. A thinking being is probably going to require these in general, but a being which is specifically conscious rather needs the higher-level attributes of intelligence and sentience (and possibly intentionality).


We can explore this further by considering different possible configurations. Taking some inspiration from the "moral matrix" concept, I envisage these characteristics as a series of sliders. Unfortunately there are rather a lot of possible combinations... if we reduce them to the binary on/off conditions, there are 2^5 = 32 possibilities. Allow an intermediary state and that's 3^5 = 243 options.

Well, the binary cases aren't so bad. Let's simplify the criteria so that :

Inner awareness = mindless if none, mindful if present
Intelligence = stupid if none, clever if any
Agency = inert if none, intentful if any
Sentience = blind if none, sentient if any
Self-awareness = dreamlike if none, awake if any

And allowing all options to have memory and access to information in general, as implicitly required.

Using this we can print out all 32 possibilities and explore the various combinations of mental states, just for funzies. Let's group them according to the number of conditions satisfied, and sub-group them according to whether they have inner awareness (a true mind, however limited) or not.

Yes, really, this is how I spend my spare time.


0 conditions met
Mindless Stupid Inert Blind Dreamlike : The trivial case of being dead. Or a rock.


1 condition met
Mindful
Mindful Stupid Inert Blind Dreamlike : Willy Wonka, living in a world of pure imagination.

Mindless
Mindless Clever Inert Blind Dreamlike : Calculators.
Mindless Stupid Intentful Blind Dreamlike : Angry switches.
Mindless Stupid Inert Sentient Dreamlike
Mindless Stupid Inert Blind Awake 


2 conditions met
Mindful
Mindful Clever Inert Blind Dreamlike : Sleepy scientists.
Mindful Stupid Intentful Blind Dreamlike : Brexit voters.
Mindful Stupid Inert Sentient Dreamlike : Drugged Brexit voters.
Mindful Stupid Inert Blind Awake

Mindless
Mindless Clever Intentful Blind Dreamlike : A decent chatbot.
Mindless Clever Inert Sentient Dreamlike : Useful robots.
Mindless Clever Inert Blind Awake
Mindless Stupid Intentful Sentient Dreamlike : Dangerous robots.
Mindless Stupid Intentful Blind Awake
Mindless Stupid Inert Sentient Awake


3 conditions met
Mindful
Mindful Stupid Inert Sentient Awake : Lazy idiots.
Mindful Stupid Intentful Blind Awake
Mindful Stupid Intentful Sentient Dreamlike : Hallucinations.
Mindful Clever Inert Blind Awake
Mindful Clever Inert Sentient Dreamlike : Beginnings of true AI.
Mindful Clever Intentful Blind Dreamlike : Beginnings of truly dangerous AI.

Mindless
Mindless Stupid Intentful Sentient Awake : A really boring but dangerous robot.
Mindless Clever Inert Sentient Awake
Mindless Clever Intentful Blind Awake : A hallucinating robot.
Mindless Clever Intentful Sentient Dreamlike : A different hallucinating robot.


4 conditions met
Mindful
Mindful Stupid Intentful Sentient Awake : About half the population.
Mindful Clever Inert Sentient Awake : Philosophers.
Mindful Clever Intentful Blind Awake 
Mindful Clever Intentful Sentient Dreamlike : Angry philosophers on drugs.

Mindless
Mindless Clever Intentful Sentient Awake : A potentially superb AI.

5 conditions met
Mindful Clever Intentful Sentient Awake
The trivial case of being a full conscious being, the most highly evolved entity possible, i.e. a radio astronomer.


This all reduces to 23 options, including the two extreme cases. Of course no more than a rough draft. I've not allowed the cases of being awake without being sentient, but we could debate if this is the best use of terms; clearly, you can know something is out there without really knowing what it is. And obviously the descriptions aren't meant to be taken seriously !

And... I'm not really happy that the characteristics don't have the same level of independence. I'm not thrilled with my definition of "agency" either. As I said, a rough draft.

Still, I think it's maybe at least provocative ? There are attempts to build machines to detect consciousness, which is very important for patients in a coma. According to the above, such patients may be truly conscious but unable to act, or they may still have mental states (which is more important for ethical considerations) but not consciousness proper, just as a dreamer. 

More philosophically problematic may be people who have aphantasia (no mental imagery) or no inner monologue. My assumption is that they do have some form of inner awareness, just not the most common ones we're used to. Or maybe they have them but they're at a lower level. Or maybe they don't, and they're the equivalent of biological robots. Which would be a bit scary, but at least it would explain Dominic Cummings.

No comments:

Post a Comment

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Review : Ordinary Men

As promised last time  I'm going to do a more thorough review of Christopher Browning's Ordinary Men . I already mentioned the Netf...