About consciousness

Consciousness is a nebulous concept with many definitions. There are as many theories about consciousness as there are definitions and the philosophical and scientific community is still far from consensus. This post represents my current (limited) understanding which is evolving as is the whole field of conciousness research. It is written as a statement for conciseness but should be understood as a set of hypotheses, and in some cases as speculations.

The perhaps least understood, and in my mind the most interesting, aspect of consciousness is that of subjective mental states 1 2. These are mental states that include subjective experiences that can only be experienced by the organism itself such as pain, the color red, the sense of a self, and the taste of strawberries. How such subjective experiences emerge is by the philosopher David Chalmers called the hard problem of consciousness. (The “easy” problems of consciousness are to explain complex behavior, sensory information processing, logical reasoning etc.)

A mental state, subjective or not, can be seen as information held by an organism that represents information about the organism’s environment (including the organism’s own body) 3. There are studies that show correlations between mental states such as pain and neural firing patterns (brain “states”) [1]. It has also recently been possible to reconstruct images seen by a test person from fMRI brain scans [4] of the test person’s brains which indicates a mapping between sensory information about the world and higher level mental states.

The active inference framework, a theory about perception and action (see e.g., this earlier post), is firmly based on the assumption that mental states represent information about the organism’s environment, a model of the world [2]. It claims that the organism uses this information both for interpreting sensory input (observations) and for planning actions. The objective function to be minimized in both cases is free energy.

With a well calibrated model of the world the attractive mental states correspond to attractive real-world states. It would be hard to understand how an organism could navigate an ever-changing world and maintain allostasis without access to and storage of useful information about the conditions outside of its boundary.

I assume that mental states are represented by neural firing patterns in the nervous system much like a novel can be represented by Unicode tokens in a computer file. This means that I reject strong emergence which seems to require some secret sauce in addition to the neural firing patters, a kind of dark matter of consciousness. There is some motivations for this below.

Subjectivity

The subjectivity of mental states is elusive and hard to describe fully. Many people I talk to don’t even realize there is something interesting there. The subjective mental states are such a familiar aspect of our cognition that they go mostly unnoticed much like our heartbeat.

These are a few examples of subjective mental states:

  • The sense of a continuous self, separate from the environment.
  • Qualia such as pleasure, pain, and the taste of strawberries.
  • Emotional states such as happiness, anger, and love.

Subjective mental states can be loosely defined as mental states that “feel like something” and where there is a perceived separation of an observer (the “self” 4) and the observed (everything but the “self”). Different types of subjective states may rely on different mechanisms and have different evolutionary histories. There may thus be not only one but dozens of “hard problems of consciousness”.

Regular scientific methods are difficult to apply to subjective experiences since there is no reliable way to observe them in other people, only in ourselves; there is no third-person perspective of a subjective experience, only a first-person perspective. In the first-person case the observer and the observed are the same.

What is an explanation?

What could an explanation of the “hard problem” look like?

We can only hope to explain concepts in terms of other concepts and their relationships. At each point in time we have to accept a certain lowest level of explanation and intuition accompanied with a theory that helps us predict events and experiments. We refine our intuitions and theories over time. When intuition is hard to find, like for the concept spin in quantum mechanics, we construct analogies or accept that the concept remains unintuitive as long as our theories are consistent. After a while we may confuse our model of something with the thing itself. We may for instance think that a red car is actually red whereas the redness only exists in our model of the car.

There is probably no ultimate explanation of anything and we don’t want to have infinite recursion so we have to stop somewhere and accept the lowest level concepts as “axioms”.

Just because we currently can’t explain the subjective nature of subjective mental states, we shoudn’t be discouraged to try to explain other aspects of it and design experiments to verify or falsify those explanations. We can include subjective mental states in our theories such as the active inference framework without understanding their ultimate nature, just like we have included spin in quantum mechanics and can do very accurate calculations without knowing its ultimate nature.

Perhaps we eventually have to accept that “this it what it sounds like when doves cry” as the old Prince song goes meaning that subjective experiences, while most likely an attribute of a certain type of information processing, cannot be explained in terms of anything else. They may belong to a unique ontological category.

Emergence of subjective mental states

There are many theories about the mechanisms that generate subjective mental states. Some of the most popular ones are:

  • Materialism: Subjective mental states are products of information processing. This includes neural activity in the brain, computational processes in artificial systems, or any complex system with the right kind of physical or informational structures. The key mechanism is the organization and interaction of components, whether biological or artificial.
  • Panpsychism: Subjective mental states are intrinsic properties of all matter, similar to mass or energy. In this view, consciousness does not “emerge” in the traditional sense but is already present at a fundamental level in the universe. The complexity of consciousness in systems like the human brain results from the combination or integration of these fundamental conscious elements.
  • Property dualism: This is “panpsychism light”. The idea is that elementary particles don’t only have physical properties like charge, mass, and spin, they also have mental properties.
  • Idealistic monism: In this view, conscious states are the primary and fundamental reality, and the physical world is a manifestation or emergent property of conscious states. The mechanism here inverts the materialist perspective: instead of consciousness emerging from matter, matter emerges from consciousness.
  • Neutral monism: This perspective posits an underlying, neutral reality that is neither exclusively mental nor physical. Both conscious states and physical phenomena emerge from this fundamental substrate. The mechanism suggests a more foundational level of reality from which both mind and matter arise as two aspects of the same thing.

None of the theories offer any intuitive explanation for how conscious states emerge. The non-materialistic theories don’t even try to explain subjective experiences in terms of something else. They assume a priori that the belong to an irreducible ontological category. Materialism on the other hand tries to explain subjective experiences in terms of information processing but lacks an explanation about how information processing can produce subjective mental states. I find avoiding the problem as much of a non-solution as failing to produce an explanation.

It is difficult to square the four non-materialistic theories above with the apparent dependence of consciousness on information processing. When we are anaesthesized, we don’t have subjective experiences. Subjective experiences can also be altered by drugs in the brain and mental training. If consciousness is some sort of static property like mass or some even more fundamental quality of reality like space, then why does it concentrate in brains and why does it require information processing?

I, like most scientists, trust Occams Razor which in this case points at materialism. The other theories not only lack an explanation of the nature of subjective mental states but also add additional speculative concepts to physics as we know it today. And they don’t explain the apparent dependence of subjective mental states on the brain and on information processing.

To put materialism on equal footing with the other theories, we could posit that certain algorithms, or certain types of information processing, whether computable or not 5, have some hitherto undetected mental properties. It would not be a more outlandish claim than that electrons have mental properties. As apparent from the rest of this post, I would claim that it is actually less outlandish. If we believe that mental states emanate from information processing, then they can be realized on different types of hardware, biological and silicon-based 6.

Giving subjective mental states the status of a fundamental aspect of the universe might in fact be a case of anthropocentric chauvinism; we humans are conscious, therefore it must be a fundamentally important concept. It might just be that nature found a clever way to create subjective experiences in neural networks to secure our allostasis and that’s all there is. Somewhere else in the universe evolution may have taken an other path that doesn’t include subjective mental states (see below).

Consciousness is an evolutionary adaptation

Evolution seldom favors useless features that require resources, even if the tail feathers of a peacock are somewhat challenging to explain. It is quite obvious that evolution has shaped our brains and more specifially such subsystems as vision. We can follow the evolutionary history of the brain by looking at monkeys and other more primitive species.

I posit that the capability of subjective mental states has evolved due to evolutionary pressure in the same way as the human eye or the poisonous fangs of cobras. For a discussion on how high-level “system requirements” can shape a system through evolution see this post.

The perhaps most important type of subjective mental state is the compelling awareness of a self as separate from the environment and the urge to keep it that way. This distinction enables an organism to prioritize its own needs, recognize threats to its well-being, and take appropriate actions to protect and care for itself. For more complex organisms, especially humans, this involves not just physical integrity but also psychological and social dimensions.

The evolution of the self might also be tied to the complexities of social living. Recognizing oneself as an individual distinct from others can be crucial for forming complex social relationships, understanding others’ intentions and emotions, and navigating social hierarchies. The sense of self-other distinction also likely plays a role in the capacity for empathy and caring. Understanding one’s own experiences as separate from others’ can lead to recognizing and responding to others’ needs and emotions, which is vital in cooperative societies.

In addition to the existential usefulness of subjective experiences, they also help us on a more mundane level. Anybody who has been in love and had an intimate experience intuit that subjective mental states are both real and instrumental in the procreation of organisms with the capacity for subjective experiences. It also seems uncontroversial that if one has a bone fracture that hurts, one is inclined to avoid stress that brings about the pain, thereby accelerating healing.

The valence of the mental state measures the attractiveness of the state (or the corresponding observation). Mental states with positive valence identify preferred states in active inference (see this post) and thus help the organism to infer the action that minimize expected free energy, securing allostasis.

Subjective experiences as encodings of information

The brain seems to integrate several kinds of information in its mental states. A red 1965 Corvette emerges in our brain as one integrated mental state, not as three different mental states. If we are shopping for a collector car, the red 1965 Corvette mental state is also associated with a valence (“desirable”) that may guide our actions.

Likewise, we see a face as one object. If it is the face of a loved one, it also has an associated strong valence to it. Shape, color, valence, name of the person, and perhaps other associations are integrated into a single mental state.

A third example of an integrated mental state is toothache. We can usually say how severe the ache is and point out which tooth that is causing the suffering. This means that the toothache mental state also integrates location information.

The mental states above are examples of subjective mental states since they are associated with subjective experiences. They are also associated with more or less attractiveness (valence) that may guide our actions such as buying a car or calling the dentist.

The subjective feeling and any associated qualia can be seen as attributes of the mental state. An analogy could be the force feedback mechanism of the control stick in a fighter plane. More force is for instance required to pull higher G:s. The force feedback makes some information about the state of the aircraft intuitively accessible to the pilot who can make decisions based on the force feedback in addition to the instruments.

I posit that the subjective attributes of mental states are efficient encodings of information using a unique encoding modality intrinsic to the brain and only accessible by the same brain. The subjective attributes may also function as an “index” for storing and retrieving certain mental states.

Philosophical zombies as alternative solutions

Nature doesn’t always come up with the optimal solution to a problem from an engineering point of view. One example is the eye where the wiring is on “wrong” side of the retina. It is therefore possible that subjective mental states are not the only solution, or maybe not even the best solution, to the particular set of problems that subjective mental states solve. It may only be a solution that evolution stumbled upon and that was in some sense easy to “design” by evolution. There could be alternative solutions that are functionally equivalent to subjective mental states but lack subjective experiences.

An observation consistent with such a possibility is that we are constructing ever-cleverer artificial minds performing complex tasks such as walking, navigating city traffic, and having rather intelligent dialogs with humans. Since these systems haven’t been subject to evolutionary pressure to survive, they have most likely not developed subjective experiences. Also, trivially, artificial systems can sense color, smell, sound etc. without translating the sensory inputs to qualia.

A well-known thought experiment in consciousness research is that of a philosophical zombie. A philosophical zombie is defined as a human being minus subjective mental states. The question is: could such a philosophical zombie behave identically to a human being?

We do know that humans can from time to time enter kind of a “zombie mode”. Such states are called flow states. In these states the actor is totally immersed in the task at hand, be it writing software, painting, or negotiating powder snow on a snowboard. This indicates that at least the subjective self isn’t always needed or even useful.

Accepting that subjective mental states are a solution to a problem and there might be solutions that are functionally equivalent to consciousness, the answer has to be yes.

Subjective mental states arise from information processing

There are several arguments for subjective mental states being the products of information processing in the brain 7.

  • When anasthetised, a person loses the capacity for subjective experiences and cannot for instance feel any pain. The person also loses the most basic subjective experience of all, the one representing their own separate existence. Subjective experiences don’t emerge without information processing.
  • Subjective experiences can be altered by drugs in the brain and mental training.
  • Studies show correlations between neural activity and corresponding conscious states [1]. One subjective mental state consistently gives rise to the same (or similar) neural firing pattern (in the same person).
  • Advances in AI and neural networks, while not conscious, demonstrate how complex information processing can give rise to sophisticated, seemingly intelligent behavior. There are even step-wise increases in capabilities when the size of the neural network is increased [3]. This analogy supports the idea that subjective mental states could be the result of complex information processing. At least it means that I don’t want to chase unknown substrates or mechanisms before we have thoroughly investigated the theoretical limits of information processing.

Subjective mental states do have causal power

Some theories of consciousness, in particular emergentism and epiphenomenalism, define mental state as its own ontological category that has a relationship to a neural firing pattern but is not the same as the neural firing pattern. The difficulty with this position is that it remains undefined where exactly we would find these mental states and what the nature of their relationship with neural firing patterns would be. I believe that any explanations where mental states are (new) categories instead of terms lead to dualism. It is also in these theories difficult to explain how subjective mental states could have causal power, i.e., influence the behavior of the organism. (Read more here.)

In reductive physicalism the question of causal power doesn’t turn up at all since mental states are information processing in the brain. Information processing in the brain obviously has causal power. Also, if subjective mental states didn’t have causal power, they would most likely not emerge as a result of evolution (see this post).

According to the active inference framework the organism navigates through life minimizing its life-cycle surprise [2]. Surprise is related to how far the current mental state is from the expected or preferred (and therefore in most cases adaptive) mental state.

All actions decided by the brain are according the the active inference framework based on the current mental state and the expected free energy (a proxy for future surprise) of future actions and therefore future expected mental states [2]. The expected free energy is calculated based on how attractive the expected future states are. This attractiveness is at least partly indicated by the valence of a subjective aspect of the mental state such as pleasure or pain. Simply put, qualia that “feel good” indicate attractive mental states and are therefore interpreted as preferred states in action inference.

Active inference is the mechanism through which subjective mental states manifest their causal power.

Links

[1] Tor D. Wager, et.al. An fMRI-Based Neurologic Signature of Physical Pain. N Engl J Med 2013; 368:1388-1397.
[2] Thomas Parr, Giovanni Pezzulo, Karl J. Friston. Active Inference. MIT Press Direct.
[3] GPT-4 System Card. OpenAI. March 23, 2023.
[4] Yu Takagi, Shinji Nishimoto. High-resolution image reconstruction with latent diffusion models from human brain activity.
[5] Susan Blackmore. Consciousness: A Very Short Introduction.

Footnotes

  1. These are some other suggested components of consciousness:
    Intentionality: This refers to the ‘aboutness’ of mental states – the quality of thoughts being about something external or internal. It’s a key feature of consciousness, as it involves the ability to hold representations of the world and direct attention.
    Self-awareness: This is the recognition of the self as distinct from the environment and other entities. It involves a degree of self-reflection and awareness of one’s own mental states.
    Unity of experience: Conscious experience is often unified; we experience a seamless integration of sensory inputs and thoughts, creating a coherent sense of reality.
    Temporal consciousness: This aspect involves the perception of time and the continuity of experience. It’s how we link past, present, and future events in a cohesive narrative.
    Agency and control: The feeling of being an agent with the ability to control and initiate actions is a significant aspect of consciousness. It’s closely linked with the concept of free will.
    Access consciousness: Proposed by philosopher Ned Block, this refers to the brain’s capacity to make information available for verbal report, reasoning, and the control of behavior. It contrasts with phenomenal consciousness (akin to qualia), which is more about experiential aspects.
    Attention: Consciousness often involves the ability to focus attention selectively on different aspects of the environment or one’s thought processes.
    Language and symbolic thought: The role of language in shaping and expressing conscious thought is a significant area of study. Some theories suggest that language is a necessary component for many aspects of conscious thought. ↩︎
  2. The term “state” is somewhat misleading as the brain doesn’t process information like a computer, going from one static state to the next. A brain state is instead a dynamic pattern of firing neurons, an “algorithm of algorithms”. ↩︎
  3. This theory is called representationalism. The dual of representationalism is intentionality which refers to the ‘aboutness’ or ‘directedness’ of mental states — the capacity of thoughts, beliefs, desires, and other mental states to be about something. It means that mental states are directed towards objects, concepts, or states of affairs in the world. ↩︎
  4. The self must somehow be represented by a neural firing pattern too. ↩︎
  5. There is an ongoing debate about whether quantum phenomena are inherently random or if they are predictable given some hitherto unknown mathematical model. Some, e.g., Roger Penrose, submit that if such a mathematical model exist, it can not be computable in the Turing-sense. No non-computable mathematics exists today. ↩︎
  6. This hypothesis is called functionalism and is based on that subjectivity obeys the today known laws of physics. ↩︎
  7. All causal chains of events are information processing. ↩︎

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *