Ten Empirically Testable Properties of

Christopher W. Tyler, Ph.D., D.Sc.

[email protected]

City University of London

Northampton Square

London EC1, U.K.

Abstract

Many thinkers have provided analyses of the properties of consciousness, though not generally in the context of framing the putative properties in terms of their specific testability. To avoid the criticisms of untestability leveled against the 19th century phenomenologists, the goal of the present overview is to identify the specific properties of consciousness that provide the foothold of empirical testability in physiological terms. Of course, the properties of consciousness are, of their nature, subjective, but they may gain a measure of objectivity by consensus agreement on their validity. After proposing a working definition of consciousness and related terms, a representative range of proposals as to its properties is reviewed and elaborated into a larger set of testable properties of consciousness. Potential procedures for the specific implementation of relevant tests are then delineated.

Keywords

Consciousness; brain; mind; neural substrate; hard problem; testability; working memory

Quote from Sadia Sadia:

Aesthetic experience can convert the experience of alienation into an experience of transcendent epiphany.

Robert Fludd (1619) ‘The triple essence of the visual process.” from De Triplici Animae in Corpore Visione. (https://commons.wikimedia.org/wiki/File:RobertFuddBewusstsein17Jh.png. This file has been identified as being free of known restrictions under copyright law, including all related and neighboring rights.)

1. Consciousness The first goal is to attempt a working definition of what is meant by the term “consciousness”. This term is intended to refer to the domain of human thought, or mental operations incorporating knowledge and experience. As such, the terminology used to specify this overall domain can be distinguished into various subdivisions. In this framework, perhaps first articulated by Leibniz, consciousness (or ‘apperception’) is conceived as the domain of operation of thinking, termed ‘working memory’, to be distinguished from simple (or ‘’), which is ‘the internal state of … representing external things’ Leibniz (1714, p. 208) (now known as qualia) without reaching the level of operational intentionality implied in consciousness. Thus, awareness would be the most basic form of consciousness that humans experience. In Gibsonian terms, elaborated consciousness is a state of basic awareness in combination with ‘affordances’ of what will or may happen under an available set of operations, while simple awareness is the primary registration of data prior to the assignation of the affordances.

This distinction raises the possibility of further distinctions, so to clarify the domain of enquiry we may provide a diagram of their basic relationships in a hierarchical table in which basic awareness is symbolized by A* and elaborated consciousness by C* (where the * represents the quality of being an experiential process, as opposed to a non-experiential physical process). Properties ascribed to each of the four states are those of being experiential (as opposed to operating below the level of awareness)

Table of Forms of Consciousness A* C* Self-A* Self-C* Experiential Ö Ö Ö Ö Representational Ö Ö Operational Ö Ö Reflexive Ö Ö

Each of these terms has been used in multiple different ways in the literature, including as synonyms for each other, but it seems illogical to compress them in such a manner when there is this set of distinctions that need to be made. Thus, the representational property (or what is confusingly termed “intentional” in the philosophical literature) is the sense that the experienced qualia derive from an external object. There are schools of philosophy that consider all qualia to have this property (even in dreams), but one simply has to close one’s eyes to realize that this is not the case. Under the darkness of the lids, we experience all kinds of flowing shapes and colors that are clearly internal to our visual processes rather than deriving from external objects.

The third property is that of operationality, the will (or ‘conation’), here treated as - the dynamic, operational, self-guiding nature of consciousness captured by the verb ‘think’ in Descartes’ famous aphorism. In this way, thinking is a complex operational process involving the manipulation of conscious entities and the relationships among them. Although much of this thinking may be accompanied by the qualia of visual and auditory imagery, the qualia do not constitute the logic of the stream of thought per se. They may do so in certain states such as reverie, but the processes of asking questions and solving the problems that occupy much of operational thought processes have a logical, inferential flow that constitute an abstract aspect of consciousness that does not constitute the qualia- specific characterization of basic awareness.

The fourth level in Table I is that of reflexivity, which is whether the state incorporates a representation of the self in its experience or operations, as further delineated in the following. This table is by no means exhaustive of all the consciousness terms that have been mooted, but is intended to define the usage of the most common terms for the purpose of the present paper. On this basis, operational consciousness per se is to be distinguished from self-consciousness, which involves not just a functioning process, but one in which the self is incorporated as a component of the process. The corresponding term for non-intentional awareness is self-awareness, which may thus be distinguished as the awareness of the self as a component of the array of qualia, without operational manipulations being involved. An example would be the experience of oneself as a runner when long-distance running (as opposed to the experience of the environment passing as one progresses forward, which would be non-self-awareness). Thus, according to this scheme both awareness and consciousness are experiential, involving qualia of some kind, but awareness is the basic state while consciousness is the intentional process of operating on mental entities.

If the qualia involve some further mental representation of the self that is performing the operations, they are the ‘self-’ variety of the respective experiences. In this sense, self- awareness and self-consciousness are reflexive (or reflective), involving a double representation of both the object of the simple awareness or operational consciousness and an awareness of that each process is taking place. The distinction between the two is that self-awareness is defined here as the process of having a perceptual viewpoint or perspective of the self that is the receiver of the awareness, and of the potential effects of changing this viewpoint. Thus, if I am looking at a red ball, I am aware that if I look away, or attend elsewhere, I will become aware of something else, and that ‘I’ am the agent of that decision. These are simple operation properties of the essential fact that the awareness does not simply exist, but is a component of an operational receiver or operator that has agency with respect to that awareness.

The simple self-representational aspect of self-awareness may be contrasted with the complex self involved in self-consciousness, which is a far more elaborated process. Here the self is a structured entity constituted of the memories, emotions, aspirations and conceptualizations of the individual involved. Thus, just as consciousness involves the manipulative operations of working memory and the cognitive interpretations in terms of meaning of each configuration of sensations to the individual, self-consciousness brings to bear these interpretative aspects of the in relation to the structured array of information that constitute the historical self. Not just “what does it mean?”, but “what does it mean to me?”. This flickering red quale means fire in general, but to me it means I am about to lose my house and life’s work stored in it.

This second-level ‘self’ representation itself seems to be of the simple variety, as it is difficult to conceive of ‘working memory’ types of operations taking place within either the awareness or consciousness levels of mental experiences as the objects of such operations. The ‘I’ that is the agent of the attentional focus experiences itself as a simple spotlight, with the complexity in what it focuses on, not in itself. Thus the reflexivity of either awareness or consciousness is considered to be at the simple level of secondary awareness.

2. The Neural Substrate of Consciousness The preceding analysis does not address the origin of the experiential quality of consciousness, which has been dubbed the “Hard Problem” of conscious experience (Chalmers, 1997). In this sense consciousness may be considered as a process that is emergent from brain activity. As is made clear by Tyler (2015), however, emergence is a property of many levels of physical reality, whereas consciousness is unique in being the form that we, as humans, use as mode of thinking (i.e., the process that generates the form of behavior that produces the present document, for example). What is it that makes this particular emergent process unique? It is not simply the complexity of its structure. The broiling infernos of the interactive processes that constitute the activity of the sun, for example, are enormously complex and highly structured, but we do not consider them to be conscious. What makes consciousness unique is that it is the only process that we know from the internal perspective of what it is like to be that process (Tyler, 2015).

It is this internal perspective that entails Chalmers’ hard dualism, since we cannot take the internal perspective of anything but our own brain processes. In principle, one could imagine that all neural activity gave rise to consciousness, so that it was not a special emergent property of some subset of neural activity but a general property of neural activity per se. If so, we would have understood the neural basis for consciousness. Nevertheless, consciousness would still be private in the sense that it was the experience arising from the process of neural activity rather than any other process (such as the flow of the ocean currents or the workings of a computer). We can never know whether such non-neural processes might involve some form of consciousness because we cannot know what it is like to be any other process than the process of our brain. Thus it is the privacy or exclusivity of our brain process to our own personal subjectivity that constitutes the real hard problem.

Taken to its logical conclusion, the implication of the hard dualism of the internal perspective of our consciousness is that mind processes are essentially experiential (subjective observables) while brain processes are essentially physiological (objective observables), even though both kinds of observables may derive from the same underlying neural processes. The subjective observables are essentially private, while the nature of the objective observables is that they may equivalently be experienced by indefinitely many other observers, and are in principle public. This dualistic divide between the subjective and objective views of brain processes is fundamental, even though we spend much of our time inferring subjective states in others, and even though the objective observables of others brains are ultimately filtered through the individual subjectivity of each external observer.

Within the concept of the hard dualism of the internal perspective, there is nevertheless an assumption of the psychophysiological parallelism under which present-day neuroscience operates. All human neuroscience studies that perform analyses linking brain function to observer report or perceptual response are assuming psychophysiological parallelism, and few in the field doubt its validity. Moreover, there have been extensive recent efforts to identify what is known as the neural correlate of consciousness (‘NCC’) in the functioning of the brain at various levels (Koch, 2004; see https://en.wikipedia.org/wiki/Neural_correlates_of_consciousness). This term has its limitations, in that a) it is usually taken to imply that consciousness is a state rather than a process, and b) even if interpreted as a process, it seems to embody an implicit dualism: the parallel existence of a state called ‘consciousness’ and an array of neural functions that might correlate with it, as though they are concurrent in the way that the traffic runs parallel to the structure of the road.

The present position, on the other hand is that there is a neural substrate of conscious processing (NSCP) in the sense that the neural and the subjective aspects of conscious process are ‘internal’ and ‘external’ aspects of the same unitary process, rather than separate but correlated parallel processes. The prerequisite for the establishment of this unity is a complex of correlatives between the subjective mental properties of consciousness and the objective properties of the emergent neural process sufficient to establish the NSCP.

3. Proposals for the Defining Properties of Consciousness Many thinkers have provided analyses of the properties of consciousness, though not generally in the context of framing the putative properties in terms of their specific testability. To avoid the criticisms of untestability leveled against the 19th century phenomenologists, the goal of the present overview is to identify the specific properties of consciousness that provide the foothold of empirical testability in physiological terms. Of course, the properties of consciousness are, of their nature, subjective, but they may gain a measure of objectivity by consensus agreement on their validity. Once an appropriate set of properties is identified, such consensus will be sought among the community of interested scientists. The first step in this process is to survey more recent efforts to characterize the properties of consciousness, to see which may help to develop a consensus view on ones that may be testable. These properties will inevitably not be exhaustive, but it is hoped that they will give a good sampling of the range and diversity of views on the issue. They will lead to a final section of the paper that develops a new extended list of testable properties of consciousness and outline the kinds of tests that may be envisaged for them.

Wikipedia

In the present era, one of the most definitive sources of at least popular consensus is Wikipedia, so we may begin the review with its definition of consciousness: “It has been defined variously in terms of sentience, awareness, subjectivity, the ability to experience or to feel, wakefulness, having a sense of selfhood or soul, the fact that there is something "that it is like" to "have" or "be" it, and the executive control system of the mind, or the state or quality of awareness, or, of being aware of an external object or something within oneself. “ (https://en.wikipedia.org/wiki/Consciousness)

In general, definitions have value only to the extent that they are non-tautological, more than just a simple rephrasing in terms of synonyms. Applying this criterion to this list of consensual items provided in the Wikipedia article, we find the following:

1. Sentience. Just an appealing synonym for ‘consciousness’.

2. Awareness. A quasi-synonym, but conceptualizable as rudimentary subclass of consciousness as a whole (as per Section 1), the aspect that corresponds to the internal state of having ‘raw feels’ or sensations per se.

3. Subjectivity. A redefinition that emphasizes the internality of conscious experience and its fallibility in representation of the external world, rather than the ‘raw feel’ aspect.

4. The ability to experience or to feel. A definition in terms of an abstracted behavioral property (the ‘ability’), but one that embeds the experiential quality of consciousness, implying that it should be rephrased as ‘the process of experiencing or feeling’, although that is at root just another synonym.

5. Wakefulness. A definition in terms of an observable brain state, but one that tends to imply an objective definition of the state to external observers, while still being compatible with it involving subjective sentience. In so doing, it tends to elide the ‘hard problem’ of the relationship between the two aspects. 6. Having a sense of selfhood or soul. A definition at the elaborated level of ‘self- consciousness’ as defined in section 1, which makes clear that this is only one aspect of consciousness, the elaborated self-consciousness that involves the memories, emotions, aspirations and conceptualizations of the individual involved.

7. The fact that there is something "that it is like" to "have" or "be" it. This form of conceptualization is focused on the experiential aspect of individual awareness, stemming from ’s (1974) essay “What it is like to be a bat”. It is not a definition per se but an effort to characterize one of its properties. This effort, however, is somewhat incoherent in that it does not attempt to specify what it is ‘like’ to be a bat, or to have any particular kind of consciousness, only to make the point that it is different from what it is like to be a human (or to be Thomas Nagel, in particular).

8. The executive control system of the mind. This definition is an operational one avoiding mention of any experiential aspects of awareness, but focused on the operational aspects often captured by the term ‘working memory’, considered in Section 6.

9. The state or quality of awareness. Conversely to the previous case, this definition focuses on the experiential aspect of basic awareness, treating it as a static entity as implied by the terms ‘state’ and ‘quality’ (as opposed to dynamic concepts like an ‘activity’ or ‘process’, which seem more appropriate forms to bridge the explanatory gap to brain processes). In specifying the experiential aspect of consciousness (or awareness), it is essentially just another synonym.

10. Being aware of an external object or something within oneself. This definition highlights the representational aspect of awareness, or what is termed in philosophy ‘intentionality’ that differentiates awareness from a state like a bodily posture. One cannot simply be ‘aware’ in the way that one’s body can be ‘standing’: one must be ‘aware of’ something, in a subject/predicate relationship.

In summary, almost all of the 10 attempts at definitions of consciousness compiled by Wikipedia fall short in significant respects. They are either unilluminating synonyms, functional specifications in terms of working memory operations, or expressions of its representation function with respect to aspects of the external world. These functional specifications would apply equally to computer programs designed to perform working memory operations or provide representational functions of the world outside the computer. As such, they are generally agreed not to exhibit consciousness at, say, the 20th century level of the capabilities of such computational functions (although some thinkers expect sufficient complexity to be achieved by computation operations in the 21st century to bridge that gap). Nicholas Humphrey

A basic set of the properties of consciousness was proposed by Humphrey (1992) (with the explanations being his complete quotes):

1. To be conscious is essentially to have sensations. That is, to have affect-laden mental representations of something happening here and now. 2. The subject of consciousness, ‘I’, is an embodied self. In the absence of bodily sensations, ‘I’ would not exist. Sentio ergo sum – I feel, therefore I am. 3. All sensations are implicitly located at the spatial boundary between me and not-me, and at the temporal boundary between past and future: that is, in the ‘present’. 4. For human beings, most sensations occur in the province of one of the five (sight, sound, touch, smell, taste). Hence most human states of consciousness have one or other of these qualities. There are no non-sensory, amodal conscious states. 5. Mental activities other than those involving direct sensations enter consciousness only insofar as they are accompanied by ‘reminders’ of sensation, such as happens with mental imagery and dreams. 6. This is not less true of conscious thoughts, ideas, beliefs … Conscious thoughts are typically ‘heard’ as voices in the head – and without this sensory component they would drop away. 7. If and when we claim that another living organism is conscious, we are implying that it too is the subject of sensations (although not necessarily of the kind we are familiar with). 8. If we were to claim that a non-living organism were conscious, the same would have to apply. A mechanical robot for example would not be conscious unless it were specifically designed to have sensation as well as perception (whatever that design involves).

This is a boldly divergent set of properties, which do not follow the classical Cartesian mental tradition but set off on an idiosyncratic path that treats the mind as determinedly corporeal in its contents. In terms of Table I, Humphrey largely reduces the complexity of consciousness to what here is termed ‘simple awareness’, i.e., the experience of qualia. Moreover, he goes beyond the core qualia of sensory experience of the world to claim that ALL other aspects consciousness are basic echoes of sensory experience. He even extends this claim to the recipient of sensations, the ‘I’, asserting that the ‘I’ would not exist in the absence of bodily sensations.

In this respect Humphrey seems unduly restrictive. It is true that many non-sensory activities, such as the composing or the understanding of this sentence, have sensory accompaniments, such as the sight of the letters on the page and the sound that the words would have if read aloud. However, the consciousness of this sentence has many more aspects than these basic sensory aspects. Beyond its sight and implied sound, it has no taste, smell or feel, but it has layers of abstract meaning that have no sensory component (at least in this writer’s mind). Each word has a meaning, and the whole concatenation of them has a primary meaning that develops as the sentence unfolds. Beyond that primary meaning, there is also the concept of writer composing the sentence in his (my) mind, and of the reader extracting that meaning in his or her mind by the process of reading the sentence. While it is true that expressing these aspects may call to mind the picture of the writer or reader engaging in these activities, those pictures are not themselves the meaning of the composing or extracting activities – those activities are abstract thought processes that have no sensory corollary.

Thus Humphrey’s characterization of consciousness as entirely embedded in sensation completely misses the large scale aspects of conceptualization, thought, understanding, deducing, inferring, generating questions, and the whole domain of mentation above and beyond the scope of the basic processes of sensing, experiencing and feeling that he considers the domain of consciousness. Consideration of his list serves to draw a distinction between the domain of qualia, which is his primary (or sole) concern and the larger domain of consciousness as a whole.

A key aspect missing from Humphrey’s account is the operational aspect of consciousness – the will (or ‘conation’), here treated as attention - the dynamic, operational, self-guiding nature of consciousness; the processes of asking questions and solving the problems that occupy much of operational thought processes that have a logical, inferential flow that constitute an abstract aspect of consciousness that does not constitute the qualia-specific characterization of his account.

Thomas Metzinger

One interesting set of properties (or “constraints”) of consciousness was compiled by Metzinger (2000), as follows:

1. Globality: Individual phenomenal contents are always bound into a global situational context, consisting of a conscious world model

2. Presentationality: The temporal immediacy of experience as such, that is embedded into a uni-directional flow. Especially the experience of ‘nowness’ such that every moment includes an immanent immediate past and presages a future

3. Transparency: The unavailability to attention of preconscious processing stages 4. Convolved holism: The structural feature of phenomenal states whereby we experience objects at once as being wholes and not merely sets of features, but at the same time as often being wholes constructed of parts

5. Dynamicity: Our conscious life emerges from integrated psychological moments, which are themselves integrated into the flow of subjective time

6. Perspectivalness: The reference of conscious contents to a subjective first-person perspective. In general, conscious mental life possesses a focus and comes from a point of view

Although a worthy attempt, to this author several of these “constraints” seem mutually redundant, and unhelpful as criteria for identifying the psychophysiological basis of consciousness. In general, it is hard to see how they could be empirically testable because most of them are experiential rather than structural properties. Constraint 3 is not a property of consciousness per se but of unconscious mental processing. Constraints 1 and 4 seem largely redundant in the property of globality, while Constraints 2 and 5 both express the experiential integration of psychological moments into the flow of time. Constraint 4 seems self-contradictory, as it is difficult to understand how objects could be perceived simultaneously as both wholes and collections of parts. Sequentially, perhaps. Thus, Metzinger’s Constraints do not provide a firm basis on which to proceed with the identification of the NSCP.

Giulio Tononi

One of the most influential recent approaches to the properties of consciousness (C*) is a set of “axioms” or “self-evident truths about consciousness” proposed by Tononi (2008) and Oizumi, Albantakis & Tononi (2014) in the context of their Integrated Information Theory, which are endorsed by Koch (2012).

1) C* exists

2) C* is compositional (structured): each experience consists of multiple aspects in various combinations

3) C* is informative: each experience differs in its particular way from other possible experiences

4) C* is integrated: each experience is irreducible to non-interdependent components

5) C* is exclusive: each experience excludes all others. Although these “axioms” represent the established intent of their author(s), to this author they again seem mutually contradictory, confusing and unhelpful as criteria for identifying the psychophysiological basis of consciousness. How does the exclusive property of Axiom 5 differ from the differentiated property of Axiom 3? And if each experience excludes all others (as in Axiom 5), how can it consist of the multiple aspects of Axiom 2? Are not the aspects themselves “experiences”? How do the “aspects” differ from the “components” of Axiom 4? Indeed, the double negative of the Axiom 4 renders it largely uninterpretable. If the components to which it is reducible are interdependent, they seem contradictory to the recombinant aspects of Axiom 2. And the exclusive, integrated aspect/component structure of the experiences as specified in these axioms provides little in the way of properties that meet the criteria for physiological testability. They are more in the nature of phenomenological analysis of the contents of consciousness than properties that are testable for mind/brain correlation assessments.

Todd Feinberg

A more recent analysis of consciousness (Friedberg & Mallat, 2016) identifies 6 defining features of complex biological organisms that are also features of consciousness, but are not exclusive to consciousness per se, and are therefore exclusion criteria for properties that are necessary but not sufficient for consciousness. They follow these first 6 properties with a list of 8 features of neural organization that are found in organisms that they consider to exhibit consciousness.

First level: General biological features that apply to all living things

1.1. Life, embodiment, and process 1.2. System and self-organization 1.3. Hierarchy, emergence, and constraint 1.4. Teleonomy (goal-directedness) and adaptation

Second level: Reflexes that apply to all animals with nervous systems

2.1. Rates and connectivity 2.2. Basic motor programs

Third level: Special neurobiological features that apply to animals with sensory consciousness

3.1. Elaborated sensory organs 3.2. Complex neural hierarchies 3.3. Neural hierarchies that create unique neural-neural interactions 3.4. Multisensory convergence 3.5. Neural hierarchies that create isomorphic representations and mental images, and affective states 3.6. Unique combination of nested and non-nested hierarchical functions 3.7. Attention 3.8. Memory

Feinberg & Mallat (2016) are explicit that this third list of 8 properties constitutes the set of objective neurobiological organizational properties of organisms with consciousness, not the defining properties of consciousness per se. These they address in a subsequent figure with the introduction: The [neurobiological organization of a] nested hierarchy of the general and special objective features explains the subjective features of consciousness (namely, the four neuro-ontologically subjective features of consciousness: Referral, Unity, Causation, and Qualia; Feinberg, 2012):

1. Referral Conscious experiences are about (referred to) the outer world, the body, or affective states, but are not referred to the neurons that produce the experiences 2. Mental unity Consciousness is unified and bound into a relatively unified field of awareness in contrast to the divisible set of individual neurons that create it 3. Mental causation How the subjective mind can have causal influence on behavioral actions, the material body, and the outside world. 4. Qualia Qualities, the subjectively experienced attributes such as colors, pains, sounds, etc. A commentary on Feinberg’s basic list of four attributes of consciousness, which are specifically referenced as overlapping extensively with previous lists of attributes, can be made as follows:

1. Referral. What is termed ‘referral’ here has a long history in the philosophical term ‘intentionality’ and, as such, is a long-established property of consciousness. Nevertheless, it is overstated here, as entopic phenomena experienced with closed eyes, auditory tinnitus, headaches, and so on, are experienced as internal states rather than being referred to external objects or bodily parts. Aches and pains are a particularly interesting example as, although they may be projected to a particular part of the body, they are not referred to that part in the sense of being an attribute of that part. “I feel a pain in my toe” is not like saying “my toe is swollen”, with the toe as the referent object.

Pains have the particular property of making a referent cross-linkage between the feeling ‘I’ and the locus of the pain. Whereas the referent for the stimuli from a typical object is the object out in the world, with the ‘I’ as the purely mute observer of its existence there, the referent for a pain is very much of a direct connection with the pain as hurting ‘me’, even though it has a distinct location in my toe or my head. The nature of the consciousness of pain is thus not the typical externalized referent (even with respect to body parts) but consciousness of the participation of the observing ‘I’ as an intrinsic component of the referent complex.

When it comes to affective states, the issue of referral seems even less clear. How can an affective state, such as anger or pleasure, be the referent of conscious experience? It seems much more plausible to think of the emotions as being projected onto an external referent, such as stone on which the toe was stubbed, or the beautiful painting that evoked the pleasurable response. The affective component is very much the emotional ‘coloring’ of the external referent, the toe or the painting, rather than a referent in itself.

2. Mental unity. This is one property of consciousness that is in common in almost all specifications of its properties. It is the focus of attention, the attentional ‘I/eye’ focus that is typical of conscious states. There are, however, states of partial attention that seem to stretch the unity concept. One such is the psychophysical task where one is required to fixate the eye steadily at one point while making judgements about other regions of the visual field. Both aspects of the task require local attention, so the attention must somehow be split in order to accomplish the complementary aspects of the task. The issue then is whether this is a split into non-unitary partial regions of attention or whether it is perhaps accomplished by rapid alternation among the requisite sites. Feinberg may have had something like this in mind by including the word ‘relatively’ in the specification, allowing for some degree of splitting of what is normally a strongly unified percept or experience.

Another form of non-focal attention is the mental zoom by which we can zoom out to survey a larger spread of or all of the visual field simultaneously, or zoom in to a very small target such as a bus number. In the zoomed-out state it seems as though we can apprehend the general layout of the entire contents of the visual field, although there is still a sense of focal attention to aspects within it. 3. Mental causation. As specified by Feinberg, mental causation is “How the subjective mind can have causal influence on behavioral actions, the material body, and the outside world.” It is certainly true that neural activity in the brain can cause muscular behavior, and thence bodily movement and consequent changes in the outside world. There is, however, something questionable in attributing this causal sequence to the ‘subjective mind’, which is an experiential process at a slight philosophical remove from the neural activity. There is some initiating process that generates the motivation for a given behavioral outcome, but it is not entirely clear that this is a subjective experience. Often we spontaneously decide on a course of action without an obvious reason for doing so. We are conscious of the decision and its implementation, but not necessarily of the source of the decision. Many such processes seem to be delivered to our consciousness by an unseen hand.

Even the process of generating thoughts has an unconscious aspect. We have all had the experience of starting off a sentence and finding that we don’t have the word that we needed to express the intended thought. We hesitate, reaching for the word on the ‘tip of the tongue’. With luck, it comes without too much delay, a minor glitch in our production of the sentence, or we may have to apologize and try for a substitute. The word was there, and often presents itself at a later time, so it is not a question of the thought being unformed, just of having timely access to the intended components. This experience tells us that our subjective awareness is skimming the surface of a deeper causal process that was generating the sentence, making it unclear whether the subjective consciousness was the causal motivator or whether it is attributable to a deeper unconscious or pre-conscious process that is closer to the objective neural activities of the brain.

4. Qualia. As the subjectively experienced attributes such as colors, pains, sounds, etc, qualia have long been recognized as being a defining feature of consciousness. Indeed, many, such as Bertrand Russell, and Christoph Koch, seem to treat them as the defining feature. In this respect the analysis of consciousness has drifted far from its early conceptual identification with thought, by Descartes, who defined the conscious thoughts of his aphorism “I think, therefore I am” as “everything that is within us in such a way that we are immediately aware of it. Thus all the operations of the will, the intellect, the imagination and the senses are thoughts.” (Descartes, 1615). It is difficult to define the experienced attributes of the operations of the will and the intellect, other than to say that we are aware that they are the current processes of our conscious mind (although most people are also aware that there are processes going on that we are NOT aware of, that nevertheless deliver their results to our consciousness, implying that there are also mental processes of our unconscious mind). Descartes’ definition does not, therefore, operate in reverse; much of what I ‘am’ consists of everything of which I am not immediately aware, but can bring to mind from memory, or arrives unbidden or even against my preference from some internal store of depressing, frightening, disgusting or anxiety-provoking past experiences. I am many aspects that I am not thinking of.

Thus qualia, in their common usage, are closer to the individual components of awareness that represent our experiences of the external world and do not go nearly far enough as a characterization of the Cartesian experiential domain of consciousness as a whole, specifically including the thought processes of the will, the intellect, and the imagination.

4. Globalist Views of Consciousness

Expanding on Descartes’ view, Baars et al. (2013) propose that the role of consciousness is to function as a global workspace that can first integrate and then propagate information from local centers for the coherent control of action. The end results of their analysis is to derive a series of propositions relating to this global workspace conceptualization of consciousness that they regard as “testable” predictions of its validity. Consideration of these propositions, however, suggests that many of them are either untestable in the form proposed or not directly related to consciousness as the experienced process of our mental operations:

Some testable predictions (from Baars et al., 2013; Table 3)

1. The cortico-thalamic (C-T) system supports any-to-many binding and broadcasting of conscious contents. A bound conscious gestalt may emerge from anywhere in the cerebrum, and spread globally to all other regions for ∼100 ms.

> This is not formulated as a testable prediction because the neural correlate of “a bound conscious gestalt” is not known or specified.

2. Receiving networks adapt to novel information from broadcast sources. After widespread receiver adaptation (updating), broadcasts are driven out of consciousness by competing inputs.

> This is not formulated as a testable prediction because it requires the neural correlate of consciousness to be known and to be identifiably in consciousness before testing whether they are ‘driven out’ of it.

3. Posterior cortex generates perceptual conscious contents, while explicit feelings of knowing are generated from non-. (Frontal, anterior temporal, parietal). Many cognitive tasks involve both a conscious perceptual and a reportable semantic broadcast, as shown in the extended ERP.

> The electrical signal of the ERP cannot show the involvement of “a conscious perceptual and a reportable semantic broadcast”, it can only correlate with it. The limitations of ERP source localization are not sufficient to establish whether the source is a broadcast process or not.

4. Since nearly all cortico-thalamic links are bidirectional, the cerebrum supports very widespread adaptive resonance (reentrant signaling). Signaling of conscious contents is superimposed on baseline resonant activity in the C-T core. Because of spatiotopic array organization in the cortex and thalamic nuclei, content signaling in the cerebrum is simultaneously spatial and temporal.

> No link is made in this statement between the bidirectionality of cortico-thalamic links and the presence of consciousness or the signaling of “conscious contents”. No testable prediction is proposed.

5. Goal-directed signaling in the C-T core is waking state dependent. Waking, dreaming and slow-wave sleep reflect distinct global modes. However, even slow-wave sleep may support waking-like activity during the UP phase of the slow oscillation.

> The implication of the statement is that all modes of sleep may be accompanied by consciousness, leaving the testable prediction obscure.

6. While many spatiotemporal codes may exist, cross-frequency phase coupling is thought to integrate the full range of C-T rhythms. Because conscious sensory events are integrated within 100 ms periods, 4–12 Hz rhythms may underlie conscious moments.

> There is already a well-established (inverse) relationship of 8-12 Hz (alpha) brain rhythms and consciousness. In fact, rhythms from 13-100 Hz are also associated with reductions in alpha rhythm, implying a positive relation to consciousness. These are, however, more strongly associated with neurophysiological memory functions than global workspace operations.

7. Effortful voluntary control involves binding and broadcasting from frontoparietal regions. Mental effort is an FOK (Feeling of Knowing) that is associated with major cognitive styles like persistence and general intelligence.

> This statement is not formulated as a testable prediction.

8. The hippocampal complex supports conscious event perception, as well as serving to encode episodic memory traces in multiple brain regions. Hippocampal lesions often lead to cortical reorganization of conscious sensory functions.

> Reorganization is not the same as elimination. The initial statement implies that hippocampal lesions should eliminate conscious event perception, whereas in fact it only eliminates the memory storage of it. 5. An Expanded Specification of the Properties of Consciousness (C*)

In view of the weakness of the attempts delineated above, we may develop a fuller set of properties of consciousness for which a testable NSCP is plausibly definable, namely, that consciousness (C*) is:

1. Private 2. Unitary 3. Interrogative 4. Extinguishable 5. Iterative 6. Operational 7. Multifaceted 8. Complexly interconnected 9. Autosuppressive 10. Self-referential

These ten properties may be explicated as follows:

1. Privacy: One of the irreducible properties of C* is its privacy. Pace science fiction, as far as we know, there is no way to share our individual C* with anyone else. Verbal and non-verbal means of communication provide effective means of generating the illusion of sharing C*, but (as too many lovers have found to their cost!), this is only a superficial level of apparent sharing, not a direct experience of another’s true internal experience. To meet this criterion, the NSCP must be brain- compatible and not allow for direct interbrain communication. In the context of quantal theories of C*, this means that the NSCP must not be based on any non-local quantal effects. (Those who accept the non-locality of some superordinate cosmic C* will, of course, draw the opposite conclusion.)

2. Unity: Under ordinary conditions, C* is experience as unitary at a given moment. We have one experience at a time, although we may be able to rapidly switch among multiple experiences over a short time intervals. The NSCP must, accordingly, occur in a unified neural net of some kind in the brain rather than in multiple distinct brain sites.

3. Interrogacy: Though not widely recognized, a defining property of C* is the ability to generate questions and represent potential answers in an indeterminate superposition of probabilistic states. Other complex systems, such as galaxies, biological cells and the Internet, incorporate extensive recursive interactions and consist of energy processes that undergo development and evolution. Although they can be said to process information, they cannot meaningfully be said to ask questions. It seems to be a unique property of a conscious system to formulate questions, and something that gets switched on in humans at about the age of 1 year. This capability also entails (though perhaps not until a later age) the ability to envisage possible answers.

4. Extinguishability: A primary property of C* is its lack of continuity. It is extinguished every time we fall asleep, are anesthetized or a knocked out by a physical trauma, and is rekindled when we awake (and also, in a somewhat restricted form, when we dream). Although these states are deeply subjective, they can be attested in the form of memory marker of external events (such as our last memory of a radio program before dropping off to sleep or our first memory of a wake-up alarm). Dreaming is well established as being objectively indexed by rapid eye movements while asleep (REM sleep).

5. Iterativity: Another well-established property of C* is its tendency to iterate repeatedly through similar states, both when there are problems to be solved (like anxiously reiterating a worrying scenario) and as a form of pleasure (as in music, which reflects the consciousness of its listeners, and has continual repeats of phrases, themes and whole pieces over a wide range of time scales).

6. Operationality: The operational property is captured by the term “working” in the cognitive science concept of “working memory”. In other words, “memory” is the ability to store representations of aspects of the world as stable brain states. The remarkable property of such memories is that we are not conscious of the millions of memories that are held in store most of the time. We only become conscious of them when they are accessed by C* for a brief time. There seem to be two forms of access, one that is well-described as “working memory”, to perform some operation on the stored information, and the other inoperative access that is usually included in the functional usage of “working memory” but does not involve any operational changes to the stored information.

7. Multifacetedness: C* by its nature incorporates all varieties of human experience, from logical thought processes and imaginary journey planning through the irreducible qualia of direct sensory and indirect imagery experiences to the array of emotional experiences and primary internal states such as pain and orgasm. Although we still may not be able to envisage what it would mean for the NSCP to exhibit, or possess, such experiences, it is a core requirement of the theory that it would be able to do so. At least in the case of thought or journey planning, the NSCP should be able to exhibit the activation of the specific memory states representing the sequential stages of the specific thought or journey in question.

8. Complex interconnectivity: C* is experienced as complexly interconnected, in the sense that it can proceed along many ‘lateral thinking’ paths from any one state to many others. This interconnected flexibility is part of its generative or creative power. It is not like a finite state machine, one that typical proceeds sequentially from any one state to a definite following state. C* is capable of exhibiting multiple connectivity from any facet to any other facet of human thoughts and feelings, unconstrained by logic. (Of course, in some cases well-trodden paths of thought do become established such that C* does operate analogously to a finite-state machine, but this is more the exception than the rule.) This property corresponds to the “global workspace” concept of Baars (1983).

9. Autosuppressivity: One of the sources of the variety and creativity of C* is that it tends to exhibit the property of burning out at any one state, suppressing the tendency to return to that state, thus impelling continuous movement to novelty. This is a well-known property of attention across the visual field (inhibition of return), and also a rule of a good writing style, to avoid using the same term or phrase repeatedly in a text. Indeed, this is the opposite of the behavior of a classical finite-state machine, which repeatedly follows the same path from any given state. Autosuppressivity is thus a major contributor to the creativity of humans and other organisms, though it may be overridden by iterativity, the tendency to stay the comfort zone of the same sequential paths of behavior.

10. Self-referentiality: A final property of C* is its ability to represent itself as a component of the conscious field. This property harks back to Russell’s Paradox as a seemingly impossible feat: what is the set of entities that includes itself as a member? But this is a common experience, that we can be (acutely!) aware of ourself as a participant in the field of C*. This property goes beyond the primary quality of referentiality of C*, that it has the inherent quality of referring to some form of object outside itself (or what philosophers misleadingly term ‘intentionality’).

Specific examples of empirically definable tests for NSCP of these properties of consciousness are as follows:

1. Privacy. The obvious basis of the privacy of individual experience is the separate brain of each individual. While this test is passed, it is not easy to set up the converse case, of co-extensive brains for non-private experience. An aspect that relates to this issue, however, is the correspondence between brain states for comparable experiences across individuals. One form of the converse case is when people judge that they have similar individual experiences in particular situations, so these similar experiences should be expected to have similar NSCPs in terms of the recordable patterns of neural activity.

2. Unity. If a particular set of brain structures is identified as the NSCP, then they should either be structurally unitary (such as the anatomical neural net of the claustrum) or have demonstrably correlated (i.e., unitary) activity across the multiple anatomical structures when C* is reportable. To pass this test, the correlation across structures should account for all of their recordable activation above the noise level of the recording technique.

3. Interrogacy. One aspect of brain function that does not seem to have been investigated is the process of formulating questions, or interrogacy. Coming up with questions is a creative process that all philosophers and scientists engage in professionally, yet it has not been codified as a psychological process or studied in a neuroimaging context. The NSCP should be coextensive with the brain processes involved in interrogacy, once it is studied.

4. Extinguishability. The NSCP must exhibit the same time course of complete extinction as C* itself every time we fall asleep or are anesthetized, and be rekindled when we awake. This could be tested with a button monitor that has to be held down while we are falling asleep or being anesthetized, but will be released by the muscular relaxation with the onset of the sleep state. This would be most easily tested with continuous scalp EEG recording but could be attempted with fMRI. Note that long-established sleep research has established the psychophysiological parallelism between reported sleep states and EEG signatures. These show that the deep sleep associated with delta-wave activity (1-3 Hz) typically has little or no reportable conscious experience (Dement & Wolpert, 1958). Thus, the more rapid EEG normally associated with non-sleeping states qualifies as an NCC for C*. This result, however, illustrates the difference between an NCC and a NSCP, since the absence of delta waves does not qualify as a substrate, and the remaining EEG activity does not switch off during delta wave sleep.

5. Iterativity. Any plausible NSCP measure must exhibit the iterativity of conscious experience over the experiential range of time scales. This was the case for the electrical stimulation of the by Penfield (1958), where same long- forgotten conscious memory sequence was repeatedly evoked by stimulation at the same site in the temporal lobe (but not any other part of the brain).

6. Operationality. This is a functional property that is readily accessible to empirical techniques such as functional Magnetic Resonance Imaging. Various NSCP brain sites associated with working memory have been identified through fMRI studies, but their specific roles and dynamics in relation to the operational properties of experiential working memory are not well understood.

7. Multifacetedness. The test for this property is that the neural activity proposed as a substrate for C* should be activated for all the multifaceted aspects of experiential consciousness. The testability criterion would be that any measurable neural process identified as the NSCP would be concurrent with one such experience, and vice versa, with no significant misses or false alarms in the coupling instances.

8. Complex interconnectivity. Any one aspect of consciousness, such as awareness of a face is not a simple state but a multilevel complex of experiential components from the basic raw feel to the communicative socio-emotional implications. To be explanatory the NSCP should exhibit a similar variety of interconnectivity.

9. Autosuppressivity. Once again, the attentional autosuppressivity that keeps C* moving on from each identifiable state to the next is a further property that can be identified in candidate mechanisms for the NSCP.

10. Self-referentiality. Computationally, it is not difficult to achieve a program that includes itself as a component in its representation, a common feature of computer games known as an ‘avatar’. It escapes Russell’s Paradox by not being a full representation that actually contains itself, but only a reduced representation of the major features of it in model form. It is not so clear how the neural implementation of an avatar could be achieved, but this is a further prerequisite of the NSCP. Note that this concept, of self-referentiality being a testable aspect of the NSCP while referentiality is not, is itself paradoxical. But self-referentiality can be tested by identifying a brain process that switches on and off concurrently with the switch between the self ‘avatar’ and other content, whereas referentiality cannot be tested because it is an unavoidable property of C*, and there is no non-referential form of C* against which to test the ‘off’ state of a candidate process.

6. Functions of Consciousness

A further aspect of consciousness that is rarely considered is its evolutionary function (Bridgeman, 2011; Earl, 2014), as distinct from its neural integrative, adaptive and working memory functions, which are commonly highlighted (e.g., Baars et al., 2013). Indeed, many aspects of neural function are integrative, adaptive and mnemonic without passing the threshold of conscious awareness, such as the procedural memory functions of the cerebellum and basal ganglia. It is evident, therefore that such neural functions do not require consciousness per se. Bridgeman (2011) for example, argues that consciousness allows organisms to avoid the tyranny of response to the immediate (e.g., Pavlovian) environment, allowing the organism to superpose its goal-directed needs into the situational response. He concludes that such behavior requires the operation of working memory, and that consciousness is therefore a particular form of working memory.

However, although goal-directedness may be a characteristic property of consciousness, it does not seem a sufficient criterion. Virtually all behaving organisms engage is such goal- directed behavior in one form or another, but we would hesitate to ascribe consciousness to all forms of goal-directed behavior (such as cows eating grass, for example). Indeed, goal-directed behavior can be observed in single-celled micro-organisms, such as the hunting behavior of dinoflagellates and planaria, based on the information gleaned from their unitary subcellular eye (Schwab, 2012). Thus, behavioral goal-directness is a property, indeed, the essential property, of all behaving organisms, or animals, making it difficult to distinguish the role of goal-directedness in consciousness from that in behavior as whole.

An alternative view of the role of consciousness in working memory is that it represents the interface of the memory storage process. There is substantial evidence that we can only remember items from the sensory world that were attended (i.e., that were a focus of conscious awareness). Although unattended items may be processed in some form to allow their characterization as uninteresting targets for attention, through what is known as pre- attentive processing (Triesman, 1985), such items do not reach the site of memory. Only attended items can be remembered. It therefore seems that consciousness may represent the gateway to memory. While not all items that reach consciousness may be remembered, it seems to be the case that all items that are remembered must have reached consciousness.

Although consciousness is thus a sine qua non for laying down the memory for an item, it is not the memory per se. Indeed, the very concept of memory implies a lack of consciousness, for the act of remembering corresponds precisely to bring the item back into consciousness from its latent storage condition outside of consciousness. This lack consciousness is evident for the vast range of items in long-term memory, such as the name of your first-grade teacher, which you may not have brought to consciousness for decades, but is also true for short-term iconic memory. We have all had the experience of being told a phone number, then doing competing activity, then being able to recall the phone number by accessing the internal auditory echo of that number that is still available for a few minutes, though outside the immediate consciousness of the current focus of attention.

Equally, consciousness may be distinguished from the more interactive concept of working memory (Baddeley & Hitch, 1874), the earlier form of the global workspace that is currently being championed by Baars and associates (e.g., Baars et al., 2013). There are three aspects to these forms of operation, which form the core operations of the process we call ‘thinking’. There is the recall of items from memory, the working operation on the items, and the consciousness of them. Consider the quiz question of whether an item is bigger than a breadbox, for an item such as a rugby ball. We have to recall the item from memory, examine it to ascertain its dimensions, do likewise for the standard item of a breadbox, compare the sets of dimensions (with appropriate rotation to the best fitting orientation) and decide which is larger. Indeed, we have to decide which form of breadbox is intended, the single-loaf kind that would be too small for the rugby ball or the multiloaf breadbin that would easily be large enough. Since these operations require the act of recall from memory, followed by operations on them, it seems to be a misnomer to call them a form, albeit active, of memory per se. This was perhaps a strategy to avoid the use of the term ‘consciousness’ in the reductionist milieu of the mid-20th century, but it seems to inflate the storage function of memory to an implausible extent. It is preferable to maintain the term ‘memory’ for the storage function of retaining the information while moving it outside the theatre of consciousness.

Finally, what light does this analysis shed on the global workspace as the essence of consciousness? In the breadbox quiz, we become conscious of posing the question, of recalling the memory of the rugby ball, of its scale, of recalling the breadbox, of its relative scale, of aligning the two up for comparison, and of the decision. But we are not conscious of all these factors at the same time. At least at the beginning of the process, while we are recalling the shape of the rugby ball from those of other sports balls, we are not conscious of considering the type of breadbox. It is only when all the components have been recalled from memory that we may perhaps be conscious of them all together. So, while the global workspace may be the specific arena of the operations of consciousness, it does not seem to be an accurate characterization of the core function of consciousness per se. Consciousness thus seems to be better characterized as the focus of operational attention within the global workspace, rather than the global workspace as a whole.

In summary, the evolutionary function of consciousness may be not so much a mechanism to introduce goal-directed aspects into the control of behavior as to function as the gatekeeper for memory storage, such that only aspects of the sensory input that pass the criterion for reaching consciousness can be stored in memory, while all other aspects are lost (Penfield, 1958). The stored memories themselves decay over time, so they may also tend to be lost eventually, but many are retained for long periods or a lifetime, especially those that were experienced with heightened consciousness. Thus, while ‘attention’ describes the selective function of which aspects of the sensory input are the focus of the gatekeeping function, ‘consciousness’ describes the activation level through which the sensory input becomes laid down as a memory trace, and reinforced or reorganized in memory when recalled through the working memory mechanism.

7. Conclusion.

To extend the probing to the neural substrate of conscious processing (NSCP), a full specification of the properties of consciousness as subjectively experienced is provided in a form that is neuroscientifically testable. These properties are then considered against those of the global workspace conceptualization of consciousness to highlight the differences between it and the current framework of explicit testability of consciousness conceived as the focus of operational attention by which transient sensory input is converted to long- lasting memories.

References

Baars B.J., Franklin S., Ramsoy T.Z. (2013) Global workspace dynamics: cortical "binding and propagation" enables conscious contents. Frontiers in Psychology 4:200. doi: 10.3389/fpsyg.2013.00200. Baddeley A.D., Hitch G.J. 1974. Working memory. In The Psychology of Learning and Motivation: Advances in Research and Theory, ed. GA Bower, pp. 47–89. New York: Academic Bridgeman, B. (2011) Functions of consciousness. Cognitive Neuroscience 2 115-116. Chalmers, D.J. (1997) The Conscious Mind: In Search of a Fundamental Theory (Philosophy of Mind Series). Oxford: Oxford University Press, USA.

Dement W., Wolpert E.A. (1958) The relation of eye movements, body motility, and external stimuli to dream content. Journal of Experimental Spychology 55:543-53 Descartes, R. The Philosophical Writings of Descartes. Translated by Cottingham, J., Stoothoff, R., Murdoch, D. and Kenny, A. (vol. 3). 3 vols. Cambridge: Cambridge University Press, 1985: II 113 / AT VII 160 Earl, B. (2014) The biological function of consciousness. Frontiers in Psychology 5:697, doi: 10.3389/fpsyg.2014.00697 Feinberg, T.E. (2012) Neuroontology, neurobiological naturalism, and consciousness: a challenge to scientific reduction and a solution. Physics of Life Reviews 9:13-34. Feinberg, T.E., Mallatt, J. (2016) The nature of primary consciousness. A new synthesis. Consciousness and 43:113-27. Fludd R. (1619) De Triplici Animae in Corpore Visione. Tomus II tractatus I sectio I liber X. (image in public domain, copyright expired) Humphrey, N. (1992) A History of the Mind. Chatto and Windus: London. Leibniz, G.W. (1714) Principles of Nature and Grace Based on Reason. §4, in Philosophical Essays. Trans. Roger Ariew and Daniel Garber, 1989, Indianapolis: Hackett, Koch, C. (2004). The Quest for Consciousness: a Neurobiological Approach. Roberts & Company Publishers: Englewood, NJ. Koch, C. (2012) Consciousness: Confessions of a Romantic Reductionist. MIT Press: Cambridge, MA. Metzinger, T. (2000). The subjectivity of subjective experience: A representationalist analysis of the first- person perspective. In T. Metzinger, ed., Neural Correlates of Consciousness—Empirical and Conceptual Questions. Cambridge, MA: MIT Press. Nagel, T. (1974) What is it like to be a bat? The Philosophical Review, 83, 435-450O Oizumi, M., Albantakis, L. Tononi, G. (2014) From the phenomenology to the mechanisms of consciousness: integrated information theory 3.0. PLoS Computational Biology 10 (5). doi:10.1371/journal.pcbi.1003588. Penfield, W. (1958). Some mechanisms of consciousness discovered during electrical stimulation of the brain. Proceedings of the National Academy of Sciences, U.S.A., 44, 51-57. Schwab, I. (2012) Evolution's Witness: How Eyes Evolved. Oxford University Press US, New York. Tononi, G. (2008). Consciousness as integrated information: a provisional manifesto. The Biological Bulletin 215 (3): 216–242. Treisman, A. (1986) Features and objects in visual processing, Scientific American, 255:114-125. Tyler, C.W. (2015) The Emergent Dualism view of quantum physics and consciousness. Cosmos and History 11, 97-114.