SWIF Philosophy of Mind, 2 April 2001. http://www.swif.uniba.it/lei/mind/forums/seager.htm
_____________
Forums Forum 2 Carruthers's reply
_____________

Dispositions and Consciousness
Commentary on Carruthers, Peter. Phenomenal Consciousness. Cambridge: Cambridge University Press, 2000.

William Seager

Department of Philosophy
University of Toronto at Scarborough, Ontario (Canada).
Homepage

Though my job and my aim is to criticize certain aspects of Peter Carruthers's theory of consciousness as presented in his Phenomenal Consciousness (2000), I want to begin by enthusing about Carruthers's book and the philosophical advance of which it is a part. One of the most important developments in the philosophy of mind over the last fifteen years or so is the rise of the representational theories of consciousness. Of course, some of the core ideas in this tradition can be traced back further than fifteen years! Notions of the essential "reflexivity" of conscious thought can be found in Kant, who perhaps took the idea from Leibniz, who clearly prefigures the representational account of consciousness where he says: "... it is well to make the distinction between perception which is the internal state of the monad representing external things, and apperception, which is consciousness, or the reflective knowledge of this internal state" (1714/1989, p. 208). A somewhat less historical figure is David Armstrong, whose account of consciousness as a kind of "self-monitoring" inner perception in A Materialist Theory of the Mind (1968) at least anticipates the modern representational theories of consciousness. Armstrong, almost echoing Leibniz, wrote: "consciousness is simply a further mental state, a state "directed" towards the original inner states" (p. 94). Significant developments in the representational account of consciousness that followed Armstrong include David Rosenthal's (1986) "Two Concepts of Consciousness" (Rosenthal has continued to develop and extend his theory in a number of influential papers), William Lycan's (1987) Consciousness (Lycan's theory is further developed in his 1996), and Gilbert Harman's "The Intrinsic Quality of Experience" (1990).

In 1995 a most remarkable event occurred: the publication of two books, written independently of each other, which espoused remarkably similar views on consciousness. There were Fred Dretske's Naturalizing the Mind and Michael Tye's Ten Problems of Consciousness. Both books argued that consciousness could be "reduced" to representation (in a certain sense) and that in particular the qualitative character of experience (or what philosophers call qualia) was itself a matter of representation. Their theories maintain that to experience red consciously is to represent a certain part of the environment (or imagined environment) as having a certain property; to experience pain is to represent a part of one's body (perhaps - in phantom limb pain for example - an "imaginary" part) as having a rather disagreeable kind of property. The fact that Tye and Dretske came to so many of the same conclusions once they had set the basics of their views seems to indicate that the representational theory of consciousness has a lot of content which is constraining its final form. Which is good, and rather unusual in philosophical constructions.

The work of Dretske and Tye, as my examples illustrate, made it clear there was a fundamental distinction within representational theories of consciousness: that between what Carruthers calls first order representation (FOR) based and higher order representation (HOR) based theories. Whereas Dretske and Tye opted for FOR theories, Rosenthal and Armstrong (not to mention Leibniz) opted for HOR theories. The debate between proponents of FOR and HOR theories is one of the most interesting and important issues in current philosophy of mind.

Both approaches have many virtues. Abstractly speaking, they promise to open a route towards the naturalization of mind and consciousness (thus perhaps solving the so-called hard problem), although this promise is dependent on the distinctly non-trivial task of providing an adequate naturalistic theory of representation and/or intentionality. Less abstractly, it seems these theories will also be able to integrate smoothly into the mainstream of cognitive science with its long standing emphasis on representation and the processing of representations. In addition to providing a strikingly novel approach to consciousness itself, the representational theories are a wellspring of potential insights into the nature of the mind with interesting, if sometimes contentious, things to say, for example, about introspection and its relation to consciousness, emotional consciousness, animal consciousness, and the role of the "theory-theory" account of our commonsense understanding of mind (here we see another interesting and more concrete integration with science, this time psychology).

Carruthers's book is a marvelous and wide-ranging critical introduction to the problem of consciousness from a representational theory point of view. One of the main reasons it provides such an incisive review of current views about consciousness is that Carruthers also develops and defends a highly original version of the representational theory of consciousness which boasts a remarkable degree of depth and precision. The book is so rich that I will be able to comment on only a few points within it and the fact that my discussion will be critical indicates only that I want to focus on a central theme about which I have some worries.

I think we can outline Carruthers's position, briefly and crudely, as follows. First, Carruthers prefers a HOR to a FOR theory of consciousness. The basic reason for this preference is that the FOR theory lacks the resources to distinguish, in a principled way that springs naturally from the underlying theory, conscious from non-conscious representations. This is a serious problem since given the kind of cognitive science inspired background theories of the mind which most FOR theorists would endorse there are inevitably going to be huge numbers of representations loose and active within any cognitive system, only a tiny fraction of which, presumably, are conscious. HOR theories avoid this problem by explicitly providing a mechanism by which only some representations become conscious, namely by being "taken up" into a higher order representation. This mechanism is well able to distinguish conscious from non-conscious representations without impairing the cognitive function of the non-conscious representations.

The HOR theories can also be divided up in various ways. One distinction is between higher order experience (HOE) theories and higher order thought (HOT) theories. Carruthers prefers the latter. Roughly speaking the reason is that our consciousness of the world is of the world rather than of the mind, or, as it is sometimes put, consciousness is transparent; we see right through our experience to the world beyond, so to speak. This is odd if our account of consciousness is based on our having quasi-perceptual experiences of inner states, for we might expect then that these experiences then have their own set of perceptible (to inner sense now) phenomenological qualities and, in all honesty, there is just no trace of any such qualities (or else in perception we are completely cut off from the world, never "observing" anything but our own sense data). Higher order thought theories avoid this problem simply because thoughts do not come with any intrinsic phenomenology but they can be about any kind of phenomenlogical state, or indeed any kind of state, whatsoever.

We need one more distinction to situate Carruthers's view, which is the distinction between actualist and dispositionalist theories. The actualist HOT theories define consciousness in terms of the actual occurrence of an appropriate higher order thought about a lower order state so that only those states are conscious which in fact bring about an occurrent higher order thought about them; the dispositionalist HOT theories require less for a lower order state to be a conscious state, namely that the lower order states be disposed to produce such higher order thoughts. But there is no requirement that they exercise their ability. I believe that the dispositional higher order thought theory of consciousness is original to Carruthers. It might seem odd to analyse the vivid, totally occurrent phenomenon of consciousness in terms of mere dispositions, but there are advantages. For example, the dispositionalist account offers - literally in this instance - a kind of economy in that it reduces the number of actual mental states which the brain has to produce and Carruthers uses that in an argument to the effect that evolution would be unlikely to build a brain that is given to the promiscuous proliferation of higher order thoughts for every aspect of our apparently exceedingly rich and varied conscious experience.

In any case, I don't want to consider the arguments which Carruthers advances against actualist HOT theory but instead raise an objection against two versions of the dispositionalist account, both of which have been advanced by Carruthers. The first is the reflexive account which first appeared in Carruthers's Language, Thought and Consciousness (1996) and from which Carruthers steps back in Phenomenal Consciousness. However, the present retreat is merely tactical so far as I can see and in fact Carruthers goes so far as to say that the reflexive account, which is described in some detail, may be an accurate description of the mind as it actually is (see pp. 271 ff.) It is just that the full featured reflexive account is not needed for the "restricted purpose" of the present volume - I have to put that in scare quotes seeing that this restricted purpose is nothing less than that of explaining consciousness. The difference between the old reflexive account and the non-reflexive account actually endorsed by Carruthers now depends upon yet another distinction. Whether or not one is a dispositionalist HOT theorist, one can wonder about the status, with respect to their consciousness, of the higher order thoughts. These thoughts can themselves either possess or not possess the disposition to produce yet higher order thoughts. The reflexive account asserts that the higher order thoughts do possess these dispositions; that is, it makes the higher order thoughts themselves conscious thoughts.

Now, we might pause here to ask whether it is so much as even possible for actualist HOT theories of consciousness to require that the higher order thought, call it T[x], which makes the lower order state x conscious, be itself a conscious mental state. The answer is apparently an obvious "no" since such a requirement would generate a vicious infinite regress of nested conscious states. Not only is it the case that it is phenomenologically plain that when I am conscious of some mental state, x, I am not also conscious of each of an infinite hierarchy of states T[x], T[T[x]], ..., T[...T[x]...], etc. but there must also be neurologically founded limitations on the number and complexity of thoughts that any of us can actually entertain at one time. But on the other hand, HOT theory cannot rule out the possibility of any particular higher order thought's being conscious since, generally speaking, it is certainly possible to become consciously aware of any particular lower order thought. Actualist HOT theory has no difficulty here. One can become conscious of the higher order thought, T[x], that makes x a conscious state if T[x] should happen to bring about the still higher order thought T[T[x]], but there is no requirement that this thought should occur to one in order for x to be a conscious state. Since it seems evident that we are often conscious without being conscious of being conscious but that sometimes we do enjoy such higher order consciousness, this would appear to be a required feature of any theory of consciousness.

Although the actualist account cannot allow that the consciousness conferring higher order states are themselves conscious states, it seems clear that the dispositionalist HOT theory can avoid the regress without denying consciousness to the higher order thoughts. So if such reflexivity was indeed a feature of consciousness this would be an significant advantage for a dispositionalist account. I doubt that such reflexivity could be a feature of consciousness however, just because only dispositional HOT theory can accommodate it. For, I suspect, the dispositional reflexive account suffers from a fatal flaw itself which is closely related to the regress problem which dooms an actualist reflexive theory. The objection is very simple.

Let us consider whether there is a limit, imposed by the finiteness and particularity of our cognitive architecture, on the complexity of nested thoughts we can entertain. It seems as certain that there are thoughts of the form "I am aware that I am aware that I am aware .... that P" which are sufficiently deeply nested as to be in fact entirely incomprehensible, given normal human cognitive capacities, as that there are numbers which are too big for my calculator to multiply. I suspect that this limit is in fact quite low - at least for me - as I have a good deal of difficulty being aware of just a few levels of awareness. This psychological difficulty or weakness, which I think everyone will readily agree is real, in fact shows that higher order thoughts are indeed more complex - require more mental machinery to entertain - than the lower order thoughts they are about.

Suppose, then and without loss of generality, that the level of nesting at which nested thoughts become unentertainable is n. Then it is easy to show that no thought of level n-1 could be conscious. For if it is impossible to entertain, because of inherent cognitive limitations, a nested thought of level n, it cannot be the case that any thought is apt to cause a nested thought of level n. (Any more than a certain kind of ill-made match could be disposed to light if struck if, because of inherent chemical conditions, no striking could raise the temperature sufficient to ignite it.) Obviously, if a level n-1 thought cannot cause a level n thought it cannot cause a level n conscious thought, and so, according to the dispositional conscious, or reflexive, HOT theory, the level n-1 thought cannot be conscious. But if no level n-1 thought could be conscious then no thought of level n-2 could be disposed to produce a conscious level n-1 thought (even if it might be disposed to produce a level n-1 thought). Hence, no level n-2 thought could be a conscious thought. This argument by "vicious descent" can clearly be generalized as far as necessary, with the disastrous result that no mental state can be conscious according to the reflexive HOT theory.

Biting, instead of dodging, the bullet, it might perhaps be replied that there is no level of nested thought which is impossible to entertain. It is true that the concept of thought, and that of the entertaining of thoughts, admits of no intrinsic, purely abstract, limitation in complexity. But the point here is that there is a natural limitation, imposed by the cognitive architecture implemented by the finite brain, to the complexity of entertainable thoughts. This limitation is based upon natural laws which reveal to us the range of possible dispositions which our neurological machinery can instantiate. The dispositional conscious HOT theory cannot avail itself of the mere abstract structural possibilities of thought, since it depends upon the actual dispositions inherent in the brains or minds we actually possess. The distinction between the abstract structure of thought and our actual dispositions to entertain thoughts appealed to here is of course reminiscent of Chomsky's between performance and competence. While competence - the abstract structural possibilities of language - is not limited by natural constraints, it would be ludicrous to claim that there was an actual human disposition either to produce or to understand sentences with, say, 1037 nested relative clauses.

I think the only conclusion that can be drawn is that the dispositional conscious HOT theory, or the reflexive theory, cannot be correct (at least for finite, real-world cognitive systems).

But even if this objection is granted, it does not touch the non-reflexive account which Carruthers officially advances in Phenomenal Consciousness. This theory is immune to the vicious descent argument which yields only the "theorem" that there is a level of nested thought of which we cannot be conscious (which is one level below the level at which we can no longer entertain thoughts of that complexity at all). Such cognitive limitations on consciousness seem entirely acceptable, not unexpected and in fact phenomenologically highly plausible.

Still, there is a somewhat more subtle difficulty which can be raised against the non-reflexive account and which I think is at least an interesting challenge to the dispositionalist approach in general and perhaps threatens to undercut it entirely.

In actualist HOT theory, if we can prevent the higher order thoughts from occurring we can prevent the lower order states from being conscious. Thus, for example, if we had some kind of machine, call it a "neural meddler", that interfered with the causal mechanisms which normally enable the lower order state to produce the higher order thought which makes the former conscious then we would have a "consciousness inhibitor". Of course, such mechanisms exist on either the actualist or dispositionalist versions of HOT theory. For example, Carruthers imagines (see pp. 228 ff.) that contents are sent from perceptual systems to a special short-term memory buffer which has access under certain conditions to the systems which realize conceptual thinking. This access is limited to the direct, non-inferential production of higher order thoughts about the current contents of this buffer. It is important that the production of the higher order thoughts be non-inferential to avoid strong objections based upon coming to know that one is in a lower order state in ways that intuitively do not lead to a state of consciousness (for example, if a reliable source simply informs one that one is in a certain mental state one will come to think a higher order thought about the original state but not in a way that seems to entail that the original state is a conscious state). But this issue does not matter for what follows here.

In the case of the dispositional HOT theories, the operation of a neural meddler is slightly less straightforward than in actualist theories. Part of the reason for this has to do with the nature of dispositions. Consider an ordinary match. It has a disposition to light if struck. Does it have this disposition in a vacuum, where it cannot light no matter whether or not it is struck? I'm not sure if there is a definite answer to this question, but I feel sure that if we meddled with the match itself - as opposed to altering the external circumstances - we could destroy the disposition. If, for example, we actually struck and lit the match and then blew it out we would thereby have a match that definitely lacked the disposition. More subtle methods of "disabling" the disposition are easily imagined - such as coating the match with some kind of irremovable wax. Similarly, if we meddle with someone's brain so that the lower order states are made incapable of causing the appropriate higher order states we have, as the phrase "incapable of causing" suggests, eliminated the disposition to cause higher order states. In the kind of "boxology" envisioned by Carruthers there are doubtless very complex neural processes mediating the relation between lower order states and the higher order thoughts which make the latter conscious states, and they could presumably be interfered with in a multitude of ways (some of which might even be invoked to account for certain of the bizarre deficits of consciousness, such as Capgras syndrome, blindsight, etc.). Thus a neural meddler would eliminate consciousness under the dictates of the dispositional HOT theory, in much the same way that consciousness would be eliminated by the prevention of the occurrence of higher order states under the dictates of the actualist HOT theories. Another way to put this point, in terms that Carruthers favours, is that the neural meddler would prevent the lower order states from being available to consciousness, and without this availability there can be no consciousness of those states (on either the dispositional or actualist theories).

But if this is so, dispositional HOT theories face a serious objection. To advance it I have to enter the realm of pure philosophical thought experiment but I think in a legitimate fashion. Consider, now, a modified neural meddler which blocks the disposition to cause higher order states only for those states and for those time periods when the lower order states would not, in fact, cause a higher order thought. (To continue the analogy with the match, we can imagine a device that somehow only coats with wax matches that are not actually going to be struck.) Such a meddler would be extremely difficult to produce in practice (not that the original meddler is exactly "off the shelf" machinery just yet!) since it requires an ability to know under what conditions a lower order thought will occur but will not actually cause a higher order thought. If we suppose that it is in principle possible to predict the operation of a brain at the neural level, then the information from such predictions could be fed into the meddler so that it would be active only for those lower order states which possess the disposition to bring about higher order thoughts about themselves but that are such that during the relevant time period are in fact not going to exercise this disposition and cause the higher order thought. Of course, the practical difficulty of developing such a meddler is irrelevant to the point of principle at issue here.

Now, a curious consequence follows. Let us take two people, the first with one of these modified neural meddlers implanted into to his or her brain and the second without, but who otherwise begin in identical neurological states (and in identical environments). Both of these people will have an identical history of higher order thoughts, since the meddler will never prevent a lower order state that actually was going to produce a higher order thought from causing that thought. They will also have identical histories of lower order mental states, for the meddler has no effect on these. Yet they will be markedly different in their states of consciousness, for the unfortunate person with the meddler will lack an entire set of conscious states enjoyed by the other - namely those that as a matter of fact do not produce any higher order thoughts (but - in the unmeddled brain - could have). This is a necessary consequence of the dispositional HOT theory, since it is explicitly designed to allow that states are conscious simply if they are able to produce higher order thoughts, not if they actually do produce those thoughts.
This consequence of dispositional HOT theory is not only curious, it is disturbing and highly implausible. There is absolutely no difference in the behaviour of our two individuals, no difference in their history of mental states and no difference in the neural events and processes which are occurring within them. There is nothing to mark the difference between the two of them except an entirely inert meddler. The meddler never has to actually function to produce this drastic alteration in consciousness. That is, two brains identical in their neural states and their dynamics will differ in consciousness solely because one has an inert piece of machinery within it! No such implausibility follows from occurrent HOT theory.

In a way, this objection is appealing to the feature of consciousness which I mentioned above. Consciousness is the example of something which is happening, which is, so to speak, totally occurrent. It seems impossible that in the face of the identical nature of the neural processes which are occurring in our two systems that the existence of consciousness could depend upon the purely counterfactual possibilities of differences in the system's behavioural capacities.

Perhaps the implausibility can be underlined if we imagine that the modified meddler is oscillating between being "off" - incapable of functioning - and "on" - capable of functioning, even though it never will. There will be a corresponding oscillation in consciousness (more conscious states when the meddler is disabled, fewer when it is enabled) which would presumably be very striking but in fact would be seemingly be completely unreportable by the subject despite being a huge difference in phenomenological experience.

There is a kind of "inverted" version of this objection. Consider a device that increases or boosts the aptitude of a lower order mental state to produce higher order thoughts (call it a boosting meddler - in terms of our match analogy, a boosting meddler might be something that modifies the match so that it will ignite at a lower temperature, thus increasing its aptitude to light when struck). Under actualist HOT theory, such a device would increase the number of conscious states inasmuch as it would increase the number of higher order thoughts that are actually brought about by lower order states. This effect would be apparent under dispositional HOT theory as well. But, as before, dispositional HOT theory permits a more subtle tampering with consciousness. There must be a distinction between those lower order states which do and those which do not possess the disposition to bring about higher order thoughts and so there must be some neurological basis for this difference. Thus we can imagine implanting a modified boosting meddler, which creates the aptitude to cause higher order states in lower order states which (1) would not otherwise have the disposition to bring about higher order states and which (2) even with the boosting meddler in place will never in fact exercise this disposition and actually cause a higher order thought. Again, this would require remarkable knowledge (and fore-knowledge) of how a particular neural system is going to work, but we can grant this knowledge in principle . (The analogous device in the match example would only modify those matches that are in fact not going to be struck. So the modified boosting meddler never makes any difference to which matches actually light or do not light.) So, even though there is absolutely no increase in the number of higher order thoughts, there is a striking increase in consciousness. Perhaps this result is less implausible than the previous one, insofar as in this case the meddler actually does some work, but it remains very implausible.

One reply that might be made on behalf of the dispositional HOT theory is to invent a distinction between dispositions so that only some of them are such as to yield consciousness. This would be a dangerous move however, since it would open this HOT theory to the same objection raised against FOR theories (such as Dretske's or Tye's), namely, to provide a principled distinction between those states which are and those which are not conscious. If the dispositional HOT theorist can say: conscious states are those which are properly disposed to produce higher order thoughts about them, then it seems the FOR theorists can with equal justice say that those representations are conscious which are of the proper form and function.

Perhaps there is a simple reply to these lines of objections which reveals a misunderstanding of the theory on my part, but they seem to me a serious foundational worry about the basic structure of the view. Carruthers's theory even so remains an intriguing and original addition to the growing range of representational theories of consciousness, which is the most exciting area in consciousness studies at the moment. Carruthers's book is now an essential part of the literature of this area.


References

Armstrong, David (1968). A Materialist Theory of the Mind, London: Routledge and Kegan Paul.

Carruthers, Peter (1996). Language, Thought and Consciousness, Cambridge: Cambridge University Press.

Carruthers, Peter (2000). Phenomenal Consciousness, Cambridge: Cambridge University Press.

Dretske, Fred (1995). Naturalizing the Mind, Cambridge, MA: MIT Press.

Harman, Gilbert (1990). "The Intrinsic Quality of Experience" in J. Tomberlin (ed.) Philosophical Perspectives, vol. 4, pp. 31-52.

Leibniz, G. W. (1714/1989). "Principles of Nature and Grace", in R. Ariew and D. Garber (trans. ed.) G. W. Leibniz: Philosophical Essays, Indianapolis: Hackett.

Lycan, William (1987). Consciousness, Cambridge, MA: MIT Press.

Lycan, William (1996). Consciousness and Experience, Cambridge, MA: MIT Press.

Rosenthal, David (1986). "Two Concepts of Consciousness," Philosophical Studies, 49, pp. 329-59.

Tye, Michael (1995). Ten Problems of Consciousness, Cambridge, MA: MIT Press.

 

 

_____________
© 2001 William Seager