Sunday, January 17, 2021

Counting Conscious States

Information is physical, which means there is a limit to the amount of information that can fit in a given volume.  If you try to cram more information into that volume, the physical mass of that information will literally collapse into a black hole.  That limit is called the Bekenstein bound and is truly a massive limit.  For instance, the total informational bound that could be contained in a volume the size of the human brain is around 10^42 bits, which means that the total number of possible brain states is around 2^(10^42).  The total informational capacity of the entire visible universe is on the order of 2^(10^120) states.

Why does this matter?  Physicalism (as contrasted with dualism) says that conscious states are produced by physical states; if a first conscious state is distinct from a second conscious state, then they must be produced by different physical states.  All of my papers (and most or all of my blog posts) so far have assumed physicalism is true, in part because anyone who doubts physicalism is usually condescendingly dismissed, ignored, or scoffed at by the scientific community, and in part because I don’t see why the Creator of the already ridiculously complicated universe would have omitted a physical explanation/mechanism for consciousness.  In other words, unless there is a reason to believe that consciousness does not entirely depend on underlying physical state, I see no need, for now, to reject physicalism.  Nevertheless, physicalism would be falsified if one could show that the number of distinct conscious states exceeded the number of physical states, because that would require that a single physical state could produce more than one distinct conscious state.

One avenue for evaluating physicalism, then, is to literally count distinct conscious states.  For example, if one could show that the total number of possible distinct conscious states by a particular person exceeded 2^(10^42), then that would prove that consciousness cannot depend (entirely) on the brain; if one could show that the number of possible distinct conscious states exceeded 2^(10^120), then that would literally falsify physicalism. 

A few years ago, Doug Porpora wrote a fascinating paper that attempted to prove that the total number of distinct conscious states is actually infinite.  One of his arguments, for instance, is that if we assume that there are some natural numbers that we cannot think about, then there must be a maximum number (call it Max) that we can think about and a minimum number (call it Min) that we cannot think about.  But if we can think about Max, certainly we can think about Max+1 or Max^2, which means that Max is not the maximum number we can think about and the original assumption (that there are some natural numbers that we cannot think about) is false.  A related argument is that by identifying the minimum number that we cannot think about (and even naming it Min), we are thinking about Min, which means that Min is not in the set of numbers that we cannot think about!  Again, the original assumption is false.  There is more to the argument than this but it gives you the general flavor of its proof-by-contradiction strategy.  One commenter has attempted to refute Porpora’s argument in this paper, and Porpora may be working on a reply. 

This got me thinking again about the importance of counting distinct conscious states, which very few people seem to have attempted.  Of course, if Porpora’s logical argument is correct, then physicalism is false because even though 2^(10^120) is a ridiculously and incomprehensively large number, it is still trumped by ∞.  But we should also realize that both of the quantities we are considering are extremes.  Infinity is extreme, of course, but so is the Bekenstein bound.

Let’s take a more realistic approach.  There are something like 100 billion neurons in the human brain.  If each neuron acts like a digital bit, then the total number of distinct brain states is 2^(100 billion).  Neurons are actually complex cells with very complicated connections to each other.  I don’t think any neuroscientist seriously regards neurons as acting in any way like digital bits.  However, I do think it is interesting to ask the question of whether or not the number of distinct conscious states exceeds 2^(100 billion).  If there were a way to answer that question – by somehow counting conscious states – then it would do a couple things:

·         Assuming physicalism is true, discovering that the number of distinct conscious states exceeds 2^(100 billion) would confirm that the brain is not a digital computer with neurons acting as digital bits.

·         It would provide a methodology for counting conscious states that may provide further insights about the physical nature of consciousness.

On that note, let me suggest such a method.  First, let me start with the notion of one stimulus “frame,” which is the particular collection of physical stimuli that one might detect through the five senses at any given moment.  Let’s assume that there are N consciously distinct (frames of) stimuli.  What I mean by that is that there are N different combinations of stimuli from the person’s senses that the person would be able to distinguish.  Consider these different sets of stimuli:

·         Watching a sunset while hearing crashing waves while tasting white wine while smelling the salty ocean while feeling sand under one’s feet;

·         Watching a sunset while hearing crashing waves while tasting red wine while smelling the salty ocean while feeling sand under one’s feet;

·         Watching a sunset while hearing seagulls while tasting white wine while smelling the salty ocean while feeling sand under one’s feet.

If we actually took the time to list them, we could certainly produce a very, very long list of consciously distinct stimuli.  Some of them might differ very subtly, such as two stimuli that are identical except for the temperature of the sand differing by one degree, or a slight difference in sound frequency distribution from the seagulls, or a slight but perceptible difference in the cloud distribution above the sunset.

What matters, in enumerating consciously distinct stimuli, is whether a person could distinguish them, not whether he actually does.  If he could distinguish two stimuli, either by consciously noticing the difference or simply having a (slightly) different conscious experience based on the difference, then that difference must be reflected in the underlying physical state.

So how many such distinct stimuli are there?  Lots.  One could certainly distinguish millions of different visual stimuli, many thousands of different sounds and tactile sensations, and at least hundreds of different tastes and smells.  This is a ridiculously conservative claim, of course; there are professional chefs, for example, who can probably differentiate millions of different tastes and smells.  On this very conservative basis, there are probably far, far more than 10^18 (around 2^60) distinct stimuli for any given person.  If there were only 10^18 distinct conscious states or experiences, then in principle it would only require about 60 bits to specify them. 

However, history matters.  Conscious experience does not depend just on one’s stimuli in the moment, but also on prior stimuli (as well as prior conscious experience).  To specify a person’s conscious experience, it is not enough to specify his current stimuli, as his experience will also depend on past stimuli.  For example, imagine the different conscious experiences at time t1:

Case A – No significant change from t0 to t1:

t0: Watching a sunset while hearing crashing waves while tasting red wine while smelling the salty ocean while feeling sand under one’s feet.

t1: Watching a sunset while hearing crashing waves while tasting red wine while smelling the salty ocean while feeling sand under one’s feet.

Case B – Significant change from t0 to t1:

t0: Watching a sunset while hearing crashing waves while tasting white wine while smelling the salty ocean while feeling sand under one’s feet.

t1: Watching a sunset while hearing crashing waves while tasting red wine while smelling the salty ocean while feeling sand under one’s feet.

The stimulus at t1 is the same in both cases, but the conscious experience would clearly be different.  In Case A, the person may simply be enjoying the surroundings, while in Case B, he may be confused/surprised that his wine has suddenly changed flavor and color.

What that means is even if the information necessary to specify the particular stimulus at time t1 is 60 bits, that information is not sufficient to specify the person’s conscious experience at that time.  In other words, history matters, and instead of just counting the number of possible distinct stimuli, we need to consider their order in time. 

So, for N consciously distinct stimuli, let’s assume that one’s conscious experience/state at a given time is sensitive to (i.e., depends on) the time-ordering of M of these stimuli.  The total number of possible states, then, is just the permutation N!/(N-M)!, but assuming that N>>M, this total number of states ≈ N^M.  

So in the above example, the number of possible physical states necessary to allow the person to consciously distinguish Case A from Case B is not N, but N^2.  If N requires, say, 60 bits of information, then at least 120 bits are required to specify his conscious state at time t1.  But of course the situation is far worse.  We can imagine a series of ten consecutive stimuli, ending at time t9, which the person would consciously experience in a manner that depended on all ten stimuli and their order.  It makes no difference whether the person actually remembers the particular stimuli or their order of progression.  As long as he has a conscious experience at t9 that is in some (even miniscule) manner dependent on the particular stimuli and their order, then that conscious state is one of at least N^10 states, requiring at least 600 bits to specify.

Now note that his experience at t9 is a unique one of at least N^10 states, just as his experience at later time t19 is a unique one of at least N^10 states, and so forth until time t99.  But if his conscious experience at time t99 is sensitive to the ordering of his conscious experiences at t9, t19, t29, etc., then the conscious state at t99 is one of at least N^100 states, requiring at least 6000 bits to specify.  Once again, this analysis has nothing to do with whether the person remembers any specifics about his prior stimuli or experiences; all that matters is that his conscious experience at t99 depends to some degree on the ordering of experiences at t9, t19, etc., and that his experience at t9 depends to some degree on the ordering of stimuli at t0, t1, etc.

It’s easy to show, then, that the total number of possible conscious states is N^T, where T is the total number of individual “frames” of stimulus that one experiences over his life.  How many is that?  Well, 100 years is about 3 billion seconds, and we certainly experience more than one “frame” of stimulus per second.  (Otherwise, TVs would not need a refresh rate of around 30 frames/second.)  So, for 10 frames/second, we might estimate the total number of possible conscious states at about N^(30 billion).  If N is 2^60, then the total number of conscious states is 2^(1.8 trillion), requiring at least 1.8 trillion bits to specify.

I find it fascinating how close this is to the number of neurons (100 billion) in the human brain.  For extremely rough back-of-the-envelope calculations like this, an order or two of magnitude is certainly “close.”  The storage capacity of the human brain has been estimated somewhere in the tens to thousands of terabytes, and once again the above rough estimate is within a couple of orders of magnitude of this amount.

What this tells me is that this method of counting distinct conscious states is viable and potentially useful and valuable.  By getting better estimates for the number of stimuli that a person can distinguish, for example, we might find that the rough estimate above (≈ trillion bits) is far too high or far too low, which could then provide insights on our understanding of the brain as: a computer; a digital computer; a digital computer with neurons acting as bits; and the independent source of consciousness.  Of course, such an analysis will never get us anywhere near the Bekenstein bound or infinity, as addressed by Porpora’s paper, but I still think we can learn interesting and important things about the physical nature of consciousness by counting distinct conscious states.

Finally, I think the above analysis hints at something fundamental: that consciousness is history-dependent.  This is something I discuss at length in my paper on the Unique History Theorem, but the above arguments suggest a similar conclusion by a very different analysis.  If one’s conscious experience at time t99 depends to some degree on his experience at t98, which in turn depends on his experience at t97, and so on back, then it may not be possible to produce a person de novo in a particular conscious state C1 who has not already experienced the particular sequence of conscious states on which state C1 depends.

In any event, I think it makes sense to seriously consider and estimate the number of potentially distinct conscious states, taking into account a human’s sensitivity to different stimuli and the extent to which ordering of stimuli affect conscious states.  I think this approach could yield potentially fascinating knowledge and implications about the brain and the physical nature of consciousness.

6 comments:

  1. What if the human brain's neurons were able to pack billions of bits of information in a more condensed form- like how a zip drive can condense gigabytes into megabytes? (not sure if I'm saying this correctly, but I just wanted to throw that out there). Would that potentially make your argument against physicality more difficult?

    ReplyDelete
    Replies
    1. Fascinating question. I think you are referring to information compression. When a song is digitized, for example, what happens is that it is chopped up into a bunch of short time segments (what we might call “frames” in a video), and each of those segments is chopped up into tiny frequency ranges, etc. A typical 3-minute song, based on typical digital sampling, might have an uncompressed file size of maybe 50 MB. Now, the software that produced the digital file could have sampled the original song at a much higher rate, producing an uncompressed size of maybe 500 MB, but nobody on the planet would be able to distinguish these two songs. On the flip side, much of the information in the 50 MB song can be erased without affecting its quality relative to human conscious experience because few (if any) humans would be able to detect or distinguish a difference. That’s why an MP3 song might only be 5 MB yet be virtually indistinguishable from the 50 MB version.

      That’s why my analysis in this blog post repeatedly mentions “distinct” conscious states and “distinct” stimuli. In other words, the notion that a conscious stimulus requires at least 60 bits is already ridiculously conservative. According to this assumption, at 10 distinct “frames” per second, a 3-minute conscious experience (including stimuli from all five senses) could be encoded in 108 kb (or 13.5 kB). But if research shows that it takes a couple of megabytes to encode a distinct 3-minute version of just audio information, then clearly my estimate of 60 bits to encode a distinct conscious stimulus is orders of magnitude too low/conservative. In other words, by only considering "distinct" stimuli and conscious states, the above analysis already addresses information compression.

      Delete
  2. Just curious – you say that the total informational bound that could be contained in a volume the size of the human brain is around 10^42 bits. What’s the source of that number?

    ReplyDelete
    Replies
    1. I took it from the Wikipedia entry on Bekenstein bound. It's a pretty straightforward calculation that is related to why a single bit of information increases the area of a black hole's event horizon by one Planck area. But the Bekenstein bound is both theoretical and extreme. It's a bit like saying that "the speed limit on Route 66 is the speed of light."

      Delete
  3. Hi Andrew! Thanks for the shout out. I definitely agree with you that consciousness is history-dependent. In fact one thing we are conscious of is our history. The problem with counting thoughts is the intensionality of the mental (yes, with an s), which means that in contrast with sentences about physical things, the truth of sentences about mental behavior is not necessarily preserved by the substitution of co-designative terms. That mouthful means something simpler. Although Samuel Clemens and Mark Twain are co-designative terms, meaning they designate the same thing, a person may believe that Mark Twain wrote Huckleberry Finn without believing that Samuel Clemens did. So are those two beliefs or one just described differently? And if two, then is their a third belief that the the two are the same? Another issue is belief dispositions. Do you believe that Napoleon had lungs? Well of course you do, though you likely never thought about it before. How many thoughts are like that? You believe things that are not necessarily stored in your brain. I doubt any of us, including me, had a belief stored in our brain about Napoleon's lungs and other stored beliefs about Napoleon's blood, etc. This actually sounds like a whole different argument against the reduction of the mental to the physical. Thanks for the blog. Doug

    ReplyDelete
    Replies
    1. Doug... you always keep me thinking!! In the above analysis, I focused primarily on consciously distinct stimuli... if it’s possible to distinguish two nearly identical sensory stimuli, then the underlying physical states must be different. One thing your comments clarify for me is that we might conceivably have distinct conscious states/experiences even with the exact same sensory stimuli, and there could be lots of them. Imagine I am experiencing a particular set of stimuli described above (e.g., “watching a sunset while hearing crashing waves while tasting red wine while smelling the salty ocean while feeling sand under my feet”). Could I not also be wondering whether Samuel Clemens is Mark Twain, or whether Napoleon had lungs or fingernails or whatever? And whether the number of thoughts I might have is actually infinite (as you argue in your paper), we can both agree that the number is very large. As long as those thoughts are distinct, then the physical state that encodes my conscious state at that moment must be adequate not just to distinguish prior stimulus “frames” (as I argue above, requiring information of at least N^T), but must also distinguish possible conscious frames over and above these stimulus frames. I’ll need to think some more about this but we may be on to something!

      Delete

All comments are moderated. After submitting your comment, please give me 24 hours to approve. Thanks!