[HN Gopher] Temporal circuit of brain activity supports human co...
       ___________________________________________________________________
        
       Temporal circuit of brain activity supports human consciousness
        
       Author : hhs
       Score  : 246 points
       Date   : 2020-04-09 16:16 UTC (6 hours ago)
        
 (HTM) web link (advances.sciencemag.org)
 (TXT) w3m dump (advances.sciencemag.org)
        
       | naasking wrote:
       | The anti-correlated behaviour of these two networks, and even
       | their default mode vs. attention functions, reminds of the
       | attention schema theory of consciousness [1].
       | 
       | Specifically, the attention schema theory posits that some
       | constant back and forth signal switching between internal and
       | external models of the world results in the illusion of
       | subjective awareness, in an analogous manner to how task
       | switching provides the illusion of parallelism on single-core
       | CPUs.
       | 
       | [1]
       | https://www.frontiersin.org/articles/10.3389/fpsyg.2015.0050...
        
         | ryeights wrote:
         | I can see how that would lead to an illusion of _continuous_
         | subjective awareness, but I don 't think it supports the notion
         | that subjective awareness is entirely illusory. I think
         | therefore I am, the existence of qualia [1], etc.
         | 
         | [1]: https://en.wikipedia.org/wiki/Qualia
        
           | naasking wrote:
           | "I think therefore I am" assumes the conclusion. "This is a
           | thought, therefore thoughts exist" is the valid, non-circular
           | version.
           | 
           | The attention schema theory addresses the specific problem of
           | how we apparently infer first-person subjective facts when no
           | such concept exists in physics, the latter of which consists
           | entirely of third-person objective facts. The answer is that
           | we erroneously conclude that the facts we perceive are first-
           | person, but this perception is a sensory trick, similar to an
           | optical illusion.
           | 
           | The question of qualia is larger than this specific question,
           | but subjectivity was probably an important problem to
           | overcome for a materialist explanation of consciousness.
           | Dennett has long held that what we call "consciousness" is
           | very likely a bunch of distinct phenomena that all get
           | muddled together, and the fact that we have started to pick
           | it apart hints suggestively that he was right.
        
             | hackinthebochs wrote:
             | >"I think therefore I am" assumes the conclusion.
             | 
             | But the thought is self-referential. The thought is about
             | itself thinking, and so the thought instantiates the
             | sufficient case for a subject. And so there is no question-
             | begging.
             | 
             | >The answer is that we erroneously conclude that the facts
             | we perceive are first-person
             | 
             | But as long as these facts are presented or represented as
             | being first-person, the sufficient case for first-person
             | acquaintance has been established. Whether these first-
             | person facts are ultimately grounded in third-person
             | descriptions or phenomena doesn't make them illusions.
        
               | naasking wrote:
               | > But the thought is self-referential. The thought is
               | about _itself thinking_ , and so the thought instantiates
               | the sufficient case for a subject.
               | 
               | You just assumed the existence of a subject again. Where
               | is the proof that a thought requires a subject? One that
               | isn't vacuous or doesn't just assume its own conclusion?
               | 
               | > Whether these first-person facts are ultimately
               | grounded in third-person descriptions or phenomena
               | doesn't make them illusions.
               | 
               | It does for the technical purposes of the consciousness
               | debate. The terminology we're using, like "illusion", has
               | a technical meaning for the debate between materialists
               | and antimaterialists, wherein antimaterialists argue that
               | a first-person fact _cannot_ be reduced to third person
               | facts, even in principle.
               | 
               | Obviously even materialists speaking informally would
               | still use first person language and speak normally about
               | their experiences.
        
               | hackinthebochs wrote:
               | >Where is the proof that a thought requires a subject?
               | 
               | I'm not sure what you're asking. If the thought is self-
               | referential, then the subject is inherent in the self-
               | reference of the thought, namely the thought itself. I am
               | not assuming some kind of conscious subjectivity here.
               | Merely that the content of the thought is instantiated in
               | the case of self-reference. If this were not the case,
               | then the thought could not self-reference.
               | 
               | >has a technical meaning for the debate between
               | materialists and antimaterialists
               | 
               | I'm familiar with the usual suspects in this debate (e.g.
               | Dennett, Frankish), and I don't find their usage of
               | illusion particularly "technical". They use it to mean
               | that phenomenal consciousness doesn't exist or isn't
               | real. But its this very usage that I take issue with.
        
             | ardy42 wrote:
             | > The answer is that we erroneously conclude that the facts
             | we perceive are first-person, but this perception is a
             | sensory trick, similar to an optical illusion.
             | 
             | I'm extremely skeptical of answers that involve labeling
             | difficult challenges to a theory as "illusions."
        
               | naasking wrote:
               | > I'm extremely skeptical of answers that involve
               | labeling difficult challenges to a theory as "illusions."
               | 
               | So calling [1] an optical illusion warrants skepticism
               | because it's attempting to dismiss the challenge of
               | having to explain how water can physically break and
               | magically reconstitute pencils? Don't you see the problem
               | with this sort of argument?
               | 
               | The point is that integrating all of our knowledge leaves
               | no room for first person facts. Additionally, every time
               | we've tried to ascribe some unique or magical property to
               | humans or life (like vitalism), we've been flat out
               | wrong. No doubt there are plenty of challenges left to
               | resolve in neuroscience, and no one is claiming that a
               | materialist account of qualia is unnecessary.
               | 
               | [1] http://media.log-in.ru/i/pencilIn_in_water.jpg
        
               | ardy42 wrote:
               | > So calling [1] an optical illusion warrants skepticism
               | because it's attempting to dismiss the challenge of
               | having to explain how water can physically break and
               | magically reconstitute pencils? Don't you see the problem
               | with this sort of argument?
               | 
               | Yeah, but that's a bit of a straw man.
               | 
               | The kinds of claims-of-illusion that warrant particular
               | skepticism are the ones that deny fundamental
               | observations in defense of some particular (usually
               | sectarian, for lack of a better word) philosophical
               | perspective.
        
             | md224 wrote:
             | > we erroneously conclude that the facts we perceive are
             | first-person
             | 
             | this doesn't make sense... the perceiving itself is what
             | makes something first-person, not the object of perception
        
               | naasking wrote:
               | That's not the technical meaning in the consciousness
               | debate. I invite you to read about Mary's room for an
               | introduction to the distinction I was describing.
        
               | md224 wrote:
               | Oh okay... I thought you were denying the reality of
               | first-person subjective experience as commonly
               | understood, not a narrowly-defined technical term. That
               | seems more reasonable then (though also less
               | interesting).
        
               | naasking wrote:
               | It's still pretty interesting! Mary's room thought
               | experiment is pretty short and simple, but it'll make you
               | think pretty hard.
        
             | david_w wrote:
             | naasking, as I see it, Dennett (in Consciouness Explained)
             | engages in a sleight of hand. He redefines consciouness as
             | "a bunch of distinct phenomena that gets muddle together",
             | but that doesn't touch the mystery of qualia, it tries to
             | just deny that there is anything to explain, owing to the
             | fact (Dennett claims) that the problem is mischaracterized
             | from the start.
        
               | naasking wrote:
               | And your mind plays sleight of hand all the time, which
               | Dennett clearly establishes in his work. Or do you
               | actually see the physical blind spot that's a fundamental
               | feature of your eyeball?
               | 
               | So why would you trust your direct perception over the
               | mountains of evidence that clearly demonstrates that we
               | can't trust our perceptions?
        
               | david_w wrote:
               | No literate scientifically minded person disputes your
               | point, but it doesn't address my point. My point is this-
               | qualia as a phenomena exists. Even if I think a red thing
               | is blue, I am still experiencing some color and the
               | experiencing itself- aside from its accuracy- is what
               | needs explaining.
               | 
               | So experience, aka qualia as a phenomena unto itself is
               | in need of explaining, not any particular qualia and not
               | the presence of absence of any correlation between the
               | qualia and objective reality, i.e. the "truthfullness" or
               | "accuracy" of the qualia.
        
               | goatlover wrote:
               | If we can't trust our perceptions, then there is no
               | mountain of evidence to say the mind is playing a trick
               | on us regarding consciousness. That's because the
               | scientific evidence is empirical, which is knowledge
               | based on perception. Dennett's argument risks undermining
               | the foundation for scientific knowledge.
        
         | voxl wrote:
         | Calling it an illusion is not interesting. We define
         | consciousness and subjective experience to match the very
         | experience we understand.
         | 
         | There is literally no way for it to be an illusion, the
         | definition itself precludes it. No matter how consciousness and
         | subjective experience are implemented in the hardware of our
         | brains, it is still a concept that we use to describe the
         | experience, and the experience is real no matter what.
        
           | Trasmatta wrote:
           | Consciousness is basically the only thing we can conclusively
           | say is NOT an illusion, right?
        
         | goatlover wrote:
         | Does this mean the experience of pain, sound, color, etc. are
         | illusions?
        
           | LASR wrote:
           | What does it mean to be an illusion?
        
           | naasking wrote:
           | They are an illusion in precisely the same way your car or
           | your day job are illusions: we don't admit any notion of
           | qualia/cars/day jobs into our ontology of physics, and of
           | qualia must ultimately be explained by appealing only to our
           | ontology.
        
             | hackinthebochs wrote:
             | >the same way your car or your day job are illusions
             | 
             | But what work is calling these things illusions doing for
             | you? That they're not fundamental units of the furniture of
             | the universe doesn't mean they don't exist or play
             | necessary causal or explanatory roles.
        
               | naasking wrote:
               | > But what work is calling these things illusions doing
               | for you?
               | 
               | It serves to help distinguish that which is reducible to
               | more fundamental ontological entities, from that which is
               | irreducible and thus ontologically fundamental.
               | 
               | I agree that these concepts certainly fulfill _useful_
               | causal and explanatory roles. Whether they are
               | "necessary" has some room for debate.
        
           | jimmaswell wrote:
           | You can only really "experience" the model your brain makes
           | of the world, not the world directly. You end up assuming the
           | world exists and there's no Descartes' Demon.
        
             | jkhdigital wrote:
             | Right... we are just colonies of cells which behave in
             | programmed ways to generate predictable responses from
             | other cells in other parts of the colony who have their own
             | jobs to perform in maintaining the homeostasis of the
             | colony. The illusion of self is useful because it ensures
             | that the collection of cells entrusted with executive
             | functions act in the interest of the entire colony by
             | perceiving it as a unified whole.
        
             | someguyorother wrote:
             | Another way to put it: the experience is real, but it might
             | be misleading.
             | 
             | A hallucination is an experience of something that doesn't
             | exist, but the experience itself does.
        
         | codeulike wrote:
         | Fascinating, an actual idea about consciousness that I havent
         | heard before
        
         | hirundo wrote:
         | So psuedo basic for the consciousness algorithm:
         | 10 look at world       20 look at my reaction to world       30
         | goto 10
         | 
         | Which generates consciousness like frames per second generates
         | motion. Or like the colored lines over this black and white
         | photo generate a color image:
         | 
         | https://twitter.com/SteveStuWill/status/1248000332027715584/...
        
           | goatlover wrote:
           | Except without the frames and colored lines. Lines 10 and 20
           | don't provide the experiences. They're just behavior. Somehow
           | all the sensations have to be added in when looking at the
           | world and looking at one's reaction to the world.
        
             | hateful wrote:
             | let experience$ = INKEY$
        
           | AndrewKemendo wrote:
           | Taking this further, I suppose you could consider persistence
           | of vision [1] as analogous to consciousness.
           | 
           | It's an artifact of the limitations of the system.
           | 
           | [1] https://en.wikipedia.org/wiki/Persistence_of_vision
        
             | anticensor wrote:
             | Persistence effect has to exist in one way or another to
             | get real-time self awareness; if it were not an input
             | alternation, it would exist at the reasoning level.
        
       | akozak wrote:
       | > Here, we conservatively use the term "unresponsiveness" instead
       | of "unconsciousness" to allow for the possibility that covert or
       | disconnected consciousness could occur in the absence of
       | behavioral response.
       | 
       | Conservatism is very wise! Given what they say in that quote, I'm
       | very confused why they think it's justified in the intro to
       | suggest they've identified 2 systems responsible for
       | consciousness. Shouldn't they replace every use of the word
       | "consciousness" with "responsiveness"? They're relying on a
       | purely behavioral understanding of consciousness
       | 
       | Descartes famously thought that consciousness lived in the pineal
       | gland, and similar arguments has tended to generate some well
       | deserved criticism from philosophers of mind. Pointing at a
       | physical thing and saying it's the source of conscious experience
       | should come with pretty extraordinary evidence.
        
         | hackinthebochs wrote:
         | >They're relying on a purely behavioral understanding of
         | consciousness
         | 
         | They're relying on the fact that consciousness has physical
         | manifestations in behavior. The alternative is
         | epiphenomenalism. While it may be a philosophically interesting
         | position, its useless scientifically and so its fair to assume
         | consciousness has some physical artifacts in a scientific
         | context.
        
           | akozak wrote:
           | I don't think we're forced to choose between behaviorism and
           | epiphenomenalism.
           | 
           | But my point is more internal to the paper. They make claims
           | about the physical basis for consciousness and seem to
           | believe they've generated evidence for it, but they also
           | explicitly say they've only gathered evidence about
           | responsiveness.
           | 
           | EDIT: To be clear I'm objecting to the semantics (which I
           | consider important), not the potential value of the research.
        
       | shireboy wrote:
       | ELI5 "anti-correlated" here? What I envision is the temporal
       | circuit acting like a computer clock, and the other as
       | input/output. But "anti correlated" makes it sound like that's
       | not the case?
        
         | cjhveal wrote:
         | Not ELI5 exactly, but the article does a pretty good job of
         | explaining in the first paragraph:
         | 
         | > The default mode network (DMN) is an internally directed
         | system that correlates with consciousness of self, and the
         | dorsal attention network (DAT) is an externally directed system
         | that correlates with consciousness of the environment... the
         | DMN and DAT appear to be in a reciprocal relationship with each
         | other such that they are not simultaneously active, i.e., they
         | are "anticorrelated."
         | 
         | The "temporal circuit" the paper describes is the neural
         | architecture that facilitates the transitions between these two
         | networks.
        
           | TeMPOraL wrote:
           | Your description parsed back into electronics-land sound to
           | me like: "temporal circuit" is a clock signal, DMN ticks on
           | raising edge, DAT ticks on falling edge.
        
           | abvdasker wrote:
           | Totally fascinating. It makes me wonder what kind of
           | dysfunction would result from both the DMN and DAT being
           | active at the same time, and especially what my subjective
           | experience of that would be if it were happening to me.
        
             | Mithriil wrote:
             | I may have found a partial answer to that, or at least a
             | track to explore. I read some research articles from Robin
             | Carhart-Harris on psylocibin/psylocin ("magic mushrooms")
             | last year. The effect of psylocibin on the Default-Mode
             | Network (DMN) seemed to be a critical part of his research,
             | and so I searched if there was also some observations on
             | the antiphasic nature of the DMN and the Dorsal Attention
             | Network (DAN), and I found something rather interesting
             | [1]:
             | 
             | "The following example may help to illustrate what is meant
             | by competition between conscious states--and the loss of it
             | in primary consciousness. Functional brain imaging has
             | identified distinct brain networks that subserve distinct
             | psychological functions. For example, the DMN is associated
             | with introspective thought and a dorsal frontoparietal
             | attention network (DAN) is associated with visuospatial
             | attention and is a classic example of a "task positive
             | network" (TPN)--i.e., a network of regions that are
             | consistently activated during goal-directed cognition. If
             | the brain was to be sampled during a primary state (such as
             | a psychedelic state) we would predict that the rules that
             | normally apply to normal waking consciousness will become
             | less robust. Indeed, we recently found this to be so when
             | analysing the degree of orthogonality or "anti-correlation"
             | between the DMN and TPN post-psilocybin. __Post-drug there
             | was a significant reduction in the DMN-TPN anticorrelation,
             | consistent with these networks becoming less different or
             | more similar (i.e., a flattening of the attractor
             | landscape). __The same decrease in DMN-TPN anticorrelation
             | has been found in experienced meditators during rest
             | (Brewer et al., 2011) and meditation (Froeliger et al.,
             | 2012). Moreover, decreased DMN-TPN inverse coupling is
             | especially marked during a particular style of meditation
             | referred to as "non-dual awareness" (Josipovic et al.,
             | 2011). This is interesting because this style of meditation
             | promotes the same collapse of dualities that was identified
             | by Stace (and Freud) as constituting the core of the
             | spiritual experience. The DMN is closely associated with
             | self-reflection, subjectivity and introspection, and task
             | positive networks are associated with the inverse of these
             | things, i.e., focus-on and scrutiny of the external world
             | (Raichle et al., 2001). Thus, it follows that DMN and TPN
             | activity must be competitive or orthogonal in order to
             | avoid confusion over what constitutes self, subject and
             | internal on the one hand, and other, object and external on
             | the other. It is important to highlight that disturbance in
             | one 's sense of self, and particularly one's sense of
             | existing apart from one's environment, is a hallmark of the
             | spiritual (Stace, 1961) and psychedelic experience
             | (Carhart-Harris et al., 2012b)."
             | 
             | [1] https://www.frontiersin.org/articles/10.3389/fnhum.2014
             | .0002...
        
             | lifty wrote:
             | I am not sure if this is accurate but intuitively, in
             | strong psychedelic experiences it feels that both the DMN
             | and DAT are active at the same time, which leads to, among
             | many other things, a clearheaded view of mental processes
             | that are hard to observe otherwise. One example would be
             | observing emotions and how they affect your state of mind,
             | while at the same time being totally detached from them.
             | Some studies [1] propose that this happens because of an
             | increase in connectivity between various parts of the
             | brain, which could also be the thing that leads to ego
             | dissolution.
             | 
             | [1] https://www.cell.com/current-
             | biology/fulltext/S0960-9822(16)...
        
         | hoorayimhelping wrote:
         | > _We demonstrate that the transitions between default mode and
         | dorsal attention networks are embedded in this temporal
         | circuit, in which a balanced reciprocal accessibility of brain
         | states is characteristic of consciousness._
         | 
         | I read it is as the temporal circuit manages the anti-
         | correlated relationship between the two networks to produce
         | consciousness. Almost like the temporal circuit is a function
         | whose goal is to return 1 given two anticorrelated inputs that
         | add up to about 1 and the computation to arrive to the solution
         | is what consciousness is. Weird unresponsive stuff happens when
         | the sum of the inputs is above 1.
         | 
         | Consciousness is an emergent side effect of trying to keep two
         | input systems synchronized
        
         | jonnycomputer wrote:
         | In general activity in the DFN decreases when a person is
         | engaged in a task; the more demanding the task, the more the
         | decrease. The attention and salience networks do the opposite.
         | A classic experiment in this case would be to contrast BOLD
         | signal in an fMRI experiment between easy and hard blocks, or
         | between active periods and rest periods. The general
         | observation about the anti-correlation of the DFN and task-
         | activated networks is a very robust result, seen over and over
         | again.
        
       | bronlund wrote:
       | That Tiphareth is the natural consequence of Geburah is ancient
       | news. A little bit late to the party there :P
        
       | stcredzero wrote:
       | Language is a Virus. Consciousness is an OODA Loop. (You could
       | write an alternative set of lyrics to the Laurie Anderson song.)
        
       | jkhdigital wrote:
       | Man, I love seeing more research like this. My personal
       | experience, as someone who has dealt with a variety of issues in
       | psychiatrists' offices and rehabilitation rooms, is that a clear
       | scientific understanding of what is actually going on under my
       | skull provides a much firmer basis for any therapeutic approach.
       | I'm really hoping that in my lifetime we will see connections
       | made between the physiological elements of consciousness and the
       | modern psychiatric plagues of depression, anxiety, and addiction
       | that finally produce the targeted, universally effective
       | therapies that are desperately needed.
        
         | trub wrote:
         | psychedelic plant medicine is highly effective
        
         | kohtatsu wrote:
         | I was recently talking to a registered psych nurse, and we got
         | talking about Cognitive Behavioural Therapy.
         | 
         | I believe I've been self administering a form of it for a
         | couple years, and I summarized my understand of CBT as "moving
         | more thinking from the amygdala to the prefrontal cortex", and
         | she confirmed that with "in laymen's terms; yes".
         | 
         | It's not like the fields are completely isolated, I guess is
         | what I'm getting at with that anecdote. It's hard to go from
         | neuroscience to psychology, but that's always being looked at.
         | I reckon most big advancements will start coming when we start
         | understanding the connectome more, but it's not like all
         | advancements will come from there, and it's not like people
         | aren't working right now to bridge neuroscience and psychology.
         | 
         | Also I want to hang out with the laymen she does.
        
           | curo wrote:
           | I'm not qualified to contest this, but I do remember a side
           | blurb from "Principles of Neuroscience" (Kandel, Schwartz,
           | Jessell) that said overactive mPFCs are attributed with
           | autism and below is some more research on it.
           | 
           | I don't think you're saying this overtly but I have seen
           | people from the Thinking Fast & Slow crowd glorify their PFCs
           | as arbiters of cognitive bias while forgetting that healthy
           | social, emotional processing required integrated functioning
           | between all neural correlates involved, including the
           | amygdala. I would venture to guess CBT is effective because
           | it stops overactive PFCs which is the opposite of what the
           | nurse's guess is. But as a laymen here, I can't say one way
           | or the other.
           | 
           | I remember a decade or two ago, the ACC+vmPFC combo was
           | getting a lot of praise as this balancing force between the
           | dlPFC and the amygdala saying strong ACC+vmPFC could be the
           | clue to healthy brains. I think the answer will always be,
           | "hey all these parts are important. Just meditate, exercise,
           | and eat right. And don't believe your thoughts too much
           | (CBT)"
           | 
           | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5192959/
           | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4688328/
        
           | rhcom2 wrote:
           | Do you mind sharing your approach to how you make that shift
           | from "amygdala to the prefrontal cortex"? Is it similar to
           | practicing mindfulness with a focus on the now?
        
             | mettamage wrote:
             | No, prefrontal cortex is about planning, reasoning and
             | inhibiting emotions (we do it all the time without
             | realizing) among a lot of other things. Amygdala is based
             | on reacting to fear, among other emotions.
             | 
             | CBT gives you a toolset to ask yourself questions to
             | understand (a) which perspective you're currently looking
             | things from and (b) which other perspectives you could use.
             | 
             | The 10 cognitive distortions and recognizing them is a good
             | start. Cognitive distortions happen mostly through
             | emotional processes (e.g. the amygdala but the whole limbic
             | system really).
             | 
             | Mindfulness meditation is an emotional-based approach as it
             | mostly relates (for laymen like me) to scanning the body.
             | Scanning the body gives marked improvements to the insular
             | cortex. It also gives marked improvements to the PFC (the
             | inhibition part, not the planning part).
             | 
             | This is all written way too short and my knowledge is a bit
             | stale on it. I used to be _really_ into this a couple of
             | years ago. It was during the time when I studied psychology
             | (I even published a neuroscientific literature review :D).
        
               | Enginerrrd wrote:
               | I love the spirit of your response, but I feel the need
               | to disagree a bit and elaborate about your statement:
               | >Mindfulness meditation is an emotional-based approach as
               | it mostly relates (for laymen like me) to scanning the
               | body
               | 
               | The REAL essence and power of mindfulness is becoming
               | aware of the contents of your attention. For some reason,
               | focusing the attention inwards on bodily somatosensory
               | experiences tends to encourage that, but the two are not
               | the same. Body-scanning is more a technique to help
               | encourage the development of mindfulness rather than the
               | end goal in itself.
               | 
               | The reason this distinction is so important and powerful
               | is that the brain regions which are feeding the contents
               | of your attention are the ones that get reinforced. When
               | you combine mindfulness with practice in _redirecting_
               | your attention, it becomes an insanely powerful tool to
               | fundamentally reshape your reality by restructuring your
               | brain.
        
               | fossuser wrote:
               | There's some irony there given that excessive body-
               | scanning and hyper-vigilance can be common symptoms
               | related to anxiety.
               | 
               | Though the CBT stuff in general and being aware of your
               | attention does seem empirically helpful, I just find the
               | body scanning focus as a common start may not be the
               | best.
        
               | uoaei wrote:
               | Yes exactly. The "mindfulness" designation is a recursive
               | one, where first you are mindful, then you are mindful of
               | that which is mindful, and all the way down.
        
               | Trasmatta wrote:
               | > Mindfulness meditation is an emotional-based approach
               | as it mostly relates (for laymen like me) to scanning the
               | body
               | 
               | Just to clarify, body scanning is just one type /
               | approach to meditation. Many practices don't utilize it
               | at all, or only do so in conjunction with other
               | techniques.
        
               | mettamage wrote:
               | I appreciate the clarifications. I did write it a bit too
               | hastily. Sorry about that.
        
       | ta1234567890 wrote:
       | Sounds like the temporal circuit is acting like a clock, as in a
       | computer chip's clock. Pretty cool.
       | 
       | > We demonstrate that the transitions between default mode and
       | dorsal attention networks are embedded in this temporal circuit,
       | in which a balanced reciprocal accessibility of brain states is
       | characteristic of consciousness. Conversely, isolation of the
       | default mode and dorsal attention networks from the temporal
       | circuit is associated with unresponsiveness of diverse
       | etiologies. These findings advance the foundational understanding
       | of the functional role of anticorrelated systems in
       | consciousness.
        
       | david_w wrote:
       | The brain is the most complicated structure in the known
       | universe. The probes currently available to science- fMRI and GSR
       | - are both gross measures of cortical electrical activity.
       | They're enough to start to explore apparent structural and
       | (gross) electrical correlation between brain areas and (gross)
       | alterations in "consciousness", in this case unconsciousness
       | invoked via propofol and ketamine. Fair enough.
       | 
       | However, it irritates me when I hear scientists loosely throw the
       | word "consciousness" into these studies and here's why.
       | 
       | In these studies, consciousness is always implicitly defined
       | operationally as the electrical activity in some identified
       | networks- DAT and DSM and front-parietal and sensory motor etc..
       | But the concept of consciousness has another life in philosophy
       | where in works by people like Patricia Churchland and others, it
       | references something more subtle- the mystery of why there should
       | be anything we call experience at all.
       | 
       | Experience itself doesn't seem to be necessary to the working of
       | any machine, including our brains. We don't think our TVs have
       | any experiences despite (being capable of) accurately
       | representing all human visual experiences. The reason we don't
       | think they experience what they're displaying is because we know
       | how they work and we know there's no ghost in the machine. Adding
       | on "experiences" to an explanation of how TVs work is gratuitous
       | and unnecessary.
       | 
       | But that's not the case with humans-just the opposite. Experience
       | is absolutely foundational.
       | 
       | Descartes tried to boil his world down to what he could know with
       | absolute certainty and arrived at his famous "Cogito ergo sum"
       | formulation, but actually, he skipped a step; that step is
       | simply- "There is experience".
       | 
       | Experience is perfectly gratuitous to any explanation of brain
       | activity since all that activity, like an electrical storm, could
       | take place in exactly the same way without it. We (our brains)
       | could be, and most scientists believe are, very complicated, but
       | purely mechanical machines. They could be exactly as they are
       | with no more awareness- not to say feedback loops- than a
       | blender.
       | 
       | But that account leaves the problem of experience or
       | consciousness completely untouched. That would be O.K. except we
       | know we have it.
       | 
       | The mystery of consciousness is not totally defined by questions
       | like of "can I make you unconscious or conscious?" or "can I
       | cause you to have this or that illusory experience by stimulating
       | your brain?". The mystery of consciousness is _why is there
       | anything like experience at all ?_
       | 
       | So whenever I read a paper that makes some confident assertion
       | about consciousness, it gets under my skin. It's electrical
       | activity and perhaps human behavior and speech they are actually
       | examining, not consciousness. I hear these papers gratingly
       | assuming the consequent with respect to the biggest mystery there
       | is. They are implicitly saying "this is consciousness, this
       | pattern of electrical activity in the brain and here is what we
       | have discovered about consciousness". That's one perspective, but
       | to philosophers, both academic and non-academic, it's a form of
       | punting on the real question.
       | 
       | Consciousness is to brain science what AGI is to AI. Researchers
       | just love to make assertions and grand predictions.
       | 
       | Actually the correlation between the two is closer than that
       | since strong AI claims that consciousness can be captured in a
       | computer; Kurtzweil and his Singularity concept is in this school
       | of thought.
       | 
       | He and people like him claim that not only does experience arise
       | as a direct result of brain activity but any substrate- including
       | general purpose computer platforms- will similarly give rise to
       | the same experiences if only they are programmed in a particular
       | way, specifically, if the computations are functionally
       | equivalent to the brain's computations.
       | 
       | Are badly programmed computers therefore experiencing chaos?
       | Well, why not? Are simpler computers, like a thermostat which
       | "experiences" temperature changes, also somehow dimly conscious?
       | If that seems like a straw man argument to you, you should know
       | Marvin Minsky bought it and so do a lot of other scientists
       | whether they realize it or not.
       | 
       | All of this is just a non-starer to people like me. You don't get
       | to skip a step because it keeps your theory neat or provides you
       | the promise of immortality because you uploaded your "you" to a
       | machine.
       | 
       | Consciousness, understood in this way, is a genuine mystery which
       | for now at least I don't think we have the conceptual tools to
       | even define much less make pronouncements about.
        
         | sofal wrote:
         | I think we're a long way from a good understanding of how
         | consciousness works, but I also think a lot of people are going
         | to subscribe to a sort of consciousness-of-the-gaps idea no
         | matter how much progress is made in understanding the actual
         | mechanisms. Even if we fully understood and and could reproduce
         | it, there would be scores of people who would flat out refuse
         | to see the evidence and would simply assert that the ineffable
         | "experience" does not exist within beings for which they don't
         | want to acknowledge it. The very concept of p-zombies
         | illustrates this a priori refusal to admit any possible
         | evidence whatsoever of consciousness. Another person could
         | simply decide that I am in fact a p-zombie and lock themselves
         | in a closed system of thought out of which there is no path to
         | demonstrating that I "experience" anything at all.
         | 
         | I think if you want to put forth a hypothesis that there is
         | some ghostly ineffable part of consciousness called
         | "experience" that cannot ever be touched or measured by
         | scientific means, then you have a self-defeating argument that
         | cannot be supported. You might as well go full solipsism.
         | There's nothing stopping you.
         | 
         | Consciousness is a genuine mystery at this point, but I think
         | some people will still see it that way even if we solve it, and
         | this is clearer to me every time I see people trash any kind of
         | effort or progress made by science in understanding the brain,
         | claiming that it is not in fact progress at all.
        
           | david_w wrote:
           | Just to toss off one more worthwhile idea to you since it
           | seems like you're interested in this topic. p-zombies is not
           | the most challenging scenario strong-AI deniers are likely to
           | face. Brain cell replacement is.
           | 
           | With p-zombies you have two observers outside the system
           | arguing about the system's inner life. With brain cell
           | replacement, you have the subject directly and quite
           | authoritatively experiencing the system in question and
           | reporting back.
           | 
           | It seems many times more likely some of us will live to see
           | this, but you just never know. Newtonian mechanics had it all
           | locked up save for a few details and look what those details
           | held.
           | 
           | Every brain science / cog-sci paper it seems has some
           | alternative amputating conclusions to pronounce about
           | consciouness.
           | 
           | They sort of have to do that because of the funding model
           | they live under. Positive results only ! It's not the
           | researcher's fault; I don't fault them. I just adopt a highly
           | skeptical, wait and see, there's-probably-more-to-the-story
           | attitude generally in science, that, and the more concrete
           | counter-arguments I mentioned in my other comments make me a
           | very highly dissident observer of this field.
        
           | axguscbklp wrote:
           | On the other hand, I think that many people are emotionally
           | invested in believing that science must be capable of solving
           | the hard problem of consciousness even though there is no
           | reason to assume that science is.
           | 
           | It is perfectly possible that the hard problem of
           | consciousness is in principle and forever beyond the reach of
           | scientific investigation.
        
           | david_w wrote:
           | "The very concept of p-zombies illustrates this a priori
           | refusal to admit any possible evidence whatsoever of
           | consciousness. Another person could simply decide that I am
           | in fact a p-zombie and lock themselves in a closed system of
           | thought out of which there is no path to demonstrating that I
           | "experience" anything at all."
           | 
           | This is a good point and makes the problem interesting in an
           | additional way. We (I) assume something like p-zombies exist
           | in non-human consciousness, dogs and cats for example. It's
           | like something to be a dog. How far down do we want to go ?
           | Frogs? I'll bite; it's like something to be a frog:
           | 
           | https://www.youtube.com/watch?v=w8IY2eTBqd8
           | 
           | But here's a counter to the p-zombies argument, OK?
           | 
           | The p-zombies argument is usually taken to mean there comes a
           | point where what has been created is so indistinguishable
           | from "real" people, ala Ex Machina, that arguing over it is a
           | form of ideologically motivated perversion.
           | 
           | Let me turn that round and say that the p-zombie argument is
           | (accidentally) making the following strong claim- it is
           | impossible to build a machine which in every way acts human
           | but has no experience.
           | 
           | That's a very very strong claim on this universe. I wouldn't
           | take the bet, because someone's going to do it.
           | 
           | But if someone is going to do it, how can we tell when they
           | have or they haven't? The Turing Test is outdated (as I see
           | it) and anyway already passed for some judges ( re: ELIZA).
           | 
           | To me, this circles back to the original problem. We can't
           | distinguish between the high probability that someone can
           | eventually create an actual zombie and "real" experience-
           | having artificial intelligence, and why is that ?
           | 
           | The issue is just another form of the basic problem- we don't
           | have the conceptual framework to get our minds around what
           | experience is.
           | 
           | Our basic assumptions may be off. Instead of quarks et.al.
           | being the basic building blocks of matter and matter of
           | brains and brains of consciouness, some people take
           | experience to be the most basic building block of the
           | universe.
           | 
           | This was my conclusion and I thought it would just brand me
           | as an eccentric so I never pushed it, but now I see it's
           | being kicked around by people with careers.
           | 
           | Another assumption is that experience/consciousness is
           | comprehensible to the level of scientific causality/reality
           | we're aiming at, (let's just shorthand it to "ultimate
           | reality"), because there are separate, distinct things in the
           | first place.
           | 
           | But what if separate things is not a fact about ultimate
           | reality? What if they're more like a hardwired perceptual
           | compulsion we can't escape? Then we might very well find
           | truly insoluable mysteries on the foundational tier of our
           | conceptual scaffolding, because none of the "things" we think
           | about are real in the first place. Things which don't exist,
           | don't have to "add up".
           | 
           | So this would mean our minds and ultimate reality are just
           | not made for each other, _even as that reality directly
           | impinges on our personal daily lives in ways we can and do
           | readily experience and talk about_.
           | 
           | It seems like the most far fetched and deflating hypothesis
           | possible, but consider we'd merely be joining the rest of the
           | animal kingdom in this regards.
        
         | codeulike wrote:
         | The thing is, if you're an atheist (and I write from one of the
         | non-USA countries in which being an atheist is entirely
         | unremarkable) then experience (or qualia) and consciousness
         | itself are still very mysterious, but it's hard to avoid the
         | conclusion that it must all be a side effect of processing or
         | information somehow.
         | 
         | Daniel Dennett has some good stuff on this (see Consciousness
         | Explained, etc). Its not that he knows the answers, but his
         | point is that consciousness might not be exactly what we think
         | it is, there are lots of thought-traps around it, so we have to
         | carefully unpick some of our assumptions about it to get
         | anywhere - e.g. what he calls the cartesian theatre is one very
         | powerful misconception (too long to explain here).
         | 
         | Also I always like to drop this Iain Banks quote in these kinds
         | of discussions (from A Few Notes About The Culture)
         | 
         |  _Certainly there are arguments against the possibility of
         | [strong] Artificial Intelligence, but they tend to boil down to
         | one of three assertions: one, that there is some vital field or
         | other presently intangible influence exclusive to biological
         | life - perhaps even carbon-based biological life - which may
         | eventually fall within the remit of scientific understanding
         | but which cannot be emulated in any other form (all of which is
         | neither impossible nor likely); two, that self-awareness
         | resides in a supernatural soul - presumably linked to a broad-
         | based occult system involving gods or a god, reincarnation or
         | whatever - and which one assumes can never be understood
         | scientifically (equally improbable, though I do write as an
         | atheist); and, three, that matter cannot become self-aware (or
         | more precisely that it cannot support any informational
         | formulation which might be said to be self-aware or taken
         | together with its material substrate exhibit the signs of self-
         | awareness). ...I leave all the more than nominally self-aware
         | readers to spot the logical problem with that argument._
         | 
         | Edit: changed 'cant really avoid the conclusion' to 'its hard
         | to ...'
        
           | axguscbklp wrote:
           | >if you're an atheist [...] you can't really avoid the
           | conclusion that it must all be a side effect of processing or
           | information somehow
           | 
           | Why? I'm an atheist-leaning agnostic, but I think that the
           | hard problem of consciousness might well turn out to be
           | impossible for scientific investigation to tackle.
           | 
           | I cannot think of any valid logic that would show that "there
           | are no gods" implies "consciousness is a side effect of
           | processing or information somehow".
        
             | codeulike wrote:
             | Well yes OK. I guess I'm jumping from 'being an atheist' to
             | 'general distrust of the so called supernatural'.
             | 
             | Do you mean 'impossible for scientific investigation to
             | tackle' because its just too complex (in the same way we
             | can't predict the weather very accurately) or do you mean
             | more like: because you suspect there is some outside-of-
             | known-physics involvement that we wont ever be able to get
             | a grip on?
        
               | axguscbklp wrote:
               | Not because it is too complex, but because I suspect that
               | there may be something to consciousness that is outside
               | of knowable physics. There is no reason to assume that
               | scientific investigation is in principle capable of
               | getting a grip on all of reality. That does not mean that
               | consciousness is some mystic woo-woo, it just means that
               | scientific investigation may in principle be limited.
               | Consciousness might well turn out to be impossible in
               | principle to tackle using mathematical modeling,
               | reproducible experiment, theories of physical mechanisms,
               | etc. - but that would not mean that consciousness is not
               | real. It does not require scientific inquiry to show that
               | consciousness is real. Subjective experience is
               | immediately obviously real, as subjective experience.
        
               | codeulike wrote:
               | I agree that what you say is possible, but it's also
               | possible that consciousness does lie inside known
               | physics, so I reckon it's worth people investigating that
               | angle, as formidable as it seems.
               | 
               | I've edited my comment above to be a bit less absolutist
        
               | axguscbklp wrote:
               | I wouldn't say it's impossible, although honestly I
               | cannot even begin to imagine how consciousness could lie
               | inside known or even knowable physics. But if people want
               | to try, more power to them. I'm open to my suspicion
               | being wrong.
        
           | david_w wrote:
           | Never take the arguments of a side from their opponent's
           | mouths.
           | 
           | The arguments I offered have nothing to do with any of the
           | three he claims they all boil down to.
           | 
           | If you think I made one of these three, please tell me which
           | one so I can clarify the argument.
           | 
           | Assuming it's a side effect of processing- known as an
           | epiphenomena- immediately commits you to answering the
           | question- does a badly programmed computer have a form of
           | consciouness? Does a thermostat have a primitive form? Is it
           | specifically impossible to create AI which emulates human
           | thinking to the last detail, but has no consciouness, i.e.
           | really is just an empty machine with zero experience? Is that
           | an impossible task which could not be achieved by anyone by
           | any means?
           | 
           | Suppose I debate with someone who has a computer programmed
           | to be conscious. Here's what I'm going to do. I'm going to
           | very very slightly change the programming so whatever output
           | it's producing which is proving, my opponent claims, the
           | computer is conscience, starts to degrade.
           | 
           | I'm going to do that then ask my opponent- still conscious?
           | I'm going to do this and I'll guess my opponent will say
           | "less so perhaps" , which would be his best reply.
           | 
           | Then I'm going to repeat until I get a "probably not" and
           | then a "no" from him, which by his own hypothesis has to
           | happen.
           | 
           | Then I'm going to diff the conscious program and the
           | unconscious program and ask him if he really thinks those
           | slightly altered lines of code are the difference between
           | consciousness and a humdrum computer.
           | 
           | Because that's where this goes, this idea that a certain type
           | of computation is consciousness.
           | 
           | It also goes to consciousness being granted to a machine like
           | a Turing Tape. You may not think that squishy biological
           | matter should be bequeathed with a "magical" property which
           | hosts consciousness, but tell me, how do you feel about a
           | Turning Tape?
        
         | downerending wrote:
         | All true enough, and I think any honest scientist would say
         | that the most we can hope for is to notice a few patterns in
         | the wallpaper on Plato's Cave. There's no reason to think that
         | any real insight beyond that is possible.
        
       | sebringj wrote:
       | This feels right in terms how I experience things when I've had
       | bad migraines and notice parts of my capability going away
       | temporarily such as understanding speech or being able to read
       | words or missing visual areas entirely. Things get jumbled or
       | confused, all the while, I am aware of these things happening yet
       | unable to control them. It feels like there are separate parts of
       | me like modules that go offline but the one that is constant is
       | the sense of "me" or the consciousness part. These episodes are
       | few and far between but I am still thankful to have a different
       | perspective of our bio-mechanical nature. It also makes me feel
       | closer to my pets in the sense that awareness or consciousness
       | doesn't correlate with cognitive ability or intellect but that is
       | just my guess.
        
       ___________________________________________________________________
       (page generated 2020-04-09 23:00 UTC)