[HN Gopher] Mind Emulation Foundation
       ___________________________________________________________________
        
       Mind Emulation Foundation
        
       Author : gk1
       Score  : 54 points
       Date   : 2020-09-01 17:53 UTC (5 hours ago)
        
 (HTM) web link (mindemulation.org)
 (TXT) w3m dump (mindemulation.org)
        
       | mcculley wrote:
       | How does having the connectome get you the mind? Don't you need
       | the weights in the neurons? I thought I also read that neurons
       | are not discrete units, that there might be weights within the
       | dendrites/axons.
        
         | dsiroker wrote:
         | The connectome includes the weights.
        
           | mcculley wrote:
           | Do you have a reference somewhere that says that? The
           | Wikipedia entry for connectomics does not imply that and I've
           | not read of anything that can get the weights out of neurons.
        
             | dsiroker wrote:
             | Connectome: How the Brain's Wiring Makes Us Who We Are [1]
             | 
             | [1] https://www.amazon.com/Connectome-How-Brains-Wiring-
             | Makes/dp...
        
       | dsiroker wrote:
       | (I'm the founder & CEO of the Mind Emulation Foundation)
       | 
       | Flattered to see this hit the front page! This is a project I've
       | been passionate about for a while and been keeping it mostly
       | under the radar.
       | 
       | Now that it's public, I'm happy to answer any questions the HN
       | community has.
        
         | mchusma wrote:
         | My feedback. I think if I were to donate I would like to see a
         | clear roadmap for even getting in the ballpark of doing this.
         | 
         | I have donated to SENS for years, and particularly when they
         | started, they didn't really say exactly "donate to cure aging",
         | they said "these are the important known problems that we need
         | to spend time/money on to make progress here."
         | 
         | I believe there are multiple large known problems doing brain
         | emulation. What are they? How will donating to you progress
         | those?
        
           | lawlessone wrote:
           | How do you think SENS is doing, sorry to go off topic.
           | 
           | I like them, and they give me hope, but i am not sure where
           | they are.
        
         | RobertoG wrote:
         | Do you have any affiliation or contact with
         | https://www.brainpreservation.org/ ?
        
         | vicnicius wrote:
         | > This digital emulation could then interact with the rest of
         | the world using a simple text interface or a more sophisticated
         | robotic avatar with digital representations of taste, sight,
         | touch, smell, and sound.
         | 
         | I found this to be the most intriguing bit of the described
         | process. I can digest the (theoretical) idea of reproducing the
         | mind, mostly agreeing with a materialistic perspective of it.
         | What I got myself imagining was the process of adaptation this
         | mind would have to go to be capable of interacting with the new
         | kinds of inputs and outputs it would have. Imagine you freeze a
         | computer while it was running a VR game with max resolution
         | settings. You then go on and move the machine to a different
         | setup, with a CRT monitor and a keyboard to interact with it.
         | How can any meaningful interaction happen in such a context?
         | Unless you provide a virtual environment to interact with it
         | and allow it to adapt... But then, how would such an
         | environment look like to a copy of a mind? Would you have any
         | insights on that?
        
         | lachlan-sneff wrote:
         | Can you be clearer about what's going on behind the scenes
         | here? Additionally, are you aware of the humanityplus and
         | ##hplusroadmap communities?
        
         | anvandare wrote:
         | Not really a question, more of a remark: both the destructive
         | and non-destructive approaches result in the same thing: a copy
         | of 'you', not actually 'you'. (The same is true when you go to
         | sleep, of course. Whoever wakes up isn't you either)
         | 
         | What you would need is a Ship of Theseus approach - preserving
         | the consciousness stream while neuron after neuron is being
         | replaced by a digital version, slowly, to convince the stream
         | into thinking it's still the same. You can't just take a single
         | scan; you have to keep scanning continuously and keep a running
         | feedback stream between the (decreasing) wetbrain and the
         | (increasing) bitbrain to ensure the illusion of continuity.
         | 
         | (But to be honest, I don't actually believe in consciousness or
         | a Self. Whoever started writing this short comment wasn't 'me',
         | and neither am I, nothing is)
        
           | dsiroker wrote:
           | The gradual replacement approach (as you describe it) is a
           | useful thought experiment that gets people more comfortable
           | with the possibility of mind emulation but the end state is
           | the same for both approaches. One just intuitively _feels_
           | more _you_ than the other, but as you said, it 's likely all
           | an illusion so doing it while you "sleep" is probably enough
           | to make _you_ sufficiently happy to be alive when you wake up
           | non-biologically.
        
         | [deleted]
        
       | PaulHoule wrote:
       | https://en.wikipedia.org/wiki/The_Annals_of_the_Heechee
        
       | gverrilla wrote:
       | is this troll content?
        
       | mcculley wrote:
       | I am always amused at this kind of approach to immortality. While
       | the copy of me that is reborn would appreciate my preparedness,
       | that doesn't make this copy any less unhappy about dying.
        
         | dsiroker wrote:
         | When you wake up in the morning billions of your cells have
         | changed from the night before. Are you any less you?
         | 
         | It is possible one day you will go to sleep biologically and
         | wake up non-biologically. It's just a matter of sufficiently
         | emulating the processes that were present when you were the
         | biological you.
        
           | mcculley wrote:
           | I have meditated enough to be unconvinced that even the me
           | that goes to sleep is the same guy who woke up that morning.
           | 
           | I try to be good to the guy who wakes up with my memories and
           | body the next morning. That makes me no less unhappy about
           | dying.
        
             | dsiroker wrote:
             | That is a refreshing perspective.
        
           | Trasmatta wrote:
           | What if you and the non biological copy both wake up? Which
           | one is "you"? I'm obviously going to care more about the one
           | that I appear to actually be, and won't be happy if me copy
           | decides to kill me.
           | 
           | The idea of a continuous self is probably an illusion, but if
           | your body dies that illusion does too. Your copy just lives
           | its own version of that illusion.
        
       | TedDoesntTalk wrote:
       | > the body will be partitioned such that it could be scanned with
       | an electron microscope
       | 
       | "Partitioning" the brain will destroy some of the nanometer-scale
       | tissue as it is sliced.
        
       | darepublic wrote:
       | Once you become immortal, someone can put you in hell
        
         | WealthVsSurvive wrote:
         | I think there is a fundamental paradox at the root of
         | consciousness and thought: once we succeed in alleviating
         | suffering by discarding the body, we are already in hell. What
         | is the purpose of a mind without a body? To witness? To what
         | ends? There's a reason that the lower brain is at the seat of
         | the throne. Maybe we should focus more on cooperating and less
         | on becoming a purposeless husk. If the string is too tight it
         | breaks, if it is too loose it will not play.
        
           | FeepingCreature wrote:
           | If you can emulate a brain, you can _probably_ emulate a
           | body.
        
           | RobertoG wrote:
           | >>"[..]What is the purpose of a mind without a body?[..]"
           | 
           | What's the purpose of a mind with a body?
        
             | joeberon wrote:
             | To help the body survive
        
         | Trasmatta wrote:
         | Yep, this kind of terrifies me. The human mind has an
         | incredible capacity for suffering. An emulated mind might have
         | many orders of magnitude higher capacity for suffering, on top
         | of being effectively immortal. At least the human mind will die
         | after 80 or 90 years or so.
        
       | ivan_ah wrote:
       | Focussing on synaptic connections seems rather simplistic. For
       | full "emulation" they would probably need to emulate the electric
       | fields and neurotransmitter concentrations, otherwise just-the-
       | spikes simulation will probably capture only a small percentage
       | of brain dynamics.
        
         | ravi-delia wrote:
         | You're probably right, at least in terms of neurotransmitter
         | concentration, but I doubt that synaptic connections are only a
         | 'small percentage' of brain dynamics.
        
       | keiferski wrote:
       | Recent work in philosophy has called into question the notion
       | that the mind/cognition/identity is entirely independent from the
       | body.
       | 
       | https://plato.stanford.edu/entries/embodied-cognition/
       | 
       | Personally I think Western culture in particular has neglected
       | the physical aspects of existence. The idea that our bodies are
       | simply vessels for our minds seems more the result of cultural
       | neglect than anything.
        
         | joeberon wrote:
         | Recent? Buddha taught all this 2500 years ago
        
         | dsiroker wrote:
         | We are not claiming they are entirely independent, quite the
         | contrary. The mind is an emergent property of the body. Just
         | like music is an emergent property of sound waves.
        
           | keiferski wrote:
           | But if your mind and self is formed by and dependent on your
           | body, how can you transfer it to a bodyless existence without
           | losing that self?
           | 
           | The mind as an emergent property of the body is also not an
           | established philosophical or scientific fact, and is quite
           | dependent on positivism, which has plenty of issues.
           | 
           | It seems to me that at best, you're creating a surface-level
           | copy, but one inherently limited to contemporary scientific
           | knowledge. Not to say that this isn't interesting or useful,
           | but it's certainly not the same 'self.'
        
             | dsiroker wrote:
             | The notion of self is an illusion the mind creates. (There
             | are many benefits for doing so, not least of which is self-
             | preservation which is helpful to producing progeny and so
             | therefore is selected for during natural selection.)
             | 
             | To answer your question directly: one can transfer an
             | emergent property of a system to another system by
             | sufficiently transferring the mechanism that produce that
             | emergent property in the first place. A good analog would
             | be emulating the hardware of the Nintendo Entertainment
             | System (NES) entirely in software [1].
             | 
             | [1] https://jsnes.org/
        
               | keiferski wrote:
               | The self is far more complex than the simplistic
               | positivist notion of it. And again, this only works if
               | you assume that at the time of the mind creation, your
               | knowledge is complete. That seems fairly ignorant
               | considering the history of science, not to mention the
               | inherent limitations of empirical knowledge.
               | 
               | The NES example is not really a good one because it's a
               | created object and knowledge of it is complete, therefore
               | replicating it is possible.
               | 
               | Even then, assuming all of this didn't matter- I still
               | don't see how the mind maintains itself in a new body.
               | It's not as if a human mind is a static entity-it
               | constantly comes into contact with the world through its
               | embodied form and this reinforces and extends this notion
               | of identity. Assuming you could emulate it on a computer,
               | it would seem logical that the mind would change to adapt
               | to its new body, thus no longer being the same self.
               | 
               | Ultimately any "transfers" will actually just be the
               | creation of new minds, which IMO is more interesting
               | anyway.
        
               | dsiroker wrote:
               | > The NES example is not really a good one because it's a
               | created object and knowledge of it is complete, therefore
               | replicating it is possible.
               | 
               | One could replicate the NES hardware without any
               | knowledge of how it was built by reverse engineering it
               | 
               | > Ultimately any "transfers" will actually just be the
               | creation of new minds, which IMO is more interesting
               | anyway.
               | 
               | I actually agree but in the same way that you have a "new
               | mind" when you wake up in the morning. The continuity of
               | self is an illusion. "I" would be very happy to one day
               | wake up having been transferred into a non-biological
               | system the same way that "I" would be very happy to wake
               | up tomorrow.
        
               | keiferski wrote:
               | Sure but the mind is reinforced by its constant
               | embodiment. If you woke up in another body, or with no
               | body, then that identity would seem difficult to
               | maintain.
        
               | simion314 wrote:
               | >One could replicate the NES hardware without any
               | knowledge of how it was built by reverse engineering it
               | 
               | Not sure if one could do that if that person is not very
               | familiar with similar projects. Give the NES system to a
               | scientist from 1800 and tell me what they could conclude.
        
         | jesselawson wrote:
         | Thank you for saying this. It's always so strange to me that
         | there are people obsessed with prolonging the inevitable, as if
         | immortality itself could be an end-state.
        
           | FeepingCreature wrote:
           | All of life is prolonging the inevitable. This would seem to
           | be an argument for suicide.
        
       | zzzeek wrote:
       | It's been done:
       | https://en.wikipedia.org/wiki/Brainstorm_(1983_film)
        
       | camdenlock wrote:
       | My hesitation with mind emulation is not so much with the
       | technical side; I think it's fairly clear that we'll get there.
       | 
       | However, the question of responsible stewardship looms large, and
       | is rarely addressed. With whom am I entrusting my mind? How can I
       | be sure that such stewardship won't be transferred to another
       | party at some point? Who's to guarantee that my mind won't be
       | installed into an eternal torment sim?
       | 
       | The stewardship questions have always bothered the hell out of
       | me, and the lack of convincing answers has always led me to avoid
       | buying completely into the preservation of my body for future
       | scanning and uploading into a sim.
        
       | prerok wrote:
       | I'm not convinced.
       | 
       | 1. We still don't know what the memory engrams truly are (
       | https://en.m.wikipedia.org/wiki/Engram_(neuropsychology) ). I
       | once read that aside from relying on the interconnections of the
       | neurons they also rely on specific proteins created during memory
       | creation. They are then vital for memory recollection.
       | 
       | 2. We know that the connections between neurons are important but
       | we just realized that the support cells (glia) also affect the
       | firing mecanism: they are not only support but a filter as well (
       | https://en.m.wikipedia.org/wiki/Glia )
       | 
       | 3. Inhibitory interneurons provide a way for synchronous firing
       | of neurons to form a learning experience:
       | https://www.nature.com/articles/s41467-019-13170-w
       | 
       | All in all to replicate the brain functionality we would need to
       | fully replicate the chemical composition of the brain to the
       | lowest level (molecules).
       | 
       | I'm not holding my breath.
        
         | dsiroker wrote:
         | > They are then vital for memory recollection.
         | 
         | Even if specific proteins are needed for memory creation (which
         | is debatable [1]) it doesn't mean you need to model those
         | proteins to retrieve the stored memories from the structures
         | that they created. You can read data from a hard drive without
         | modeling the CPU or memory bus of the computer that stored the
         | data.
         | 
         | > support cells (glia)
         | 
         | Glia cells are the order of 40-50 microns [2] and can easily be
         | seen with an electron microscope. In fact, they are present in
         | the Lichtman paper [3] linked from the Mind Emulation website.
         | 
         | [1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2745628/ [2]
         | https://psych.athabascau.ca/html/Psych402/Biotutorials/4/par...
         | [3] https://www.cell.com/cell/fulltext/S0092-8674(15)00824-7
        
           | prerok wrote:
           | Glia cells can be seen, sure. But to my knowledge we still
           | don't understand their impact.
           | 
           | Anyway, my point is that even if you are able to recreate the
           | structure you would need to replicate the functionality.
           | 
           | A better analogy would be that even if you have the data on
           | the hard drive you would need a special program that would
           | know how to interpret it. In this case the data also contains
           | a time variant exactly when the data applies which is not
           | captured within the structure unless you go to the molecular
           | level.
        
         | adeledeweylopez wrote:
         | Memory engrams are thought to be DNA methylation patterns in a
         | neuron's DNA.
         | 
         | http://learnmem.cshlp.org/content/23/10/587.full
         | 
         | https://www.nature.com/articles/s41467-020-14498-4
        
           | prerok wrote:
           | Wow, have not read this before. Thanks for sharing!
        
       | lamename wrote:
       | I'm not as critical of these goals as some, but to not even
       | mention another dimension of complexity beyond connections:
       | electrophysiological firing pattern, is quite an
       | oversimplification.
        
         | dsiroker wrote:
         | With a sufficiently robust representation of the connectome
         | including its spatial orientation one can also emulate the
         | electrophysiology.
        
       | new_realist wrote:
       | Looks like a bunch of kids have watched too much Westworld.
        
       | mrkstu wrote:
       | This is so ahead of the curve of reality that it's easy to
       | dismiss- BUT at the least it's possible that it could lead to
       | some interesting basic research being done. Hopefully that,
       | rather than misleading rich marks and separating them from their
       | cash is the real goal.
        
         | dsiroker wrote:
         | Thank you for your optimism. The goal is to fund and conduct
         | basic research.
        
       | lawlessone wrote:
       | >a reasonable estimate for the first human brain being mapped
       | would in 2084.
       | 
       | :(
        
       | drewbug wrote:
       | Any major difference from the Carboncopies Foundation?
        
       | smoyer wrote:
       | Reading this reminds me of a great Cory Doctorow / Charlie Stross
       | book - "The Rapture of the Nerds"
       | (https://craphound.com/category/rotn/)
        
       | VikingCoder wrote:
       | My favorite books about mind emulation:
       | 
       | * "Fall, Or Dodge in Hell" by Neil Stephenson [1]
       | 
       | * "Permutation City" by Greg Egan [2]
       | 
       | [1] https://www.amazon.com/dp/B07PXM4VMD/ref=dp-kindle-
       | redirect?...
       | 
       | [2] https://www.amazon.com/Permutation-City-Greg-Egan-
       | ebook/dp/B...
        
         | greatquux wrote:
         | Yes, "Fall" was really great! It was after reading it that I
         | decided I didn't want to emulate my mind anymore. :)
        
         | Trasmatta wrote:
         | And for games, SOMA is probably the best example.
        
         | NickM wrote:
         | I really loved Permutation City, and loved the beginning of
         | Fall, but man it felt like Fall just turned into a really hard
         | slog after a certain point. I have definitely enjoyed some of
         | Neal Stephenson's other novels too, even slower-paced ones like
         | Anathem, but for some reason Fall just didn't do it for me.
        
           | microtherion wrote:
           | I agree that the middle of the novel was dragging, but after
           | a while, the story was picking up speed again for me.
           | 
           | Stephenson's early novels suffered from weak endings, IMHO.
           | His newer ones (Fall, Seveneves) suffer from bloated
           | midsections).
        
         | abecedarius wrote:
         | Whole Brain Emulation Roadmap
         | http://www.fhi.ox.ac.uk/Reports/2008-3.pdf
         | 
         | The Age of Em https://www.amazon.com/gp/product/0198754620/
        
       | jarinflation wrote:
       | this could effectively lead to the creation of heaven, an
       | afterlife that would eliminate the notion of death. I never quite
       | understood why the entire human species, once made aware of the
       | non zero probability of this working, not diverted most of its
       | entire global energy, time and resources towards this effort. I
       | can only imagine it is because most people have no capacity to
       | really imagine, or outright refuse to ever imagine, death. In a
       | way I do envy them.
        
         | gallerdude wrote:
         | You're basically arguing a techno-religious Pascal's Wager, so
         | my contra-argument is the same: if the nonzero probability is a
         | 0.1% chance, would you really risk your one and only life
         | dedicated to something that has a 99.9% chance of not
         | happening, whether it be real or techno-heaven?
        
         | jasperry wrote:
         | How can you be sure that if we could eliminate death, we would
         | also be able to make life so great that our mind would find it
         | worthwhile to keep living century after century? In other
         | words, if we enabled people's minds to continue forever, how do
         | we know it would be heaven and not hell?
        
       | dontreact wrote:
       | So the C. Elegans connectome was done in 1986, and we still
       | haven't made a fully functional model of the c. elegans brain.
       | I'm not sure that this bottom up approach (synapses -> model ->
       | behavior) will work better than a top down approach that has been
       | making a lot of progress in AI (behavior -> model)
        
         | dsiroker wrote:
         | OpenWorm [1] has made a lot of progress toward building a fully
         | functional model and when it simulates a worm's behavior it is
         | almost indistinguishable from the real thing. Here is a video
         | they produced in 2013 of C. elegans moving:
         | https://www.youtube.com/watch?v=SaovWiZJUWY
         | 
         | [1] http://openworm.org/
        
       | Barrin92 wrote:
       | _" While we are far from understanding how the mind works, most
       | philosophers and scientists agree that your mind is an emergent
       | property of your body. In particular, your body's connectome.
       | Your connectome is the comprehensive network of neural
       | connections in your brain and nervous system. Today your
       | connectome is biological. _"
       | 
       | This is a pretty speculative thesis. It's not at all clear that
       | everything relevant to the mind is found in the connections
       | rather than the particular biochemical processes of the brain.
       | It's a very reductionist view that drastically underestimates the
       | biological complexity of even individual cells. There's a good
       | book, _Wetware: A Computer in Every Living Cell_ , by Dennis Bray
       | going into detail on how much functionality and physical
       | processes are at work even in the most simplest cells that is
       | routinely ignored by these analogies of the brain to a digital
       | computer.
       | 
       | There is this extreme, and I would argue unscientific bias
       | towards treating the mind as something that's recreatable in a
       | digital system probably because it enables this science-fiction
       | speculation and dreams of immortality of people living in the
       | cloud.
        
         | ebg13 wrote:
         | > _It 's not at all clear that everything relevant to the mind
         | is found in the connections rather than the particular
         | biochemical processes_
         | 
         | I wouldn't expect anyone to consider the connectome to be
         | absent the processes inside each of the individual neurons that
         | are connected. I consider this to mean just that it's an
         | emergent property of the collection working in concert. After
         | all, everything is just connections all the way down, even deep
         | inside individual cells.
        
           | mkolodny wrote:
           | > everything is just connections all the way down
           | 
           | Is that true? Could the "biochemical processes" also include
           | relationships between cells?
        
             | dsiroker wrote:
             | Correct, in order to emulate a mind we might need to
             | include that as well so that is in scope.
        
         | __tg__ wrote:
         | I hear the term 'emergent property' bandied about in relation
         | to the mind as if using it somehow explains anything. It says
         | nothing more than mind exists, somehow, yet we have no clue
         | about its nature.
         | 
         | Scientists and philosophers agreeing on something means nothing
         | as they have agreed on utter bunk before. The short of it is
         | that we know little about the mind and have no idea how to even
         | start expanding on the little we know.
        
           | tasty_freeze wrote:
           | > Scientists and philosophers agreeing on something means
           | nothing as they have agreed on utter bunk before.
           | 
           | That is an extremely cynical position to take; one could use
           | it to dismiss anything, even things obviously true. For
           | instance, "I don't believe the Sun is powered by fusion --
           | nobody has ever gone there to take a sample. Sure, they claim
           | to have all sorts of indirect evidence, and there is 99.9999%
           | consensus on it, but scientists have backed things before
           | that were utter bunk."
           | 
           | > It says nothing more than mind exists,
           | 
           | It makes a much stronger claim than that. There are many
           | people who believe that consciousness exists outside the
           | mind, and the brain is a kind of consciousness receiver, akin
           | to a radio receiver, that picks up the signal and relays it
           | to the body. The claim of emergent behavior is an explicit
           | rejection that that mystical explanation that is compelling
           | to many people.
           | 
           | > The short of it is that we know little about the mind and
           | have no idea how to even start expanding on the little we
           | know.
           | 
           | This is entirely at odds with reality. Is it your position
           | that brain researchers haven't learned anything over the past
           | 10, 20, 30 years? Clearly they have, so obviously they do
           | have ideas about how to expand on that knowledge.
        
         | dsiroker wrote:
         | I've posed this claim to dozens of neuroscientists. If you
         | consider the connectome just the static connections then you
         | might be right. If you include the dynamics of the brain (the
         | biochemical processes) as part of the connectome then most
         | neuroscientists would agree that is sufficient to produce the
         | emergent property of mind. The honest answer is we don't know
         | yet. That said, it's likely not necessary to model every atom's
         | interaction with one another so there must be a level of
         | abstraction sufficient enough to emulate a mind. Our foundation
         | is trying to identify what is the minimal level of abstraction
         | necessary to emulate a mind.
        
           | jchrisa wrote:
           | In support of the requirement for high-fidelity (atom-for-
           | atom) modeling is the notion that an evolved computer would
           | converge toward behaviors that supervene on specifics of the
           | host environment. If porting a binary to another CPU
           | architecture is tough, how easy will it be to port a mind to
           | a simulated simple physics? How many edge cases will it have
           | to get right to even run at all? If brains are hacks designed
           | over millions of generations to surf overlapping fitness
           | functions, it makes sense they'd find implementation (real
           | physics) dependent optimizations that compound in ways which
           | fall apart in toy physics. That's not to say we can't add
           | cool peripherals.
           | 
           | ps even with atom-for-atom modeling, how do you know the
           | behavior doesn't depend on relations which are not
           | computable? If physics ranges over the reals, some of those
           | edge cases might be hard to find with a simulator.
        
             | FeepingCreature wrote:
             | On the other hand, the brain is famously warm and wet.
             | There's a limit to how much local state the brain can
             | practically use to compute, given how messy it is.
        
             | dsiroker wrote:
             | > If porting a binary to another CPU architecture is tough,
             | how easy will it be to port a mind to a simulated simple
             | physics?
             | 
             | It will be very difficult, but we shouldn't underestimate
             | what is possible decades from now. As an analogy, consider
             | when the Nintendo Entertainment System (NES) came out in
             | the 1980s. Did anyone ever imagine it could be fully
             | emulated in JavaScript in a browser [1]? Certainly not
             | since those technologies hadn't been invented yet.
             | 
             | [1] https://jsnes.org/
        
               | indrax wrote:
               | What's JavaScript? What's a browser?
        
             | jjaredsimpson wrote:
             | >If physics ranges over the reals
             | 
             | I thought this was ruled out by
             | https://en.wikipedia.org/wiki/Bekenstein_bound
        
             | cylon13 wrote:
             | I can hit myself in the head and I don't lose my train of
             | thought, instantly lose consciousness, or die. If
             | consciousness relied on the precise positions of individual
             | atoms (as far as that makes sense with moving particles) it
             | would be way more fragile than we've observed it to be. The
             | fact that your brain is resilient to being knocked around a
             | bit is evidence towards the underlying mind being at least
             | slightly higher level than where strong quantum effects
             | live and also fairly redundant.
        
               | im3w1l wrote:
               | I agree, but I think there is a case to be made that
               | there is important state separate from just which cells
               | connect to which and how strongly, but is also more
               | coarse grained than single atoms floating around.
               | 
               | The cytoskeleton may be found out to have a role to play.
               | The number and locations of ion pumps. Or epigenetic
               | changes in clusters of brain cells.
        
           | TedDoesntTalk wrote:
           | What if we find that the gut, which has 100 million nerve
           | cells, also plays a part in the emergent property of mind?
           | 
           | https://www.sciencemag.org/news/2018/09/your-gut-directly-
           | co...
           | 
           | "In a petri dish, enteroendocrine cells reached out to vagal
           | neurons and formed synaptic connections with each other. The
           | cells even gushed out glutamate, a neurotransmitter involved
           | in smell and taste, which the vagal neurons picked up on
           | within 100 milliseconds--faster than an eyeblink."
        
             | phreeza wrote:
             | People who have large parts of their gut removed surgically
             | don't lose their mind.
        
             | ggreer wrote:
             | If that were true, then quadriplegics would have cognitive
             | issues, as would those who have their vagus nerve severed.
             | Those people don't suffer from impaired cognition or
             | drastic personality changes, so we can be sure that the
             | nerves in the gut are not important for brain emulation.
             | 
             | Also human brains have an average of 86 billion neurons, so
             | emulating an extra 100 million cells (0.1%) would be
             | trivial in comparison.
        
             | dsiroker wrote:
             | That is one reason why I was very careful to name this the
             | Mind Emulation Foundation and not the Brain Emulation
             | Foundation. I also use the word 'body' instead of 'brain'
             | throughout and define a connectome as: the comprehensive
             | network of neural connections in your brain _and nervous
             | system._
        
         | Causality1 wrote:
         | Indeed. We humans largely create devices that function either
         | through calculation or through physical reaction, relying on
         | the underlying rules of the universe to "do the math" of, say,
         | launching a cannonball and having it follow a consistent arc.
         | The brain combines both at almost every level. It may be
         | fundamentally impossible to emulate a human personality equal
         | to a real one without a physics simulation of a human brain and
         | its chemistry.
         | 
         | A dragonfly brain takes the input from thirty thousand visual
         | receptor cells and uses it to track prey movement using only
         | sixteen neurons. Could we do the same using an equal volume of
         | transistors?
        
           | Glorbutron wrote:
           | No one is saying a neuron is a one to one equivalent with a
           | transistor. That behavior does seem like it's possible to
           | emulate with many transistors, however.
        
             | westurner wrote:
             | Was just talking about quantum cognition and memristors (in
             | context to GIT) a few days ago:
             | https://news.ycombinator.com/item?id=24317768
             | 
             | Quantum cognition:
             | https://en.wikipedia.org/wiki/Quantum_cognition
             | 
             | Memristor: https://en.wikipedia.org/wiki/Memristor
             | 
             | It may yet be possible to sufficiently functionally emulate
             | the mind with (orders of magnitude more) transistors.
             | Though, is it necessary to emulate e.g. autonomic
             | functions? Do we consider the immune system to be part of
             | the mind (and gut)?
             | 
             | Perhaps there's something like an amplituhedron - or some
             | happenstance correspondence - that will enable more
             | efficient simulation of quantum systems on classical
             | silicon pending orders of magnitude increases in coherence
             | and also error rate in whichever computation medium.
             | 
             | For abstract formalisms (which do incorporate transistors
             | as a computation medium sufficient for certain tasks), is
             | there a more comprehensive set than Constructor Theory?
             | 
             | Constructor theory:
             | https://en.wikipedia.org/wiki/Constructor_theory
             | 
             | Amplituhedron: https://en.wikipedia.org/wiki/Amplituhedron
             | 
             | What _is_ the universe using our brains to compute? Is
             | abstract reasoning even necessary for this job?
             | 
             | Something worth emulating: Critical reasoning.
             | https://en.wikipedia.org/wiki/Critical_reasoning
        
       | sieste wrote:
       | That plot half way down the page where they fit a straight line
       | through 2 points and predict to be able to map human brains by
       | 2084 made me laugh.
        
         | dsiroker wrote:
         | The y-axis is logarithmic so it's actually representing
         | exponential improvements which is a fair upper-bound assumption
         | given the rate of improvement in cost to map a human genome was
         | better than exponential.
        
           | sieste wrote:
           | I get that. It's just that for an excel-trapolation 2x
           | outside the observed range based on only 2 data points, 2084
           | is a strangely precise estimate. I find it hard to take this
           | seriously.
        
             | dsiroker wrote:
             | Pull requests welcome. :)
        
       ___________________________________________________________________
       (page generated 2020-09-01 23:01 UTC)