[HN Gopher] Human use of high-bandwidth wireless brain-computer ... ___________________________________________________________________ Human use of high-bandwidth wireless brain-computer interface Author : bemmu Score : 227 points Date : 2021-04-04 15:21 UTC (7 hours ago) (HTM) web link (www.brown.edu) (TXT) w3m dump (www.brown.edu) | daenz wrote: | I guarantee that at some point in the future, if we make it far | enough, there will be an overwhelming social argument made that | everyone should get super integrated brain interface devices | implanted, "for the good of everyone." The argument will probably | go something like this: | | >The brain interface device X smooths out volatile emotions, | reducing risk of angry outbursts that result in violence. By not | getting a device installed, you are putting everyone at risk to | your violent outbursts. Employers and businesses have the right | to exclude someone who is at a higher risk of inflicting | violence. | oliv__ wrote: | This unfortunately makes a lot of sense in the current context | in which we live, but I am optimistic enough to believe that | sometime, somewhere in the world, people will join forces to | push back against such use of the technology and in favor of a | "free-er" society of individuals. | | I like to think of the ideas behind the formation of the USA as | a similar spirit. | lenkite wrote: | Yep, get a [Brain Passport] for [Public Safety] or be denied | public services. Actually, this is something that will likely | happen. It will first start with violent criminals and then | gradually make its way into the general public - with | appropriate cherry-picked data and statistics showing its | advantages. | srswtf123 wrote: | I suspect you're right, and I already don't want this | technology to exist, or its creation to be pursued. | | A brain-computer interface will, IMO, most likely be used to | control brains, not computers. | bserge wrote: | I would be against it, I'd rather live in a forest and hunt | wild rabbits and... | | > smooths out volatile emotions | | I volunteer! | | Seriously, that's how one would buy me. Reduce my emotions? | Maybe remove them? Plug me in, buddy :D | newsbinator wrote: | If anything, the argument to get brain interfaces implanted | would be that it's cruel to children not to implant them. | | It'd be like withholding a vaccine against a genetic flaw, when | the vaccine is cheap and sitting on the shelf ready to use. | joe_the_user wrote: | If such a brain interface allowed X evil actor to actually | control people, you wouldn't have arguments, you'd have a | direct takeover (or several different evil actors dueling). | | But if this interface was simply like a drug or some similar | effect, I doubt there's be enough of a combination of interests | to get people on board. | gallerdude wrote: | This is huge. From what I've read, a lot of neuroscience is | bottlenecked by having a hard time reading neurons through the | skull. This will remove the bottleneck in whole new types of | brain/mind/consciousness research. | EMM_386 wrote: | This does not read neurons "through the skull". | | It wirelessly transmits the data from probes already in the | brain. The innovation is that they do not have to be physically | tethered to get the data. | | > The unit sits on top of a user's head and connects to an | electrode array within the brain's motor cortex using the same | port used by wired systems. | pier25 wrote: | So what constitutes high/low bandwidth in this context? The | article doesn't mention any specific numbers (eg: 1Mbps). | echelon wrote: | BCI could unlock human immortality. | | BCI is a hard problem, and the risk to reward ratio for current | generation tech is too high except for a few isolated cases: non- | invasive, which is low-resolution, and disease remediation, which | is basically a measure of last resort. Given the poor payoff, the | technology isn't invested into. | | If we can get out of the gravity well / steep energy slope that | prevents us from reaching the pinnacle, we can maybe one day | become capable of performing brain copies and uploads, which | effectively achieves immortality. This would be the most | impactful technology ever developed for humans, should we still | be relevant at that point. There's a huge hill to climb in | getting there, and it's unlikely we'll see it within our | lifetimes, if ever. | | AGI, if developed first, would probably see little need in co- | opting messy and overly-complicated human machines. | | And there's always the chance we destroy ourselves first. | blisterpeanuts wrote: | Well, a brain copy is not exactly the same as immortality; it | just means your memories and an amalgamation of the neural | networks that form your unique personality can be duplicated. | The entity that results would be a separate individual. | scsilver wrote: | Im not sure I would notice or care. | bradgranath wrote: | High-bandwidth wireless link, for existing human brain | interfaces. | | They miniaturized the reciever and slapped a wifi chip on it. | | Cool. | | They aren't beaming thoughts into brains tho. | tyingq wrote: | Wow, that's pretty surreal. An actual person with 2 high density | connectors on their head. Each one streaming 48mbps of neural | data. Parts of Philip K Dick's stories are almost real. Though I | get that the data is coming out is still pretty low-fidelity and | crude. | ascotan wrote: | Next step: add a telnetd server and give it the root password of | 123. | l-lousy wrote: | I can't wait until I forget to charge my brain reader overnight | and can't access my computer. | IncRnd wrote: | You have not received the proper number of ad imprints this | week. Network functionality will be restored, once your | Facebook BCI chip detects that you have fully met the terms of | service, which you had legally accepted in order to receive | this free implant. Until such time, you will not be able to | access Facebook's BrainNet. We urge your compliance, so you may | once again virtually chat with friends and family and work | remotely with your employer. | TheOtherHobbes wrote: | "You have been disconnected because payment was declined..." | disgruntled101 wrote: | How long until programmers are forced to ditch antiquated methods | of input like hands and keyboards in favor of streaming thoughts | directly to your IDE? Can't wait to be forced by market forces to | adopt such an interface and then promptly get ads streamed back | or get brainhacked | goldenchrome wrote: | All I can say is, I'm glad I'll be dead in a handful of | decades. | blisterpeanuts wrote: | Suppose that while you're still around, a brain extension | enables you to greatly extend your lifespan. Would you agree | to the implant? Totally hypothetical, of course; I myself | would not have a ready answer. But for paralyzed and nerve- | damaged people, it seems to me adopting this technology would | be a no-brainer, so to speak. | talmr wrote: | What about other thoughts? Does my employer get access to | those? | disgruntled101 wrote: | Just meditate on your breath or play pazaak in your mind to | distract the mind reading | PicassoCTs wrote: | Only for advertisement and performance reviews. You will have | to change who you are to fit into the company, im afraid.. | kjjjjjjjjjjjjjj wrote: | > You will have to change who you are to fit into the | company, im afraid.. | | As if you don't have to already? The false consensus | paradox in corporations is overwhelming. | PicassoCTs wrote: | That attitude will have to go, cause that space for that | attitude is need for some upgrade. I dread this world. One | could glimps it in the Firefall novels of Peter Watts. | "Experts" who upgraded themselves into crippled "savants", able | to outperform all baselines, but incapable of feeling there own | fingertips. | | One can already feel that pressure, regarding substance abuse | to stay awake longer and perform better with amphetamines and | be more creative with hallucinogenics. | | Imagine having to sacrifice ever more parts of yourselves, to | stay relevant. What a horrific freak-show we will become.. | soulofmischief wrote: | > Imagine having to sacrifice ever more parts of yourselves, | to stay relevant | | _The Little Gods_ by Jamie Wahls explores this very notion, | in the context of a parent and her child, and how each | respond to expectations of an augmented society. | | http://compellingsciencefiction.com/stories/thelittlegods.ht. | .. | Mathnerd314 wrote: | I think it'll have to wait until a non-surgical BCI gets decent | performance. The study in the article uses implanted electrodes | - forced surgery would be a nightmare. | beefield wrote: | Don't know about you, but the thoughts in my head are such an | unordered and incomprehensible mess that it is really hard for | me to see any benefit of direct streaming. Typing the thoughts | slowly down and rereading and evaluating consequenses multiple | times is the only way to get any sense out from my head. And I | like to think myself as relatively good thinker... | [deleted] | tibbydudeza wrote: | Neuralink ???. | tlibert wrote: | Next stop: telepathy. | rocmcd wrote: | Call me a cynic, but I don't have a lot of optimism for brain- | computer interfaces. I can barely control my own thoughts, let | alone understand how they are made or where they originate. We | would need to make an exponential leap in our understanding of | the brain and our consciousness within it to make this in any way | a viable input method. | xondono wrote: | I was pretty excited when I started learning about BCIs in | college. Then I realized that it's not that my hands and eyes | are some sort of limited bandwith, but rather that my brain is | not really able to increase the throughput. How many of you | code at the speed you type? | | While I appreciate that they are game changers to people with | accessibility problems, they're essentially not worth the risk | to anyone else. | iamgopal wrote: | This is what I was thinking. As a creative output device My | brain thinks way ahead while I'm typing the current code. This | tech will actually slowdown the process with respect to output | . What this will thrive at is, giving input to the brain about | data by brain generating just enough query. So ultimately | future super human will be the guys who can generate precise | query much faster. | tachyonbeam wrote: | IMO, the bigger problem, which nobody talks about, is that if | we have brain-computer interfaces, it will be trivial to use | them to control our emotions. Once that happens, it seems to me | we'll basically stop being human. People are going to want to | feel whatever emotion is convenient in that moment. | | Don't enjoy your terrible dead-end job? Now you do. Don't enjoy | your abusive relationship? Now you do. Don't feel comfortable | with societal issues at large? Now you do. Empathy gets in the | way of doing your job? No problem. | goldsteinq wrote: | This is the problem with computer->brain interfaces (which we | don't have), not brain->computer interfaces (like moving | mouse pointers via direct brain->computer connection). | wongarsu wrote: | If you have a good brain->computer interface you can just | use traditional methods in a tight feedback loop to | manipulate the brain. We have more than enough methods to | reward or punish people. Simply reward them for good | thoughts, punish them for bad ones, and I don't see how | their behavior wouldn't change. | nnmg wrote: | I don't know, I think that is a big jump and definitely not | trivial. | | "Reading" neural activity is much different than "writing", | and modifying the circuits/neural activity precisely enough | to modify emotions. | | These devices are typically cortical surface level electrode | meshes, placed over the motor region of the cortex, while | emotions are thought to come from various deep brain | structures. Not saying it won't happen, but we are much, | much, further from the latter than the former. | tachyonbeam wrote: | I don't know about that. You're right that emotions seem to | come from deeper structures, but these structures are also | more primitive. We're able to modify emotions with | something as simple as amphetamines, so controlling them | with a few well-placed electrodes is maybe not so | difficult. Seems to me that as brain interface technology | starts progressing, we're going to hit an S-curve of | technological progress that will make it advance very | rapidly in one or two decades. | scsilver wrote: | Terminal Man is a fun read. | https://en.m.wikipedia.org/wiki/The_Terminal_Man | nnmg wrote: | It's definitely possible, but I guess what I am saying is | that research in this area hasn't really been explored in | the context of humans. | | In the lab, we use targeted genetic manipulations such as | optogenetics [1] or chemogenetics (see DREADDS [2]) to | achieve precise circuit manipulations that can | (maybe/kinda) change emotional state (see [3] and [4] for | manipulation of fear in mice, sorry may be pay-walled | check sci-hub). But these are impractical in humans at | the moment because they require specific genetic | backgrounds (a CRISPR modified mouse expressing a | specific artificial DNA sequence in certain types of | neurons from birth), viral injections to add other | genetic constructs that interact with the from-birth one, | and implanting lights or adding drugs directly to the | brain where the cells are. Precise electrical | manipulation is not really done, even in animal labs | because it is not precise or controllable for these types | of things. | | Again, I have no doubt that we will get there, maybe in a | few decades too. But the techniques are much further from | human use than the "reading" technology demonstrated | here. | | [1] https://en.wikipedia.org/wiki/Optogenetics [2] https: | //en.wikipedia.org/wiki/Receptor_activated_solely_by_a... | [3] https://pubmed.ncbi.nlm.nih.gov/28288126/ [4] | https://www.nature.com/articles/npp2015276/ | zajio1am wrote: | > Once that happens, it seems to me we'll basically stop | being human. People are going to want to feel whatever | emotion is convenient in that moment. | | I would say it is the other way. Many animals have emotions. | It is sophisticated abstract thinking that makes us humans. | If one can get full control of their emotional part of brain, | that would make them truly human. | oliv__ wrote: | You're ready to be hired by whoever will market these | devices. | PartiallyTyped wrote: | If we could actually get control of our emotional part, it | will be used for the military to create superhuman | soldiers, devoid of any empathy and augmented with a rage | mode switch that turns off fear and fills them with | adrenaline. | | Personally, I'd use it to deal with my bipolar. | whichquestion wrote: | On the other side of this, some people require external | emotional regulation because their brains fail to do so for | them and take medication for it. So having this as a | treatment option isn't necessarily something we should avoid | pursuing for cases where medication isn't an option for | whatever reason. | tachyonbeam wrote: | Yes obviously, just like there is a legit case for brain | implants for paraplegic or wheelchair-bound people. | However, it's easy to see how things could easily go way | too far and lead to a world of VR-addiction and | dehumanization. | | https://cdn- | images-1.medium.com/max/1200/1*gQVf0RpFjaYfS7GJJ... | | As always, technology is a tool, and a double-edged sword. | It's just hard to predict how it will change society | sometimes. IMO brain implants are actually way more | dangerous than genetic engineering ever could be. People | creating designer babies with blonde hair and a higher IQ | is nothing compared to the risk of people no longer being | able to feel empathy and sadness in response to problematic | situations. Maybe we'll even stop feeling love, because | it's just too inconvenient. | | Oh and uh, yeah: brain implants could also make it possible | to implement the notion of thoughtcrime. I hope, for your | own sakes, that your political beliefs and opinions are in | line with that the majority has deemed correct. | whichquestion wrote: | We already have segments of the population who have to | deal with various forms of addiction and dehumanization | and this has not stopped us in the development of new | medications and technologies. | | Should we prevent the development of this technology due | to its potential for abuse? Should we develop the | technology for its potential to benefit ourselves? | | Obviously brain implants that can read and write thoughts | come with an extraordinary amount of power and it is both | wonderful and terrifying to imagine the potential | benefits and dangers that it could provide us. | | I think we will do what we have done through history and | someone somewhere will develop the technology if it is | possible eventually regardless of our qualms with its | potential to destroy people. | whowe1 wrote: | The steam engine was in use for generations before | thermodynamics theory was discovered. So in many cases, it is | possible to engineer a technology without fully understand the | underlying principles that govern its behavior. | soared wrote: | How does 48 megabits per second compare to a tasks a computer | does? | escape_goat wrote: | From the moment I read "full broadband fidelity," I began looking | at this press release as a product of dark-pattern science | communications rather than an announcement of scientific | progress. The news is that the connection is wireless and high- | bandwidth. Low-bandwidth wireless communications have already | succeeded elsewhere. Innovation could have occurred regarding the | interface device in the brain, the broadcast chip in that device, | the physical link link layer, the protocol layers above that, the | external receiver, et cetera, but there are no details we can | clean except that the connection is 'virtually' as good as a | physically wired connection. Wherever details are missing, we can | assume neither that they were overlooked by the writer, nor that | they were deliberately left out. We can assume, however, that | they did not contain any information which furthered the author's | purpose. | burlesona wrote: | So back in the 60's, people looked back at the progress over the | previous decades, imagined the future, and thought about space. | We'd have commercial space flight any day now. The most poignant | scene I can think of: in 2001 A Space Odyssey, the character | flies on a Pan-Am spaceship to the moon, and then goes into a | _phone booth to make a phone call._ | | Fast forward and it turns out that we had been near the top of an | s-curve when it came to space tech, but near the bottom of the | s-curve of computers, and few people back then were imagining | (could imagine?) how different the world would be 50 years later | with everyone carrying around internet-connected supercomputers | in their pocket. | | I think we may be in the same situation today, where people | imagine the future and think AI revolution and computational | everything, but are mostly missing that we're at the bottom of a | biotech s-curve that is going to blow "computer" progress out of | the water over the next 50-60 years. | | My guess is that in 60 years our computer technology will be | largely similar to today, just faster and nicer. But in the same | way that the mature industrial revolution made high-precision | manufacturing possible which made incredible computers possible, | our mature computer technology is now enabling incredible | progress in biotech. And the explosion of biotech will lead to | mind-blowing changes that are difficult to even imagine. | | From this article: no more keyboards / mice? No typing, you can | "think" to write. What about recording your own thoughts and then | playing them back to yourself later? How much further can that | tech go? And there is so much more beyond BCI, we are just | understanding the basic building blocks in many areas, but making | amazing progress. | | I'm excited about it. | meremortals wrote: | The Mood Organ from Do Androids Dream of Electric Sheep | ben_w wrote: | While I agree that we will have mind blowing biotech | improvements in the next 50 to 60 years, I don't believe it's | _physically possible_ for biotech progress to be as mind- | blowing as what happened in computer tech. | | In the last 60 years, computers have gone from $160 | billion/GFLOP to $0.03/GFLOP; transistors are now smaller and | faster than synapses by the same factors that _wolves_ are | smaller and faster than _hills_ , and the sort of computing | tech that was literally SciFi in the 60s -- self-teaching | universal translators, voice interfaces, facial recognition -- | is now fairly standard. | | 60 years of biotech? If the next time I wake is after 60 years | in a cryonics chamber[0] and was told _every_ disease was now | cured, that every organ could be printed on demand, that adult | genetic modification was fine and furries could get whatever | gene-modded body they wanted up to and including elephant-mass | fire-breathing flying dragon, and that full brain uploading and | /or neural laces a-la The Culture were standard, I would | believe it. But if they told me biological immortality was | solved (as opposed to mind upload followed by download into a | freshly bioprinted body with a blank brain) I'd doubt the year | was really only 2081 -- not all research can be done in | silicon, some has to be done in-vivo, and immorality would be | one of them. | | [0] this would be very surprising as I've not signed up for it, | but for the sake of example | maxander wrote: | If we have full-on adult genetic modification capable of the, | ah, dramatic example you provide, we've certainly figured out | a way to get around in-vivo test difficulties. For better or | worse, any biomedical advance comes up against that problem | sooner or later. | | Therapies to slow aging have the particular problem that it | could intrinsically take decades to show an effect, sure- but | that's simply reason to be a bit more ambitious and aim for | therapies to _reverse_ aging, which could be tested rapidly | in already-old patients. :) | KaoruAoiShiho wrote: | > But if they told me biological immortality was solved (as | opposed to mind upload followed by download into a freshly | bioprinted body with a blank brain) I'd doubt the year was | really only 2081 -- not all research can be done in silicon, | some has to be done in-vivo, and immorality would be one of | them. | | But why, biological immortality is already here for many | animals like jellyfish. Honestly seems closer to me than | uploading. | ben_w wrote: | Because 60 years isn't enough time to tell if you were | completely correct, or if there was something you missed. | oliv__ wrote: | _" What about recording your own thoughts and then playing them | back to yourself later?"_ | | Oh hell nah, it's enough craziness in real time, I sure as hell | don't need to replay _that_. | IgorPartola wrote: | Imagine if we could connect your brain directly to a computer. | Imagine if you could do things like instantly and precisely | recall any Wikipedia article, any news story, any mathematical | formula. Imagine if arithmetic goes from a skill you learn to a | thing your brain does with 100% accuracy. | | Now imagine if your need for speech goes away: why bother using | it when you can just "text" from your brain directly to mine | and I instantly know what you said without me having to "read" | anything. Instant communication. Instant connection to anyone. | Instant ads beamed directly into your brain by Google and | Facebook. | | Now take it a step further: your mind is now a part of a | collective globally connected network. The boundary of where | "you" exist and where the rest of the world exists is erased. | You can feel what other people feel. You can see through the | eyes of an Oscar winner, a surgeon, a head of state, a porn | star. Police body cams become police mind cams: what was the | cop thinking when they took any given action? What we currently | have as YouTube celebrities and Instagram influencers become | Mindgram stars. You can see and perceive as them. | | Now take it a step further. Death isn't death. Like the paradox | of rebuilding a ship one plank at a time, your mind stops | existing in your body and occupies a collection of other | bodies. Artificial intelligence mixed in with real intelligence | mixed in with remnant intelligence. We can't imagine what this | feels like but we are marching towards it getting ever closer | every year. | | Now take it a step further. People want to get away from this | hive mind concept. They disconnect. They play games. They make | games where all NPCs are now simulated to the point where they | believe they are real. They are here for the benefit of the | players but even the players can't tell the difference when | they are in the game. | | Now take it a step further. Inside the simulation someone | introduces Hard Seltzer. The in game year is 2021 and a player | just read that some NPC somewhere had just created a | brain/computer interface. He rips off his headset and goes to | unplug the computer because fuck this game, all the DLC clearly | ruined it. | goldsteinq wrote: | > Imagine if we could connect your brain directly to a | computer. Imagine if you could do things like instantly and | precisely recall any Wikipedia article, any news story, any | mathematical formula. Imagine if arithmetic goes from a skill | you learn to a thing your brain does with 100% accuracy. | | You're talking about transferring information FROM computer | TO brain. We have no idea how to do it. | | Transferring information FROM brain TO computer is achievable | with modern tech (and that's what this link shows), but not | vice versa. | scsilver wrote: | We have perfectly good analog inputs, I'd rather we start | with improving those rather than open the digital 6th sense | box. | IgorPartola wrote: | How much can you improve the human ear? More importantly | how much can you improve the speed with which you | perceive with the human ear and actually understand and | retain information from it? You can probably double it's | efficiency. But can you make it take in information with | perfect clarify at 10x the rate? 1000x? A direct | interface into the brain could hypothetically bypass the | ear entirely. And there is precedent for this already: we | went from pointing and grunting, to speech, to writing, | to digital writing, to the web. Imagine what it might | have been like for me to convey this message to you if we | lived in a hunter gatherer society before human speech | was a thing? Now flip that forward: what specialized | tools could we use to speed up communication more? About | the only things we have left are real time translation | devices and an AR capable of augmenting what we are | looking at with relevant labels and articles. Beyond that | we have no place to improve without inventing a radically | new way to interface, and in nature you either improve or | you die. Nothing stands still and neither will this. | IgorPartola wrote: | Yes that's true. But I think two way communication is a | goal for a lot of research and whole orders of magnitude | more difficult, likely not impossible. | greenwich26 wrote: | Actually, and fortunately, it is impossible. This sort of | brain model was preemptively debunked by Kant in | _Critique of Pure Reason_ and other works. | eggsmediumrare wrote: | I think it's pretty hard to argue that any philosophical | work, regardless of how important or impactful or | insightful it is, can "debunk" anything. | codebolt wrote: | But much more speculative. Just because we can imagine | something doesn't mean science can/will achieve it. | Brain-to-computer communication seems much more | straightforward to achieve from a technical perspective, | and has enough potential to revolutionize aspects of our | lives on it's own. | throwaway316943 wrote: | The awesome thing is that our brain is great at learning | how to use and make sense of new inputs. The big one being | literacy, you never think about it but you learned to | interpret strange little patterns first as sounds that you | hear in your head and then as entire concepts like "cat" or | "happy". The same can be said for spoken language or | mathematics or musical notation. I don't doubt that the | human brain will have little trouble learning that X | pattern of electrical inputs to a group of neurons means | "cat" or the sound of A or even an image of a bird. It | won't come instantly and it won't be identical to the thing | it represents without wiring directly into the visual or | auditory regions but it will give us a new sense and a new | language. | sigg3 wrote: | Transferring information TO BRAIN is sort of the raison | d'etre of computers. | | This comment FROM brain TO computer, FROM computer TO | computer, FROM computer TO your brain. | | It's awesome. | lanstin wrote: | At least half the value or learning arithmetic is that it | shapes ones neural network in some fashion in a way to make | it better at certain types of thought. Skipping that | learning process presumably skips those physical changes as | well. | all2 wrote: | We could train our "soft" neural networks very | efficiently with a computer interface. Maybe not as fast | as dedicated software neural networks, but the human mind | responds very quickly to feedback loops (sometimes | destructively). | | Which makes me wonder, what will an overtrained brain | look like? What kinds of illnesses are we unleashing on | the world by attaching an interface like that directly to | the brain? | IgorPartola wrote: | My pet theory is that anxiety is an over trained brain | reaction. | all2 wrote: | In a lot of recovery circles there's an underlying | concept of "getting out of your head" where the | methodology that arises in each circle attempts to get a | person to leave the circular thoughts in their heads and | do/think something else. | | I think this is why psylocybin is so effective for | depression: it induces a state of plasticity in the brain | that gives someone an opportunity to fill in the ruts | they had been mentally pacing in. | bserge wrote: | eXistenZ (1999 film) was a mindfuck when I saw it. | | But yeah, I think we're getting closer and closer to a true | hivemind. It would have to suppress the individual | personality, otherwise a lot of people will likely go insane. | | Of course, it could be that's acceptable losses or they're | cut off from the "advanced civilization" and left to live | somewhere far from the cities. | | That is, of course, if half the planet isn't flooded and | turned into desert by then. | Elof wrote: | You should check out the Nexus series :) | TheOtherHobbes wrote: | Or watch Forbidden Planet - "Monsters from the Id" | [deleted] | ElFitz wrote: | > Imagine if we could connect your brain directly to a | computer. | | Please, no. I'd just get even more frustrated at how slow the | damn thing is. | IgorPartola wrote: | There is so much about it that I think would be wrong, | difficult, bad and we as a species can't even imagine what | it would be like. How do you install an ad blocker on an | interface like that? | ElFitz wrote: | Very good question. | | Regarding the ad blocker, one thing I definitely can't | wait for are true AR glasses that could act as a real- | life ad blocker. | | Being out in the street doesn't mean I have in any way | agreed to being constantly drowned in and have my | attention stolen away by all this bloody noise. | katzgrau wrote: | I'd argue we already connected ourselves to computers, and | we're just using the safest but slowest adapters available | right now. | ElFitz wrote: | We could argue on "safest", but definitely the compromise | on speed, ease of use and safety that I can think of that | I'm the most comfortable with | qayxc wrote: | I'm curious, what would a solution look like that's even | safer than a touch interface (mouse, keyboard, screen) or | voice? | ElFitz wrote: | Well, removing audio and images, leaving only text, would | make it safer. | | I, and probably many others, wouldn't have stumbled upon | some of the things I have. They thankfully are now only | blurry memories to me, even though merely evoking them | still is nauseating. | | It would also dramatically reduce the impact of much of | the bullshit content out there, since words appear to | have much less emotional impact than images (and appear | to be much less appealing), and thus be safer to society | as a whole. | | A text-only interface would also be much less useful and | much more annoying to use. | all2 wrote: | > since words appear to have much less emotional impact | than images | | To some. My imagination is quite good and as a child I | consumed vast quantities of print media. A lot of it | wasn't appropriate for a child to consume. | jamiek88 wrote: | Same here, as evidenced when one rainy Saturday afternoon | the family were all together after lunch, grandparents | drinking tea, me cross legged on the floor reading when I | innocently asked the room "hey, what does cunt mean?' | | Grandma turned puce and dad snatched the book from me, | wtf are you reading? | | James Herbert's Rats trilogy. Aged 8. | | Warped ever since. | | Text is plenty. | blisterpeanuts wrote: | The notion of a neurally connected Facebook or Google scares | the heck out of me. Apart from the countless petabytes of | data they would collect, just imagine a world where the | powers that be can actually tap into your thoughts, perhaps | even implant ideas that they think you should have. At a | certain point, we lose our individuality and become subsumed | by the global AI, game over. | | But before that distant dystopian point is reached, I do hope | we develop ways for paralyzed people to regain sensory | control and live normal lives. | ElFitz wrote: | > At a certain point, we lose our individuality and become | subsumed by the global AI, game over. | | In a way, we already have. Each and every one of us is | constantly influenced by and influencing untold numbers of | people, and most beliefs and knowledge are more or less | "standardised". | | Most people follow the school -> (college ->) 9 to 5 -> | retire consensus, and even those who believe themselves to | be outliers actually behave how outliers are expected to | behave, all of us furthering the goals set by others, some | of which died have even died long ago. | | Actual individuality is quite rare and usually expressed at | a very small scale. | sigg3 wrote: | I think your identification of individuality and a | measure of uniqueness is a mistake. | | We're not individuals apart from others, the others are | presupposed. The I is an abstraction in the sense that it | presupposes social terms to understand itself. You need a | reference, like culture, to be correctly understood as | "alternative" (although 'peripheral' in terms of some | specific aspects is more correct). | | If you're not in a community at all, you're not going to | reproduce. | | Actual individuality is merely recognizing the exercised | autonomy by an agent. You are still an individual even | when you behave according to existing mores you did not | create. | | (The extra social esteem bestowed to relative difference | is a cultural trend and a historical phenomenon. It does | not determine our species, only our current conditions | and predicaments.) | ElFitz wrote: | I fail to see how your arguments contradict my words. | | Each ant constantly exercises it's autonomy. Would you | nonetheless argue that it has any individuality in the | way the gp intended the word? | hypertele-Xii wrote: | How would individuality be expressed at large scale, | anyhow? Funny thought. | ElFitz wrote: | By figuring and trying out new ways of being instead of | seeking to conform to archetypes. | | By trying out new ways out of repeating situations and | creating new behaviours instead of either repeating the | same habits or trying to adopt someone else's response to | them. | | At an individual's scale, those would be large and have | big impacts. | | Much more so than the colour of my living room's wall, my | type of car or defining myself by wearing either shirts | or T-shirts and trying to impose on everyone else what I | consider to be professional or unprofessional. | badjeans wrote: | What's so dsytopian about that? | cercatrova wrote: | > He rips off his headset and goes to unplug the computer | because fuck this game, all the DLC clearly ruined it. | | > The most poignant scene I can think of: in 2001 A Space | Odyssey, the character flies on a Pan-Am spaceship to the | moon, and then goes into a phone booth to make a phone call. | | Interesting that you commit the same fallacy as the parent | talks about: you talk about all this complexity in biotech | but then assume that there's going to be a headset with a | computer in order to connect to the simulation, rather than | it being directly implanted into one's brain. | IgorPartola wrote: | I added that for a bit of color :) | MetalGuru wrote: | > Now take it a step further. Inside the simulation someone | introduces Hard Seltzer. The in game year is 2021 and a | player just read that some NPC somewhere had just created a | brain/computer interface. He rips off his headset and goes to | unplug the computer because fuck this game, all the DLC | clearly ruined it. | | Lmao. Hard seltzer isn't that bad | IgorPartola wrote: | It's proof we are in a simulation. What else but a random | item generator could have come up with it? | PartiallyTyped wrote: | > Now take it a step further. Death isn't death. Like the | paradox of rebuilding a ship one plank at a time, your mind | stops existing in your body and occupies a collection of | other bodies. | | Douglas Hofstadter talks about this in "I am a Strange Loop" | [1], but he argues that our 'soul fragments' as he calls them | are a representation of ourselves in others. Depending on how | large of a fragment they hold in our brain, we can perceive | the world as they do, and think as the other person. They get | to experience the world through us, in a sense, given that we | 'allow them to'. | | It is an interesting idea, and helps reconcile the death of | our loved ones. | | [1] https://www.amazon.com/Am-Strange-Loop-Douglas- | Hofstadter/dp... | phreack wrote: | This reads like an Asimov story! He really did have some very | well informed predictions that seem accurate even nowadays. | TeMPOraL wrote: | > _Now take it a step further. People want to get away from | this hive mind concept._ | | Why would they? | | All the preceding paragraphs sound like Borg collective, but | hey, if it's voluntary, it actually doesn't sound bad. As | long as we can keep adtech away. | bserge wrote: | Yeah... _if_ it 's voluntary. | | Working is voluntary. | | Sure, you want to move to a farm far away from all the | madness or you want to sit in a workshop designing robots | all day, but you need money, so you "volunteer" to work | some job you barely tolerate in or near a city for all of | your best years. | | The ads are just constant slaps in the face. | TheOtherHobbes wrote: | We were also at the bottom of the s-curve of social malware and | adtech, which turned out to the real outcome of commoditised | computing. | | The rhetoric of personal creativity and freedom was flattened | by something far less interesting and more toxic. | | I wouldn't be so keen to rush headlong into a bioware-connected | world until that problem is solved. | frashelaw wrote: | This indeed is also my main fear. All the nefarious, | intrusive, methods of advertising will get a hundred times | worse, but now be directly beamed into our brains. Our | hypercapitalist economy will only accelerate and exacerbate | this. | fnord77 wrote: | we have something today that's 3-4x faster than pecking on a | smartphone keyboard: voice to text. | StavrosK wrote: | I extremely doubt that, I type faster with predictive text | than I speak. | throwawayboise wrote: | Entirely depends on the individual. With a virtual on- | screen keyboard, I can rarely type even one word without | error. It's like my fingertips are just too big to hit the | keys accurately. Swipe-keying is somewhat better/faster but | I'm much better with real physical keys. Speech-to-text | used to be pretty bad but with my current phone it's better | than typing, for me. The downside is I hate talking to | computers. | StavrosK wrote: | Have you used SwiftKey? I find it corrects 99% of my | errors, to the point where I just press keys in the | vicinity of what I want to type and it comes out correct. | ungamed wrote: | But only 90% accurate, thats 10% not accurate enough. | derefr wrote: | > we had been near the top of an s-curve when it came to space | tech | | Near the top of an s-curve in _getting-to-space_ ("heavy lift") | tech, more like. | | I'd say the field of actual _in-space_ tech (i.e. technology | that takes advantage of low-gravity / low-pressure / low- | oxygen environments) is still pretty nascent. We still treat | space as "Earth, minus some guarantees" (i.e. something we have | to _harden_ for) rather than doing much with the unique | _benefits_ of space. | | It'll probably take having a long-term industrial base | operating _from_ space to see much change there, though. | | Imagine, for example, living on a space station, and having | your food cooked using cooking techniques that assume cheaply- | available vacuum and/or anoxic environments. :) | madpata wrote: | > cooking techniques that assume cheaply-available vacuum | | That also assumes that the atmosphere you're venting for that | "cheap vacuum" is cheap as well. | hypertele-Xii wrote: | You don't have to vent. You can just compress into storage. | Negitivefrags wrote: | So to get your cheap vaccuum you first have to make a | vacuum in a chamber by sucking the air out before opening | it to space? You can just skip the open it to space part | and do it on earth! | burlesona wrote: | Yeah I agree, there are "generations" of technology, and I | think that people in the 60's looked at the progress of | transportation tech from 1910 to 1960 and thought "at this | pace we'll all be zipping around the solar system like it's | nothing by the 2000s." It was not easy to form an intuition | of why the first generation of space tech was going to hit | physics-imposed limits that would "slow that progress." | | To be fair we still made lots of progress with space tech | after the 60's, and I think via SpaceX and others we are | hopefully now starting a new S-curve unlocked by cheaper | access to LEO. | akhilpotla wrote: | An interesting thing to think about was people used to | think of getting somewhere. Now we think of things coming | to us. In a way, we did achieve the "zipping around", it's | just that we did it via the internet and wireless | communication. Of course, it is not the same, but it is | similar. | wongarsu wrote: | 60s space tech also hit a lot of limitations of computers | of the time. SpaceX's Super Heavy is in some ways a | reimagining of N1's first stage, but with control software | that makes it viable to deal with engine failures in | flight. | | But a big problem is also that the progress in space tech | wasn't organic. There was no economic incentive, it was | driven purely by propaganda, national pride and political | goals. Once that fell away it took half a century for | economic usecases catch up to a point where private | investment was viable. | PeterisP wrote: | "food cooked using cooking techniques that assume cheaply- | available vacuum " - what would those be? | | Vacuum isn't something that's hard to get if you need it, all | you need is a motor driving a pump, so if any industrial food | process or one of the fancy restaurant chefs would have a | good use for it, they would be already using vacuum in | cooking. A kitchen vacuum sealer is <100$ (I'm assuming that | would count as "cheaply available"), and it's not | particularly useful though for most other cooking purposes | that come to mind. | blisterpeanuts wrote: | Space tech might be considerably further along today, had we | in the U.S. not limited our R&D after 1969. A reusable | shuttle that cost over $1B per flight was interesting | innovation in 1981, but it actually represented a dead end | for the U.S. rather than the beginning of a new era of | exploration. | musingsole wrote: | Once the moon mission was accomplished, we lacked a clear | target on which to stay focused. Build cool things, but | what for? | | Researchers could concoct all sorts of narratives, but it'd | lost the spark that held the layman's attention and | permitted the spend of political capital. | wongarsu wrote: | From what I gather from the era the next goals were | pretty clear, even to the general public: "go to Mars" | (or in the Soviet case "go to Venus"), and then go to | Alpha Centauri. | | If the Soviets would have won the race to the moon this | might even have happened. But instead the Soviets decided | to focus on space stations, and the US declared | themselves winner and did largely nothing (by rejecting | NASA's Space Transportation System proposal, which was | also about a space station and a way to get there | cheaply). | lumost wrote: | The space shuttle was a bad system for a number of | political reasons. Its per launch cost was comparable to | developing new heavy lifts or developing new missions to | the outer planets. | | Effectively nasa was spending their RnD money on opex and | not getting any political capital back for it. Tragically | the program had been set up with the promise of a space | truck, if nasa admitted they didn't deliver a space truck | Congress would have been unlikely to fund a new program. | Had the budget been allocated entirely to either | technological development or novel explorations America's | willingness to fund nasa could have been substantially | different. | hhs wrote: | There's a startup called MindPortal (YC W21) that seems focused | on using tech to review thoughts and control things: | | https://www.ycombinator.com/companies/mindportal | d_silin wrote: | That's actually a good prediction. Biotech is untapped field | for moonshot-scale breakthroughs, even the current mRNA | vaccines is only a small part of what will be possible in 10-15 | years. | jszymborski wrote: | I think this is why the grand-parent is perhaps making an | error in thinking that e.g. space travel or computing are | silos unto their own. | | Having worked in wet labs and deep learning labs, I think | we've a lot to gain from increasing our ability to simulate | experiments in silico and automate biological processes. | | A lot of the room for improvement has been carved out by | improvements in machine learning. | burlesona wrote: | I agree with you :) | | > But in the same way that the mature industrial revolution | made high-precision manufacturing possible which made | incredible computers possible, our mature computer | technology is now enabling incredible progress in biotech. | jszymborski wrote: | True, I missed that! | BiteCode_dev wrote: | With the track records we have of power and technological | abuses, I'm not sure I'm excited about having a direct | interface to my head. | | In fact, I would be more excited about IRL laws tuning down | what some are doing with the current indirect interfaces to my | head, such as fake news, propaganda, advertising and | manipulations of all sorts. | PartiallyTyped wrote: | I agree with you, and I'd say that we are perhaps in the | bottom, or the middle of the S curve in software. Despite all | the technological progress in hardware, our software is very | slow and buggy. We end up increasing complexity in the name of | 'productivity' and going up in abstractions, but we end up | lacking fine-grained control and performance in modern systems. | | I hope that we see a paradigm shift back towards writing robust | and performant systems instead of stacking abstractions. Sure, | Monads and Transformers are all fun to use, make code coincise | and are very satisfying when they compose well, but, what's the | hidden cost, and is it worth it? | | As a user that encounters bugs at a disproportionally high | rate, I'd say no. The trade-off in increasing abstractions is | not worth the trade-off. | UnpossibleJim wrote: | I think this has more to do with management timeline | expectations and income valuation than software development. | I do understand that they're almost inseparable, as software | has to make money. But, timelines need to take into account | the "craft" of software creation and not just the desk hours, | for lack of better terms =/ Short timelines and quick turn | arounds don't leave time for refactoring and quality code | creation. First passes tend to be the final draft, more often | then not. | jolux wrote: | Your experience of using Haskell is that it causes more bugs | than less abstract languages? | PartiallyTyped wrote: | No, on the contrary, my experience with Haskell is that my | code is mostly bug free, but ends up less performant | because you can accidentally create huge trunks in the heap | and it consumes too much mental stamina. | | However, there exists an intermediate plane of abstraction | over C and under Haskell that is absolutely horrendous and | results in all sorts of weird bugs and unpredictable | situations. | jolux wrote: | What about Rust? | tachyonbeam wrote: | I've heard people say that Haskell would be better with | eager rather than lazy evaluation, because of the mental | burden that it causes. IMO that doesn't seem like a hard | problem to solve. We can design pure functional languages | with eager evaluation. | zozbot234 wrote: | Haskell would be better if it used polarity and focusing | to make _both_ strict and lazy evaluation first-class | citizens in the language. A stricrly-evaluated | counterpart to Haskell is just ML, which we 've had since | the 1970s. | jose_zap wrote: | You can make any module strict by using the XStrict | language pragma in Haskell. | tachyonbeam wrote: | I've coded in OCaml, which wasn't pure immutable, but | rather immutable by default. Because it has mutability, | that removes the focus on immutable data structures, | making it a very different language. | hypertele-Xii wrote: | It makes sense though, the cost of hardware per performance | has nosedove, whereas the cost of human labor has probably | gone up? Why pay devs to write fast code when you can buy | faster computers faster and cheaper? | | I wonder when the trend reverses. It must, at some point, | mustn't it? | PartiallyTyped wrote: | > Why pay devs to write fast code when you can buy faster | computers faster and cheaper? | | Because you can do more in the same time. Speed and | responsiveness are features, the issue is, the general | population has come to accept that bugs are not only | acceptable, but just that, 'bugs' that you can shoo away by | restarting the machine/program. | | > I wonder when the trend reverses. It must, at some point, | mustn't it? | | I hope so. As we have seen with spectre and now AMD's | equivalent, speculative execution is risky and very complex | to get right. We can't rely on ever increasing complexity | on CPUs and fabrication processes, at some point quantum | mechanics will bite back. | ChuckMcM wrote: | Pretty much. I'm less excited. Because I was super excited 50 | years ago when we were at the bottom of the S-curve on | computers and I had no idea they would eventually be | commandeered to rip off my parents. I think of it as a "wider" | view of the impacts vs a "narrow" view of the benefits. | | For me, the nagging question is what happens when biotech has | figured out biological systems to the point that everyone stops | aging/dying[1]. Does that 10 billion people, give or take, | become "the humans" for the rest of time? | | [1] Lots of evidence that there is no "reason" for cell | senescence, it's just an evolutionary afterthought (you | succeeded in reproducing, now go die) and like other things can | be "fixed." | pmichaud wrote: | I wish I could find the source, but when I've looked into | this in the past I was reasonably convinced that the | population would go up a bit, but not catastrophically, with | the basic idea being that people die all the time for lots of | reasons, only one of which is "aging" (aging is multiple | things, blah blah). So people's lifespans would be much | longer on average, but not infinitely long. Something more | like 300 or 400 years, with a pretty big standard deviation. | bopbeepboop wrote: | I see it the other way: | | We've had a revolutionary S-curve with computing/artificial | reasoning in inventing transistors -- but we know we're still | at the bottom of two related S-curves, quantum computing (an | exponential increase in many problems of interest) and | IOT/smart systems where our automated reasoning is embodied in | something. We know somewhere up those curves lies the ability | to make new kinds of minds. | | I think both of those will prove to be bigger than bio- | science... and more over, bio-science will require them to a) | do the experiments and b) find uses for the technologies. | | I think human augmentation will turn out to be like | spaceflight: humans are near the top of their S-curve already. | | Instead, I think biology research won't come into its own until | AGI research does and we have an idea of how to make _new_ | biological systems. | | Of course, that might kill us all. Horribly. | oceanghost wrote: | > but near the bottom of the s-curve of computers | | I think the most interesting exemplar of this is Star Trek. As | everyone knows, Star Trek is based on 19th century naval | warfare. The battle scenes are hilarious-- a captain calling | out orders at human speeds to his crew that executes them. | | It's been obvious for 40 years that computers would do the | fighting, but in 1966 it wasn't obvious, so the paradigm was | Horatio Hornblower. | ddalex wrote: | star trek is not heavy on fighting anyhow, so these battles | are plot devices... how would a computer executing and | finishing a battle even before humans figure ouut that | somethin is happening help with the plot.... | | in the same vein, why would strikes on a ship result in | sparkles on the bridge.... | hyko wrote: | I guess I don't see what is supposed to drive the S curve in | biotechnology over the next 60 years. Advanced information | technology is a necessary but not a sufficient condition for | biotech mastery, and the other conditions are just not in | place. | burlesona wrote: | Just to name one thing, I think CRISPR is likely to be seen | as fundamental a technological building block as the | transistor. | vkou wrote: | > So back in the 60's, people looked back at the progress over | the previous decades, imagined the future, and thought about | space. We'd have commercial space flight any day now. | | We _do_ have commercial space flight. Commercial space flight | has exploded over the past few decades. The sky is full of | communication satellites, imaging satellites, sensor | satellites, even the occasional vanity satellites. | | Everything worth doing in space, we're doing. | | What we don't have is things that aren't worth doing in space, | like Pan-am flights to the moon. | noir_lord wrote: | I think the big revolutions are going to be in biology - fast | super computers allows us to make advances we couldn't have | made any other way. | | Computers are an enabling technology for basically every other | advancement - in fact it's hard to imagine breakthroughs at | this point that don't involve computers in some way - even | 'just' as a tool for collaboration. | hypertele-Xii wrote: | 'Computers' are more like, industrial consciousness, though. | What happens when we can grow brain cells on demand? (and | support their function). Imagine upgrading your _self_ like | you upgrade your computer. More brain. Less sleep. More | hands. Armored skin. Photosynthesis. | | Imagine you're a shapeshifter. You can copy any aspect of any | living organism on Earth, integrate tech directly into your | body, and the smartest people are coming up with new useful | things to add. | bserge wrote: | Aaaand it's all banned. Enhancing your performance? Heresy! | | That's assuming such research is allowed in the first | place. Humans are innately repulsed by biology, by | organisms. Imagine an arm with its skin ripped open to the | bone, gushing blood everywhere. Imagine your guts hanging | out from your belly. | | And the ethical/moral rules we built over thousands of | years will not allow the majority to sit by and watch some | atrocious (from their POV) experiments. | emayljames wrote: | Also, taking economics to a new level, once we have got past | the scarce-resource/pollution based short term thinking and | it no longer is viable to have a volatile structured economy, | where there is no longer a need for the profit motive. Super | computing could easily take this task up of organising and | fairly structuring the economy. | frashelaw wrote: | > From this article: no more keyboards / mice? No typing, you | can "think" to write. What about recording your own thoughts | and then playing them back to yourself later? How much further | can that tech go? And there is so much more beyond BCI, we are | just understanding the basic building blocks in many areas, but | making amazing progress. | | While this itself is certainly an interesting concept, I'm | worried at its consequences when implemented in our | hypercapitalist economy: We'll almost certainly, along with | this incredible interaction technology, have advertising beamed | directly into our consciousness or something similarly | intrusive. It's honestly terrifying how much worse intrusive | tracking and advertising would get with this technology. | ChrisMarshallNY wrote: | I agree. | | Your comment made me think of Natalie Woods' last movie: | "Brainstorm." | | https://en.wikipedia.org/wiki/Brainstorm_(1983_film) | jmfldn wrote: | This stuff is no doubt promising, not least for disabled people | but this neuralink-type stuff seems terrifying. Anyone who is | excited about having an Internet connection into their brain | needs their head examined. Us humans don't exactly have a good | track record of avoiding awful unintended consequences when we | introduce new tech, no matter the benefits. | | It all starts out with an innocent sales pitch, "we're just | connecting people" etc etc, but whatever we build ends up | reflecting human nature and our social and economic context in | all its myriad ways, good and bad. | | We don't control technology or even have the foggiest idea how | anything we build will pan out. We just make it look like there | was a masterplan after the fact when in reality it was a headless | blunder. | | I'm the sort of person that yearns for computing to be done | sitting on a chair looking at a big screen. I don't even like | mobile internet devices that much in terms of what they've done | to us. Beaming this straight to our brain? I'm out thanks. | oliv__ wrote: | I've been thinking about this quite a bit: there seems to have | been a shift sometime during the last 50 years where instead of | computers and computing being a tool to be mastered and | controlled by humans, we've seen computers switch the power | dynamic and render us humans the tools. | | To me, that clash of "visions" was supremely represented in | Apple's "1984" advert. | | I'd love for computing to get back to that utopian vision of | the bicycle for the mind but if you look around these days it's | more of a train-wreck of the mind. | etblg wrote: | You may like the Adam Curtis documentary series "All Watched | Over by Machines of Loving Grace", which basically posits | that same thought. | jmfldn wrote: | "I'd love for computing to get back to that utopian vision of | the bicycle for the mind but if you look around these days | it's more of a train-wreck of the mind." | | Same here. I'm increasingly not interested in machines or | software that I don't control at leaf to some extent, that | I'm not free to modify, that aren't about empowerment, | learning and creativity for the user. It's not just based on | a personal desire, although it is partly that. It's the only | way we can stay free. This isn't just some high-minded hacker | ideology, it's literally about liberty. | nightowl_games wrote: | Last night someone put on a history documentary about Oliver | Cromwell's invasions of Ireland. They showed that the invention | of the printing press around that time led to all the news of | the Irish rebellion having a further reach inside of England. | Oliver Cromwell channeled that nationalism into his own | political gains and caused a wake of destruction through | Ireland. | | It's exactly the same phenomenon as we see now with social | media. | | This is the natural cycle of human innovation in communication. | | I want these interfaces to happen despite the growing pains we | will have. | | Besides, it doesn't matter what we want. They will happen | either way. | | Think about it in terms of larger time scales and the growing | pains of new technology seems like a more worthwhile cost. | | I know it's dismissive to label real human suffering and death | as growing pains, but this stuff is inevitable, the results are | predictable, and in a large enough timescale, the technology | will yield immense fruit. | 0x4d464d48 wrote: | "Anyone who is excited about having an Internet connection into | their brain needs their head examined." | | Perhaps you can have this done with a POST request in the | future and kill two birds with one stone. | unchocked wrote: | So the innovation here is not the neural probes (200 neurons, | same state of the art), nor the connection from the neural probes | to outside the skull (physical port, not wireless), but only that | there is a wireless dongle that sits on top of the physical port | and connects to a server somewhere? Yawn? | lallysingh wrote: | > In the current study, two devices used together recorded | neural signals at 48 megabits per second from 200 electrodes | with a battery life of over 36 hours. | | So roughly 10 Gbps, not bad. But this isn't about the raw tech. | | For people doing medical research, it lets them gather data all | day instead of just appointments. That's the big change. | tyingq wrote: | I didn't read it as 48mbps per signal, but rather 48mbps per | device. Did I read it wrong? | maddyboo wrote: | I actually read it as 48 Mbit/s for the 2 devices combined, | or 24 Mbit/s per device. | tyingq wrote: | Ah, yeah, thanks.. | | _" In the current study, two devices used together..._" | o_p wrote: | I doubt thats physically possible, its like trying to access | your RAM by reading EM fluctuations from the outside of your | computer, sure, you can see when theres a lot of read/write | instructions but theres no way you read memory. | tyingq wrote: | It seems like it's closer to measuring which areas of the | "die" are busy. Like "the MMU is doing something hard" or | "this adder circuit in the ALU is idle", etc. | tyingq wrote: | I think, yeah "Yawn" from a tech perspective. But for average | people, the idea that these folks are hooked up and streaming | from their homes, with minimal visible hardware is new. It | feels more sci-fi than someone in a research center with a ton | of wires on their head. | loliko wrote: | all these great brains working on such commodities when we have | climate change, poverty, authoritarianism and other issues much | more pressing. seems that we live in an era where it's easier to | make human talk with machines than to work on helping humans | talks peacefully with each other. | whowe1 wrote: | Neuralink's tech is definitely more advanced, but they haven't | gotten it into humans yet and there are still issues with the | longevity of the threads inside an actual human brain. ___________________________________________________________________ (page generated 2021-04-04 23:00 UTC)