There are no good substitutes for understanding.

Applications of Brain-computing interfaces

One of the things I find frustrating about BCIs is that everyone is working hard on them without a good idea of what exactly one would do with them (aside from the most obvious things like ‘robot hand’): It’s very handwavy: “it’ll be a memory prosthetic increasing IQ 20 points!” ‘yeah but how’ ‘uh’. I don’t need a detailed prototype laying out every step, even just a generic description would do. What’s the VisiCalc or visual text editor of BCI? -gwern on reddit

Assuming read-only BCIs:


Health & brain activity interactions:

  • What kind of effects do nutritions have on brain activity? Is there a set of nutritions you can take to balance hormonal / other cognitive defects, or bottlenecks?

Emotional regulation:

  • continuous visual feedback on your current state of mind (eg red for anger, blue for relaxed)
  • Military application: “tapping out” commanders, who are too lost / too tired/anxious to be making solid decisions (was a DARPA paper about this)
  • Public speech: hook up BCI to 3 randomly selected volunteering audience member. Visualized emotions get feed to speaker. Dynamic response & crowd-control based on instant feedback
  • Lie detector which actually works.
  • Traffic accident reduction: eg compulsory mental state check required to start your vehicle
  • “State of mind navigator”: pre-train system on wide range of activities, dial in the desired cognitive state you want to be in, system automatically finds the 3 x 5 minute activities which deterministically puts you in that state of mind

Cognitive improvements:

  • visualize skill-relevant cortex activity across multiple different types of trainings. ArgMax learning by putting pressure on the training which generates the most skill-relevant cortex activity
  • graphing temporal fluctuations in daily routine, and determining best-performing times for different activities
  • Improving focus / productivity / motivation: mapping out which neural activity is corelated to it, and working backwards the causality chain; then changing environment / inputs / activities to alter cognition towards it
  • Reducing cognitive biases: automatic detection & alerting on ugh fields, rationalizations, flinching away from a thought, motivated reasoning / rationalization

Information retrieval:

  • “BCI records your global activations as you read that Reddit post about deep learning augmented by EEG/MRI/etc data; a year later while reading HN about something AI, you want to leave a comment & think to yourself ‘what was that thing about using labels from brain imaging’ which produces similar activations and the BCI immediately pulls up 10 hits in a sidebar, and you glance over and realize the top one is the post you were thinking of and you can immediately start rereading it.”

Augmenting execution:


  • Same with information retrieval: you think of a calculation, or operation you’d like to execute, BCI confirms “strength of want”, and executes it if above threshold

As Input:

  • Brain activation to be decoded into a text summary & editable
  • Controlling prothethics, limbs, vehicles, devices, tools, mouse cursors on screens
  • IoT / smart environment / Neuroergonomics: your entire environment being highly responsive to your inputs -eg you’re slightly cold -> brain activity gets picked up -> increased temperature



  • Eyes as webcam: sufficiently high resolution reading of the visual cortex could turn real-life experiences into real livestream (with sensory data as an added bonus). Tagged annotation of wearer’s mental state can provide an annotation on how events were imported by the person (eg. twitch TV with deep empathy)
  • Translation of intent: take vector representation of relevant brain states as mapped to intent expressed in words; extract it as how the same vector would unpack into words for other person -with the aim to eliminate “lost in translation” / cultural differences, even between speakers of the same langauge
  • Neural search: take vector representation of brain intent, map to vector representation of texts over the Internet, search for closest match. Like, literally you think of a thing, and it gives you the exact closest-match article you’ve been looking for

For pets:


  • Figuring out what the heck is actually up with cats (other than murder :) ); reducing animal suffering by finding the argmax of that little determinstic deep learning neural network
  • Autonomous fighter dogs: instead of mounted head controls, read eye visuals directly & direct the trained dog towards paths using eg laser flicker

BCIs as input for understanding & imitating the brain:


  • put top performers under their skilled position, while recording mental states. Use this training data for guided deep learning training (with end-to-end training done for the task itself)
  • Security, and authentication: your brain mental state (eg when loaded with a thought) as security key

Medical :

(ref work: Brain computer interfacing: Applications and challenges)

  • Prevention: eg people in drug rehab -addiction activates a specific brain region, device look for activation, intervention is focued on where the most high-leverage point is, just before people
  • Relatedly: finding activities with direct neural feedback, which does not activates those parts of the brain & keeping people entertained in rehab [cognitive behavioural therapy 202]
  • Detection & diagnosis: stroke, epilepsy, dyslexia (in very young children), learning disabilities
  • Rehabilitation and restoration: “brain structures associated with stroke injuries could be reorganized and the damaged motor functions could be restored via neuroplasticity” (see paper above)


  • online canvas for the human imagination: use BCIs as a feedback loop for drawing; continuously spam geometrical objects into a canvas until it converges to imagined scenes
  • take human during daydreaming, create short summaries of the daydream on a continuous basis. Artists write (rough draft of) novels using nothing but their imagination

Neural loop:

  • Hacking: take human, hook up to BCI, looking at monitor; DRL generates images, with feedback being any of: generating pleasure in the brain, generating pain, activating target brain regions, up to, and possibly including brainwashing
  • Infinite entertainment wireheading: take large library of media, look at user’s excitement level, play the one which is predicted to maximize excitement. Vary excitement level over time to avoid burning out.

Value system elicitation:

  • currently, you don’t have a really good grasp on your own value system; a “man-in-the-middle-of-BCI&DRL” system might be able to do high-frequency query-elicit cycles, and generate a high-resolution map of your values
  • Beta uploads: sufficiently high-res BCI is indistuingishable from having a copy of the mind in the computer

But, like VisiCalc, I suspect a lot of this is going to depend on the actual ops of what’s going to be available, what capabilities we have, and what the market will be ready for. (It’s hard to nail down high-confidence bets in advance; ie the above attempted to be more of a MECE collection, than a “visicalc” prediction).