At MindAffect our mission has been to “open up new dimensions of interaction” by providing developing technologies which allow users to directly control computers with their brains.  So far we have achieved this mission by;

  1. working directly with patients and patient support groups (such as ALS-liga Belgium) to deliver BCI technologies directly to end users,
  2. partnering with groups in sectors interested in adding brain control, such as VR gaming or home automation, to develop product prototypes,
  3. direct sales of complete BCI development kits to makers and hackers interested developing brain controlled projects from our kickstarter campaign.

Today we are pleased to announce the next stage in our mission to allow more people to add brain control to their own projects with the release of the open-source version of our core brain computer interfacing technology, which you can download now from github.com/mindaffect/pymindaffectBCI.

Out of the box’ this software allows you to use brain controls to write words, play games, or control the lights in your house, and many other things. More importantly however, we also provide the software, documentation, examples and example-data to allow you to easily develop your own new methods of brain controlled interaction.

Key highlights of the mindaffectBCI are:

  • Batteries Included : we include complete working high performance examples from the start, so you can get to a working typing BCI (assuming you have the hardware) as rapidly as possible.  We also include APIs and examples showing how to use  the BCI with the most popular languages/development environments; python, unity, swift, java. Do you want to add brain controls to your game?  Then use our unity plugin.
  • Python-based : the core algorithms are written in the extremely popular python programming language.  So we are cross-platform by default.  Do you want to try new machine learning techniques? — then just plug-in any sci-kit learn compatible classifier and try it on our public-datasets
  • Designed to be changed : A modular communicating processes design makes it easy to adapt, add or even replace any component to adapt the BCI to their own needs.  Do you have an unsupported amplifier?  Then just add a new amp driver.  Do you want to control your TV? Then just plug in a new output-module.  How about using a 3d-LED cube for stimuli? Then just replace the presentation module.  Find the noisetagging stimuli too annoying?  Then change the stimulus sequence to make a P300, or SSVEP design.  Or how about going completely non-visual? Then use a raspberry pi to drive tactile stimulators. 

Target Users

Whilst the mindaffect BCI can be used by anyone, it has been developed with particular users in mind:

  • Patient Technical Support Teams: One of the key motivators for the MindAffect team is to make BCIs which improve people’s lives. We help some patients ourselves, but cannot support every possible patient and their environment. Instead, we try to provide the tools so patient support teams can themselves fit the BCI to their patients needs. For this, we provide a basic text communication application out-of-the-box, with guidance on how to customise this for their users needs, for example integrating the BCI with their existing AT system, or changing the display layout to support other ‘alphabets’.
  • Game Designers: Do you want to add brain controls to an existing game? Or make a new game including Brain controls as a novel interaction modality? You can easily do this, in a cross-platform way, as shown here, using our unity plugin.
  • Hackers and Makers: Do you want to add brain control to your raspberry-pi robot, Lego robot, sphero or drone? Now you can, either by using a simple control app on your laptop, or (more fun) by adding LEDs or LASERS(!!!) to your robot for direct control. We provide examples for driving LEDs from a raspberry Pi, and are happy to help using other hacker boards (micro:bit) or even the LEDs on your drone.
  • Neuroscience Students and Researchers.  A BCI is an excellent tool for learning the basic neuroscience of how the brain responds to stimuli.  For these users we include tools for the on-line interactive visualization of the stimulus specific responses. Importantly, these visualizations utilize the same technology as the on-line BCI , which uses machine learning techniques to improve signal quality and separate the responses from overlapping brain responses. This gives students a clearer view of the brain response in a short amount of time allowing for interactive learning and experimentation with stimulus parameters and mental strategies. 
  • Machine Learning Engineers / DataScientists. Modern BCIs (including our own) rely heavily on machine learning techniques to process the noisy data gathered from EEG sensors and cope with the high degree of variability in responses over different individuals and environments. MindAffect firmly believes that the key to enabling the new BCI applications we all want is a combination of more sophisticated machine learning algorithms and larger and more diverse datasets on which to train these algorithms.  The mindaffect BCI helps in both these directions by; Firstly,  making it easier to rapidly gather relatively large EEG datasets by using consumer grade EEG devices and applications designed in your preferred application development framework. For example, by using a raspberry Pi, headphones, and EEG headband and an openBCI ganglion to measure the brains response to different music types. Secondly, by providing example data (see kaggle-mindaffectbci) and a sklearn compatible interface for machine learning developers to experiment with different learning algorithms, both in larger off-line dataset analysis and directly in on-line applications.

Get it now!

So, are you interested in neural technologies, but were unable to try them yourself? or did you try it and were disappointed in its performance? or perhaps you have a great idea for a brain controlled application which you didn’t know how to make work before?   Then try the mindaffect BCI and start opening up new dimensions of interaction with our open source brain computer interface software.