MindAffect, an NLC / Donders Institute venture, is specialized in the application of artificial intelligence and neuroscience to enhance human communication. During our first years of operation, MindAffect achieved its primary goal to create a groundbreaking BCI product which gives locked-in patients the ability to interact with the outside world by controlling a computer solely with brain signals.
We have expanded the technology to include the tracking, interpreting and transmitting of a much wider range of neuron signals and working with industry leaders to create exciting new applications in next-generation hearing and vision testing, brain health monitoring and neural communication.
Fast Audio-metric Threshold Detection
Our system can be used for the real-time detection of stimulus-evoked potentials to provide a rapid, objective, and highly cost-effective estimation of the entire audio-metric curve in real-time.
Automated Listening Experience Monitoring
With further development, our system can be used for automatic, objective, and continuous assessment of a users listening experience.
Automatic Optimized Hearing-Aid Fitting
Combining with advanced machine learning techniques, we would be able to minimize the time and maximize the customer satisfaction by fitting hearing aid specifically to the user’s individual needs and preferences.
Visual Evoked Potential Vision Testing
This is a highly effective way to check the nerves of the visual pathway for abnormalities, such as; neuritis, glaucoma, multiple sclerosis and optic neuropathy.
Visual Acuity Testing
MindAffect’s algorithms can be used to rapidly map sensitivity over the visual field to stimulus features such as; intensity, color, and movement. The speed and simplicity of this test, potentially allows more frequent testing and earlier identification of neuro-visual disorders.
Testing of Visual Cognitive Function
Or system can also test higher-level visual cognitive functions, such as face or object recognition. This may be useful in monitoring general brain-health or diagnosis of issues associated with particular conditions.
Brain Health Check
Testing brain responses
Our algorithms allow for rapid (<10 minutes) testing of both low-level stimulus processing and higher level cognitive responses to complex visual, auditory and tactile stimuli.
Brain Health Check
The results of these tests, when compared to a database of normative responses, give an indication of general brain health and any specific processing deficiencies. Such a Brain Health Check has a wide range of potential uses, including; healthy ageing, minor brain injury diagnostics (e.g. in sports), general health monitoring, etc.
Our technology is well suited to the challenges of assessing very small children as it is fast, does not require patient responses and is highly robust to external-noise and individual (developmental) neural response differences.
Multi-modal Sensory Testing
Our algorithms allow for rapid, automated testing and assessment in sleeping or awake infants, across sensory modalities, such as;
- Visual: Bright-dark sensitivity (eyes-closed) or general visual acuity (eyes-open)
- Auditory: general or frequency specific loudness sensitivity testing
- Tactile: mapping sensitivity to tactile sensation
Brain Computer Interfaces
Our algorithms allow paralyzed or healthy individuals to communicate and control devices using just brain signals. We have successfully developed applications and evaluated these algorithms for paralyzed individuals, and healthy users. To further facilitate development in this area and help as many patients as possible we have released these algorithms to the community as open source.
Neural Computer Interfaces (EMG)
As well as working on direct brain control, we develop computer interfaces based on neural signals from peripheral motor neurons. Utilizing the natural signal amplification provided by the human motor system improves signal-to-noise and allows for fast and intuitive control control performance in both patients and healthy users.
Example applications we are developing include easy to use facial-control for nearly-locked in-patients, or sub-vocal speech recognition for silent communication in emergency situations.
Using small tactile simulators we can automatically track where someones attention is focused.
Using this information, we can create positive feedback to subtly train the brain to divert attention away from the area of pain or itching — which may lead to improved well being, lower drug use and faster recovery.