Aikatch is an chirplet transform based algorithm that pairs with EGG to recreate the images you see directly from your brain wave data. It relies on SSVEP (steady state visually evoked potentials), a principle in neuroscience where when you see a light flashing at a specific frequency, the neurons in your brain fire at that same frequency.
The MUSE headband along with an occipital electrode (for the part of the brain that processes sight) is used as the EEG to collect data from your brain as you stare at a flashing image.
The goal is to create a way to share sensory experience with another person. Imagine going to the doctor and them being able to feel your pain, see from your eyes, hear from your eyes to get a better understanding for your condition. It would also be practical in senior homes and education, being able to step into the mind of someone who;s trying to learn and understand what they struggle with would be so friken cool.
Aikatch only replicates sight in black and white right now. I want to take it further and replicate the entire sensory experience from non invasive methods
What inspired you (or your team)?
When I was a kid, I was deaf until I was 4 and struggle with learning language until late kindergarten. It was very difficult for me to communicate and express myself, so I would often just take things from other people and frustrate my teachers.
While I was working at the lab, I saw an awesome opportunity to understand human experience on a deep level. I’m driven to recreate human experience and create true empathy by listening directly to the brain with non-invasive EEG.