System Architecture for the Project Prototype

Mind-Controlled Virtual Assistant on a Smartphone Device


Initial Siri Speech Recognition Prototype

Here's a link to my vimeo hosted video of my initial prototypes for the speech recognition side of the project:


I made this iPhone app in Xcode and the interactivity indicator using with my graphic design tools. Currently there is no public API available to developers for Siri, yet I found a way to activate it using public frameworks available within the standard iOS SDK. Instead of activating Siri with the Home button, I am using the proximity sensor to activate the voice recognition by wave my hand over the front facing camera. This initial prototype exhibits the speech recognition application for the project. The brainwave controller will later be integrated with this application to provide mind-controlled functionality.


Mind-Controlled iPhone Siri Initial Prototype Test - 3/10/13

Mind-Controlled iPhone Siri Prototype Test

YouTube link:

Vimeo Link:

This is a one of my initial tests using a brainwave-reading device (MindWave Mobile) to control some custom Siri functions on a iPhone. For the first speech recognition segment I wave my hand over the device to activate Siri. During the next portions, I use mental commands from the EEG signals to commands the iPhone to open a map, open a menu and close a menu with voice feedback from Siri.

The brainwave-analyzing headset monitors a variety of EEG signals including delta and alpha signals, attention levels, and other patterns. I have linked some of the raw data from the device to the algorithms I would normally use to control Siri with voice recognition. I am using brainwave pattern here to send the commands to the iPhone.


Mind-Controlled Prototype - Using Menus, Playing Music Videos, Making Calls and Maps

YouTube link:

Vimeo video link:

This prototype video shows the mind-controlled virtual assistant using the MindWave Mobile EEG headset and a custom Siri API. The prototype uses no jailbroken devices or Ruby-enabled proxies to interact with Siri. The user thinks of the intended action and the activity is performed on the device.

This prototype shows how the user can use mental commands to interact with the virtual assistant, open a menu, select icon items, play videos and music, make phone calls and open a map.


email: © Duane Cash 2013