Prototype


System Architecture for the Project Prototype

Mind-Controlled Virtual Assistant on a Smartphone Device



voiceIndicator-blue


Initial Siri Speech Recognition Prototype

Here's a link to my vimeo hosted video of my initial prototypes for the speech recognition side of the project:

vimeo_prototype_screenshot


http://vimeo.com/60359236

I made this iPhone app in Xcode and the interactivity indicator using with my graphic design tools. Currently there is no public API available to developers for Siri, yet I found a way to activate it using public frameworks available within the standard iOS SDK. Instead of activating Siri with the Home button, I am using the proximity sensor to activate the voice recognition by wave my hand over the front facing camera. This initial prototype exhibits the speech recognition application for the project. The brainwave controller will later be integrated with this application to provide mind-controlled functionality.


voiceIndicator-blue


Mind-Controlled iPhone Siri Initial Prototype Test - 3/10/13



Mind-Controlled iPhone Siri Prototype Test

YouTube link:

https://www.youtube.com/watch?v=-ipZrWxOhA0


Vimeo Link:

https://vimeo.com/61498919


This is a one of my initial tests using a brainwave-reading device (MindWave Mobile) to control some custom Siri functions on a iPhone. For the first speech recognition segment I wave my hand over the device to activate Siri. During the next portions, I use mental commands from the EEG signals to commands the iPhone to open a map, open a menu and close a menu with voice feedback from Siri.

The brainwave-analyzing headset monitors a variety of EEG signals including delta and alpha signals, attention levels, and other patterns. I have linked some of the raw data from the device to the algorithms I would normally use to control Siri with voice recognition. I am using brainwave pattern here to send the commands to the iPhone.


voiceIndicator-blue


Mind-Controlled Prototype - Using Menus, Playing Music Videos, Making Calls and Maps


YouTube link:

http://youtu.be/wgGlI2_LaUY

Vimeo video link:

https://vimeo.com/63024569

This prototype video shows the mind-controlled virtual assistant using the MindWave Mobile EEG headset and a custom Siri API. The prototype uses no jailbroken devices or Ruby-enabled proxies to interact with Siri. The user thinks of the intended action and the activity is performed on the device.

This prototype shows how the user can use mental commands to interact with the virtual assistant, open a menu, select icon items, play videos and music, make phone calls and open a map.



DC_Logo


email: dcash10181@aol.com © Duane Cash 2013