Gestureless Touchscreen Interface

The Problem

The advancement of technology is improving the lives of individuals with SCI, especially those with tetraplegia, but paradoxically it is also leaving them behind.  Touchscreen devices are increasingly becoming the primary means for socialization and internet access, but physical interaction is limited for this population of individuals.  This further isolates them from a society that is driven by the newest smartphones and tablets.

Screen Shot 2015-11-23 at 1.37.49 PM

This project develops the hardware and software necessary to digitally empower physically disabled individuals.  The system will use electromyography (EMG) to non-invasively measure muscle movement above a lesion or injury site and interpret these gestures as cursor movements on a touchscreen.  The system will be optimized for usability by both the individual with tetraplegia as well as their caregiver (e.g. ease of don and doffing, simplicity of calibration, intuitive interaction).  It also needs to be modular, so that it can be customized to the capabilities of the user.

There is currently no standardized benchmark for digital assistive devices.  In the course of this research, we will quantify guidelines for task execution through user studies comparing existing technology to ours.  We are ensuring that this research fulfills user requirements by soliciting their input throughout the project.


This research aims to return digital freedom to individuals who can no use technology due to a disability.  Further research includes investigating whether neural signals from the closed-loop DBS project can be used to drive the digital cursor.

Presentations and Applicable Citations

Pratt, K, “Touchless Touchscreens,” NSF ERC Perfect Pitch National Competition, Washington DC, Oct 28, 2015.

Pratt, K, “Touchless Touchscreens,” NSF ERC Perfect Pitch CSNE Competition, Seattle, WA, Sep 18, 2015.  (First Place)

Pratt, K and Chizeck, H, “Classifying EMG Signals for Virtual Control,” Women in Robotics II Workshop, RSS, Rome, Italy, Jul 16, 2015.  (Presentation available here.)

H. J. Chizeck, O. Johnson, and J. Herron, “Using Neural Signals to Drive Touch Screen Devices,” US20140210745 A1, 31-Jul-2014.

Affiliated FacultyHoward Chizeck

Funding Sources