Abstract
We demonstrate a method that allows two users to communicate remotely using their sense of touch by dynamically applying vibrotactile feedback to one user's forearm using two different input methods. User input on a standard mobile touch-screen device or a purpose-built touch-sensitive wearable is analyzed in real time, and used to control intensity, location, and motion parameters of the vibrotactile output to synthesize the stroke on a second users arm. Our method demonstrates that different input methods can be used for generating similar vibrotactile sensations.
Original language | English |
---|---|
Title of host publication | UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct |
Place of Publication | New York |
Publisher | Association for Computing Machinery |
Pages | 273-276 |
ISBN (Print) | 9781450344623 |
DOIs | |
Publication status | Published - 2016 |
Event | The 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Heidelberg, Germany Duration: 12 Sept 2016 → 16 Sept 2016 |
Conference
Conference | The 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing |
---|---|
Abbreviated title | UbiComp '16 |
Country/Territory | Germany |
City | Heidelberg |
Period | 12/09/16 → 16/09/16 |