Skip to main content

Talk: Enabling Haptic Experiences Anywhere, Anytime

Shan-Yuan Teng: PhD candidate, Computer Science, University of Chicago

Event Details

Date
Thursday, January 30, 2025
Time
12-1 p.m.
Location
Description

LIVE STREAM: https://uwmadison.zoom.us/j/96866279421?pwd=vruJIIy0t0D9Ck3YYzhBL8KqR8dOVv.1 

Abstract. Mobile and wearable devices accompany us wherever we go. These devices opened up new ways to assist users in real time by delivering visual or auditory notifications. However, this leaves open the challenge: how to realize wearable/mobile interfaces capable of providing real-time physical assistance? While researchers have long posited that soon our devices will be capable of delivering realistic touch & force sensations (so-called haptics), when we look around our everyday computer interfaces, haptic assistance is still left at a minimum (e.g., only simple vibrations for notifications). My research is driven by the question: What fundamental restrictions are limiting the advancement of haptics towards mobile and wearable, and, more importantly, can we tackle these to enable a world where physical assistance is possible anywhere & anytime?

I argue that roadblocks come from the fact that immersive haptic devices were engineered as a piece of infrastructure, rather than as personal devices. Existing haptic devices are usually stationary (i.e., they work well in hospitals for surgical training or for immersive experiences in theme parks)—yet haptic experiences are absent in everyday contexts. By prioritizing the replication of high-fidelity sensations over mobility, I argue that haptic devices ended up (1) getting in the way of the user—wearing the existing generation of haptic devices prevents users from engaging in dexterous tasks (e.g., haptic gloves prevent the user from typing on a real keyboard); and (2) power-hungry—they require enormous amounts of heavy batteries (e.g., haptic devices often cannot be used on-the-go). These are not temporary limitations; these stem from conceptual flaws in the way that scientists have conceptualized the role of haptics. 

To this end, I re-envision haptics to be used in everyday devices and propose novel approaches to solve this: (1) design haptic devices that preserve and even support dexterity for manual tasks in the real world (e.g., accessibility and prioritize real world over virtual sensations); and (2) integrate haptic interactions with the user’s body to minimize device size and power consumption (e.g., leverage energy harvesting from the user's own body).

Through a series of novel devices built from scratch and their accompanying technical & user-study validations, I demonstrate a roadmap toward a new generation of haptic devices that will enable haptic experiences anywhere and anytime.

Bio. Shan-Yuan Teng is a PhD candidate in Computer Science at the University of Chicago advised by Prof. Pedro Lopes. Shan-Yuan’s research aims at advancing a new generation of haptic devices (e.g., those that can create a programmable sense of touch, forces, etc.) to exhibit properties that we became used to expect from our mobile & wearable devices, such as extreme mobility, availability anytime, etc. To advance haptics into this new territory, Shan-Yuan engineers custom-made interactive devices that, for instance, allow us to feel touch in Augmented/Mixed Reality (AR/MR/XR) without encumbering our dexterity in the real world; or support manual interactions for blind users. Shan-Yuan has published these works at top Human-Computer Interaction (HCI) conferences including ACM CHI & UIST, with two Best Paper Awards and five Honorable Mention Awards.

Cost
Free

Tags