top of page

Passive Palpation Rendering for Shared Tasks in VR

Determining best practices for sharing palpations sensations between users conducting a joint palpation exam for telehealth applications

Active touch rapidly communicates properties invisible to the eye or difficult to verbally explain. An example of this is medical palpation exams, where a doctor's manual examination and familiarity with palpation sensations plays a crucial role in their ability to diagnose a patient. For virtual interactions such as telehealth visits, however, active hands-on touch may not be possible.


With the goal of restoring tactile information to virtual interactions, we are exploring conditions under which actively induced touch sensations can be interpreted when applied to a passive hand.


We developed several testbeds for studying this paradigm, including a robotic platform that applies palpation-like touch sensations to a passive user by moving a tissue phantom under their stationary hand, and a virtual testbed in which a user observes a VR interaction while feeling concomitant sensations. We are leveraging these platforms to explore perceptual implications and potential compensatory methods for preserving peoples' ability to passively interpret tactile sensations.


Project Gallery

bottom of page