Harold Soh
Talk Title: Towards Data-Driven Methods with Tactile Perception
Abstract: In this talk, Prof. Soh will share recent progress from the lab on data-driven generative models that incorporate tactile sensing as an additional input modality. This is a “work-in-progress” talk, featuring mostly unpublished work.
Talk Title: ProTac: Embracing the Contact by Sensing Softly Across the Surface
Abstract: Soft robots with embedded multimodal sensing offer exciting potential for safe, adaptable, and intuitive human-robot interaction. However, integrating sensing directly into soft robotic skins remains a key challenge, largely due to the mismatch between compliant materials and traditional electronic components. While vision-based tactile sensing has gained traction in recent years, its application to intrinsic multimodal sensing across large, deformable robot structures has been limited. In this talk, Prof. Van Ho will present Protac, a novel vision-based soft sensing technology that enables both tactile and proximity sensing within a single soft skin. At the heart of ProTac is a functional soft layer capable of dynamically switching between opaque and transparent optical states, allowing the system to alternate between sensing modes as needed. This work opens new avenues for enhancing physical interaction, perception, and motion control in robotics.
Anh-Van Ho
Matt Mason
Talk Title: Contact versus clutter, vacuum cups versus hands
The more cluttered a workspace, the harder it is to find a grasp. But clutter is inevitable, even desirable, both in logistics and in the home. We love our stuff, but we have limited space. Humans use their great dexterity to deal with clutter. Robots, on the other hand, lack human-level dexterity, but perform beautifully using vacuum grippers. What is the future? Might robots develop human-level dexterity? Might vacuum cups become obsolete?
No! Vacuum cups are to manipulation as wheels are to locomotion:
simple, rugged, inexpensive, elegant.
Talk Title: Data-Efficient Learning from Vision and
Touch for Manipulation Tasks
A key challenge in robotics is enabling robots to perform task-oriented, robust, and adaptive manipulation in unstructured environments, where objects can vary widely in shape, size, appearance, and placement. To address this, we develop data-efficient methods for robotic grasping and manipulation that integrate vision and touch—complementary sensory modalities essential for perceiving and interacting with the physical world. These approaches aim to learn from minimal supervision or experience, adapt to novel objects and tasks, and make informed decisions under uncertainty. By jointly interpreting visual and tactile cues, robots can infer object properties, reason about task goals, and execute reliable manipulation strategies. This work advances the development of robots that learn efficiently and operate robustly in complex, real-world scenarios.
Faraz Faruqi
Talk Title: Touch in the Loop: Fabricating Tactile Interfaces for Physical Intelligence
The physical surfaces that robots encounter and act upon carry rich, contact-level information. Yet, robotic systems often operate on geometry that is visually designed and mechanically neutral, lacking surface features that support tactile inference. What if we could program these surfaces with tactile properties that are designed explicitly to support contact-rich interaction?
In this talk, I introduce TactStyle, a generative fabrication system that creates 3D-printable surfaces with programmable tactile properties, using texture images as input. TactStyle decouples color and geometry stylization to produce surface microgeometry (heightfields) that encode tactile features like roughness, scratchiness, and isotropy. The geometry is generated using a fine-tuned diffusion model trained on texture–heightfield pairs, enabling creators to specify the feel of a surface alongside its look — without requiring specialized sensors or physical capture devices.
Through a psychophysical study, we show that TactStyle outperforms visual-only baselines in replicating human-perceived tactile descriptors. This positions TactStyle as a tool not just for prototyping texture, but for fabricating surfaces that can meaningfully interface with contact-rich algorithms, from grasp planning to haptic exploration. This approach aims to reposition fabrication as a means of embedding tactile priors into physical artifacts — closing the loop between surface design and physical intelligence.