RSS17 Workshop on
Tactile Sensing for Manipulation:
Hardware, Modeling, and Learning
15 July 2017 - RSS2017 - Cambridge, MA, USA
If robots are to perform everyday tasks in the real world, they will need sophisticated tactile sensing. The tactile data must be integrated into multi-sensory representations that support exploration, manipulation, and other tasks.
This workshop asks the following questions:
- What kinds of tactile technologies are currently available, and what are needed?
- What type of representations are best for capturing and exploiting tactile data?
- How can tactile information be combined with other information to support specific tasks?
- Can learning help to provide suitable representations from high-dimensional sensory data?
This workshop will bring together experts from the fields of tactile sensing, sensor design, manipulation, and machine learning. We expect that the pairing of theoretical and applied knowledge will lead to an interesting exchange of ideas and stimulate an open discussion about the goals and challenges of tactile sensing.
Submission deadline: 08 June 2017 Notification: 15 June 2017 Camera ready: 03 July 2017
- Workshop: 15 July 2017
- Edward Adelson - MIT
- Matei Ciocarlie - Columbia University
- Jeremy Fishel - SynTouch Inc.
- Robert Haschke - Bielefeld University
- Robert Howe - Harvard
- Charlie Kemp - Georgia Institute of Technology
- Oliver Kroemer - USC
- Sergey Levine - UC Berkeley
- Alberto Rodriguez - MIT
- Veronica J. Santos - UCLA
- Wenzhen Yuan (MIT)
- Tapomayukh Bhattacharjee (Georgia Institute of Technology)
- Roberto Calandra (UC Berkeley)
- Edward Adelson (MIT)