Pretouch is a sense that is longer range than touch, but shorter than vision. Using electric field sensing hardware that I designed to fit inside a robot’s fingers, several robotic manipulation tasks are made easier or possible.
Human-to-robot and robot-to-human handoff
In this work, the robot is programmed to accept an object from a human, and then hand it to another human.
- First, the e-field pretouch sensors are used to detect the presence of an object that a human is holding up and track the object to center the robot’s hand on it as a human might be moving around.
- Once the sensors detect that the object has stopped moving, the sensors are again used to servo the positions of the fingers to equal distances from the object’s surface, about a centimeter away. This ensures that when the fingers close, they will make simultaneous contact to avoid displacing the object.
- After grasping the object, the robot waits for the human to let go before trying to move away. The human holding onto the object provides an AC path to ground, which the e-field sensors can detect. When the human lets go, this current path goes away and is easily observed in the sensor readings.
- The robot moves its arm to hand the object to another human. Again, the e-field sensors are used to detect whether the human has accepted the object so that it does not let go prematurely.
Picking up stationary objects
We also demonstrated the use of the e-field pretouch sensors to pick up stationary objects from surfaces when only the approximate position of the object is known. Using closed-loop controllers, the sensor inputs are used to correct the positioning of the arm and fingers to facilitate reliable grasps.
This technology has been featured in several high-profile demos, including:
- CeBIT opening ceremony, Hannover, Germany March 2009
- Intel Developer Forum, San Francisco, CA, September 2008
- Intel Developer Forum, Taipei, Taiwan, October 2008
Grasping various objects with and without e-field pretouch sensing
E-field handoff demo at the CeBIT 2009 opening ceremony, with Intel Chairmain Craig Barrett, Governor Schwarzenegger, and Chancellor Merkel
Mayton, B., LeGrand, L., and Smith, J. 2009. An Electric Field Pretouch System for Grasping and Co-Manipulation. IEEE International Conference on Robotics and Automation, 2010.
Using electric field sensing hardware that I designed, I enabled Marvin, Intel Labs Seattle’s mobile manipulation research platform, to plug itself in to ordinary, unmodified electrical outlets. The robot only needs to know the approximate location of an outlet on its 2D map to drive up to it and precisely align the prongs with the holes in the socket by sensing the emitted 60Hz electric field.
Mayton, B.D., Legrand, L., and Smith, J. 2009. Robot, Feed Thyself: Plugging In to Unmodified Electrical Outlets by Sensing Emitted AC Electric Fields. IEEE International Conference on Robotics and Automation, 2010.
Gizmodo: Intel Robot Finds Wall Socket, Plugs Self In
Seattle Times: Intel Robot’s New Trick
UW Classroom Presenter, developed by Richard Anderson et al. at the University of Washington, is interactive presentation software that runs on tablet PCs. Each student uses his or her own tablet PC, can see written annotations made on the slide by the instructor (called “ink”) and can add his or her own ink to slides which can be submitted back to the instructor to be reviewed or shared with the class. In my undergraduate capstone project at the University of Washington, I worked with several other students to develop a version of Classroom Presenter that runs on the One Laptop Per Child foundation’s XO laptop.
The software is not just a port but a complete adaptation to make it usable on the XO. The XO is not a tablet, so only simple drawing with the trackpad (or a mouse) is possible. We added text input features to enable students to provide a textual response to a question without needing to write it with a mouse. We use the XO’s built-in facilities for discovering shared activities and connecting to other machines, so that connecting the machines together is simple enough for elementary students to do themselves. We also included features necessary for setups that don’t include a projector: the original UW Classroom Presenter expects that if the teacher wants to share a student’s submission with the class, he or she will use a projector to display it. In our implementation, we enable the teacher to broadcast selected student submissions to the rest of the class, so students may view them on their own screens.
The project culminated in a trial at a local elementary school, where students in small groups shared XO laptops to complete activities about a recent field trip, while the teacher talked about the students’ work and shared their submissions with the rest of the class.
Screenshot of a slide in Classroom Presenter for the XO with student responses
Students using Classroom Presenter on the XO during our classroom trial. Photo: Mark Ahlness / CC BY-NC-ND 2.0
The results of the trial were very positive; we received great feedback both from the teacher and from his students. The source code we developed has been made available under an open source license.