Shadow Dexterous Hand

Shadow Dexterous Hand


With the Shadow Dexterous Hand, we have taken a truly anthropomorphic approach to robot manipulation. With 20 actuated degrees of freedom, position and force sensors, and ultra sensitive touch sensors on the fingertips, the hand provides unique capabilities for problems that require the closest approximation of the human hand currently possible.
The Shadow Dexterous Hand uses industry standard interfaces and can be used as a tele-operation tool or mounted on a range of robot arms as part of a robot system.
See the hand in action on our YouTube channel or media collection.


Human Kinematics

Approximating the kinematics of the human hand was our top priority when developing the Dexterous Hand. The Hand has 20 actuated degrees of freedom and a further 4 under-actuated movements for total of 24 joints. Each joint has a movement range again the same as or very close to that of a human hand, including the thumb and even the flex of the palm for the little finger. A model of the Shadow Hand for use in Blender was developed by Shadow and UPMC and is available from here.

Human Sized


The Shadow Dexterous Hand is a feat of miniaturisation. Within the same envelope as a human hand we have packed highly sensed finger tips, position sensors for each joint and a control board on the palm allowing for system extension via add-ons. This increases operational capabilities significantly such as by wearing standard gloves for protection in specific tasks.

High Bandwidth Sensing

With 129 sensors in total, the Hand provides detailed telemetry, which can be exploited to generate innovative manipulation control systems or to provide detailed understanding of the external environment. As well as position sensing for each joint the Hand includes force sensing for each actuator, tactile sensing on fingertips, temperature and motor current and voltage sensing. All of this data is made available to the user from 100Hz and up to 1kHz via a high bandwidth EtherCAT interface. Supporting this high frequency for data is the fast movement of the hand – from open to closed in 0.5 seconds.

Open Platform

The Hand is fully integrated with ROS, and we make available solid models and code. You can download and start using a virtual model of the Hand in ROS right now.  Control of the Hand, including position control algorithms can be modified by the user in ROS. Firmware within the Hand itself can be made available for modification.

Air Muscle, Left Hand & Arm Integration


Certain applications may demand an Air Muscle actuated version of the Hand. This is available as an option, as well as left or right handed versions. The Hand has been designed to fit onto a variety of commercially available robots. Customers have successfully mounted a Hand onto arms from: Kuka,Denso, Willow Garage, Mitsubishi and Universal Robots (click on the manufacturer names for videos of the integrated systems in action). We are able to customise the Hand as required for each application.

BioTac Tactile Sensor (optional)


For applications requiring highly detailed sensing capabilities Shadow has partnered with SynTouch LLC to make available the revolutionary BioTac tactile sensor in a package especially designed for the Shadow Hand. BioTac sensors can be added to all fingers and allows detailed force, micro-vibration and temperature gradient sensing. Data from the BioTac sensors is fully integrated and available via the same EtherCAT interface as other sensors.

Grasp Stabilization and Control (GSC) (optional)

GSC makes it possible to use an advanced manipulator out of the box in most situations. Based on an incoming 3D point cloud, a segmentation algorithm is first applied. Once the user selects which objects she wants to grasp, a recognition algorithm is run on the cluster to try to identify the object from a database of previously learned objects. If the object is recognised, then a list of pre-computed grasps is loaded. If the object is not recognised, then a list of possible grasps are computed based on a mesh reconstructed from the segmented point cloud. Based on this list of grasps, the GSC framework then selects the best possible grasp automatically and executes it carefully.
Once the object is held by this initial grasp, the powerful GSC algorithm kicks-in: using the different tactile capabilities, it refines the current pose of the object. Also using the tactile data, it computes the stability of the current grasp, and using a finger-gaiting mechanism, it moves the fingers safely to a new more stable grasp autonomously. The different algorithm are robust enough to work with different levels of tactile sensors, from the most basic to the most expensive ones.