The unifying theme of my research programme is the invention of enabling technologies that can help others. Over the past several years, my work has been focused on lifelike robotic and prosthetic hands, bio-inspired tactile sensing, human-recognizable robotic gestures, and robotics for autism therapy for children. In what follows, I present the research works that my team and I have worked on and I will describe how we attempted to solve a few technological and humanitarian problems from a different perspective. I conclude with the future directions of the research. 

Patient-specific and lifelike prosthesis: Any loss or absence of limbs can cause devastating physical, psychosocial, and economic damage to an individual. It is an experience linked with grief, depression, anxiety, loss of self-esteem and social isolation [1-3]. Whether the traumatic loss of limb or finger is due to war, an industrial, domestic or vehicular accident, amputation leaves the individual with a long lasting emotional scar from the disfigurement. 

Fig. 1  Design and fabrication of an accurate replica of a patient’s hand. (a) Human subject’s hand. (b) CT scan image and (c) the corresponding 3D reconstruction of the human hand structure. (d) 3D model of the two-part mould and the bones used to fabricate the synthetic hand. (e) Closed model of the mould. The structure is placed vertically and the silicone rubber is poured from the large hole shown. (f) Rapid prototype model of the mould and the bones. (g) The complete artificial replica of the human subject’s hand obtained after curing of silicone rubber.

Many have worked on artificial equivalents of the missing hands/limbs [4-6]. These devices have to be naturally controlled by the user's thoughts and the sensations be fed back to its user. In addition, the prosthetic hand should pass undetected as an artificial device during social interactions. This requires that the hand has accurate appearance as the missing hand/limb and that its skin has humanlike softness and warmth when touched. Recent technologies that partially offer solutions to this problem are too expensive and complex especially for the demands of developing countries. 


We invented a method to create patient-specific artificial hand or fingers without even seeing an amputee for measurements and fitting during the fabrication process [7, 8]. We used computed tomography (CT) data from the non-affected hand to accurately recreate the geometry of a missing hand in digital format (Fig. 1). Using the data, reconstruction can be done remotely from the amputee and the prosthesis can be delivered at their doorstep.

Furthermore, we have experimented on synthetic materials that have comparable softness to a human hand [9-11]. We also invented an embedded temperature control system to make an artificial skin feel warm to the touch, at a temperature that is reflective of the surrounding environment–just like real human skin [7, 12]. The results of our touch experiments at the arm of blindfolded subjects show that touch from the soft and warm fake hand could not be discriminated from a touch from a human hand. In addition, we have conducted human-to-human handshake experiments to determine the motion, forces and finger joint angles (Fig. 2). These data can be used for designing more compliant robotic arm and hand systems

Fig. 2  Handshake biomechanics.

The experimenter wore motion, force, and joint angle sensors to characterize the human-to-human handshake behaviour. The data are crucial for the design of more socially acceptable movements of prosthetic/robotic limbs.

Bio-Inspired Tactile Sensing: The primary interface of contact of an artificial hand to the external environment is first through the artificial skin and then through the embedded tactile sensors [13]. Inspired by neurophysiological studies, my collaborators and I have worked to address the 5 basic requirements of tactile sensing [14, 15]. The tactile sensing system has to detect: (1) contact between the finger and the object; (2) contact between the object and the environment; (3) slippage; (4) local shape; and (5) global shape.


We found that requirement 1 can be addressed with simple on-off contact sensors, although it would be ideal that the threshold of contact is 5 mN at a minimum, similar to the sensory threshold of the human fingertip [14]. Requirements 2 and 3 are best resolved with force sensors that can detect the x, y and z components [14]. For requirement 4, we investigated artificial fingerprint ridges with embedded tri-axial MEMS sensors for detecting local shapes. We were able to discriminate edge, flat and curved surfaces from one another by embedding force micro sensors beneath ridged artificial skin that resembles the fingerprints [16, 17]. We patented that solution as well [7]. For requirement 5 on global shape detection, our most recent finding shows that shapes like cones, cylinders, spheres or boxes can be discriminated from one another using sensors that detect the joint angles [18]. This is an alternative look to the approach of others whereby the shapes are recognized through the contact events at the fingers and the palm surface. All these fundamental background knowledge can be integrated to a robotic hand to perform grasping, manipulation and haptic exploration tasks.


Human-Recognizable Robotic Gestures: “As the tongue speaketh to the ear, so the gesture speaketh to the eye,” Sir Francis Bacon once said [19]. We explored whether human beings can understand gestures produced by robots [20]. We designed an experiment to determine if pointing gestures can improve spatial cognition if the gesture is made a telepresence robot. If it were the case, humans can derive meaning conveyed in telerobotic gestures when processing spatial information. We conducted two experiments over Skype in our study. Participants were presented with a robotic interface that had arms, which were teleoperated by an experimenter. The robot could point to virtual locations that represented certain entities. We found out that having pointing gestures for telepresence robots can increase the recall of objects and their locations by 15% as compared to when instructions are only given verbally. Keeping in mind the state of confusion and fatigue among the ground crew at disaster sites (e.g. Fukushima Nuclear Plant), the study described above can be a precursor to the way instructions are given in a future when robots are deployed together with humans in danger zones. 

Furthermore, many researchers in robotics have worked on how to control robots by instructing them with gestures [21-26]. In short, the robot would have to understand the bodily gestures that the human is doing and make sense out of it. We took a different approach wherein we investigated which gestures are easily understood by the human observer [27]. We programmed a humanoid robot to perform 15 typical gestures (Fig. 3). Likewise, a human was asked to perform the same 15 gestures (not shown here for brevity). Videos were taken. We asked participants which gestures they can recognize in the videos. It turned out that majority of the 122 experimental subjects can easily understand the following gestures: head nods, clapping, walking, flying, hugging, and expressing anger by the gesture called arms akimbo. 


Communication is a two-way street. In other words, not only should robots understand messages from the human conversation partner, humans must also understand the messages that are being communicated by robots, including the non-verbal ones. The development cycle time for robot programming and testing can be reduced if roboticists know at the onset the basic robotic gestures that humans can understand. When programmed into robots, these gestures can lead to human-robot interactions that are natural, appropriate, and engaging.

Social Robots in Autism Therapy for Children: Autism Spectrum Disorder is a complex developmental disability that causes problems with social interaction, communication, and imagination. Symptoms usually start before the age of three and can cause delays or problems in many different skills that develop from infancy to adulthood. In 2012, the Center for Disease Control in Atlanta, USA released a troubling estimate: 1 in 88 children in the USA may have autism [28]. In South Korea, the estimate was 1 in 38 [29]. Currently, there is no documented cure for autism, nor is there one single treatment for autism spectrum disorders.

Fig. 3  Various gestures by a robot. Nodding, clapping, walking, flying, hugging and expressing anger can be easily recognized by human observers.

However, there is an increasing trend showing that children have preference to interact with computers and robots than to other humans [30]. As a baseline study, I have documented the interactions of children as young as 11 months old as they interact with social robots. The videos that I have gathered provide a rich resource of how children with or without autism can interact with robots [31]. Particularly, issues with safety, familiarity and acceptance would have to be addressed by robot designers.


My students and I have investigated the use of physiological sensors, like heartbeat and galvanic skin response sensors, to determine variations in the internal state of research subjects [32]. We induced sadness by making them watch a movie. We found that signals from both sensors increased as the subjects anticipated the highly disturbing parts of the movie (i.e. an accident). Using similar sensors, it would be helpful to investigate further whether children with autism can show physiological signals that can provide warning signs to the caregivers that a meltdown is imminent.

The Next 5 Years: Looking forward to the next 5 years, I envision that my research programme will be significant steps closer to addressing one of humanity’s toughest problems: “bringing back to touch to those who lost it”. Figure 4 gives a glimpse of that future. From my own work and from that of other researchers in the world, we can see that the basic technological blocks for improved prosthetic/robotic grasping, manipulation, and exploration are already in place. More efforts in fusing technologies from sensing, actuation, mechanisms, materials, and controls, will be needed. I think the technological challenges in prosthetics are the toughest as this has to be interfaced to a human being. However, many solutions from my work can now be applied to humanoid robotics, robotics for care (e.g. robotic nanny), robotic surgery, remote manipulation for the oil and gas industry, subsea exploration, among others.

In addition to the intellectual property and scholarly publications that have been created, some of the works described above received international media attention from the likes of British Broadcasting Channel, MIT Technology Review, New Scientist, Popular Science, and Discovery Channel News.

Fig. 4 Conceptual depiction of a next generation prosthetic arm-hand system. Training of a grasping and drinking task (left). Replicating the motions with the training data using adaptive control schemes with an intelligent prosthesis (right). 


[1]     C. M. Parkes, "Psycho-social transitions: comparison between reactions to loss of a limb and loss of a spouse," British Journal of Psychiatry, vol. 127, pp. 204-210, 1975.

[2]     D. Desmond and M. MacLachlan, "Psychosocial Issues in the Field of Prosthetics and Orthotics," Prosthetics and Orthotics, vol. 14, pp. 19-22, 2002.

[3]     J. Sabolich and S. Sabolich, You're Not Alone, Stories of 38 People who Conquered the Challenges of a Lifetime. Oklahoma: Scott Sabolich Prosthetic and Research, 2005.

[4]     T. A. Kuiken, L. A. Miller, R. D. Lipschutz, B. A. Lock, K. Stubblefield, P. D. Marasco, et al., "Targeted reinnervation for enhanced prosthetic arm function in a woman with a proximal amputation: a case study," Lancet, vol. 369, pp. 371-380, 2007.

[5]     M. C. Carrozza, G. Cappiello, S. Micera, B. B. Edin, L. Beccai, and C. Cipriani, "Design of a cybernetic hand for perception and action," Biological Cybernetics, vol. 95, pp. 629-644, 2006.

[6]     K. D. O'Shaughnessy, G. A. Dumanian, R. D. Lipschutz, L. A. Miller, K. Stubblefield, and T. A. Kuiken, "Targeted reinnervation to improve prosthesis control in transhumeral amputees: A report of three cases," Journal of Bone and Joint Surgery - Series A, vol. 90, pp. 393-400, 2008.

[7]     J. J. Cabibihan, S. S. Ge, S. Salehi, R. Jegadeesan, and H. Abdul Hakkim, "Apparatuses, systems, and methods for prosthetic replacement manufacturing, temperature regulation, and tactile sense duplication," PCT/SG2011/000255 Patent, 2011.

[8]     J. J. Cabibihan, "Patient-specific prosthetic fingers by remote collaboration-A case study," PLoS ONE, vol. 6, 2011.

[9]     J. J. Cabibihan, R. Pradipta, and S. S. Ge, "Prosthetic finger phalanges with lifelike skin compliance for low-force social touching interactions," Journal of NeuroEngineering and Rehabilitation, vol. 8, p. 16, 2011.

[10]   J. J. Cabibihan, S. Pattofatto, M. Jomaa, A. Benallal, and M. C. Carrozza, "Towards Humanlike Social Touch for Sociable Robotics and Prosthetics: Comparisons on the Compliance, Conformance and Hysteresis of Synthetic and Human Fingertip Skins," International Journal of Social Robotics vol. 1, pp. 29-40, 2009.

[11]   J. J. Cabibihan, "Design of Prosthetic Skins with Humanlike Softness," in Intl Conference on Biomedical Engineering, Singapore, 2008.

[12]   J. J. Cabibihan, R. Jegadeesan, S. Salehi, and S. S. Ge, "Synthetic Skins with Humanlike Warmth," in Social Robotics. vol. 6414, S. S. Ge, H. Li, J. J. Cabibihan, and Y. K. Tan, Eds., ed: Springer Berlin / Heidelberg, 2010, pp. 362-371.

[13]   J. J. Cabibihan, "Skin materials selection for prosthetic and humanoid robotic fingertips " Ph.D. Thesis, Advanced Robotics Technologies and Systems Lab, Scuola Superiore Sant'Anna, Pisa, 2007.

[14]   B. B. Edin, L. Ascari, L. Beccai, S. Roccella, J. J. Cabibihan, and M. C. Carrozza, "Bio-inspired sensorization of a biomechatronic robot hand for the grasp-and-lift task," Brain Research Bulletin, vol. 75, pp. 785-795, 2008.

[15]   B. B. Edin, L. Beccai, L. Ascari, S. Roccella, J. J. Cabibihan, and M. C. Carrozza, "A bio-inspired approach for the design and characterization of a tactile sensory system for a cybernetic prosthetic hand," in In Proceedings of the IEEE International Conference on Robotics and Automation, 2006.

[16]   S. Salehi, J.-J. Cabibihan, and S. S. Ge, "Artificial Skin Ridges Enhance Local Tactile Shape Discrimination," Sensors, vol. 11, pp. 8626-8642, 2011.

[17]   J. J. Cabibihan, O. Htun Lin, and S. Salehi, "Effect of artificial skin ridges on embedded tactile sensors," in Haptics Symposium (HAPTICS), 2012 IEEE, 2012, pp. 439-442.

[18]   J.-J. Cabibihan, A. Anand, J. Matthew, S. P. R. Krishna, S. Paul, B. Ramesh, et al. (2013). Glove-based object shape discrimination.

[19]   F. Bacon, The advancement of learning, Book 2. London: Oxford University Press, 1891.

[20]   J.-J. Cabibihan, W.-C. So, S. Saj, and Z. Zhang, "Telerobotic Pointing Gestures Shape Human Spatial Cognition," International Journal of Social Robotics, vol. 4, pp. 263-272, 2012/08/01 2012.

[21]   E. Sato, T. Yamaguchi, and F. Harashima, "Natural interface using pointing behavior for human-robot gestural interaction," IEEE Transactions on Industrial Electronics, vol. 54, pp. 1105-1112, 2007.

[22]   C. L. Sidner, C. Lee, C. D. Kidd, N. Lesh, and C. Rich, "Explorations in engagement for humans and robots," Artificial Intelligence, vol. 166, pp. 140-164, 2005.

[23]   C. Breazeal and B. Scassellati, "Robots that imitate humans," Trends in Cognitive Sciences, vol. 6, pp. 481-487, 2002.

[24]   R. M. Voyles, J. D. Morrow, and P. K. Khosla, "Towards gesture-based programming: Shape from motion primordial learning of sensorimotor primitives," Robotics and Autonomous Systems, vol. 22, pp. 361-375, 1997.

[25]   M. Hersch, F. Guenter, S. Calinon, and A. Billard, "Dynamical system modulation for robot learning via kinesthetic demonstrations," IEEE Transactions on Robotics, vol. 24, pp. 1463-1467, 2008.

[26]   S. Calinon, F. D'Halluin, E. L. Sauser, D. G. Caldwell, and A. G. Billard. (June 2010) Learning and reproduction of gestures by imitation. IEEE Robotics and Automation Magazine. 44-54.

[27]   J. J. Cabibihan, S. Wing-Chee, and S. Pramanik, "Human-Recognizable Robotic Gestures," Autonomous Mental Development, IEEE Transactions on, vol. 4, pp. 305-314, 2012.

[28]   J. Baio, "Prevalence of Autism Spectrum Disorders: Autism and Developmental Disabilities Monitoring Network, 14 Sites, United States, 2008. Morbidity and Mortality Weekly Report. Surveillance Summaries. Volume 61, Number 3," Centers for Disease Control and Prevention, 2012.

[29]   Y. S. Kim, B. L. Leventhal, Y. J. Koh, E. Fombonne, E. Laska, E. C. Lim, et al., "Prevalence of autism spectrum disorders in a total population sample," Am J Psychiatry, vol. 168, pp. 904-12, Sep 2011.

[30]   J.-J. Cabibihan, H. Javed, M. H. Ang, and S. M. Aljunied, "Why robots? A survey on the roles and benefits of social robots in autism therapy for children," 2013.

[31]   T. D. Hoa and J.-J. Cabibihan, "Cute and soft: baby steps in designing robots for children with autism," presented at the Proceedings of the Workshop at SIGGRAPH Asia, Singapore, Singapore, 2012.

[32]   J.-J. Cabibihan, L. Zheng, and C. Cher, "Affective Tele-touch," in Social Robotics. vol. 7621, S. Ge, O. Khatib, J.-J. Cabibihan, R. Simmons, and M.-A. Williams, Eds., ed: Springer Berlin Heidelberg, 2012, pp. 348-356.

Dr. John-John Cabibihan
Associate Professor
Mechanical and Industrial Engineering
Qatar University


            (974) 4403 4368 

  • facebook.png
  • twitter.png
  • linkedin.png
  • youtube.png
  • google scholar.png
  • rg.png