Humanoid and cognitive robotics
We’re a group researching in the areas of humanoid, cognitive developmental, neuro-, and collaborative robotics. Robots with artificial electronic skins are one of our specialties.
In our research, we employ the so-called synthetic methodology, or “understanding by building”, with two main goals:
- Understanding cognition and its development. In particular, we’re interested in the “body in the brain“: how do babies learn to represent their bodies and the space around it (peripersonal space) and what are the mechanisms in the brain. We build embodied computational models on humanoid robots to uncover how these representations operate.
- Robots safe and natural around humans. Taking inspiration from humans, we make robots exploit multimodal information (mostly vision and touch) to share space with humans. We’re interested in physical and social human-robot interaction.
For more details about our Research see the corresponding tab.
- 2020-9-4 Our work in social HRI has been presented at IEEE RO-MAN – a biomimetic hand (with Azumi Ueno) [pdf] and our Nao protecting its social space (with Hagen Lehmann) [pdf-arxiv] [slides with voice] [youtube-video].
- 2020-9-1 A new manuscript accepted to IEEE ICDL-Epirob and a new addition to the lab youtube channel: Gama, F.; Shcherban, M.; Rolf, M. & Hoffmann, M. (2020), Active exploration for body model learning through self-touch on a humanoid robot with artificial skin. [arxiv] [youtube-video]
- 2020-7-22 The Frontiers Research Topic Body Representations, Peripersonal Space, and the Self: Humans, Animals, Robots which Matej Hoffmann guest-edited is closed with 19 articles; the Editorial and e-book (download from RT page) are published.
- 2020-5-4 We are on twitter. You can follow us here https://twitter.com/humanoidsCTU.
- 2020-2-5 Matej Hoffmann and Zdenek Straka presented our work at the Human Brain Project summit.
- 2020-1-29 Interview with Matej Hoffmann in “Control Engineering Czech Republic” [in Czech].
- 2020-1-28 Matej Hoffmann and Petr Svarny invited speakers at “Robots 2020” – Trends in Automation…
- 2019-11-12 Matej Hoffmann was awarded the “GAČR EXPRO” project for excellence in fundamental research from the Czech Science Foundation.
- 2018-11-19 Z. Straka and J. Stepanovsky visited the HBP Summit – see the report about this and our work in the NRP Platform.
- 2018-10-29: We are partners in a new European research project (CHIST-ERA call) Interactive Perception-Action-Learning for Modelling Objects (IPALM) coordinated by Imperial College London (2019-2022).
- 2018-10-26: The Learning Body Models: Humans, Brains, and Robots seminar at Lorentz center has been a great success!
- 2018-10-19: Our new Nao robot with artificial skin is up and running! See the video and “autonomous touch video“.
- 2018-06-06: Article about our research and its participation as HBP Partnering Project for the public @ agenciasinc.es
- 2018-04-26: Matej Hoffmann was a member of the IIT Co-Aware team winning the KUKA Innovation Award 2018: Real-World Interaction Challenge. See the video.
- 2018-01 Our project has become a Partnering Project of the Human Brain Project.
- 2018 – Matej Hoffmann is the lead Topic Editor for this Frontiers Research Topic: Open for submissions!
- 2018-01: Work in collaboration with IIT Genoa and Yale University on “Compact real-time avoidance on a humanoid robot for human-robot interaction” has been accepted to the HRI ’18 Conference held in Chicago. [pdf @ arxiv] [youtube video]
- 2017-11-09: Matej Hoffmann will give a talk to the public at the Science and Technology week on 10.11.2017. (talk online – in Czech)
- 2017-09-14: Zdenek Straka and Matej Hoffmann were awarded ENNS Best Paper Award at 26th International Conference on Artificial Neural Networks (ICANN17) for their paper Learning a Peripersonal Space Representation as a Visual-Tactile Prediction Task.
- 2017-07-20: Our robot homunculus featured in the Italian magazine Focus and iCub YouTube channel (extended version in our channel). The article is available here: [early access – IEEE Xplore][postprint-pdf].
- 2017-04-18: A 4-page cover story about humanoid robotics in Respekt 16/2017 (a major Czech weekly journal; larger section available for free at ihned.cz.). Matej Hoffmann and Zdenek Straka interviewed and photographed with robots. A video, plus coverage of the rescue robots in the online addition to the story.
|Matej Hoffmann (Assistant Professor, coordinator)
Google Scholar profile
|Tomas Svoboda (Associate Professor)
Google Scholar profile
|Karla Stepanova (Postdoc)
Google Scholar profile
|Zdenek Straka (PhD Student)
Google Scholar profile
|Petr Svarny (PhD Student)
Google Scholar profile
|Filipe Gama (PhD Student)|
|Shubhan Patni (PhD Student)||Christian Mangione (PhD Student)|
More information can also be found here.
Models of body representations
How do babies learn about their bodies? Newborns probably do not have a holistic perception of their body; instead they are starting to pick up correlations in the streams of individual sensory modalities (in particular visual, tactile, proprioceptive). The structure in these streams allows them to learn the first models of their bodies. The mechanisms behind these processes are largely unclear. In collaboration with developmental and cognitive psychologists, we want to shed more light on this topic by developing robotic models.
Safe physical human-robot interaction
Robots are leaving the factory, entering domains that are far less structured and starting to share living spaces with humans. As a consequence, they need to dynamically adapt to unpredictable interactions with people and guarantee safety at every moment. “Body awareness” acquired through artificial skin can be used not only to improve reactions to collisions, but when coupled with vision, it can be extended to a surface around the body (so-called peripersonal space), facilitating collision avoidance and contact anticipation, eventually leading to safer and more natural interaction of the robot with objects, including humans.
Automatic robot self-calibration
Standard robot calibration procedures require prior knowledge of a number of quantities from the robot’s environment. These conditions have to be present for recalibration to be performed. This has motivated alternative solutions to the self-calibration problem that are more “self-contained” and can be performed automatically by the robot. These typically rely on self-observation of specific points on the robot using the robot’s own camera(s). The advent of robotic skin technologies opens up the possibility of completely new approaches. In particular, the kinematic chain can be closed and the necessary redundant information obtained through self-touch, broadening the sample collection from end-effector to whole body surface. Furthermore, the possibility of truly multimodal calibration – using visual, proprioceptive, tactile, and inertial information – is open.
- Whole-body awareness for safe and natural interaction: from brains to collaborative robots. “GAČR EXPRO” project for excellence in fundamental research from the Czech Science Foundation (2020-2024)
- Interactive Perception-Action-Learning for Modelling Objects (IPALM). European project (Horizon 2020, FET, ERA-NET Cofund, CHIST-ERA) coordinated by Imperial College London (2019-2022).
- Robot self-calibration and safe physical human-robot interaction inspired by body representations in primate brains. Czech Science Foundation. (2017-2019)
- We are also a Partnering Project of the Human Brain Project.
|Photo credits: Duilio Farina, Italian Institute of Technology.||Photo credits: Laura Taverna, Italian Institute of Technology.|
iCub (coming soon)
Humanoid robot from the Italian Institute of Technology with body proportions of a 4-year old child, 53 degrees of freedom, whole-body sensitive skin, and much more… See http://icub.org/ or iCub Youtube channel.
Two Nao humanoid robots from Softbank Robotics (formerly Aldebaran). The robots are used for developmental robotics as well as human-robot interaction research.
- Nao blue. Version Evolution (V5). Specially equipped with artificial sensitive skin (like the iCub) on torso, face, and hands.
- Nao red. Version 3+.
Collaborative manipulator arms
KUKA LBR iiwa + Barrett Hand
A 7 DoF collaborative robot (KUKA LBR iiwa R800) with joint torque sensing in every axis and a 3-finger Barrett Hand (BH8-282) with 96 tactile sensors and joint torque sensing in every fingertip.
UR10e + Airskin + OnRobot gripper / QB Soft Hand
A 6 DoF Universal Robots UR10e manipulator covered with Airskin collision sensor, including the gripper (OnRobot RG6).
Kinova Gen3 arm + Robotiq gripper
An ultralightweight 7 DoF arm with embedded RGB-D camera at the wrist and a Robotiq 2F finger.
An informal list of Open and ongoing projects is available on the webpages of Matej Hoffmann. You may have a look also at the Past projects, some of which have received awards or resulted in publications.
Other topics than listed below can be defined upon request – simply drop by at KN-E211 or write an email to Matej Hoffmann or other group members.