Humanoids

Humanoid and cognitive robotics

InfoNewsPeopleResearchProjectsRobotsStudent topics

We’re a group researching in the areas of humanoid, cognitive developmental, neuro-, and collaborative robotics. Robots with artificial electronic skins are one of our specialties. The group is coordinated by Assoc. Prof. Matej Hoffmann.

In our research, we employ the so-called synthetic methodology, or “understanding by building”, with two main goals:

  1. Understanding cognition and its development. In particular, we’re interested in the “body in the brain“: how do babies learn to represent their bodies and the space around it (peripersonal space) and what are the mechanisms in the brain. We build embodied computational models on humanoid robots to uncover how these representations operate.
  2. Robots safe and natural around humans. Taking inspiration from humans, we make robots exploit multimodal information (mostly vision and touch) to share space with humans. We’re interested in physical and social human-robot interaction.

For more details about our Research see the corresponding tab.

Our full affiliation is Vision for Robotics and Autonomous SystemsDepartment of CyberneticsFaculty of Electrical EngineeringCzech Technical University in Prague.

Humanoids group

Frontiers Research Topic: Body Representations, Peripersonal Space, and the Self: Humans, Animals, Robots

Coordinator

Matej Hoffmann (Associate Professor)
Google Scholar profile
.
Postdocs

Valentin Marcel
Google Scholar profile
Sergiu T. Popescu
Google Scholar profile
Giulia D’Angelo
Google Scholar profile
Haofeng Chen
Google Scholar profile
Zdenek Straka
Google Scholar profile
.
PhD students

.
Filipe Gama
Google Scholar profile
Shubhan Patni
Google Scholar profile
.
Jason Khoury
Google Scholar profile
Jakub Rozlivek
Google Scholar profile
Lukas Rustler
Google Scholar profile
.
Lab technician

Bedrich Himmel
.

Alumni

Karla Stepanova (Postdoc)
Google Scholar profile
Petr Svarny (PhD student – graduated 2023)
Google Scholar profile
.
For our publications, please see the Google Scholar profiles of individual group members. For demos see our YouTube channel. We are also active on X (twitter).
More information can also be found here.

Models of body representations

How do babies learn about their bodies? Newborns probably do not have a holistic perception of their body; instead they are starting to pick up correlations in the streams of individual sensory modalities (in particular visual, tactile, proprioceptive). The structure in these streams allows them to learn the first models of their bodies. The mechanisms behind these processes are largely unclear. In collaboration with developmental and cognitive psychologists, we want to shed more light on this topic by developing robotic models. See HERE for more details.

Collaborative robots and artificial touch

We target two application areas where robots with sensitive skins and multimodal awareness provide a key enabling technology:

  • safe and intelligent physical man-machine interaction and collaborative robotics (DETAILS)
  • automatic self-calibration (DETAILS)
  • Robots are leaving the factory, entering domains that are far less structured and starting to share living spaces with humans. As a consequence, they need to dynamically adapt to unpredictable interactions with people and guarantee safety at every moment. “Body awareness” acquired through artificial skin can be used not only to improve reactions to collisions, but when coupled with vision, it can be extended to a surface around the body (so-called peripersonal space), facilitating collision avoidance and contact anticipation, eventually leading to safer and more natural interaction of the robot with objects, including humans.

    Standard robot calibration procedures require prior knowledge of a number of quantities from the robot’s environment. These conditions have to be present for recalibration to be performed. This has motivated alternative solutions to the self-calibration problem that are more “self-contained” and can be performed automatically by the robot. These typically rely on self-observation of specific points on the robot using the robot’s own camera(s). The advent of robotic skin technologies opens up the possibility of completely new approaches. In particular, the kinematic chain can be closed and the necessary redundant information obtained through self-touch, broadening the sample collection from end-effector to whole body surface. Furthermore, the possibility of truly multimodal calibration – using visual, proprioceptive, tactile, and inertial information – is open.

    Humanoid robots

    Photo credits: Duilio Farina, Italian Institute of Technology. Photo credits: Laura Taverna, Italian Institute of Technology.

    iCub

    A humanoid robot from the Italian Institute of Technology with body proportions of a 4-year old child, 53 degrees of freedom, whole-body sensitive skin, and much more… See http://icub.org/ or iCub Youtube channel.

    Pepper

    Pepper is a humanoid robot from Softbank Robotics (formerly Aldebaran). The robot is meant for human-robot interaction research and social robotics. You can meet this robot also in various places like the Prague Airport.

    Nao

    Two Nao humanoid robots from Softbank Robotics (formerly Aldebaran). The robots are used for developmental robotics as well as human-robot interaction research.

    • Nao blue. Version Evolution (V5). Specially equipped with artificial sensitive skin (like the iCub) on torso, face, and hands.
    • Nao red. Version 3+.

    Collaborative manipulator arms

    KUKA LBR iiwa + Barrett Hand

    A 7 DoF collaborative robot (KUKA LBR iiwa R800) with joint torque sensing in every axis and a 3-finger Barrett Hand (BH8-282) with 96 tactile sensors and joint torque sensing in every fingertip.

    UR10e + Airskin + OnRobot gripper / QB Soft Hand

    A 6 DoF Universal Robots UR10e manipulator covered with Airskin collision sensor, including the gripper (OnRobot RG6).

    Kinova Gen3 arm + Robotiq gripper

    An ultralightweight 7 DoF arm with embedded RGB-D camera at the wrist and a Robotiq 2F finger.

    We offer interesting topics for student theses and projects as well as paid internships.

    An informal list of Open and ongoing projects is available on the webpages of Matej Hoffmann. You may have a look also at the Past projects, some of which have received awards or resulted in publications.

    Other topics than listed below can be defined upon request – simply drop by at KN-E211 or write an email to Matej Hoffmann or other group members.

    Responsible person: Matěj Hoffmann