Feed aggregator

In robotic-assisted partial nephrectomy, surgeons remove a part of a kidney often due to the presence of a mass. A drop-in ultrasound probe paired to a surgical robot is deployed to execute multiple swipes over the kidney surface to localise the mass and define the margins of resection. This sub-task is challenging and must be performed by a highly-skilled surgeon. Automating this sub-task may reduce cognitive load for the surgeon and improve patient outcomes. The eventual goal of this work is to autonomously move the ultrasound probe on the surface of the kidney taking advantage of the use of the Pneumatically Attachable Flexible (PAF) rail system, a soft robotic device used for organ scanning and repositioning. First, we integrate a shape-sensing optical fibre into the PAF rail system to evaluate the curvature of target organs in robotic-assisted laparoscopic surgery. Then, we investigate the impact of the PAF rail’s material stiffness on the curvature sensing accuracy, considering that soft targets are present in the surgical field. We found overall curvature sensing accuracy to be between 1.44% and 7.27% over the range of curvatures present in adult kidneys. Finally, we use shape sensing to plan the trajectory of the da Vinci surgical robot paired with a drop-in ultrasound probe and autonomously generate an Ultrasound scan of a kidney phantom.

For effective human-robot collaboration, it is crucial for robots to understand requests from users perceiving the three-dimensional space and ask reasonable follow-up questions when there are ambiguities. While comprehending the users’ object descriptions in the requests, existing studies have focused on this challenge for limited object categories that can be detected or localized with existing object detection and localization modules. Further, they have mostly focused on comprehending the object descriptions using flat RGB images without considering the depth dimension. On the other hand, in the wild, it is impossible to limit the object categories that can be encountered during the interaction, and 3-dimensional space perception that includes depth information is fundamental in successful task completion. To understand described objects and resolve ambiguities in the wild, for the first time, we suggest a method leveraging explainability. Our method focuses on the active areas of an RGB scene to find the described objects without putting the previous constraints on object categories and natural language instructions. We further improve our method to identify the described objects considering depth dimension. We evaluate our method in varied real-world images and observe that the regions suggested by our method can help resolve ambiguities. When we compare our method with a state-of-the-art baseline, we show that our method performs better in scenes with ambiguous objects which cannot be recognized by existing object detectors. We also show that using depth features significantly improves performance in scenes where depth data is critical to disambiguate the objects and across our evaluation dataset that contains objects that can be specified with and without the depth dimension.

This paper proposes an adaptive robust Jacobian-based controller for task-space position-tracking control of robotic manipulators. Structure of the controller is built up on a traditional Proportional-Integral-Derivative (PID) framework. An additional neural control signal is next synthesized under a non-linear learning law to compensate for internal and external disturbances in the robot dynamics. To provide the strong robustness of such the controller, a new gain learning feature is then integrated to automatically adjust the PID gains for various working conditions. Stability of the closed-loop system is guaranteed by Lyapunov constraints. Effectiveness of the proposed controller is carefully verified by intensive simulation results.

Often in swarm robotics, an assumption is made that all robots in the swarm behave the same and will have a similar (if not the same) error model. However, in reality, this is not the case, and this lack of uniformity in the error model, and other operations, can lead to various emergent behaviors. This paper considers the impact of the error model and compares robots in a swarm that operate using the same error model (uniform error) against each robot in the swarm having a different error model (thus introducing error diversity). Experiments are presented in the context of a foraging task. Simulation and physical experimental results show the importance of the error model and diversity in achieving the expected swarm behavior.

Stroke is a major global issue, affecting millions every year. When a stroke occurs, survivors are often left with physical disabilities or difficulties, frequently marked by abnormal gait. Post-stroke gait normally presents as one of or a combination of unilaterally shortened step length, decreased dorsiflexion during swing phase, and decreased walking speed. These factors lead to an increased chance of falling and an overall decrease in quality of life due to a reduced ability to locomote quickly and safely under one’s own power. Many current rehabilitation techniques fail to show lasting results that suggest the potential for producing permanent changes. As technology has advanced, robot-assisted rehabilitation appears to have a distinct advantage, as the precision and repeatability of such an intervention are not matched by conventional human-administered therapy. The possible role in gait rehabilitation of the Variable Stiffness Treadmill (VST), a unique, robotic treadmill, is further investigated in this paper. The VST is a split-belt treadmill that can reduce the vertical stiffness of one of the belts, while the other belt remains rigid. In this work, we show that the repeated unilateral stiffness perturbations created by this device elicit an aftereffect of increased step length that is seen for over 575 gait cycles with healthy subjects after a single 10-min intervention. These long aftereffects are currently unmatched in the literature according to our knowledge. This step length increase is accompanied by kinematics and muscle activity aftereffects that help explain functional changes and have their own independent value when considering the characteristics of post-stroke gait. These results suggest that repeated unilateral stiffness perturbations could possibly be a useful form of post-stroke gait rehabilitation.

In the current industrial context, the importance of assessing and improving workers' health conditions is widely recognised. Both physical and psycho-social factors contribute to jeopardising the underlying comfort and well-being, boosting the occurrence of diseases and injuries, and affecting their quality of life. Human-robot interaction and collaboration frameworks stand out among the possible solutions to prevent and mitigate workplace risk factors. The increasingly advanced control strategies and planning schemes featured by collaborative robots have the potential to foster fruitful and efficient coordination during the execution of hybrid tasks, by meeting their human counterparts' needs and limits. To this end, a thorough and comprehensive evaluation of an individual's ergonomics, i.e. direct effect of workload on the human psycho-physical status, must be taken into account. In this review article, we provide an overview of the existing ergonomics assessment tools as well as the available monitoring technologies to drive and adapt a collaborative robot's behaviour. Preliminary attempts of ergonomic human-robot collaboration frameworks are presented next, discussing state-of-the-art limitations and challenges. Future trends and promising themes are finally highlighted, aiming to promote safety, health, and equality in worldwide workplaces.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

Following the great success of the miniature humanoid robot DARwIn-OP we have developed, RoMeLa is proud to introduce the next generation humanoid robot for research and education, BRUCE (Bipedal Robot Unit with Compliance Enhanced.) BRUCE is an open-platform humanoid robot that utilizes the BEAR proprioceptive actuators, enabling it to have stunning dynamic performance capabilities never before seen in this class of robots. Originally developed at RoMeLa in joint effort with Westwood Robotics, BRUCE will be made open source to the robotics community and also be made available via Westwood Robotics.

BRUCE has a total 16 DoF, is 70cm in height and weights only 4.8kg. With a 3000mAh lithium battery it can lasts for about 20 minutes with continuous dynamic motions. Besides its excellent dynamic performance, BRUCE is very robust and user-friendly, along with great compatibility and expandability. BRUCE makes humanoid robotics research efficient, safe and fun.

[ Westwood Robotics ]

This video shows evoBOT, a dynamically stable and autonomous transport robot.

[ Fraunhofer IML ]

ASL Team wishes you all the best for 2023 :-)

[ ASL ]

Holidays are a magical time. But if you feel like our robot dog Marvin, the magic needs to catch up and find you. Keep your eyes and heart open for possibilities – jolliness is closer than you realize!

[ Accenture Baltics ]

In this Christmas clip, the robots of a swarm transport Christmas decorations and they cooperate to carry the decorated tree. Each robot has enough strength to carry the decorations itself, however, no robot can carry the tree on its own. The solution: they carry the tree by working together!

[ Demiurge ]

Thanks David!

Our VoloDrone team clearly got the holiday feels in snowy Germany while sling load testing cargo – definitely a new way of disposing of a Christmas tree before the New Year.

[ Volocopter ]

What if we race three commercially available quadruped robots for a bit of fun...? Out of the box configuration, ‘full sticks forward’ on the remotes on flat ground. Hope you enjoy the results ;-)

[ CSIRO Data61 ]

Happy Holidays From Veo!

[ Veo ]

In ETH Zurich’s Soft Robotics Lab, a white robot hand reaches for a beer can, lifts it up and moves it to a glass at the other end of the table. There, the hand carefully tilts the can to the right and pours the sparkling, gold-coloured liquid into the glass without spilling it. Cheers!

[ SRL ]

Bingo (aka Santa) found herself a new sleigh! All of us at CSIRO’s Data61 Robotics and Autonomous Systems Group wish everyone a Merry Christmas and Happy Holidays!

[ CSIRO Data61 ]

From 2020, a horse-inspired walking robot.

[ Ishikawa Minami Lab ]

Landing an unmanned aerial vehicle (UAV) on top of an unmanned surface vehicle (USV) in harsh open waters is a challenging problem, owing to forces that can damage the UAV due to a severe roll and/or pitch angle of the USV during touchdown. To tackle this, we propose a novel model predictive control (MPC) approach enabling a UAV to land autonomously on a USV in these harsh conditions.

[ MRS CTU ]

GITAI has a fancy new office in Los Angeles that they’re filling with space robots.

[ GITAI ]

This Maryland Robotics Center seminar is from CMU’s Vickie Webster-Wood: “It’s Alive! Bioinspired and biohybrid approaches towards life-like and living robots.”

In this talk, I will share efforts from my group in our two primary research thrusts: Bioinspired robotics, and biohybrid robotics. By using neuromechanical models and bioinspired robots as tools for basic research we are developing new models of how animals achieve multifunctional, adaptable behaviors. Building on our understanding of animal systems and living tissues, our research in biohybrid robotics is enabling new approaches toward the creation of autonomous biodegradable living robots. Such robotic systems have future applications in medicine, search and rescue, and environmental monitoring of sensitive environments (e.g., coral reefs).

[ UMD ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

Following the great success of the miniature humanoid robot DARwIn-OP we have developed, RoMeLa is proud to introduce the next generation humanoid robot for research and education, BRUCE (Bipedal Robot Unit with Compliance Enhanced.) BRUCE is an open-platform humanoid robot that utilizes the BEAR proprioceptive actuators, enabling it to have stunning dynamic performance capabilities never before seen in this class of robots. Originally developed at RoMeLa in joint effort with Westwood Robotics, BRUCE will be made open source to the robotics community and also be made available via Westwood Robotics.

BRUCE has a total 16 DoF, is 70cm in height and weights only 4.8kg. With a 3000mAh lithium battery it can lasts for about 20 minutes with continuous dynamic motions. Besides its excellent dynamic performance, BRUCE is very robust and user-friendly, along with great compatibility and expandability. BRUCE makes humanoid robotics research efficient, safe and fun.

[ Westwood Robotics ]

This video shows evoBOT, a dynamically stable and autonomous transport robot.

[ Fraunhofer IML ]

ASL Team wishes you all the best for 2023 :-)

[ ASL ]

Holidays are a magical time. But if you feel like our robot dog Marvin, the magic needs to catch up and find you. Keep your eyes and heart open for possibilities – jolliness is closer than you realize!

[ Accenture Baltics ]

In this Christmas clip, the robots of a swarm transport Christmas decorations and they cooperate to carry the decorated tree. Each robot has enough strength to carry the decorations itself, however, no robot can carry the tree on its own. The solution: they carry the tree by working together!

[ Demiurge ]

Thanks David!

Our VoloDrone team clearly got the holiday feels in snowy Germany while sling load testing cargo – definitely a new way of disposing of a Christmas tree before the New Year.

[ Volocopter ]

What if we race three commercially available quadruped robots for a bit of fun...? Out of the box configuration, ‘full sticks forward’ on the remotes on flat ground. Hope you enjoy the results ;-)

[ CSIRO Data61 ]

Happy Holidays From Veo!

[ Veo ]

In ETH Zurich’s Soft Robotics Lab, a white robot hand reaches for a beer can, lifts it up and moves it to a glass at the other end of the table. There, the hand carefully tilts the can to the right and pours the sparkling, gold-coloured liquid into the glass without spilling it. Cheers!

[ SRL ]

Bingo (aka Santa) found herself a new sleigh! All of us at CSIRO’s Data61 Robotics and Autonomous Systems Group wish everyone a Merry Christmas and Happy Holidays!

[ CSIRO Data61 ]

From 2020, a horse-inspired walking robot.

[ Ishikawa Minami Lab ]

Landing an unmanned aerial vehicle (UAV) on top of an unmanned surface vehicle (USV) in harsh open waters is a challenging problem, owing to forces that can damage the UAV due to a severe roll and/or pitch angle of the USV during touchdown. To tackle this, we propose a novel model predictive control (MPC) approach enabling a UAV to land autonomously on a USV in these harsh conditions.

[ MRS CTU ]

GITAI has a fancy new office in Los Angeles that they’re filling with space robots.

[ GITAI ]

This Maryland Robotics Center seminar is from CMU’s Vickie Webster-Wood: “It’s Alive! Bioinspired and biohybrid approaches towards life-like and living robots.”

In this talk, I will share efforts from my group in our two primary research thrusts: Bioinspired robotics, and biohybrid robotics. By using neuromechanical models and bioinspired robots as tools for basic research we are developing new models of how animals achieve multifunctional, adaptable behaviors. Building on our understanding of animal systems and living tissues, our research in biohybrid robotics is enabling new approaches toward the creation of autonomous biodegradable living robots. Such robotic systems have future applications in medicine, search and rescue, and environmental monitoring of sensitive environments (e.g., coral reefs).

[ UMD ]



Even simple robotic grippers can perform complex tasks—so long as it’s smart about using its environment as its handy aide. This, at least, is the finding of new research from Carnegie Mellon University’s Robotics Institute.

In robotics, simple grippers are typically assigned straightforward tasks such as picking up objects and placing them somewhere. However, by making use of their surroundings, such as pushing an item against a table or wall, simple grippers can perform skillful maneuvers usually thought achievable only by more complex, fragile and expensive, multi-fingered artificial hands.

However, previous research on this strategy, known as “extrinsic dexterity,” often made assumptions about the way in which grippers would grasp items. This in turn required specific gripper designs or robot motions.

“Simple grippers are underrated.”
—Wenxuan Zhou, Carnegie Mellon University

In the new study, scientists used AI to overcome these limitations to apply extrinsic dexterity to more general settings and successfully grasp items of various sizes, weights, shapes and surfaces.

“This research may open up new possibilities in manipulation with a simple gripper,” says study lead author Wenxuan Zhou at Carnegie Mellon University. “Potential applications include warehouse robots or housekeeping robots that help people to organize their home.”

The researchers employed reinforcement learning to train a neural network. They had the AI system attempt random actions to grasp an object, rewarding those series of actions that led to success. The system, then, ultimately adopted the most successful patterns of behavior. It learned, in so many words. After first training their system in a physics simulator, they next tested it in a simple robot with a pincer-like grip.

The scientists had the robot attempt to grab items confined within an open bin that were initially oriented in ways that meant the robot could not pick them up. For example, the robot might be given an object that was too wide for its gripper to grasp. The AI needed to figure out a way to push the item against the wall of the bin so the robot could then grab it from its side.

“Initially, we thought the robot might try to do something like scooping underneath the object, as humans do,” Zhou says. “However, the algorithm gave us an unexpected answer.” After nudging an item against the wall, the robot pushed its top finger against the side of the object to lever it up, “and then let the object drop on the bottom finger to grasp it.”

In experiments, Zhou and her colleagues tested their system on items such as cardboard boxes, plastic bottles, a toy purse and a container of Cool Whip. These varied in weight, shape and how slippery they were. They found their simple grippers could successfully grasp these items with a 78 percent success rate.

“Simple grippers are underrated,” Zhou says. “Robots should exploit extrinsic dexterity for more skillful manipulation.”

In the future, the group hopes to generalize their findings to, Zhou says, “a wider range of objects and scenarios,” Zhou says. “We are also interested in exploring more complex tasks with a simple gripper with extrinsic dexterity.”

The scientists detailed their findings 18 December at the Conference on Robot Learning in Auckland, New Zealand.



Even simple robotic grippers can perform complex tasks—so long as it’s smart about using its environment as its handy aide. This, at least, is the finding of new research from Carnegie Mellon University’s Robotics Institute.

In robotics, simple grippers are typically assigned straightforward tasks such as picking up objects and placing them somewhere. However, by making use of their surroundings, such as pushing an item against a table or wall, simple grippers can perform skillful maneuvers usually thought achievable only by more complex, fragile and expensive, multi-fingered artificial hands.

However, previous research on this strategy, known as “extrinsic dexterity,” often made assumptions about the way in which grippers would grasp items. This in turn required specific gripper designs or robot motions.

“Simple grippers are underrated.”
—Wenxuan Zhou, Carnegie Mellon University

In the new study, scientists used AI to overcome these limitations to apply extrinsic dexterity to more general settings and successfully grasp items of various sizes, weights, shapes and surfaces.

“This research may open up new possibilities in manipulation with a simple gripper,” says study lead author Wenxuan Zhou at Carnegie Mellon University. “Potential applications include warehouse robots or housekeeping robots that help people to organize their home.”

The researchers employed reinforcement learning to train a neural network. They had the AI system attempt random actions to grasp an object, rewarding those series of actions that led to success. The system, then, ultimately adopted the most successful patterns of behavior. It learned, in so many words. After first training their system in a physics simulator, they next tested it in a simple robot with a pincer-like grip.

The scientists had the robot attempt to grab items confined within an open bin that were initially oriented in ways that meant the robot could not pick them up. For example, the robot might be given an object that was too wide for its gripper to grasp. The AI needed to figure out a way to push the item against the wall of the bin so the robot could then grab it from its side.

“Initially, we thought the robot might try to do something like scooping underneath the object, as humans do,” Zhou says. “However, the algorithm gave us an unexpected answer.” After nudging an item against the wall, the robot pushed its top finger against the side of the object to lever it up, “and then let the object drop on the bottom finger to grasp it.”

In experiments, Zhou and her colleagues tested their system on items such as cardboard boxes, plastic bottles, a toy purse and a container of Cool Whip. These varied in weight, shape and how slippery they were. They found their simple grippers could successfully grasp these items with a 78 percent success rate.

“Simple grippers are underrated,” Zhou says. “Robots should exploit extrinsic dexterity for more skillful manipulation.”

In the future, the group hopes to generalize their findings to, Zhou says, “a wider range of objects and scenarios,” Zhou says. “We are also interested in exploring more complex tasks with a simple gripper with extrinsic dexterity.”

The scientists detailed their findings 18 December at the Conference on Robot Learning in Auckland, New Zealand.



2022 was a huge year for robotics. Yes, I might say this every year, and yes, every year I might also say that each year is more significant than any other. But seriously: This year trumped them all. After a tough pandemic (which, let’s be clear, is still not over), conferences and events have started to come back, research has resumed, and robots have continued to make their way into the world. It really has been a great year.

And on a personal note, we’d like to thank you, all of you, for reading (and hopefully enjoying) our work. We’d be remiss if we didn’t also thank those of you who provide awesome stuff for us to write about. So, please enjoy this quick look back at some of our most popular and most impactful stories of 2022. Here’s wishing for more and better in 2023!

The Bionic-Hand Arms Race

Robotic technology can be a powerful force for good, but using robots to make the world a better place has to be done respectfully. This is especially true when what you’re working on has a direct physical impact on a user, as is the case with bionic limbs. Britt Young has a more personal perspective on this than most, and in this article, she weaved together history, technology, and her own experience to explore bionic limb design. With over 100,000 views, this was our most popular robotics story of 2022.

For Better or Worse, Tesla Bot Is Exactly What We Expected

After Elon Musk announced Tesla’s development of a new humanoid robot, we were left wondering whether the car company would be able to somehow deliver something magical. We found out this year that the answer is a resounding “Not really.” There was nothing wrong with Tesla Bot, but it was immediately obvious that Tesla had not managed to do anything groundbreaking with it, either. While there is certainly potential for the future, at this point it’s just another humanoid robot with a long and difficult development path ahead of it.

Autonomous Drones Challenge Human Champions in First “Fair” Race

Usually, the kinds of things that humans are really good at and the kinds of things that robots are really good at don’t overlap all that much. So, it’s always impressive when robots get anywhere close to human performance in activities that play to our strengths. This year, autonomous drones from the University of Zurich managed for the first time to defeat the best human pilots in the world in a “fair” drone race, where both humans and robots relied entirely on their onboard brains and visual perception.

How Robots Can Help Us Act and Feel Younger

Gill Pratt has a unique perspective on the robotics world, going from academia to DARPA program manager to the current CEO of Toyota Research. His leadership position at TRI means that he can visualize how to make robots that best help humanity, and then actually work towards putting that vision into practice—commercially and at scale. His current focus is assistive robots that help us live fuller, happier lives as we age.

DARPA’s RACER Program Sends High-Speed Autonomous Vehicles Off-Road

Getting autonomous vehicles to drive themselves is not easy, but the fact that they work even as well as they do is arguably due to the influence of DARPA’s 2005 Grand Challenge. That’s why it’s so exciting to hear about DARPA’s newest autonomous vehicle challenge, aimed at putting fully autonomous vehicles out into the wilderness to fend for themselves completely off-road.

Boston Dynamics AI Institute Targets Basic Research

Boston Dynamics is arguably best known for developing amazing robots with questionable practicality. As the company seeks to change that by exploring commercial applications for its existing platforms, founder Marc Raibert has decided to keep focusing on basic research by starting a completely new institute with the backing of Hyundai.

Alphabet’s Intrinsic Acquires Majority of Open Robotics

The Open Source Robotics Foundation (OSRF) spun out of Willow Garage 10 years ago. This year’s acquisition of most of the Open Robotics team by Alphabet’s Intrinsic represents a milestone for the Robotics Operating System (ROS). the fact that it’s even possible for Open Robotics to move on like this is a testament to just how robust the ROS community is. The Open Robotics folks will still be contributing to ROS, with a much smaller OSRF supporting the community directly. But it’s hard to say goodbye to what OSRF used to be.

The 11 Commandments of Hugging Robots

Hugging robots is super important to me, and it should be important to you, too! And to everyone, everywhere! While, personally, I’m perfectly happy to hug just about any robot, very few of them can hug back—at least in part because the act of hugging is a complex human interaction task that requires either experience being a human or a lot of research for a robot. Much of that research has now been done, giving robots some data-driven guidelines about how to give really good hugs.


Labrador Addresses Critical Need With Deceptively Simple Home Robot

It’s not often that we see a new autonomous home robot with a compelling use case. But this year, Labrador Systems introduced Retriever, a semi-autonomous mobile table that can transport objects for folks with mobility challenges. If Retriever doesn’t sound like a big deal, that’s probably because you have no use for a robot like this; but it has the potential to make a huge impact on people who need it.

Even as It Retires, ASIMO Still Manages to Impress

ASIMO has been setting the standard for humanoid robots for literally a decade. Honda’s tiny humanoid was walking, running, and jumping back in 2011 (!)—and that was just the most recent version. ASIMO has been under development since the mid-1980s, which is some seriously ancient history as far as humanoid robots go. Honda decided to retire the little white robot this year, but ASIMO’s legacy lives on in Honda’s humanoid robot program. We’ll miss you, buddy.




2022 was a huge year for robotics. Yes, I might say this every year, and yes, every year I might also say that each year is more significant than any other. But seriously: This year trumped them all. After a tough pandemic (which, let’s be clear, is still not over), conferences and events have started to come back, research has resumed, and robots have continued to make their way into the world. It really has been a great year.

And on a personal note, we’d like to thank you, all of you, for reading (and hopefully enjoying) our work. We’d be remiss if we didn’t also thank those of you who provide awesome stuff for us to write about. So, please enjoy this quick look back at some of our most popular and most impactful stories of 2022. Here’s wishing for more and better in 2023!

The Bionic-Hand Arms Race

Robotic technology can be a powerful force for good, but using robots to make the world a better place has to be done respectfully. This is especially true when what you’re working on has a direct physical impact on a user, as is the case with bionic limbs. Britt Young has a more personal perspective on this than most, and in this article, she weaved together history, technology, and her own experience to explore bionic limb design. With over 100,000 views, this was our most popular robotics story of 2022.

For Better or Worse, Tesla Bot Is Exactly What We Expected

After Elon Musk announced Tesla’s development of a new humanoid robot, we were left wondering whether the car company would be able to somehow deliver something magical. We found out this year that the answer is a resounding “Not really.” There was nothing wrong with Tesla Bot, but it was immediately obvious that Tesla had not managed to do anything groundbreaking with it, either. While there is certainly potential for the future, at this point it’s just another humanoid robot with a long and difficult development path ahead of it.

Autonomous Drones Challenge Human Champions in First “Fair” Race

Usually, the kinds of things that humans are really good at and the kinds of things that robots are really good at don’t overlap all that much. So, it’s always impressive when robots get anywhere close to human performance in activities that play to our strengths. This year, autonomous drones from the University of Zurich managed for the first time to defeat the best human pilots in the world in a “fair” drone race, where both humans and robots relied entirely on their onboard brains and visual perception.

How Robots Can Help Us Act and Feel Younger

Gill Pratt has a unique perspective on the robotics world, going from academia to DARPA program manager to the current CEO of Toyota Research. His leadership position at TRI means that he can visualize how to make robots that best help humanity, and then actually work towards putting that vision into practice—commercially and at scale. His current focus is assistive robots that help us live fuller, happier lives as we age.

DARPA’s RACER Program Sends High-Speed Autonomous Vehicles Off-Road

Getting autonomous vehicles to drive themselves is not easy, but the fact that they work even as well as they do is arguably due to the influence of DARPA’s 2005 Grand Challenge. That’s why it’s so exciting to hear about DARPA’s newest autonomous vehicle challenge, aimed at putting fully autonomous vehicles out into the wilderness to fend for themselves completely off-road.

Boston Dynamics AI Institute Targets Basic Research

Boston Dynamics is arguably best known for developing amazing robots with questionable practicality. As the company seeks to change that by exploring commercial applications for its existing platforms, founder Marc Raibert has decided to keep focusing on basic research by starting a completely new institute with the backing of Hyundai.

Alphabet’s Intrinsic Acquires Majority of Open Robotics

The Open Source Robotics Foundation (OSRF) spun out of Willow Garage 10 years ago. This year’s acquisition of most of the Open Robotics team by Alphabet’s Intrinsic represents a milestone for the Robotics Operating System (ROS). the fact that it’s even possible for Open Robotics to move on like this is a testament to just how robust the ROS community is. The Open Robotics folks will still be contributing to ROS, with a much smaller OSRF supporting the community directly. But it’s hard to say goodbye to what OSRF used to be.

The 11 Commandments of Hugging Robots

Hugging robots is super important to me, and it should be important to you, too! And to everyone, everywhere! While, personally, I’m perfectly happy to hug just about any robot, very few of them can hug back—at least in part because the act of hugging is a complex human interaction task that requires either experience being a human or a lot of research for a robot. Much of that research has now been done, giving robots some data-driven guidelines about how to give really good hugs.


Labrador Addresses Critical Need With Deceptively Simple Home Robot

It’s not often that we see a new autonomous home robot with a compelling use case. But this year, Labrador Systems introduced Retriever, a semi-autonomous mobile table that can transport objects for folks with mobility challenges. If Retriever doesn’t sound like a big deal, that’s probably because you have no use for a robot like this; but it has the potential to make a huge impact on people who need it.

Even as It Retires, ASIMO Still Manages to Impress

ASIMO has been setting the standard for humanoid robots for literally a decade. Honda’s tiny humanoid was walking, running, and jumping back in 2011 (!)—and that was just the most recent version. ASIMO has been under development since the mid-1980s, which is some seriously ancient history as far as humanoid robots go. Honda decided to retire the little white robot this year, but ASIMO’s legacy lives on in Honda’s humanoid robot program. We’ll miss you, buddy.




Video Friday is your weekly selection of awesome robotics videos (special holiday edition!) collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

We hope you have an uplifting holiday season! Spot was teleoperated by professional operators, don’t try this at home.

[ Boston Dynamics ]

This year, our robot Husky was very busy working for the European Space Agency (ESA). But will he have to spend Christmas alone, apart from his robot friends at the FZI – alone on the moon? His friends want to change that! So, they train very hard to reunite with Husky! Will they succeed?

[ FZI ]

Thanks, Arne!

We heard Santa is starting to automate at the North Pole and loads the sledge with robots now. Enjoy our little Christmas movie!

[ Leverage Robotics ]

Thanks, Roman!

A self healing soft robot finger developed by VUB-imec Brubotics and FYSC sending in morse to the world “MERRY XMAS”.

[ BruBrotics ]

Thanks, Bram!

After the research team made some gingerbread houses, we wanted to see how Nadia would do walking over them. Happy Holidays everyone!

[ IHMC Robotics ]

In this festive robotic Christmas sketch, a group of highly advanced robots come together to celebrate the holiday season. The “Berliner Hochschule für Technik” wishes a merry Christmas and a happy new year!

[ BHT ]

Thanks, Hannes!

Our GoFa cobot had a fantastic year and is ready for new challenges in the new year, but right now, its time for some celebrations with some cobot-made delicious cookies.

[ ABB ]

Helping with the office tree, from Sanctuary AI.

Flavor text from the video description: “Decorated Christmas trees originated during the 16th-century in Germany. Protestant reformer Martin Luther is known for being among the first major historical figures to add candles to an evergreen tree. It is unclear whether this was, even then, considered to be a good idea.”

[ Sanctuary ]

Merry Christmas from qbrobotics!

[ qbrobotics ]

Christmas, delivered by robots!

[ Naver Labs ]

Bernadett dressed Ecowalker in Xmas lights. Enjoy the holidays!

[ Max Planck ]

Warmest greetings this holiday season and best wishes for a happy New Year from Kawasaki Robotics.

[ Kawasaki Robotics ]

Robotnik wishes you a Merry Christmas 2022.

[ Robotnik ]

CYBATHLON wishes you all a happy festive season and a happy new year 2023!

[ Cybathlon ]

Here’s what LiDAR-based SLAM in a snow gust looks like. Enjoy the weather out there!

[ NORLAB ]

We present advances on the development of proactive control for online individual user adaptation in a welfare robot guidance scenario. The proposed control approach can drive a mobile robot to autonomously navigate in relevant indoor environments. All in all, this study captures a wide range of research from robot control technology development to technological validity in a relevant environment and system prototype demonstration in an operational environment (i.e., an elderly care center).

[ Paper ]

Thanks, Poramate!

“Every day in a research job :)”

[ Chengxu Zhou ]

Robots like Digit are purpose-built to do tasks in environments made for humans. We aren’t trying to just mimic the look of people or make a humanoid robot. Every design and engineering decision is looked at through a function-first lens. To easily walk into warehouses and work alongside people, to do the kinds of dynamic reaching, carrying, and walking that we do, Digit has some similar characteristics. Our Co-Founder and Chief Technology Officer Jonathan Hurst, discusses the difference between humanoid and human-centric robotics.

[ Agility Robotics ]

This year, the KUKA Innovation Award is all about medicine and health. After all, new technologies are playing an increasingly important role in healthcare and will be virtually indispensable in the future. Researchers, developers and young entrepreneurs from all over the world submitted their concepts for the “Robotics in Healthcare Challenge”. An international jury of experts evaluated the concepts and selected our five finalists.

[ Kuka ]

In the summer of 2003, two NASA rovers began their journeys to Mars at a time when the Red Planet and Earth were the nearest they had been to each other in 60,000 years. To capitalize on this alignment, the rovers had been built at breakneck speed by teams at NASA’s Jet Propulsion Laboratory. The mission came amid further pressures, from mounting international competition to increasing public scrutiny following the loss of the space shuttle Columbia and its crew of seven. NASA was in great need of a success.
“Landing on Mars” is the story of Opportunity and Spirit surviving a massive solar flare during cruise, the now well-known “six minutes of terror,” and what came close to being a mission-ending software error for the first rover once it was on the ground.

[ JPL ]



Video Friday is your weekly selection of awesome robotics videos (special holiday edition!) collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

We hope you have an uplifting holiday season! Spot was teleoperated by professional operators, don’t try this at home.

[ Boston Dynamics ]

This year, our robot Husky was very busy working for the European Space Agency (ESA). But will he have to spend Christmas alone, apart from his robot friends at the FZI – alone on the moon? His friends want to change that! So, they train very hard to reunite with Husky! Will they succeed?

[ FZI ]

Thanks, Arne!

We heard Santa is starting to automate at the North Pole and loads the sledge with robots now. Enjoy our little Christmas movie!

[ Leverage Robotics ]

Thanks, Roman!

A self healing soft robot finger developed by VUB-imec Brubotics and FYSC sending in morse to the world “MERRY XMAS”.

[ BruBrotics ]

Thanks, Bram!

After the research team made some gingerbread houses, we wanted to see how Nadia would do walking over them. Happy Holidays everyone!

[ IHMC Robotics ]

In this festive robotic Christmas sketch, a group of highly advanced robots come together to celebrate the holiday season. The “Berliner Hochschule für Technik” wishes a merry Christmas and a happy new year!

[ BHT ]

Thanks, Hannes!

Our GoFa cobot had a fantastic year and is ready for new challenges in the new year, but right now, its time for some celebrations with some cobot-made delicious cookies.

[ ABB ]

Helping with the office tree, from Sanctuary AI.

Flavor text from the video description: “Decorated Christmas trees originated during the 16th-century in Germany. Protestant reformer Martin Luther is known for being among the first major historical figures to add candles to an evergreen tree. It is unclear whether this was, even then, considered to be a good idea.”

[ Sanctuary ]

Merry Christmas from qbrobotics!

[ qbrobotics ]

Christmas, delivered by robots!

[ Naver Labs ]

Bernadett dressed Ecowalker in Xmas lights. Enjoy the holidays!

[ Max Planck ]

Warmest greetings this holiday season and best wishes for a happy New Year from Kawasaki Robotics.

[ Kawasaki Robotics ]

Robotnik wishes you a Merry Christmas 2022.

[ Robotnik ]

CYBATHLON wishes you all a happy festive season and a happy new year 2023!

[ Cybathlon ]

Here’s what LiDAR-based SLAM in a snow gust looks like. Enjoy the weather out there!

[ NORLAB ]

We present advances on the development of proactive control for online individual user adaptation in a welfare robot guidance scenario. The proposed control approach can drive a mobile robot to autonomously navigate in relevant indoor environments. All in all, this study captures a wide range of research from robot control technology development to technological validity in a relevant environment and system prototype demonstration in an operational environment (i.e., an elderly care center).

[ Paper ]

Thanks, Poramate!

“Every day in a research job :)”

[ Chengxu Zhou ]

Robots like Digit are purpose-built to do tasks in environments made for humans. We aren’t trying to just mimic the look of people or make a humanoid robot. Every design and engineering decision is looked at through a function-first lens. To easily walk into warehouses and work alongside people, to do the kinds of dynamic reaching, carrying, and walking that we do, Digit has some similar characteristics. Our Co-Founder and Chief Technology Officer Jonathan Hurst, discusses the difference between humanoid and human-centric robotics.

[ Agility Robotics ]

This year, the KUKA Innovation Award is all about medicine and health. After all, new technologies are playing an increasingly important role in healthcare and will be virtually indispensable in the future. Researchers, developers and young entrepreneurs from all over the world submitted their concepts for the “Robotics in Healthcare Challenge”. An international jury of experts evaluated the concepts and selected our five finalists.

[ Kuka ]

In the summer of 2003, two NASA rovers began their journeys to Mars at a time when the Red Planet and Earth were the nearest they had been to each other in 60,000 years. To capitalize on this alignment, the rovers had been built at breakneck speed by teams at NASA’s Jet Propulsion Laboratory. The mission came amid further pressures, from mounting international competition to increasing public scrutiny following the loss of the space shuttle Columbia and its crew of seven. NASA was in great need of a success.
“Landing on Mars” is the story of Opportunity and Spirit surviving a massive solar flare during cruise, the now well-known “six minutes of terror,” and what came close to being a mission-ending software error for the first rover once it was on the ground.

[ JPL ]

Speech-to-text engines are extremely needed nowadays for different applications, representing an essential enabler in human–robot interaction. Still, some languages suffer from the lack of labeled speech data, especially in the Arabic dialects or any low-resource languages. The need for a self-supervised training process and self-training using noisy training is proven to be one of the up-and-coming feasible solutions. This article proposes an end-to-end, transformers-based model with a framework for low-resource languages. In addition, the framework incorporates customized audio-to-text processing algorithms to achieve a highly efficient Jordanian Arabic dialect speech-to-text system. The proposed framework enables ingesting data from many sources, making the ground truth from external sources possible by speeding up the manual annotation process. The framework allows the training process using noisy student training and self-supervised learning to utilize the unlabeled data in both pre- and post-training stages and incorporate multiple types of data augmentation. The proposed self-training approach outperforms the fine-tuned Wav2Vec model by 5% in terms of word error rate reduction. The outcome of this work provides the research community with a Jordanian-spoken data set along with an end-to-end approach to deal with low-resource languages. This is done by utilizing the power of the pretraining, post-training, and injecting noisy labeled and augmented data with minimal human intervention. It enables the development of new applications in the field of Arabic language speech-to-text area like the question-answering systems and intelligent control systems, and it will add human-like perception and hearing sensors to intelligent robots.

The fifth industrial revolution and the accompanying influences of digitalization are presenting enterprises with significant challenges. Regardless of the trend, however, humans will remain a central resource in future factories and will continue to be required to perform manual tasks. Against the backdrop of, e.g., societal and demographic changes and skills shortage, future-oriented support technologies such as exoskeletons represent a promising opportunity to support workers. Accordingly, the increasing interconnection of human operators, devices, and the environment, especially in human-centered work processes, requires improved human-machine interaction and further qualification of support systems to smart devices. In order to meet these requirements and enable exoskeletons as a future-proof technology, this article presents a framework for the future-oriented qualification of exoskeletons, which reveals potential in terms of user-individual and context-dependent adaptivity of support systems. In this context, a framework has been developed, allowing different support situations to be classified based on elementary functions. Using these support function dependencies and characteristics, it becomes possible to describe adaptive system behavior for human-centered support systems such as exoskeletons as a central aspect. For practical illustration, it is shown for an exemplary active exoskeleton using the example of user-individuality and context-specificity how the support characteristics of exoskeletons in the form of different support characteristics can bring about a purposeful and needs-based application for users and can contribute valuably to design future workplaces.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

Well, now humans aren’t necessary for launching drones, landing drones, charging drones, or flying drones. Thanks, Skydio!

[ Skydio ]

Do not underestimate the pleasure of hearing little metal feet climbing up metal walls.

[ Science Robotics ]

The latest in the Zoox testing series, this video showcases how Zoox tests the maneuverability capabilities of its robotaxi, which are critical for operation in dense urban environments. Four-wheel steering, bidirectional design, and active suspension are some of the features integrated into the Zoox robotaxi to ensure every ride is a smooth ride.

[ Zoox ]

Thanks, Whitney!

The Ligō device is a novel 3D bioprinting platform that supports the functional healing of skin tissue after acute skin injuries such as extensive burns. It is developed in Australia by an interdisciplinary team at Sydney-based startup Inventia Life Science. The Ligō robot prints tiny droplets containing the patient’s skin cells and optimized biomaterials into the wound directly in the operating room, combining the Kuka LBR Med and Inventia’s patented 3D bioprinting technology. In this way, tissue-guided regeneration is stimulated, allowing the body to heal itself and restore healthy skin that improves the quality of life for skin-injury survivors.

[ Inventia ]

In the first quarter of 2022, our group demoed ANYmal and Spot carrying out automated inspection at Chevron’s blending plant in Ghent, Belgium.

[ ORI ]

I have to think that for teleoperation, this is much harder than it looks.

[ Sanctuary AI ]

Meet the software Development Engineers from Amazon’s Global Ops Robotics, who are working together to deliver innovations that will shape the future of Amazon operations.

[ Amazon ]

This video highlights the impact of Covariant’s AI-powered Robotic Putwall, at Capacity, a third-party logistics company serving some of the world’s largest e-commerce brands. Affectionately named Waldo, the autonomous put wall has been fulfilling thousands of customer orders at over 500 picks per hour, with less than 0.1 percent of them needing human intervention.

[ Covariant ]

What does Moxie do? Best to just ask Moxie.

[ Embodied ]

I’m not sure what this is, but I’ll be watching!

[ Fraunhofer ]

It still kind of blows my mind that you can just go and buy yourself a robot dog.

[ Trossen ]

Here are a series of talks from the Can We Build Baymax? workshop, focusing on education and open source for humanoid robots.

[ CWBB ]

This University of Pennsylvania GRASP on Robotics talk is from Harold Soh at the National University of Singapore: “Towards Trustworthy Robots That Interact With People.”

What will it take to develop robots that work with us in real-world tasks? In this talk, we’ll discuss some of our work across the autonomy stack of a robot as we make progress towards an answer. We’ll begin with multimodal sensing and perception, and then move on to modeling humans with little data. We’ll end with the primary insights gained in our journey and a discussion of challenges in deriving robots that we trust to operate in social environments.

[ UPenn ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2023: 29 May–2 June 2023, LONDONRoboCup 2023: 4–10 July 2023, BORDEAUX, FRANCERSS 2023: 10–14 July 2023, DAEGU, KOREAIEEE RO-MAN 2023: 28–31 August 2023, BUSAN, KOREA

Enjoy today’s videos!

Well, now humans aren’t necessary for launching drones, landing drones, charging drones, or flying drones. Thanks, Skydio!

[ Skydio ]

Do not underestimate the pleasure of hearing little metal feet climbing up metal walls.

[ Science Robotics ]

The latest in the Zoox testing series, this video showcases how Zoox tests the maneuverability capabilities of its robotaxi, which are critical for operation in dense urban environments. Four-wheel steering, bidirectional design, and active suspension are some of the features integrated into the Zoox robotaxi to ensure every ride is a smooth ride.

[ Zoox ]

Thanks, Whitney!

The Ligō device is a novel 3D bioprinting platform that supports the functional healing of skin tissue after acute skin injuries such as extensive burns. It is developed in Australia by an interdisciplinary team at Sydney-based startup Inventia Life Science. The Ligō robot prints tiny droplets containing the patient’s skin cells and optimized biomaterials into the wound directly in the operating room, combining the Kuka LBR Med and Inventia’s patented 3D bioprinting technology. In this way, tissue-guided regeneration is stimulated, allowing the body to heal itself and restore healthy skin that improves the quality of life for skin-injury survivors.

[ Inventia ]

In the first quarter of 2022, our group demoed ANYmal and Spot carrying out automated inspection at Chevron’s blending plant in Ghent, Belgium.

[ ORI ]

I have to think that for teleoperation, this is much harder than it looks.

[ Sanctuary AI ]

Meet the software Development Engineers from Amazon’s Global Ops Robotics, who are working together to deliver innovations that will shape the future of Amazon operations.

[ Amazon ]

This video highlights the impact of Covariant’s AI-powered Robotic Putwall, at Capacity, a third-party logistics company serving some of the world’s largest e-commerce brands. Affectionately named Waldo, the autonomous put wall has been fulfilling thousands of customer orders at over 500 picks per hour, with less than 0.1 percent of them needing human intervention.

[ Covariant ]

What does Moxie do? Best to just ask Moxie.

[ Embodied ]

I’m not sure what this is, but I’ll be watching!

[ Fraunhofer ]

It still kind of blows my mind that you can just go and buy yourself a robot dog.

[ Trossen ]

Here are a series of talks from the Can We Build Baymax? workshop, focusing on education and open source for humanoid robots.

[ CWBB ]

This University of Pennsylvania GRASP on Robotics talk is from Harold Soh at the National University of Singapore: “Towards Trustworthy Robots That Interact With People.”

What will it take to develop robots that work with us in real-world tasks? In this talk, we’ll discuss some of our work across the autonomy stack of a robot as we make progress towards an answer. We’ll begin with multimodal sensing and perception, and then move on to modeling humans with little data. We’ll end with the primary insights gained in our journey and a discussion of challenges in deriving robots that we trust to operate in social environments.

[ UPenn ]

Pages