Feed aggregator

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEX.Cybathlon Challenges: 02 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

An overview of ongoing work by Hello Robot, UIUC, UW, and Robots for Humanity to empower Henry Evans’ independence through the use of the mobile manipulator Stretch.

And of course, you can read more about this project in this month’s issue of Spectrum magazine.

[ Hello Robot ]

At KIMLAB, we have a unique way of carving Halloween pumpkins! Our MOMO (Mobile Object Manipulation Operator) is equipped with PAPRAS arms featuring prosthetic hands, allowing it to use human tools.


This new haptic system from CMU seems actually amazing, although watching the haptic arrays pulse is wigging me out a little bit for some reason.

[ Fluid Reality Group ]

We are excited to introduce you to the Dingo 1.5, the next generation of our popular Dingo platform! With enhanced hardware and software updates, the Dingo 1.5 is ready to tackle even more challenging tasks with ease.

[ Clearpath ]

A little bit of a jump scare here from ANYbotics.

[ ANYbotics ]

Happy haunting from Boston Dynamics!

[ Boston Dynamics ]

I’m guessing this is some sort of testing setup but it’s low-key terrifying.

[ Flexiv ]

KUKA has teamed up with Augsburger Puppenkiste to build a mobile show cell in which two robots do the work of the puppeteers.

[ KUKA ]

In this video, we showcase the Advanced Grasping premium software package’s capabilities. We demonstrate how TIAGo collects objects and places them, how the gripper adapts to different shapes, and the TIAGo robot’s perception and manipulation capabilities.

[ PAL Robotics ]

HEBI Robotics produces a platform for robot development. Our long term vision is to make it easy and practical for any worker, technician, farmer, etc. to create robots as needed. Today the platform is used by researchers around the world and HEBI is using it to solve challenging automation tasks related to inspections and maintenance.

[ HEBI Robotics ]

Folded robots are a rapidly growing field that is revolutionizing how we think about robotics. Taking inspiration from the ancient art of origami results in thinner, lighter, more flexible autonomous robots.

[ NSF ]

Can I have a pet T-Rex? Is a short interdisciplinary portrait documentary featuring paleontologist and Kod*lab postdoc, Aja Mia Carter and Kod*lab robotics researchers, Postdoc Wei-Hsi Chen and PhD student J.Diego Caporale. Dr. Chen applies the art of origami to make a hopping robot while Mr. Caporale adds a degree of freedom to the spine of a quadruped robot to interrogate ideas about twisting and locomotion. An expert in the evolution of tetrapod spines from 380 millon years ago, Dr. Carter is still motivated by her childhood dream for a pet T-Rex, but how can these robotics researchers get her closer to her vision?

[ Kodlab ]

Collaborative robots (in short: cobots) have the potential to assist workers with physically or cognitive demanding tasks. However, it is crucial to recognize that such assistance can have both positive and negative effects on job quality. A key aspect of human-robot collaboration is the interdependence between human and robotic tasks. This interdependence influences the autonomy of the operator and can impact the work pace, potentially leading to a situation where the human’s work pace becomes reliant on that of the robot. Given that autonomy and work pace are essential determinants of job quality, design decisions concerning these factors can greatly influence the overall success of a robot implementation. The impact of autonomy and work pace was systematically examined through an experimental study conducted in an industrial assembly task. 20 participants engaged in collaborative work with a robot under three conditions: human lead (HL), fast-paced robot lead (FRL), and slow-paced robot lead (SRL). Perceived workload was used as a proxy for job quality. To assess the perceived workload associated with each condition was assessed with the NASA Task Load Index (TLX). Specifically, the study aimed to evaluate the role of human autonomy by comparing the perceived workload between HL and FRL conditions, as well as the influence of robot pace by comparing SRL and FRL conditions. The findings revealed a significant correlation between a higher level of human autonomy and a lower perceived workload. Furthermore, a decrease in robot pace was observed to result in a reduction of two specific factors measuring perceived workload, namely cognitive and temporal demand. These results suggest that interventions aimed at increasing human autonomy and appropriately adjusting the robot’s work pace can serve as effective measures for optimizing the perceived workload in collaborative scenarios.

The term “world model” (WM) has surfaced several times in robotics, for instance, in the context of mobile manipulation, navigation and mapping, and deep reinforcement learning. Despite its frequent use, the term does not appear to have a concise definition that is consistently used across domains and research fields. In this review article, we bootstrap a terminology for WMs, describe important design dimensions found in robotic WMs, and use them to analyze the literature on WMs in robotics, which spans four decades. Throughout, we motivate the need for WMs by using principles from software engineering, including “Design for use,” “Do not repeat yourself,” and “Low coupling, high cohesion.” Concrete design guidelines are proposed for the future development and implementation of WMs. Finally, we highlight similarities and differences between the use of the term “world model” in robotic mobile manipulation and deep reinforcement learning.

Objectives: Hyolaryngeal movement during swallowing is essential to airway protection and bolus clearance. Although palpation is widely used to evaluate hyolaryngeal motion, insufficient accuracy has been reported. The Bando Stretchable Strain Sensor for Swallowing (B4S™) was developed to capture hyolaryngeal elevation and display it as waveforms. This study compared laryngeal movement time detected by the B4S™ with laryngeal movement time measured by videofluoroscopy (VF).

Methods: Participants were 20 patients without swallowing difficulty (10 men, 10 women; age 30.6 ± 7.1 years). The B4S™ was attached to the anterior neck and two saliva swallows were measured on VF images to determine the relative and absolute reliability of laryngeal elevation time measured on VF and that measured by the B4S™.

Results: The intra-class correlation coefficient of the VF and B4S™ times was very high [ICC (1.1) = 0.980]. A Bland–Altman plot showed a strong positive correlation with a 95% confidence interval of 0.00–3.01 for the mean VF time and mean B4S™ time, with a fixed error detected in the positive direction but with no proportional error detected. Thus, the VF and B4S™ time measurements showed high consistency.

Conclusion: The strong relative and absolute reliability suggest that the B4S™ can accurately detect the duration of superior-inferior laryngeal motion during swallowing. Further study is needed to develop a method for measuring the distance of laryngeal elevation. It is also necessary to investigate the usefulness of this device for evaluation and treatment in clinical settings.