IEEE Spectrum Robotics

IEEE Spectrum
Subscribe to IEEE Spectrum Robotics feed IEEE Spectrum Robotics

Your weekly selection of awesome robot videos

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2023: 12–14 December 2023, AUSTIN, TEX.Cybathlon Challenges: 02 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCEICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN

Enjoy today’s videos!

This magnetically actuated soft robot is perhaps barely a robot by most definitions, but I can’t stop watching it flop around.

In this work, Ahmad Rafsanjani, Ahmet F. Demirörs, and co‐workers from SDU (DK) and ETH (CH) introduce kirigami into a soft magnetic sheet to achieve bidirectional crawling under rotating magnetic fields. Experimentally characterized crawling and deformation profiles, combined with numerical simulations, reveal programmable motion through changes in cut shape, magnet orientation, and translational motion. This work offers a simple approach toward untethered soft robots.

[ Paper ] via [ SDU ]

Thanks, Ahmad!

Winner of the earliest holiday video is the LARSEN team at Inria!

[ Inria ]

Thanks, Serena!

Even though this is just a rendering, I really appreciate Apptronik being like, “we’re into the humanoid thing, but sometimes you just don’t need legs.”

[ Apptronik ]

We’re not allowed to discuss unmentionables here at IEEE Spectrum, so I can only tell you that Digit has started working in a warehouse handling, uh, things.

[ Agility ]

Unitree’s sub-$90k H1 Humanoid suffering some abuse in a non-PR video.

[ Impress ]

Unlike me, ANYmal can perform 24/7 in all weather.

[ ANYbotics ]

Most of the world will need to turn on subtitles for this, but it’s cool to see how industrial robots can be used to make art.

[ Kuka ]

I was only 12 when this episode of Scientific American Frontiers aired, but I totally remember Alan Alda meeting Flakey!

And here’s the segment, it’s pretty great.

[ SRI ]

Agility CEO Damion Shelton talks about the hierarchy of robot control and draws similarities to the process of riding a horse.

[ Agility ]

Seeking to instill students with real-life workforce skills through hands-on learning, teachers at Central High School in Louisville, Ky., incorporated Spot into their curriculum. For students at CHS, a magnet school for Jefferson County Public Schools district, getting experience with an industrial robot has sparked a passion for engineering and robotics, kickstarted advancement into university engineering programs, and built lifelong career skills. See how students learn to operate Spot, program new behaviors for the robot, and inspire their peers with the school’s “emotional support robot” and unofficial mascot.

[ Boston Dynamics ]

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Thanks to eons of evolution, vines have the ability to seek out light sources, growing in the direction that will optimize their chances of absorbing sunlight and thriving. Now, researchers have succeeded in creating a vine-inspired crawling bot that can achieve similar feats, seeking out and moving towards light and heat sources. It’s described in a study published last month in IEEE Robotics and Automation Letters.

Shivani Deglurkar, a Ph.D. candidate in the department of Mechanical and Aerospace Engineering at the University of California, San Diego, helped co-design these automated “vines.” Because of its light- and heat-seeking abilities, the system doesn’t require a complex centralized controller. Instead, the “vines” automatically move towards a desired target. “[Also], if some of the vines or roots are damaged or removed, the others remain fully functional,” she notes.

While the tech is still in its infancy, Deglurkar says she envisions it helping in different applications related to solar tracking, or perhaps even in detecting and fighting smoldering fires.

It uses a novel actuator that contracts in the presence of light, causing it to gravitate towards the source. Shivani Deglurkar et al.

To help the device automatically gravitate towards heat and light, Deglurkar’s team developed a novel actuator. It uses a photo absorber in low-boiling-point fluid, which is contained in many small, individual pouches along the sides of the vine’s body. They called this novel actuator a Photothermal Phase-change Series Actuator (PPSA).

When exposed to light, the PPSAs absorb light, heat up, inflate with vapor, and contract. As the PPSAs are pressurized, they elongate, by unfurling material from inside its tip. “At the same time, the PPSAs on the side exposed to light contract, shortening that portion of the robot, and steering it toward the [light or heat] source,” explains Deglurkar.

Her team then tested the system, placing it at different distances from an infrared light source, and confirmed that it will gravitate towards the source at short distances. Its ability to do so depends on the light intensity, whereby stronger light sources allow the device to bend more towards the heat source.

Full turning of the vine by the PPSAs takes about 90 seconds. Strikingly, the device was even able to navigate around obstacles thanks to its inherent need to seek out light and heat sources.

Charles Xiao, a Ph. D. candidate in the department of Mechanical Engineering at the University of California, Santa Barbara, helped co-design the vine. He says he was surprised to see its responsiveness in even very low lighting. “Sunlight is about 1000 W/m2, and our robot has been shown to work at a fraction of solar intensity,” he explains, noting that a lot of comparable systems require illumination greater than that of one Sun.

Xiao says that the main strength of the automated vine is its simplicity and low cost to make. But more work is needed before it can hit the market—or makes its debut fighting fires. “It is slow to respond to light and heat signals and not yet designed for high temperature applications,” explains Xiao.

Therefore future prototypes would need better performance at high temperatures and ability to sense fires in order to be deployed in a real-world environment. Moving forward, Deglurkar says her team’s next steps include designing the actuators to be more selective to the wavelengths emitted by a fire, and developing actuators with a faster response time.

Every minute counts when someone suffers a cardiac arrest. New research suggests that drones equipped with equipment to automatically restart someone’s heart could help get life-saving care to people much faster.

If your heart stops beating outside of a hospital, your chance of survival is typically less than 10 percent. One thing that can boost the prospect of pulling through is an automated external defibrillator (AED)—a device that can automatically diagnose dangerous heart rhythms and deliver an electric shock to get the heart pumping properly again.

AEDs are designed to be easy to use and provide step-by-step voice instructions, making it possible for untrained bystanders to deliver treatment before an ambulance arrives. But even though AEDs are often installed in public spaces such as shopping malls and airports, the majority of cardiac arrests outside of hospitals actually occur in homes.

A team of Swedish researchers decided to use drones to deliver AEDs directly to patients. Over the course of an 11-month trial in the suburbs of Gothenburg, the team showed they could get the devices to the scene of a medical emergency before an ambulance 67 percent of the time. Generally the AED arrived more than three minutes earlier, giving bystanders time to attach the device before paramedics reached the patient. In one case, this saved a patient’s life.

“The results are really promising because we show that it’s possible to beat the ambulance services by several minutes in a majority of cases,” says Andreas Claesson, an associate professor at the Karolinska Institute in Solna who led the research. “If you look at cardiac arrest, each minute that passes without treatment survival decreases by about 10 percent. So a time benefit of three minutes, as in this study, could potentially increase survival.”

The project was a collaboration with Gothenburg-based drone operator Everdone and covered 194.3 square kilometers of semi-urban areas around the city, with a total population of roughly 200,000. Throughout the study period, the company operated five DJI drones that could be dispatched from hangars at five different locations around the city. The drones could autonomously fly to the scene of an emergency under the watch of a single safety supervisor. Each drone carried an AED in a basket that could be winched down from an altitude of 30 meters.

When the local emergency response center received a call about a suspected cardiac arrest or ongoing CPR, one of the drones was dispatched immediately. Once the drone reached the location, it lowered the AED to the ground. If the emergency dispatcher deemed it appropriate and safe, the person who had called in the cardiac arrest was directed to retrieve the device.


Drones weren’t dispatched for every emergency call, because they weren’t allowed to operate in rain and strong winds, in no-fly zones, or when calls came from high-rise buildings. But in a paper in the December edition of The Lancet Digital Health, the research team reported that of the 55 cases where both a drone and an ambulance reached the scene of the emergency, the drone got there first 37 times, with a median lead time of 3 minutes and 14 seconds.

Only 18 of those emergency calls actually turned out to be cardiac arrests, but in six of those cases the caller managed to apply the AED. In two cases the device recommended applying a shock, with one of the patients surviving thanks to the intervention. The number of cases is too few to make any claims about the clinical effectiveness of the approach, says Claesson, but he says the results clearly show that drones are an effective way to improve emergency response times.

“Three minutes is quite substantial,” says Timothy Chan, a professor of mechanical and industrial engineering at the University of Toronto, who has investigated the effectiveness of drone-delivered AEDs. “Given that in most parts of the world emergency response times are fairly static over time, it would be a huge win if we could achieve and sustain a big reduction like this in widespread practice.”

The approach won’t work everywhere, admits Claesson. In rural areas, the technology would likely lead to even bigger reductions in response time, but lower population density means the cases would be too few to justify the investment. And in big cities, ambulance response times are already relatively rapid and high rise buildings would make drone operation challenging.

But in the kind of semi-urban areas where the trial was conducted, Claesson thinks the technology is very promising. Each drone system costs roughly US $125,000 a year to run and can cover an area with roughly 30,000 to 40,000 inhabitants, which he says is already fairly cost-effective. But what will make the idea even more compelling is when the drones are able to respond to a wider range of emergencies.

That could involve delivering medical supplies for other time-sensitive medical emergencies like drug overdoses, allergic reactions or severe bleeding, he says. Drones equipped with cameras could also rapidly relay video of car accidents or fires to dispatchers, enabling them to tailor the emergency response based on the nature and severity of the incident.

The biggest challenge when it comes to delivering medical support such as AEDs by drone, says Claesson, is the reliance on untrained bystanders.“It’s a really stressful event for them,” he says. “Most often it’s a relative and most often they don’t know CPR and they might not know how an AED works.”

One promising future direction could be to combine drone-delivered AEDs with existing smartphone apps that are used to quickly alert volunteers trained in first aid to nearby medical emergencies. “In Sweden, in 40 percent of cases they arrive before an ambulance,” says Claesson. “We could just send a push notification to the app saying a drone will deliver an AED in two minutes, make your way to the site.”

The tricked out version of the ANYmal quadruped, as customized by Zürich-based Swiss-Mile, just keeps getting better and better. Starting with a commercial quadruped, adding powered wheels made the robot fast and efficient, while still allowing it to handle curbs and stairs. A few years ago, the robot learned how to stand up, which is an efficient way of moving and made the robot much more pleasant to hug, but more importantly, it unlocked the potential for the robot to start doing manipulation with its wheel-hand-leg-arms.

Doing any sort of practical manipulation with ANYmal is complicated, because its limbs were designed to be legs, not arms. But at the Robotic Systems Lab at ETH Zurich, they’ve managed to teach this robot to use its limbs to open doors, and even to grasp a package off of a table and toss it into a box.

When it makes a mistake in the real world, the robot has already learned the skills to recover.

The ETHZ researchers got the robot to reliably perform these complex behaviors using a kind of reinforcement learning called ‘curiosity driven’ learning. In simulation, the robot is given a goal that it needs to achieve—in this case, the robot is rewarded for achieving the goal of passing through a doorway, or for getting a package into a box. These are very high-level goals (also called “sparse rewards”), and the robot doesn’t get any encouragement along the way. Instead, it has to figure out how to complete the entire task from scratch.

The next step is to endow the robot with a sense of contact-based surprise.

Given an impractical amount of simulation time, the robot would likely figure out how to do these tasks on its own. But to give it a useful starting point, the researchers introduced the concept of curiosity, which encourages the robot to play with goal-related objects. “In the context of this work, ‘curiosity’ refers to a natural desire or motivation for our robot to explore and learn about its environment,” says author Marko Bjelonic, “Allowing it to discover solutions for tasks without needing engineers to explicitly specify what to do.” For the door-opening task, the robot is instructed to be curious about the position of the door handle, while for the package-grasping task, the robot is told to be curious about the motion and location of the package. Leveraging this curiosity to find ways of playing around and changing those parameters helps the robot achieve its goals, without the researchers having to provide any other kind of input.

The behaviors that the robot comes up with through this process are reliable, and they’re also diverse, which is one of the benefits of using sparse rewards. “The learning process is sensitive to small changes in the training environment,” explains Bjelonic. “This sensitivity allows the agent to explore various solutions and trajectories, potentially leading to more innovative task completion in complex, dynamic scenarios.” For example, with the door opening task, the robot discovered how to open it with either one of its end-effectors, or both at the same time, which makes it better at actually completing the task in the real world. The package manipulation is even more interesting, because the robot sometimes dropped the package in training, but it autonomously learned how to pick it up again. So, when it makes a mistake in the real world, the robot has already learned the skills to recover.

There’s still a bit of research-y cheating going on here, since the robot is relying on the visual code-based AprilTags system to tell it where relevant things (like door handles) are in the real world. But that’s a fairly minor shortcut, since direct detection of things like doors and packages is a fairly well understood problem. Bjelonic says that the next step is to endow the robot with a sense of contact-based surprise, in order to encourage exploration, which is a little bit gentler than what we see here.

Remember, too, that while this is definitely a research paper, Swiss-Mile is a company that wants to get this robot out into the world doing useful stuff. So, unlike most pure research that we cover, there’s a slightly better chance here for this ANYmal to wheel-hand-leg-arm its way into some practical application.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2023: 12–14 December 2023, AUSTIN, TEXASCybathlon Challenges: 2 February 2024, ZURICHEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

This is such an excellent use for autonomous robots: difficult, precise work that benefits from having access to lots of data. Push a button, stand back, and let the robot completely reshape your landscape.

[ Gravis Robotics ]

Universal Robots introduced the UR30 at IREX, in Tokyo, which can lift 30 kilograms—not the 63.5 kg that it says on the tire. That’s the weight of the UR30 itself.

Available for preorder now.

[ Universal Robots ]

IREX is taking place in Japan right now, and here’s a demo of Kaleido, a humanoid robot from Kawasaki.

[ Kawasaki ] via [ YouTube ]

The Unitree H1 is a full-size humanoid for under US $90,000 (!).

[ Unitree ]

This is extremely impressive but freaks me out a little to watch, and I’m not entirely sure why.


If you look in the background of this video, there’s a person wearing an exoskeleton controlling the robot in the foreground. This is an ideal system for imitation learning, and the robot is then able to perform a similar task autonomously.

[ Github ]

Thanks, Kento!

The video shows highlights from the RoboCup 2023 Humanoid AdultSize competition in Bordeaux, France. The winning team NimbRo is based in the Autonomous Intelligent Systems lab of University of Bonn, Germany.

[ NimbRo ]

This video describes an approach to generate complex, multicontact motion trajectories using user guidance provided through Virtual Reality. User input is useful to reduce the search space through defined key frame. We show these results on the humanoid robot, Valkyrie, from NASA Johnson Space Center, in both simulation and on hardware.

[ Paper ] via [ IHMC ]

For the foreseeable future, this is likely going to be necessary for most robots doing semi-structured tasks like trailer unloading: human in (or on) the loop supervision.

Of course, one human can supervise many robots at once, so as long as most of the robots are autonomous most of the time, it’s all good.

[ Contoro ]

The Danish medical technology start-up ROPCA ApS has launched its first medical product, the arthritis robot “ARTHUR”, which is already being used in the first hospitals. It is based on the lightweight robot LBR Med and supports the early diagnosis of rheumatoid arthritis using robot-assisted ultrasound. This ultrasound robot enables autonomous examination and can thus counteract the shortage of specialists in medicine. This enables earlier treatment, which is essential for a good therapeutic outcome.


Since 2020, KIMLAB has dedicated efforts to craft an affordable humanoid robot tailored for educational needs, boasting vital features like an ROS-enabled processor and multimodal sensory capabilities. By incorporating a commercially available product, we seamlessly integrated an SBC (Orange PI Lite 2), a camera, and an IMU to create a cost-effective humanoid robot, priced at less than $700 in total.


As the newest product launched by WEILAN, the 6th generation AlphaDog, namely BabyAlpha, is defined as a new family member of the artificial intelligence era. Designed for domestic scenarios, it was born for the purpose of providing joyful companionship. Not only do they possess autonomous emotions and distinct personalities, but they also excel in various skills such as singing and dancing, FaceTime calling, English communication, and sports.

[ Weilan ] via [ ModernExpress ]

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2023: 12–14 December 2023, AUSTIN, TEX.Cybathlon Challenges: 02 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

Do you find yourself wondering why the world needs bipedal humanoid robots? Allow IHMC and Boardwalk Robotics to answer that question with this video.

[ IHMC ]

Thanks, Robert!

As NASA’s Ingenuity Helicopter made its 59th flight on Mars–achieving its second highest altitude while taking pictures of this flight–the Perseverance Mars rover was watching. See two perspectives of this 142-second flight that reached an altitude of 20 meters (66 feet). This flight took place on 16 Sept. 2023. In this side-by-side video, you’ll see the perspective from Perseverance on the left, which was captured by the rover’s Mastcam-Z imager from about 55 m (180 ft.) away. On the right, you’ll see the perspective from Ingenuity, which was taken by its downward-pointing Navigation Camera (Navcam). During Flight 59, Ingenuity hovered at different altitudes to check Martian wind patterns. The highest altitude achieved in this flight was 20 m. At the time, that was a record for the helicopter.

[ JPL ]

Cassie Blue showcases its ability to navigate a moving walkway, a common yet challenging scenario in human environments. Cassie Blue can walk on to and off of a 1.2 meter-per-second moving treadmill and reject disturbances caused by a tugging gantry and sub-optimal approach angle caused by operator error. The key to Cassie Blue’s success is a new controller featuring a novel combination of virtual constraint-based control and a model predictive controller applied on the often-neglected ankle motor. This technology paves the way for robots to adapt and function in dynamic, real-world settings.

[ Paper ] via [ Michigan Robotics ]

Thanks, Wami!

In this study, we propose a parallel wire-driven leg structure, which has one DoF of linear motion and two DoFs of rotation and is controlled by six wires, as a structure that can achieve both continuous jumping and high jumping. The proposed structure can simultaneously achieve high controllability on each DoF, long acceleration distance and high power required for jumping. In order to verify the jumping performance of the parallel wire-driven leg structure, we have developed a parallel wire-driven monopedal robot, RAMIEL. RAMIEL is equipped with quasi-direct drive, high power wire winding mechanisms and a lightweight leg, and can achieve a maximum jumping height of 1.6 m and a maximum of seven continuous jumps.


Thanks, Temma!

PAL Robotics’ Kangaroo demonstrates classic “zero-moment point” or ZMP walking, with only one or two engineers tagging along, and neither of them look all that nervous.

Eventually, PAL Robotics says that the robot will be able to “perform agile maneuvers like running, jumping, and withstanding impacts.”

[ PAL Robotics ]

Thanks, Lorna!

SLOT is a small soft-bodied crawling robot with electromagnetic legs and passive body adaptation. The robot, driven by neural central pattern generator (CPG)-based control, can successfully crawl on a variety of metal terrains, including a flat surface, step, slope, confined space, and an inner (concave surface) and outer (convex surface) pipe in both horizontal and vertical directions. It can be also steered to navigate through a cluttered environment with obstacles. This small soft robot has the potential to be employed as a robotic system for inner and outer pipe inspection and confined space exploration in the oil and gas industry.


Thanks, Poramate!

It isn’t easy for a robot to find its way out of a maze. Picture these machines trying to traverse a kid’s playroom to reach the kitchen, with miscellaneous toys scattered across the floor and furniture blocking some potential paths. This messy labyrinth requires the robot to calculate the most optimal journey to its destination, without crashing into any obstacles. What is the bot to do? MIT CSAIL researchers’ “Graphs of Convex Sets (GCS) Trajectory Optimization” algorithm presents a scalable, collision-free motion planning system for these robotic navigational needs.


As the field of human-robot collaboration continues to grow and autonomous general-purpose service robots become more prevalent, robots need to obtain situational awareness and handle tasks with a limited field of view and workspace. Addressing these challenges, KIMLAB and Prof. Yong Jae Lee at the University of Wisconsin-Madison utilize the game of chess as a testbed, employing a general-purpose robotic arm.


Humanoid robots have the potential of becoming general purpose robots augmenting the human workforce in industries. However, they must match the agility and versatility of humans. In this paper, we perform experimental investigations on the dynamic walking capabilities of a series-parallel hybrid humanoid named RH5. We demonstrate that it is possible to walk up to speeds of 0.43 m/s with a position-controlled robot without full state feedback, which makes it one of the fastest walking humanoids with similar size and actuation modalities.

[ DFKI ]

Avocado drone. That is all.

[ Paper ]

Autonomous robots must navigate reliably in unknown environments even under compromised exteroceptive perception, or perception failures. Such failures often occur when harsh environments lead to degraded sensing, or when the perception algorithm misinterprets the scene due to limited generalization. In this paper, we model perception failures as invisible obstacles and pits, and train a reinforcement learning (RL) based local navigation policy to guide our legged robot.

[ Resilient Navigation ]

X20 Long Range Remote Hazard Detection Test. We remote the robot dog from a straight line distance of one kilometer, and it successfully tested the density of gases. The purpose of the test is to provide solution for firefighters to use the robot to detect harmful gases first before putting themselves in danger.

[ Deep Robotics ]

This CMU RI Seminar is by Robert Ambrose from Texas A&M, on “Robots at the Johnson Space Center and Future Plans.”

The seminar will review a series of robotic systems built at the Johnson Space Center over the last 20 years. These will include wearable robots (exoskeletons, powered gloves and jetpacks), manipulation systems (ISS cranes down to human scale) and lunar mobility systems (human surface mobility and robotic rovers). As all robotics presentations should, this will include some fun videos.

[ CMU RI ]

Most people probably think of robots as cold and calculating, but for Morgan Pope they can be a tool for generating emotions.

As a research scientist at Disney Research in Glendale, Calif., Pope designs robots for the entertainment giant’s theme parks. But working as an Imagineer, as Disney’s researchers are known, requires both in-depth knowledge of the latest technologies and an instinctive sense of “magic.”

Morgan Pope


Disney Research, Glendale, Calif.




Bachelor’s degree in engineering, Harvard; master’s and Ph.D. degrees in mechanical engineering, Stanford

“We have a very different mission compared to conventional roboticists,” he says. “We’re trying to use electromagnetism to create emotions.”

Robots have a long history at Disney. Since 1965, an animatronic of U.S. president Abraham Lincoln has been a fixture at Disneyland in Anaheim, Calif. But until recently, most of the robots on display have been firmly bolted to the floor, Pope says, and that has limited the stories they can tell.

Pope takes advantage of recent breakthroughs in robotics to create robots that can jump, flip, and tumble. He helped build the mechanical superhero Spider-Man, a stunt-double animatronic, or stuntronic, that makes death-defying leaps off buildings and over the heads of audiences at Disneyland’s Avengers Campus. Today Pope is busy designing a rollerblading cartoon character whose clumsiness is designed to tug at your heartstrings.

“We have all these amazing characters that do highly dynamic, engaging, fun things,” he says. “If we can bring these characters to life in ways that currently aren’t possible, that can give people powerful emotional experiences.”

A specialty in robot mobility

Growing up, Pope was a bookworm. He loved science fiction and popular science magazines and gravitated toward topics like astronomy and quantum mechanics. In college, he discovered his passion for building things. He enrolled in engineering at Harvard, and during the summer before his senior year, he secured an internship at the university’s Microrobotics Laboratory.

That experience stuck with him, and after graduating in 2011 Pope decided to pursue a master’s degree in mechanical engineering at Stanford. He earned his master’s in 2013 and then continued at Stanford, earning a Ph.D. in the same field in 2016. At the university’s Biomimetics and Dexterous Manipulation Laboratory, he specialized in robot mobility. He led the design of the Stanford Climbing and Aerial Maneuvering Platform (SCAMP), a small robot that could fly, land on walls, and then climb them.

He had nearly finished his Ph.D. when he met with a friend who had worked at Disney Research in Pittsburgh. When Pope heard about the Imagineers and what they do, it immediately struck him as a great way to apply his skills. Entertainment applications for robotics sounded like a lot of fun, he says, and it was also a relatively unexplored field and therefore ripe for new innovations. That same year, Pope secured a job as a postdoctoral research associate at Disney.

“If we can bring these characters to life in ways that currently aren’t possible, that can give people powerful emotional experiences.”

Three years later he became a full-time research scientist there, which took some adjustment. As an academic researcher, he spent a lot of time scrounging around for funding, Pope says, and when grants came through, the projects could take years to complete. “The output was also primarily intellectual—you had to prove the basic idea worked, write a research paper, and move on.”.

Grant writing is less of a concern for a Disney Imagineer, Pope says, but there is more pressure to deliver results quickly. Also, the kinds of problems Imagineers must solve are different from those of most roboticists. The robots are deployed in amusement parks, often in close proximity to guests, so they are held to much higher safety standards than is usual for most robots. There’s also the pressure to ensure that the robots perform reliably and predictably for multiple shows a day. And, while conventional robotics is typically focused on completing a specific task, Pope says his goal is to bring characters to life. That means concentrating on the way the robots look, move, and behave as well as the specific actions they take.

“It’s not what it does, it’s how it does it,” he explains. “It has to do it in a way that makes you feel like this is a real character, a real, live being.”

Bringing Spider-Man to life

Lifelike action was crucial for the first project that Pope worked on at Disney. The goal was to create a robotic stunt double capable of performing complex aerial acrobatics for the Amazing Spider-Man show at Disneyland, which launched in 2021. The show features human performers, but one of the stunts involves Spider-Man backflipping 20 meters into the air, which is too dangerous for even the most skilled acrobat.

To convince the audience they were really watching Spider-Man, the researchers had to create a seamless transition between the acrobat and the robot, Pope says. His role was to work out the complex physics that would generate various somersaulting stunts while the robot was in midair. “It was super rewarding to play around with one of the greatest superhero characters of all time,” he says.

Morgan Pope shows off Disney’s new Indestructible robot, which can rollerblade, somersault, and perform other feats. Walt Disney Imagineering

A robot on rollerblades

Projects aren’t always so clear-cut, he admits, and they involve a lot of experimentation. In the early phases, small teams knock out quick and simple prototypes until they hit on something that works.

“You build something and then step back and think, ‘What about this is making me feel something, what about it is connecting with me?’” Pope says.

The project he’s currently working on involves a lot of this kind of exploration. For example, his team wanted to create robots that run, but the researchers quickly realized that the machines would fall down a lot. So they built a robot that could tolerate a tumble and get up again. In the end, they found that watching the robot pick itself up was what generated the most compelling emotional response.

“You relate to the robot struggling, because we’ve all been flat on our backs and had to get up,” he observes.

The team eventually scrapped the running concept and instead put its robot on a pair of Rollerblades. Many people know the awkwardness of trying to skate for the first time, and that makes the robot’s clumsiness all the more relatable. When the researchers debuted a prototype at this year’s South by Southwest in Austin, Texas, the audience’s warm reaction made it clear that they’d made an immediate emotional connection, Pope recalls.

A job for a generalist

But building robots for Disney is about more than just intuition and emotional intelligence. It also requires skills in electronics, mechanical design, and programming.

“You need to understand how different systems work, so if you need to dive into any of them, you can go deep and also pull them all together,” Pope says.

That’s why his team is always on the lookout for generalists. One of the two most important tips he gives to students, he says, is to familiarize themselves with as many disciplines as possible.

His other suggestion is to build something. It’s the best way to figure out the kind of engineering that excites you the most, he adds. And learning to create stuff just for the joy of it is the surest path to a great career.

“Try to build things that make you happy,” Pope says. “Chase the things that bring you joy. Chase the things that are delightful.”

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2023: 12–14 December 2023, AUSTIN, TEXASCybathlon Challenges: 2 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

Fourier Intelligence has just announced the mass production of their GR-1 humanoid, and they’ve got at least a dozen of them.

[ Fourier Intelligence ]

Thanks, Ni Tao!

This collaborative work between researchers from the University of Southern Denmark and VISTEC introduces a biomorphic soft robotic skin for a hexapod robot platform, featuring a central pattern generator–based neural controller for generating respiratory-like motions on the skin. The design enables visuo-haptic nonverbal communication between humans and robots and improves the robot’s aesthetics by enhancing its biomorphic qualities.

[ Paper ]

Thanks, Mads!

According to data from 2010, around 1.8 million people in the United States can’t eat on their own. Yet training a robot to feed people presents an array of challenges for researchers. A team led by researchers at the University of Washington created a set of 11 actions a robotic arm can make to pick up nearly any food attainable by fork. In tests with this set of actions, the robot picked up the foods more than 80 percent of the time, which is the user-specified benchmark for in-home use. The small set of actions allows the system to learn to pick up new foods during one meal.

[ UW ]

Thanks, Stefan!

If you watch enough robot videos, you get to know when a robot is being pushed in a way that’s easy to recover from, and when it’s actually being challenged. The end of this video shows IHMC’s Nadia getting pushed sideways against its planted foot, which necessitates a crossover step recovery.

[ Paper ] via [ IHMC ]

Thanks, Robert!

Ayato Kanada, an assistant professor at Kyushu University, wants to build woodpecker-inspired Doc Ock tentacles. And when you’re a professor, you can just do that.

Also, woodpeckers are weird.

[ Ayato Kanada ]

Thanks, Ayato!

Explore Tevel’s joint robotic fruit-harvesting pilot program with Kubota in this video, filmed during the 2023 apple harvest season in the Mazzoni Group’s orchards in Ferrara, Italy. Watch as our autonomous fruit-picking systems operate with precision, skillfully harvesting various apples in the idyllic Italian orchards.

[ Tevel ]

Understanding what’s an obstacle and what’s only obstacle-ish has always been tricky for robots, but Spot is making some progress here.


We tried to play Street Fighter 6 by teleoperating Reachy! Well, it didn’t go as planned, as Antoine won. But it was a pretty epic fight!

[ Pollen Robotics ]

The key assets of a data center are the servers. While most of them are active in the server room, idle and new assets are stored in the IT warehouse. Focusing mainly on this IT warehouse, SeRo automates the inbound and outbound management of the data center’s assets.

[ Naver Labs ]

Humans can be so mean.

[ Flexiv ]

Interesting HRI with the flashing light on Spot here.

[ Boston Dynamics ]

Flying in circles with a big tank of gas really seems like a better job for a robot pilot than for a human one.

[ Boeing ]

On 2 November 2023, at an event hosted by the Swiss Association of Aeronautical Sciences at ETH, Professor Davide Scaramuzza presented a comprehensive overview of our latest advancements in autonomous drone technology aimed at achieving human-level performance.


A skeletal robotic hand with working ligaments and tendons can now be 3D-printed in one run. The creepy accomplishment was made possible by a new approach to additive manufacturing that can print both rigid and elastic materials at the same time in high resolution.

The new work is the result of a collaboration between researchers at ETH Zurich in Switzerland and a Massachusetts Institute of Technology spin-out called Inkbit, based in Medford, Mass. The group has devised a new 3D inkjet-printing technique capable of using a wider range of materials than previous devices.

In a new paper in Nature, the group has shown for the first time that the technology can be used to print complex moving devices made of multiple materials in a single print job. These include a bio-inspired robotic hand, a six-legged robot with a grabber, and a pump modeled on the heart.

“What was really exciting for us is that this technology, for the first time, allowed us to print complete functional systems that work right off the print bed,” says Thomas Buchner, a Ph.D. student at ETH Zurich and first author of the paper describing the work.

The new technique operates on principles similar to those of the kind of inkjet printer you might find in an office. Instead of colored inks, though, the printer sprays out resins that harden when exposed to ultraviolet (UV) light, and rather than just printing a single sheet, it builds up 3D objects layer by layer. It’s also capable of printing at extremely high resolution, with voxels—the 3D equivalent of pixels–just a few micrometers across.

3D Printed Robot Hand Has Working Tendons

3D inkjet printers aren’t new, but the palette of materials they can use has typically been limited. That’s because each layer inevitably has imperfections, and the standard approach to dealing with this has been to scrape them off or roll them flat. This means that soft or slow-curing materials cannot be used as they will get smeared or squashed.

Inkbit has been working on a workaround to this problem for a number of years. The company has built a printer featuring a platform that moves up and down beneath multiple inkjet units, a UV-curing unit, and a scanning unit. After a layer has been deposited and cured, the scanner creates a depth map of the print surface, which is then compared against the 3D model to work out how to adjust the rate of deposition from the inkjet units to even out any irregularities. Areas that received too much resin on the previous layer receive less on the next, and vice versa.

This means the printer doesn’t require any contact with the materials once they’ve been deposited, says Robert Katzschmann, a robotics professor at ETH Zurich who led the research. “That leads to all kinds of benefits, because now you can use chemistries that take longer to polymerize, that take longer to harden out, and that opens up a whole new space of much more useful materials.”

“We can actually now create a structure or a robot in one shot. It might require maybe adding a motor here or there, but the actual complexity of the structure is all there.”
—Robert Katzschmann, ETH Zurich

Previously, Inkbit had been using a scanning approach that could capture images of areas only 2 centimeters across at a time. This process had to be repeated multiple times before all the images were stitched together and analyzed, which significantly slowed down fabrication times. The new technique uses a much faster laser scanning system—the device can now print 660 times as fast as before. In addition, the team has now demonstrated that they can print with elastic polymers called thiol-enes. These materials cure slowly, but they’re much springier and more durable than acrylates, the rubberlike materials that are normally used in commercial 3D inkjet printers.

To demonstrate the potential of the new 3D printing process, the researchers printed a robotic hand. The device features rigid bones modeled on MRI scans of human hands and elastic tendons that can be connected to servos to curl the fingers in toward the palm. Each fingertip also features a thin membrane with a small cavity behind, which is connected to a long tube printed into the structure of the finger. When the finger touches something, the cavity is compressed, causing the pressure inside the tube to rise. This is picked up by a pressure sensor at the end of the tube, and this signal is used to tell the fingers to stop curling once a certain pressure has been reached.

The researchers used the hand to grip a variety of objects, including a pen and a water bottle and to touch its thumb to each of its fingertips. Critically, all of the functional parts of the robotic hand, apart from the servos and the pressure sensors, were produced in a single printing job. “What we see as novel about our work is that we can actually now create a structure or a robot in one shot,” says Katzschmann. “It might require maybe adding a motor here or there, but the actual complexity of the structure is all there.”

The researchers also created a pneumatically powered six-legged robot with a gripper that was able to walk back and forth and pick up a box of Tic-Tacs, and a pump modeled on the human heart, featuring one-way valves and internal pressure sensors, that was capable of pumping 2.3 liters of fluid a minute.

Future work will look to further expand the number of materials that the printer can use, says Katzschmann. They are restricted to materials that can be cured using UV light and that aren’t too viscous to work in an inkjet printer. But these could include things like hard epoxies, hydrogels suitable for tissue engineering, or even conductive polymers that could make it possible to print electronic circuits into devices.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEXAS, USACybathlon Challenges: 02 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

Unitree B2: beyond the limit. Maximum speed of 6m/s, sustained load of 40kg and sustained walking endurance of 5h. The comprehensive performance is two to three times that of existing quadruped robots worldwide! Adaptable to all terrains, large load, long-lasting endurance, and super athletic performance! Evolve, evolve, and evolve again!

[ Unitree ]

This shape-changing robot just got a lot smaller. In a new study, engineers at the University of Colorado Boulder debuted mCLARI, a 2-centimeter-long modular robot that can passively change its shape to squeeze through narrow gaps in multiple directions. It weighs less than a gram but can support over three times its body weight as an additional payload.

[ CU Boulder ]

Researchers at CMU used fossil evidence to engineer a soft robotic replica of pleurocystitids, a marine organism that existed nearly 450 million years ago and is believed to be one of the first echinoderms capable of movement using a muscular stem.

[ CMU ]

Stretch has moved over a million customer boxes in under a year, improving predictability and preventing injuries. But how did we get there? Discover how we put our expertise in robotics research to use designing, testing, and deploying a warehouse robot. Starting from the technological building blocks of Atlas, Stretch has the mobility, power, and intelligence to automate the industry’s toughest challenges.

[ Boston Dynamics ]

What do the robots do on Halloween after everyone leaves? Join the Ingenuity Labs robots on their trick or treating adventure. Happy Halloween!

[ Queens University ]

Thanks Josh!

FreeLander is a versatile, modular legged-robot hardware platform with adaptive bio-inspired neural control. The robot platform can be used to construct different bio-inspired legged robots. Each module of the platform consists of two legs designed to function as a two-legged robot, which is able to walk on a metal pipe using electromagnetic feet. Multiple modules can be combined to obtain six-legged and eight-legged robots to walk on difficult terrains, such as rough terrain, slopes, random stepfield, gravel, grass, and even in-pipe.


Thanks Poramate!

Energy Robotics hopes you had a Happy Halloween!

[ Energy Robotics ]

This work presents a camera model for refractive media such as water and its application in underwater visual-inertial odometry. The model is self-calibrating in real-time and is free of known correspondences or calibration targets.

[ ARL ]

Humans naturally exploit haptic feedback during contact-rich tasks like loading a dishwasher or stocking a bookshelf. Current robotic systems focus on avoiding unexpected contact, often relying on strategically placed environment sensors. In this paper we train a contact-exploiting manipulation policy in simulation for the contact-rich household task of loading plates into a slotted holder, which transfers without any fine-tuning to the real robot.

[ Paper ]

Thanks Samarth!

Presented herewith is another PAPRAS (Plug-And-Play Robotic Arm System) add-on system engineered to augment the functionalities of the quadrupedal robot, Boston Dynamics Spot. The system adeptly integrates two PAPRAS units onto the Spot, drawing inspiration from the mythological creature Orthrus—a two-headed dog in Greek mythology.


Marwa Eldiwiny is a PhD student and Early Stages Researcher (ESR) at the Vrije Universiteit Brussel whose current research focus is on modelling and simulating self-healing soft materials for industrial applications. Her Master’s thesis was ‘UAV anti-stealth technology for safe operation’. She worked as a Research Engineer at Inria, Lille nord Europe, Research Scholar at Tartu Institute of Technology and a lecturer with the Mechatronics and Industrial Robotics Programme at Minia University, Egypt. Eldiwiny hosts the IEEE RAS Soft Robotics Podcast where researchers from both Academia and Industry discuss recent developments in the Soft Robotics research field.


3 labs. Different robotic solutions of the future. Meet CSAIL’s machine friends.


This UPenn GRASP SFI Seminar is by E Farrell Helbling at Cornell, on Autonomy for Insect Scale Robots.

Countless science fiction works have set our expectations for small, mobile, autonomous robots for use in a broad range of applications. The ability to move through highly dynamic and complex environments can expand capabilities in search and rescue operations and safety inspection tasks. These robots can also form a diverse collective to provide more flexibility than a multifunctional robot. I will present my work on the analysis of control and power requirements for this vehicle, as well as results on the integration of onboard sensors. I also will discuss recent results that culminate nearly two decades of effort to create a power autonomous insect-scale vehicle. Lastly, I will outline how this design strategy can be readily applied to other micro and bioinspired autonomous robots.

[ UPenn ]

Although robots are already in warehouses, shuffling small items between bins for shipping or storage, they have yet to take over the job of lugging big, heavy things. And that’s just where they could be of the most use, because lugging is hard for people to do.

Several companies are working on the problem, and there’s likely to be plenty of room for all of them, because the opportunity is enormous. There are a lot of trailers out there that need to be unloaded. Arguably the most interesting approach comes from Dextrous Robotics, which has a robot that moves boxes around with a giant pair of chopsticks.

We first wrote about Dextrous Robotics in 2021, when they were working on a proof of concept using Franka Panda robotic arms. Since then, the concept has been proved successfully, and Dextrous has scaled up to a much larger robot that can handle hundreds of heavy boxes per hour with its chopstick manipulators.

“The chopstick type of approach is very robust,” Dextrous CEO Evan Drumwright tells us. “We can carry heavy payloads and small items with very precise manipulation. Independently posable chopsticks permit grasping a nearly limitless variety of objects with a straightforward mechanical design. It’s a real simplification of the grasping problem.”

The video above shows the robot moving about 150 boxes per hour in a scenario that simulates unloading a packed trailer, but the system is capable of operating much faster. The demonstration was done without any path optimization. In an uncluttered environment, Dextrous has been able to operate the system at 900 boxes per hour, about twice as fast as the 300 to 500 boxes per hour that a person can handle.

Of course, the heavier the box, the harder it is for a person to maintain that pace. And once a box gets heavier than about 20 kilograms, it takes two people to move it. At that point, labor becomes far less efficient. On paper, the hardware of Dextrous’s robot is capable of handling 40 kg boxes at an acceleration of up to 3 gs, and up to 65 kg at a lower acceleration. That would equate to 2,000 boxes per hour. True, this is just a theoretical maximum, but it’s what Dextrous is working toward.

If the only problem was to move heavy boxes quickly, robots would have solved it long ago. However, before you can move the box you first have to pick it up, and that complicates matters. Other robotics companies use suction to pick things up. Dextrous alone favors giant chopsticks.

Suction does have the advantage of being somewhat easier to handle on the perception and planning side: Find a flat surface, stick to it, and there you go. That approach assumes you can find a flat surface, but the well-ordered stacks of boxes seen in most demo videos aren’t necessarily what you’ll get in a warehouse. Suction has other problems: It typically has a payload limit of 20 kg or so, it doesn’t work very well with odd-size boxes, and it has trouble operating in temperatures below 10 °C. Suction systems also pull in a lot of dirt, which can cause mechanical problems.

A suction system typically attaches to just one surface, and that limits how fast it can move without losing its grip or tearing open a box. The Dextrous chopsticks can support a box on two sides. But making full use of this capability adds difficulty to the perception and planning side.

“Just getting to this point has been hardcore,” Drumwright says. “We’ve had to get to a level of precision in the perception system and the manipulation to be able to understand what we’re picking with high confidence. Our initial engineering hurdle has been very, very high.”

Manipulating rigid objects with rigid manipulators like chopsticks has taken Dextrous several years to perfect. “Figuring out how to get a robot to perceive and understand its environment, figure out the best item to pick, and then manipulating that item and doing all that in a reasonable length of time—that is really, really hard,” Drumwright tells us. “I’m not going to say we’ve solved that 100 percent, but it’s working very well. We still have plenty of stuff left to do, but the proof of concept of actually getting a robot that does contact-based manipulation to pick variably sized objects out of an unconstrained environment in a reasonable time period? We’ve solved that.”

Here’s another video showing a sustained box-handling sequence; if you watch carefully, you’ll notice all kinds of precise little motions as the robot uses its manipulators to slightly reposition boxes to give it the best grasp:

All of those motions makes the robot look almost like it’s being teleoperated, but Drumwright assures me that it’s completely autonomous. It turns out that teleoperation doesn’t work very well in this context. “We looked at doing teleop, and we actually could not do it. We found that our controllers are so precise that we could not actually make the system behave better through teleop than it did autonomously.” As to how the robot decides what to do what it does, “I can’t tell you exactly where these behaviors came from,” Drumwright says, “Let’s just call it AI. But these are all autonomous manipulation behaviors, and the robot is able to utilize this diverse set of skills to figure out how to pick every single box.”

You may have noticed that the boxes in the videos are pretty beat up. That’s because the robot has been practicing with those boxes for months, but Dextrous is mindful of the fact that care is necessary, says Drumwright. “One of the things that we were worried about from the very beginning was, how do we do this in a gentle way? But our newest version of the robot has the sensitivity to be very gentle with the boxes.”

I asked Drumwright what would be the most difficult object for his robot to pick up. I suggested a bowling ball (heavy, slippery, spherical). “Challenging, but by no means impossible,” was his response, citing research from Siddhartha Srinivasa at the University of Washington showing that a robot with chopsticks can learn to do dynamic fine manipulation of spherical objects. Dextrous isn’t above cheating slightly, though, by adding a thin coating of hard rubber to the chopsticks’ end effectors to add just a tiny bit of compliance—not enough to mess with planning or control, but enough to make grasping some tricky objects a little easier.

By a year ago, Dextrous had shown that it could move boxes at high speeds under limited scenarios. For the past year, it has been making sure that the system can handle the full range of scenarios that it’s likely to encounter in warehouses. Up next is combining those two things—cranking the speed back up while still working reliably and autonomously.

“On the manipulation side, the system is fully autonomous,” Drumwright says. “We currently have humans involved in driving the robot into the container and then joysticking it forward once it’s picked all that it can reach, but we’re making that fully autonomous, too.” And the robot has so far been quite reliable, requiring little more than lubrication.

According to Drumwright, the biggest challenge on the business side at this point is simply manufacturing enough robots, since the company builds the hardware in-house. The remaining question is how long it will take to make the transition from experiment to product. The company is starting a few commercial pilots, and Drumwright says the thing that’s slowing them down the most is building enough robots to keep up with demand.

“We’ve solved all of the hardest technical problems,” he says. “And now, it’s the business part.”

When IEEE Spectrum editors are putting together an issue of the magazine, a story on the website, or an episode of a podcast, we try to facilitate dialogue about technologies, their development, and their implications for society and the planet. We feature expert voices to articulate technical challenges and describe the engineering solutions they’ve devised to meet them.

So when Senior Editor Evan Ackerman cooked up a concept for a robotics podcast, he leaned hard into that idea. Ackerman, the world’s premier robotics journalist, talks with roboticists every day, and recording those conversations to turn those interviews into a podcast is usually a relatively straightforward process. But Ackerman wanted to try something a little bit different: bringing two roboticists together and just getting out of the way.

“The way the Chatbot podcast works is that we invite a couple of robotics experts to talk with each other about a topic they have in common,” Ackerman explains. “They come up with the questions, not us, which results in the kinds of robotics conversations you won’t hear anywhere else—uniquely informative but also surprising and fun.”

Each episode focuses on a general topic the roboticists have in common, but once they get to chatting, the guests are free to ask each other about whatever interests them. Ackerman is there to make sure they don’t wander too far into the weeds, because we want everyone to be able to enjoy these conversations. “But otherwise, I’ll mostly just be listening,” Ackerman says, “because I’ll be as excited as you are to see how each episode unfolds.”

We think this unique format gives the listener the inside scoop on aspects of robotics that only the roboticists themselves could get each other to reveal. Our first few episodes are already live. They include Skydio CEO Adam Bry and the University of Zurich professor Davide Scaramuzza talking about autonomous drones, Labrador Systems CEO Mike Dooley and iRobot chief technology officer Chris Jones on the challenges domestic robots face in unpredictable dwellings, and choreographer Monica Thomas and Amy LaViers of the Robotics, Automation, and Dance (RAD) Lab discussing how to make Boston Dynamics’ robot dance.

We have plenty more Chatbot episodes in the works, so please subscribe on whatever podcast service you like, listen and read the transcript on our website, or watch the video versions on the Spectrum YouTube channel. While you’re at it, subscribe to our other biweekly podcast, Fixing the Future, where we talk with experts and Spectrum editors about sustainable solutions to climate change and other topics of interest. And we’d love to hear what you think about our podcasts: what you like, what you don’t like, and especially who you’d like to hear on future episodes.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEX.Cybathlon Challenges: 02 February 2024, ZURICH, SWITZERLANDEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE

Enjoy today’s videos!

An overview of ongoing work by Hello Robot, UIUC, UW, and Robots for Humanity to empower Henry Evans’ independence through the use of the mobile manipulator Stretch.

And of course, you can read more about this project in this month’s issue of Spectrum magazine.

[ Hello Robot ]

At KIMLAB, we have a unique way of carving Halloween pumpkins! Our MOMO (Mobile Object Manipulation Operator) is equipped with PAPRAS arms featuring prosthetic hands, allowing it to use human tools.


This new haptic system from CMU seems actually amazing, although watching the haptic arrays pulse is wigging me out a little bit for some reason.

[ Fluid Reality Group ]

We are excited to introduce you to the Dingo 1.5, the next generation of our popular Dingo platform! With enhanced hardware and software updates, the Dingo 1.5 is ready to tackle even more challenging tasks with ease.

[ Clearpath ]

A little bit of a jump scare here from ANYbotics.

[ ANYbotics ]

Happy haunting from Boston Dynamics!

[ Boston Dynamics ]

I’m guessing this is some sort of testing setup but it’s low-key terrifying.

[ Flexiv ]

KUKA has teamed up with Augsburger Puppenkiste to build a mobile show cell in which two robots do the work of the puppeteers.

[ KUKA ]

In this video, we showcase the Advanced Grasping premium software package’s capabilities. We demonstrate how TIAGo collects objects and places them, how the gripper adapts to different shapes, and the TIAGo robot’s perception and manipulation capabilities.

[ PAL Robotics ]

HEBI Robotics produces a platform for robot development. Our long term vision is to make it easy and practical for any worker, technician, farmer, etc. to create robots as needed. Today the platform is used by researchers around the world and HEBI is using it to solve challenging automation tasks related to inspections and maintenance.

[ HEBI Robotics ]

Folded robots are a rapidly growing field that is revolutionizing how we think about robotics. Taking inspiration from the ancient art of origami results in thinner, lighter, more flexible autonomous robots.

[ NSF ]

Can I have a pet T-Rex? Is a short interdisciplinary portrait documentary featuring paleontologist and Kod*lab postdoc, Aja Mia Carter and Kod*lab robotics researchers, Postdoc Wei-Hsi Chen and PhD student J.Diego Caporale. Dr. Chen applies the art of origami to make a hopping robot while Mr. Caporale adds a degree of freedom to the spine of a quadruped robot to interrogate ideas about twisting and locomotion. An expert in the evolution of tetrapod spines from 380 millon years ago, Dr. Carter is still motivated by her childhood dream for a pet T-Rex, but how can these robotics researchers get her closer to her vision?

[ Kodlab ]

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEXAS.Cybathlon Challenges: 02 February 2024, ZURICH

Enjoy today’s videos!

The process of getting Spot to talk with a personality is very cool, but this is also something that should be done very carefully: Spot is a tool, and although it may sound like it thinks and feels, it absolutely doesn’t. Just something to keep in mind as more Spots (and other robots) make it out into the wild.

[ Boston Dynamics ]

Shhh. Be vewy, vewy quiet.

[ Paper ]

This video presents the remarkable capabilities of the TALOS robot as it demonstrates agile and robust walking using Model Predictive Control (MPC) references sent to a Whole-Body Inverse Dynamics (WBID) controller developed in collaboration with Dynamograde.

[ PAL Robotics ]

Dr. Hooman Samani from the Creative Robotics Lab at the University of the Arts London writes, “The idea is to show how robots can be beyond traditional use and involve more people in robotics such as artists as we do at our university. So we made this video to show how a co-bot can be used as a DJ and people and robots dance together to the robot DJ in a robot dance party!”

[ London CCI ]

Future robots should perform multiple and various tasks, instead of simple pick-and-place operations. In this video, Dino Robotics demonstrates the functionalities in their software solution: it cooks a steak! Bon Appétit!

[ Dino Robotics ]

This video presents a novel perching and tilting aerial robot for precise and versatile power-tool work on vertical walls. The system was developed as part of the AITHON ETH Zürich Bachelor student focus project and presented at IEEE IROS 2023. It combines a compact integrated perching drone design with a concrete drill’s heavy payload and reaction forces.

[ Paper ]

This is what very high precision, very useful robotics looks like.

[ Dusty ]

I never thought I’d write this sentence, but here is some video of a failing robotic mudskipper sex doll.

[ Nature ]

Good aim on this drone considering that its landing pad is speeding along at 20 knots.

[ AeroVironment ]

From the people responsible for the giant gundam in Japan comes this very big and very slow rideable quadruped thing.

[ Robotstart ]

RoboCup 2024 will be in Eindhoven in July!

[ RoboCup ]

A brief look into the 2023 IEEE RAS Summer School on Multi-Robot Systems, which took place in July 2023 in Prague.

[ CTU ]

Lava caves on Mars and particularly on the moon are not only interesting for exo-geologists and other space scientists, but they also could be used as storage rooms or even habitats for future human settlements. The question is how to access and explore these huge cavities under the lunar surface without risking the lives of astronauts. This is where robots, or rather teams of robots, come into play.

[ DFKI ]

The rise of recent Foundation models (and applications e.g. ChatGPT) offer an exciting glimpse into the capabilities of large deep networks trained on Internet-scale data. In this talk, I will briefly discuss some of the lessons we’ve learned while scaling real robot data collection, how we’ve been thinking about Foundation models, and how we might bootstrap off of them (and modularity) to make our robots useful sooner.

[ UPenn ]

Amundsen–Scott South Pole Station is a permanent scientific research base located at what is arguably the most isolated place on Earth. During the austral summer, the station is home to about 150 scientists and support staff, but during the austral winter, that number shrinks to just 40 or so, and those people are completely isolated from the rest of the world from mid-February until late October. For eight months, the station has to survive on its own, without deliveries of food, fuel, spare parts, or anything else. Only in the most serious of medical emergencies will a plane attempt to reach the station in the winter.

While the station’s humans rotate seasonally, there are in fact four full-time residents: the South Pole Roombas. First, there was Bert, a Roomba 652, who arrived at the station in 2018 and was for a time the loneliest robot in the world. Since the station has two floors, Bert was joined by Ernie, a Roomba 690, in 2019. A second pair of Roombas, Sam and Frodo, followed soon after.

These Roombas are at the South Pole to do what Roombas do: help keep the floors clean. But for the people who call the South Pole home for months on end, it turns out that these little robots have been able provide some much-needed distraction in a place where things stay more or less the same all of the time, and where pets, plants, and even dirt is explicitly outlawed by the Antarctic Treaty in the name of ecological preservation.

For the last year, an anonymous IT engineer has been blogging about his experiences working first at McMurdo Station (on the Antarctic coast south of New Zealand), and later at Amundsen–Scott South Pole Station, where he’s currently spending the winter as part of the station’s support staff. His blog includes mundane yet fascinating accounts of what day-to-day life is like at the South Pole, including how showering works (four minutes per person per week), where the electricity comes from (a huge amount of aviation fuel hauled over land from the coast that will power generators), and the fate of the last egg for five months (over medium with salt and pepper).

The engineer also devoted an entire post to signage at the South Pole, at the very end of which was this picture, which raised some questions for me:

Ernie, a Roomba living at the south

Ernie, it turns out, has had a dramatic and occasionally harrowing life at the South Pole station. After Ernie arrived in 2019 to clean one floor of the station, lore began to develop that Ernie and its partner Bert (tasked with cleaning the floor above) were “star-crossed lovers, forever separated by the impenetrable barrier of the staircase.” That quote comes from Amy Lowitz, a member of the South Pole Telescope team, who overwintered at the pole in 2016 and has spent many summers there. “I think I made that joke every year when a new group of people comes to the pole for the summer,” Lowitz tells IEEE Spectrum. “There’s only so many things to talk about, so eventually the Roombas come up in conversation.” Happily for Ernie, Lowitz says that it’s now on the same floor as Bert, with the new Roombas Sam and Frodo teaming up on the floor below.

But Ernie’s presumed joy at finally being united with Bert was not to last—in January of 2020, Ernie went missing. The Twitter account of the South Pole Telescope posted photos pleading for Ernie’s return, and a small memorial appeared at Ernie’s docking station.


Soon, things took a more sinister (amusingly sinister) turn. Kyle Ferguson is a South Pole Telescope team member who was at the station in the summer of 2020 when Ernie went missing, and has vivid memories of the drama that ensued:

I believe it started with just one poster that went up outside of the galley, with a picture of two people calling themselves the Cookie Monsters posing in balaclavas and standing on a staircase holding Ernie. It said something like, ‘if you ever want to see Ernie alive again, leave a tray of chocolate chip cookies in such and such location and we will return him safely.’ So that was the initial ransom.


As tends to happen in a community like this, things sort of took off from there—everybody ran with it in their own direction. So, on that wall outside of the galley, there evolved a narrative where people were trying to mount rescue missions, and there were sign up sheets for that. And there were people saying, ‘we won’t negotiate with you until you provide proof of life.’

Down the hallway, there was another narrative where people had assumed the worst: that the kidnappers had ended poor Ernie’s life prematurely. So the memorial that had sprung up for Ernie next to one of the water fountains grew. There were fake flowers and Tootsie rolls, and some people put some trash there, just in homage—trash that Ernie would never be able to sweep up. I even ended up writing a parody of the song ‘5,000 Candles in the Wind’ from Parks and Recreation for Ernie, and singing it at an open mic night.


But Ernie did come back. Those of us who believed that he had perished (I was one of those) were in the wrong. Someone claimed that the cookies had been delivered, and that the kidnappers should give Ernie back, and then there was a poster that went up that said Ernie was found abandoned underneath one of the staircases. He was rescued and revived by the Cookie Monsters. So, the kidnappers sort of got credit for saving him in the end.

Ferguson suspects that Ernie’s “IT WAS SO COLD” sticker was acquired after the robot’s brief trip outside with the kidnappers. Summer temperatures at the south pole average around -28°C, substantially below the operating temperature of a Roomba, although when we spoke to Ferguson for this article during the South Pole winter, it was closer to -80°C outside the station, including wind chill.

The harsh weather and isolation may help explain why Ernie and his Roomba brethren get so much attention from the station residents. “There’s more to do at the South Pole than people think,” Amy Lowitz tells us, “but you’re still pretty much within a half mile radius of the main station, all of the time. So people get a little bored and a little stir crazy, and we look for new and strange ways to entertain ourselves. The ransom notes were just some goofy hijinks from some bored people at the South Pole.”

Lowitz also remembers a party where either Bert or Ernie was drafted as a DJ, with a Bluetooth speaker and some fancy lighting. “We had it running around up on a table so that people wouldn’t trip over it,” she recalls. And as recently as this winter, says Kyle Ferguson, a befurred Roomba could be seen on station: “Someone put up a silly ‘lost cat’ poster earlier in the winter, with a picture not even of a cat but of like a raccoon or something. And then someone else took that and decided to run with it, so they had this fake raccoon fur that they put to the top of one of the Roombas and sent it out to wander the hallways.”

Sam, the “station cat.”Kyle Ferguson

Covering a Roomba with fur may be getting the robot a little closer to what people at the South Pole are actually missing, suggests Lowitz: “my guess is that at least some Polies [i.e. South Pole residents] are into the Roombas because we’re not allowed to have pets at the South Pole, and when there are these little Roombas running around, it’s sort of close. People do odd things at that altitude [the pressure altitude at the south pole is nearly 3500 meters], and when they miss home… a Roomba is just like a cute little thing to personify and pay attention to.”

Ferguson agrees. “We all miss our pets down here. Sometimes we joke about trying to smuggle down a puppy or a kitten even though it’s a huge violation of the Antarctic Treaty. One of the things that I think gives the Roombas some of their charm is how they keep running into walls. If I was to ascribe a personality to them, it would be kind of dumb and aloof, which evokes some of those pet memories—maybe like the time that your dog ate something it shouldn’t have.”

A recent picture of Ernie, who is currently living underneath a popcorn machine.Kyle Ferguson

Sadly, we’ve heard that the South Pole Roombas are not at their Roomb-iest right now. They’re not as young as they used to be, and getting spare parts (like new batteries) is only possible during the austral summer and requires a lead time of six months. We’ll be checking in on Bert, Ernie, Sam, and Frodo towards the end of the year once the Amundsen–Scott South Pole Station reopens for the austral summer. But for now, please enjoy the lyrics to Kyle Ferguson’s Ernie-themed “5000 Candles in the Wind” parody, adapted from ‘5,000 Candles in the Wind’ from Parks and Recreation.

Up in Roomba Heaven, here’s the thing;

You trade your wheels for angel’s wings,

And once we’ve all said goodbye,

You stop running into walls and you learn to fly.

Bye-bye, Roomba Ernie.

You were taken from us too early.

Bye-bye, Roomba Ernie.

You’re 5,000 candles in the wind.

Though we all miss you everyday,

We know you’re up there cleaning heaven’s waste.

Here’s the part that hurts the most:

Humans cannot recharge a ghost.

Bye-bye, Roomba Ernie.

You were taken from us too early.

Bye-bye, Roomba Ernie.

You’re 5,000 candles in the wind.


Bye-bye, Roomba Ernie.

You were taken from us too early.

Bye-bye, Roomba Ernie.

You’re 5,000 candles in the wind.

Maybe some day you’ll clean these halls again.

And I know I’ll always miss my Roomb-iest friend.

Spread your wings and fly.

Special thanks to the National Science Foundation,, and the Polies that we spoke to for this article. And if you’d like even more South Pole winter shenanigans, there’s an Antarctic Film Festival open to all of the research stations in Antarctica. Kyle Ferguson stars in John Wiff, an action movie that was written, filmed, and produced in just 48 hours, and you can watch it here (mildly NSFW for a truly astonishing amount of Nerf gun violence).

This sponsored article is brought to you by BESydney.

In the dynamic landscape of Australian technology, market advancements are often attributed to consumer-focused products like Canva and Afterpay. Capturing headlines and attention with their renowned success stories, these, along with other global companies like Atlassian, Facebook, and Apple, have become the face of the tech industry.

The accomplishments of these companies are remarkable. They generate immense wealth for stakeholders and employees and boast a staggering market value. But this high-profile side of the industry is just the tip of the iceberg. Deep tech – characterised by breakthrough scientific innovations – is where hidden impacts take place. Beneath the surface of these tech giants lies a thriving industry dedicated to researching and developing solutions that address large-scale problems, with a profound effect on society.

The power of deep tech

The tech industry in Australia is a powerhouse, employing one in 16 Australians and ranking as the country’s third-largest industry. In 2021, it accounted for 8.5 percent of the GDP, an undeniably significant contribution to the nation’s economy.

For nearly two decades, Sydney has also nurtured a thriving community of resilient problem solvers, quietly pushing the boundaries of scientific discovery. While consumer-focused tech giants often steal the spotlight, it is imperative to recognize the profound impact of deep tech solutions that operate behind the scenes.

From eco-friendly fabric manufacturing and hydrogen storage to molecular diagnostics and sustainable alternatives to plastics, Sydney’s brightest minds are tackling some of the world’s most pressing challenges.

The transformation of deep tech startups

Navigating the deep tech landscape is no small feat. These enterprises offer long-term solutions to pressing global challenges – a benefit that cannot be ignored – but deep tech innovations require significant time for research and development, often incubating for years before reaching the market.

They demand substantial investment and unwavering focus. Finding the right path to commercialization is paramount. Thankfully, incubators are emerging as champions in successfully transforming deep tech startups into thriving businesses.

“Sydney’s DNA demands a deep-rooted vision, an unwavering belief in problem-solving, and the determination to persevere despite challenges.” —Sally-Ann Williams, Cicada Innovations

Cicada Innovations is Australia’s oldest and largest deep tech incubator. It knows better than anyone the extent to which Australia’s deep tech evolution hinges on the power of startups. With over 365 resident companies incubated, over $1.7 billion raised, over $1.4 billion exits, and over 900 patents filed, these dynamic ventures are already spearheading groundbreaking advancements.

It’s creating intelligent robots and pioneering scaled drone delivery to minimize environmental impacts in transportation. It’s slashing the cost of cancer drugs, offering hope for prolonged lifespans and alleviating suffering. And it’s crafting innovative farming tools to enhance agricultural yields and contribute to global food security.

A thriving hub for deep tech innovation

With its vibrant ecosystem, Sydney emerges as an ideal hub for unveiling and further developing deep tech innovations. The Australian spirit, shaped by resilience and problem-solving, thrives in this city. Sally-Ann Williams, chief executive of Cicada Innovations, affirms that “Sydney’s DNA demands a deep-rooted vision, an unwavering belief in problem-solving, and the determination to persevere despite challenges.”

The city offers a supportive community, facilitating connections and access to the talent necessary for entrepreneurs to pursue their dreams. It’s this unique blend of ingredients that fuels the growth of deep tech companies, propelling them toward success.

Discover deep tech at Tech Central

Deep tech is just one facet of what’s happening at Tech Central. While we shed light on these industry accomplishments and celebrated breakthroughs, it’s crucial to support and foster the growth of a wider industry: one that thrives on resilience, problem-solving, and visionary entrepreneurship.

Sydney – with its unique blend of community, talent, and resources – stands at the forefront of this transformative revolution, ready to propel tech innovation for the benefit of all.

For more information on Sydney’s Tech Industry and hosting your next conference in Sydney, visit

A Closer Look at Deep Tech Innovators

To truly grasp the essence of deep tech, we must explore the stories of individuals and companies that are driving change. Here are a few examples of how deep tech is flourishing at Tech Central:

Xefco: A sustainable textile revolution

Xefco is a groundbreaking new materials company revolutionizing fabric manufacturing. Its innovative process significantly reduces water usage by up to 90% and eliminates the need for dyes and harsh chemicals. Traditionally, textile mills worldwide have polluted rivers and harmed local communities – Xefco aims to transform the textile industry, benefitting both the environment and economically disadvantaged communities worldwide.

Rux: Empowering the hydrogen economy

Another trailblazing company in Sydney’s deep tech ecosystem, Rux Energy is tackling the challenge of hydrogen storage. Hydrogen presents immense potential in the energy transition movement, but efficient and scalable storage solutions are essential for its widespread adoption. Rux is developing new materials and technologies to store hydrogen more effectively, paving the way for a cleaner and more sustainable future.

SpeeDX: Revolutionising molecular diagnostics

Amidst the global pandemic, SpeeDX, a Sydney-based company, emerged as a key player in molecular diagnostic testing and antimicrobial resistance. SpeeDX aims to address the rising concern of antibiotic overuse by providing personalized recommendations for effective treatment. This groundbreaking technology has far-reaching implications, reducing unnecessary antibiotic usage, minimizing the risk of antimicrobial resistance, and safeguarding public health on a global scale.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEX.Cybathlon Challenges: 02 February 2024, ZURICH, SWITZERLAND

Enjoy today’s videos!

Digit, our human-centric robot, can now self-right and stand back up after it falls. This is footage from our testing lab, where we intentionally disable the perception systems that would normally avoid/adjust to obstacles preventing Digit from falling. For the purposes of this test, we force Digit to fall in a controlled environment to demonstrate our new self-righting and recovering ability.

[ Agility ]

With our multipick functionality, Stretch is unlocking the next level of automated unloading. Stretch can now move multiple boxes with a single swing of the arm. In typical shipping containers filled with thousands of boxes, the robot is hitting significantly higher rates of productivity.

[ Boston Dynamics ]

The moral of this video is to always give your robots a gentle pat on the sensors when they do a good job at a challenging task.

[ ANYbotics ]

Since their mass production in the early 2000s, vacuum robots have emerged as highly successful commercial products in the field of home automation. At KIMLAB, we have implemented a mobile manipulator based on a vacuum robot and an add-on mechanism by employing our PAPRAS (Plug-And-Play Robotic Arm System).

[ Paper ] via [ KIMLAB ]

Happy 100 Ikeadroneversary to Verity!

[ Verity ]

If you’re wondering what kind of black magic is making this work, the answer is the best kind of black magic: magnets.

[ Paper ] via [ Freeform Robotics ]

Honda is exploring how our all-electric prototype Honda Autonomous Work Vehicle (AWV) could address the challenges of labor shortages, safety and security, and emissions reductions to bring new value to airfield operations. The Honda AWV is designed to boost workforce productivity and support repetitive tasks that allow companies to focus their workforce on value-added activities. First introduced as a concept at CES 2018, the Honda AWV is now advancing toward commercialization.

[ Honda ]

First prototype of a bike tire treated with Self-Healing polymer internally. The result is a puncture-proof inflated tire that does not need the addition of any liquid sealant. The tire is a normal bike tire with an inner tire.

[ BruBotics ]

The U.S. Navy is working on four-legged friends for sailors, and the ship’s cat is very upset.


The SMART Innovative Training Network is a joint venture between academia and industry, providing scientific and personal development of young researchers in the multidisciplinary fields of soft robotics and smart materials. SMART will realize the technologically and scientifically ambitious breakthroughs to exploit smart, stimuli-responsive material systems for actuation, sensing, and self-healing capabilities for intelligent soft devices.


Now that Rockwell Automation’s acquisition of Clearpath Robotics and OTTO Motors is complete (at something like US $600 million, according to one source), it’s more important than ever to get at least some understanding of what the future holds for those iconic yellow-and-black research robots. And it’s not just about their robots, either: Clearpath Robotics was one of the original champions of the Robot Operating System (ROS), and the company has provided a massive amount of support to the ROS community over the past decade.

At the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023) in Detroit earlier this month, we spoke with Clearpath Robotics cofounder Ryan Gariepy to get a better sense of where things are headed for Clearpath Robotics.

Now that you are part of Rockwell, what’s staying the same?

Ryan Gariepy: Both Clearpath Robotics and OTTO Motors are still very much in existence. We’re still operating with our own road maps, and Rockwell Automation has a desire to keep these brands around. We plan to keep the iconic Clearpath colors. Basically, we’re going to continue business as usual. As much as I appreciate people’s concern, we do intend to continue building this for the long-term.

“We’re now in a world where one of the largest industrial automation companies has decided that robotics is a strategic interest. We think there will be a lot of things that the robotics research community will be excited about.”
—Ryan Gariepy, Clearpath Robotics

What’s going to be different?

Gariepy: We anticipate being able to take larger risks, with more of a long-term view on some of our products and services. Rockwell also has established global scale in sales, deployment, support, supply chain, everything. It’ll really allow us to focus much more on what we’re good at, rather than having to choose between product development and operations.

Rockwell currently does a lot of stuff which is peripheral to the robotics community. They’re a global leader in motion control, in sensing, in safety—these are things that could be of great interest. I think any long-time researcher will remember the days when sensor manufacturers didn’t even support using their sensors on robots, and you had to reverse-engineer those protocols yourself. But we’re now in a world where one of the largest industrial automation companies has decided that robotics is a strategic interest. We think there will be a lot of things that the robotics research community will be excited about.

What about long-term support for existing Clearpath research robots?

Gariepy: If anything, a company like Rockwell gives us more stability rather than less stability. They’re used to supporting their products for far longer than us—the oldest Huskies are coming up on 12 or 13 years old. Rockwell has products that have been on the market for 20 years that they’re still supporting, so they very much respect that. I know that for a lot of researchers, it seems like Clearpath Robotics has been around forever, but we’ve only been around for 14 years. Rockwell has been around for 120 years.

What about TurtleBot?

Gariepy: TurtleBot 5 would be a future road map discussion, and that’s more in the hands of Open Robotics than Clearpath Robotics. We do love the TurtleBot, we’re building as many TurtleBots as we possibly can, and we have a long-term agreement with Open Robotics to continue the TurtleBot partnership. That agreement continues.

How does Rockwell feel about ROS?

Gariepy: Rockwell wants to work more with ROS, and has definitely been excited by the leadership that we have with the ROS community. There are a lot of things that we’ve been talking about on how to build on this, but I can’t really get into any details. Honestly, this is because there are so many good ideas we have, that even with this larger company, I don’t have the people to pull everything off right now.

Again, it wasn’t that many years ago when you couldn’t get an API for a manipulator arm so that you could even use it, much less have the manufacturer of that arm support ROS themselves. Things have changed substantially, and now you have a company like Rockwell becoming very excited about the potential in the ROS community.

Clearpath Robotics has of course only ever been one part of the ROS community—an important part, certainly, but the continued success of ROS has (we hope) grown beyond what might be going on at any one company. It’s a little worrisome that several other important parts of the ROS community, including Fetch Robotics and Open Robotics, have also been acquired relatively recently. So with all this in mind, we’ll be at ROSCon in New Orleans later this week to try to get a better sense of how the community feels about the future of ROS.

When Figure announced earlier this year that it was working on a general purpose humanoid robot, our excitement was tempered somewhat by the fact that the company didn’t have much to show besides renders of the robot that it hoped to eventually build. Figure had a slick looking vision, but without anything to back it up (besides a world-class robotics team, of course), it was unclear how fast they’d be able to progress.

As it turns out, they’ve progressed pretty darn fast, and today Figure is unveiling its Figure 01 robot, which has gone from nothing at all to dynamic walking in under a year.

A couple of things to note about the video, once you tear your eyes away from that shiny metal skin and the enormous battery backpack: first, the robot is walking dynamically without a tether and there are no nervous-looking roboticists within easy grabbing distance. Impressive! Dynamic walking means that there are points during the robot’s gait cycle where abruptly stopping would cause the robot to fall over, since it’s depending on momentum to keep itself in motion. It’s the kind of walking that humans do, and is significantly more difficult than a more traditional “robotic” walk, in which a robot makes sure that its center of mass is always safely positioned above a solid ground contact. Dynamic walking is also where those gentle arm swings come from—they’re helping keep the robot’s motion smooth and balanced, again in a human-like way.

The second thing that stands out is how skinny (and shiny!) this robot is, especially if you can look past the cable routing. Figure had initially shown renderings of a robot with the form factor of a slim human, but there’s usually a pretty big difference between an initial fancy render and real hardware that shows up months later. It now looks like Figure actually has a shot at keeping to that slim design, which has multiple benefits—there are human-robot interaction considerations, where a smaller form factor is likely to be less intimidating, but more importantly, the mass you save by slimming down as much as possible leads to a robot that’s more efficient, cheaper, and safer.

Obviously, there’s a lot more going on here than Figure could squeeze into is press release, so for more details, we spoke with Jenna Reher, a Senior Robotics/AI Engineer at Figure, and Jerry Pratt, Figure’s CTO.

What was the process like for you to teach this robot how to walk? How difficult was it to do that in a year?

Jenna Reher: We’ve been really focused on making sure that we’re validating a lot of the hardware as it’s built. With the robot that’s shown in the video, earlier this year it was just the pelvis bolted onto a test fixture. Then we added the spine joints and all the joints connected to that pelvis, and literally built the robot out from that pelvis. We added the legs and had those swinging in the air, and then built up the torso on top of that. At each of those stages, we were making sure to have a good process for validating that those low level pieces of this overall system were really well tuned in. I think that once you get to something as complex as a whole humanoid, all that validation really saves you a lot of time on the other end, since you have a lot more confidence in the lower level systems as you start working on higher level behaviors like locomotion

We also have a lot of people at the company that have experience on prior legged robotic platforms, so there’s a well of knowledge that we can draw from there. And there’s a large pool of literature that’s been published by people in the locomotion community that roboticists can now pull from. With our locomotion controller, it’s not like we’re trying to reinvent stable locomotion, so being able to implement things that we know already work is a big help.

Jerry Pratt: The walking algorithm we’re using has a lot of similarities to the ones that were developed during the DARPA Robotics Challenge. We’re doing a lot of machine learning on the perception side, but we’re not really doing any machine learning for control right now. For the walking algorithm, it’s pretty much robotics controllers 101.

And Jenna mentioned the step-by-step hardware bring-up. While that’s happening, we’re doing a lot of development on the controller in simulation to get to the point where the robot is walking in simulation pretty well, which means that we have a good chance of the controller working on the real robot once it comes online. I think as a company, we’ve done a good job coordinating all the pieces, and a lot of that has come from having people with the experience of having done this several times before.

More broadly, eight years after the DARPA Robotics Challenge, how hard is it to get a human-sized bipedal robot to walk?

Pratt: Theoretically, we understand walking pretty well now. There are a lot of different simple models, different toolboxes that are available, and about a dozen different approaches that you can take. A lot of it depends on having good hardware—it can be really difficult if you don’t have that. But for okay-ish walking on flat ground, it’s become easier and easier now with all of the prior work that’s been done.

There are still a lot of challenges for walking naturally, though. We really want to get to the point where our robot looks like a human when it walks. There are some robots that have gotten close, but none that I would say have passed the Turing test for walking, where if you looked at a silhouette of it, you’d think it was a human. Although, there’s not a good business case for doing that, except that it should be more efficient.

Jenna: Walking is becoming more and more understood, and also accessible to roboticists if you have the hardware for it, but there are still a lot of challenges to be able to walk while doing something useful at the same time—interacting with your environment while moving, manipulating things while moving—these are still challenging problems.

What are some important things to look for when you see a bipedal robot walk to get a sense of how capable it might be?

Reher: I think we as humans have pretty good intuition for judging how well something is locomoting—we’re kind of hardwired to do it. So if you see buzzing oscillations, or a very stiff upper body, those may be indications that a robot’s low-level controls are not quite there. A lot of success in bipedal walking comes down to making sure that a very complex systems engineering architecture is all playing nice together.

Pratt: There have been some recent efforts to come up with performance metrics for walking. Some are kind of obvious, like walking speed. Some are harder to measure, like robustness to disturbances, because it matters what phase of gait the robot is at when it gets pushed—if you push it at just the right time, it’s much harder for it to recover. But I think the person pushing the robot test is a pretty good one. While we haven’t done pushes yet, we probably will in an upcoming video.

How important is it for your robot to be able to fall safely, and at what point do you start designing for that?

Pratt: I think it’s critical to fall safely, to survive a fall, and be able to get back up. People fall— not very often, but they do— and they get back up. And there will be times in almost any application where the robot falls for one reason or another and we’re going to have to just accept that. I often tell people working on humanoids to build in a fall behavior. If the robot’s not falling, make it fall! Because if you’re trying to make the robot so that it can never fall, it’s just too hard of a problem, and it’s going to fall anyway, and then it’ll be dangerous.

I think falling can be done safely. As long as computers are still in control of the hardware, you can do very graceful, judo-style falls. You should be able to detect where people are if you are falling, and fall away from them. So, I think we can make these robots relatively safe. The hardest part of falling, I think, is protecting your hands so they don’t break as you’re falling. But it’s definitely not an insurmountable problem.

Industrial design is a focus of Figure.Figure

You have a very slim and shiny robot. Did the design require any engineering compromises?

Pratt: It’s actually a really healthy collaboration. We’re trying to fit inside a medium-size female body shape, and so the industrial design team will make these really sleek looking robot silhouettes and say, “okay mechanical team, everything needs to fit in there.” And the mechanical team will be like, “we can’t fit that motor in, we need a couple more millimeters.” It’s kind of fun watching the discussions, and sometimes there will be arguments and stuff, but it almost always leads to a better design. Even if it’s simply because it causes us to look at the problem a couple of extra times.

Reher: From my perspective, the kind of interaction with the mechanical engineers that led to the robot that we have now has been very beneficial for the controls side. We have a sleeker design with lower inertia legs, which means that we’re not trying to move a lot of mass around. That ends up helping us down the line for designing control algorithms that we can execute on the hardware.

Pratt: That’s right. And keeping the legs slim allows you to do things like crossover steps—you get more range of motion because you don’t have parts of the robot bumping into each other. Self-collisions are something that you always have to worry about with a robot, so if your robot has fewer protruding cables or bumps, it’s pretty important.

Your CEO posted a picture of some compact custom actuators that your robot is using. Do you feel like your actuator design (or something else) gives your robot some kind of secret sauce that will help it be successful?

Figure’s custom actuator (left) vs. off-the-shelf actuator (right) with equal torque.Figure

Pratt: At this point, it’s mostly really amazing engineering and software development and super talented people. About half of our team have worked on humanoids before, and half of our team have worked in some related field. That’s important— things like, making batteries for cars, making electric motors for cars, software and management systems for electric airplanes. There are a few things we’ve learned along the way that we hadn’t learned before. Maybe they’re not super secret things that other people don’t know, but there’s a handful of tricks that we’ve picked up from bashing our heads against some problem over and over. But there’s not a lot of new technology going into the robot, let’s put it that way.

Are there opportunities in the humanoid robot space for someone to develop a new technology that would significantly change the industry?

Pratt: I think getting to whatever it takes to open up new application areas, and do it relatively quickly. We’re interested in things like using large language models to plan general purpose tasks, but they’re not quite there yet. A lot of the examples that you see are at the research-y stage where they might work until you change up what’s going on—it’s not robust. But if someone cracks that open, that’s a huge advantage.

And then hand designs. If somebody can come up with a super robust large degree of freedom hand that has force sensing and tactile sensing on it, that would be huge too.

The robot is designed to fit inside a medium-size female body shape.Figure

This is a lot of progress from Figure in a very short time, but they’re certainly not alone in their goal of developing a commercial bipedal robot, and relative to other companies who’ve had operational hardware for longer, Figure may have some catching up to do. Or they may not—until we start seeing robots doing practical tasks outside of carefully controlled environments, it’s hard to know for sure.

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ROSCon 2023: 18–20 October 2023, NEW ORLEANSIEEE SSRR 2023: 13–15 November 2023, FUKUSHIMA, JAPANHumanoids 2023: 12–14 December 2023, AUSTIN, TEXASCybathlon Challenges: 02 February 2024, ZURICH

Enjoy today’s videos!

Let’s not concern ourselves with whether this beautiful monstrosity of a Strandbeest is technically a robot or not and instead just enjoy watching it move.

Since the beginning of this summer I have been trying to connect several running units (Ordissen) in succession. Animaris Rex is a herd of beach animals whose specimens hold each other as defense against storms. As individuals they would simply blow over, but as a group the chance of surviving a storm would be greater. It is 18 meters long (5 meters longer than the largest Tyrannosaurus Rex found.)

[ Strandbeest ]

It’s Slightly Less Big and Significantly Bluer Hero 6!

[ Paper ]

A low-cost robot does extreme parkour including high jumps on obstacles 2x its height, long jumps across gaps 2x its length, handstand on stairs, and running across tilted ramps. We show how a single neural net policy operating directly from a camera image, trained in simulation with large-scale RL, can overcome imprecise sensing and actuation to output highly precise control behavior end-to-end. We show our robot can perform a high jump on obstacles 2x its height, long jump across gaps 2x its length, do a handstand and run across tilted ramps, and generalize to novel obstacle courses with different physical properties.

[ CMU ]

Human waiters might actually have something to worry about here.

[ LSRL ]

While traditional control methods require multiple sensory feedback for the stable and fast locomotion of quadruped robots, our recent work presents a modular neural control architecture that can encode robot dynamics for stable and robust gait generation without sensory feedback. The architecture, integrating a central pattern generator network, a premotor shaping network, and a motor-memory hypernetwork, enables a quadruped robot to walk at different walking frequencies on different terrains, including grass and uneven stone pavement.

[ Paper ] via [ NUAA ]

Thanks, Poramate!

Visual control enables quadrotors to adaptively navigate using real-time sensory data, bridging perception with action. Yet, challenges persist, including generalization across scenarios, maintaining reliability, and ensuring real-time responsiveness. This paper introduces a perception framework grounded in foundation models for universal object detection and tracking, moving beyond specific training categories.

[ ARPL ]

As always, performing a live robot demo is no small feat, but KIMLAB members embraced the challenge! MOMO (Mobile Object Manipulation Operator) stole the Demo Expo at IROS 2023 by charming everyone as it handed out candies and swept the floor with a broom. We also unveiled our armor-controlled robotic backpack.


X30 quadruped robot, a flagship product designed to meet core industry needs in multiple fields, including inspection of power stations, factories, pipeline corridors, emergency rescue, fire detection, scientific research and more.

[ DeepRobotics ]

Robots operating in close proximity to humans rely heavily on human trust to successfully complete their tasks. But what are the real outcomes when this trust is violated? Self-defense law provides a framework for analyzing tangible failure scenarios that can inform the design of robots and their algorithms.

[ Robomechanics Lab ]