IEEE Spectrum Automation

IEEE Spectrum
Subscribe to IEEE Spectrum Automation feed IEEE Spectrum Automation


Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICHHumanoids 2024: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

Not even ladders can keep you safe from quadruped robots anymore.

[ ETH Zürich Robot Systems Lab ]

Introducing Azi (right), the new desktop robot from Engineered Arts Ltd. Azi and Ameca are having a little chat, demonstrating their wide range of expressive capabilities. Engineered Arts desktop robots feature 32 actuators, 27 for facial control alone, and 5 for the neck. They include AI conversational ability including GPT-4o support which makes them great robotic companions.

[ Engineered Arts ]

Quadruped robots that individual researchers can build by themselves are crucial for expanding the scope of research due to their high scalability and customizability. In this study, we develop a metal quadruped robot MEVIUS, that can be constructed and assembled using only materials ordered through e-commerce. We have considered the minimum set of components required for a quadruped robot, employing metal machining, sheet metal welding, and off-the-shelf components only.

[ MEVIUS from JSK Robotics Laboratory ]

Thanks Kento!

Avian perching maneuvers are one of the most frequent and agile flight scenarios, where highly optimized flight trajectories, produced by rapid wing and tail morphing that generate high angular rates and accelerations, reduce kinetic energy at impact. Here, we use optimal control methods on an avian-inspired drone with morphing wing and tail to test a recent hypothesis derived from perching maneuver experiments of Harris’ hawks that birds minimize the distance flown at high angles of attack to dissipate kinetic energy before impact.

[ EPFL Laboratory of Intelligent Systems ]

The earliest signs of bearing failures are inaudible to you, but not to Spot . Introducing acoustic vibration sensing—Automate ultrasonic inspections of rotating equipment to keep your factory humming.

The only thing I want to know is whether Spot is programmed to actually do that cute little tilt when using its acoustic sensors.

[ Boston Dynamics ]

Hear from Jonathan Hurst, our co-founder and Chief Robot Officer, why legs are ideally suited for Digit’s work.

[ Agility Robotics ]

I don’t think “IP67” really does this justice.

[ ANYbotics ]

This paper presents a teleportation system with floating robotic arms that traverse parallel cables to perform long-distance manipulation. The system benefits from the cable-based infrastructure, which is easy to set up and cost-effective with expandable workspace range.

[ EPFL ]

It seems to be just renderings for now, but here’s the next version of Fourier’s humanoid.

[ Fourier ]

Happy Oktoberfest from Dino Robotics!

[ Dino Robotics ]

This paper introduces a learning-based low-level controller for quadcopters, which adaptively controls quadcopters with significant variations in mass, size, and actuator capabilities. Our approach leverages a combination of imitation learning and reinforcement learning, creating a fast-adapting and general control framework for quadcopters that eliminates the need for precise model estimation or manual tuning.

[ HiPeR Lab ]

Parkour poses a significant challenge for legged robots, requiring navigation through complex environments with agility and precision based on limited sensory inputs. In this work, we introduce a novel method for training end-to-end visual policies, from depth pixels to robot control commands, to achieve agile and safe quadruped locomotion.

[ SoloParkour ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICHHumanoids 204: 22–24 November 2024, NANCY, FRANCE

Enjoy today’s videos!

The interaction between humans and machines is gaining increasing importance due to the advancing degree of automation. This video showcases the development of robotic systems capable of recognizing and responding to human wishes.

By Jana Jost, Sebastian Hoose, Nils Gramse, Benedikt Pschera, and Jan Emmerich from Fraunhofer IML

[ Fraunhofer IML ]

Humans are capable of continuously manipulating a wide variety of deformable objects into complex shapes, owing largely to our ability to reason about material properties as well as our ability to reason in the presence of geometric occlusion in the object’s state. To study the robotic systems and algorithms capable of deforming volumetric objects, we introduce a novel robotics task of continuously deforming clay on a pottery wheel, and we present a baseline approach for tackling such a task by learning from demonstration.

By Adam Hung, Uksang Yoo, Jonathan Francis, Jean Oh, and Jeffrey Ichnowski from CMU Robotics Insittute

[ Carnegie Mellon University Robotics Institute ]

Suction-based robotic grippers are common in industrial applications due to their simplicity and robustness, but [they] struggle with geometric complexity. Grippers that can handle varied surfaces as easily as traditional suction grippers would be more effective. Here we show how a fractal structure allows suction-based grippers to increase conformability and expand approach angle range.

By Patrick O’Brien, Jakub F. Kowalewski, Chad C. Kessens, and Jeffrey Ian Lipton from Northeastern University Transformative Robotics Lab

[ Northeastern University ]

We introduce a newly developed robotic musician designed to play an acoustic guitar in a rich and expressive manner. Unlike previous robotic guitarists, our Expressive Robotic Guitarist (ERG) is designed to play a commercial acoustic guitar while controlling a wide dynamic range, millisecond-level note generation, and a variety of playing techniques such as strumming, picking, overtones, and hammer-ons.

By Ning Yang , Amit Rogel , and Gil Weinberg from Georgia Tech

[ Georgia Tech ]

The iCub project was initiated in 2004 by Giorgio Metta, Giulio Sandini, and David Vernon to create a robotic platform for embodied cognition research. The main goals of the project were to design a humanoid robot, named iCub, to create a community by leveraging on open-source licensing, and implement several basic elements of artificial cognition and developmental robotics. More than 50 iCub have been built and used worldwide for various research projects.

[ Istituto Italiano di Tecnologia ]

In our video, we present SCALER-B, a multi-modal versatile climbing robot that is a quadruped robot capable of standing up, bipedal locomotion, bipedal climbing, and pullups with two finger grippers.

By Yusuke Tanaka, Alexander Schperberg, Alvin Zhu, and Dennis Hong from UCLA

[ Robotics Mechanical Laboratory at UCLA ]

This video explores Waseda University’s innovative journey in developing wind instrument-playing robots, from automated performance to interactive musical engagement. Through demonstrations of technical advancements and collaborative performances, the video illustrates how Waseda University is pushing the boundaries of robotics, blending technology and artistry to create interactive robotic musicians.

By Jia-Yeu Lin and Atsuo Takanishi from Waseda University

[ Waseda University ]

This video presents a brief history of robot painting projects with the intention of educating viewers about the specific, core robotics challenges that people developing robot painters face. We focus on four robotics challenges: controls, the simulation-to-reality gap, generative intelligence, and human-robot interaction. We show how various projects tackle these challenges with quotes from experts in the field.

By Peter Schaldenbrand, Gerry Chen, Vihaan Misra, Lorie Chen, Ken Goldberg, and Jean Oh from CMU

[ Carnegie Mellon University ]

The wheeled humanoid neoDavid is one of the most complex humanoid robots worldwide. All finger joints can be controlled individually, giving the system exceptional dexterity. neoDavids Variable Stiffness Actuators (VSAs) enable very high performance in the tasks with fast collisions, highly energetic vibrations, or explosive motions, such as hammering, using power-tools, e.g. a drill-hammer, or throwing a ball.

[ DLR Institute of Robotics andMechatronics ]

LG Electronics’ journey to commercialize robot navigation technology in various areas such as home, public spaces, and factories will be introduced in this paper. Technical challenges ahead in robot navigation to make an innovation for our better life will be discussed. With the vision on ‘Zero Labor Home’, the next smart home agent robot will bring us next innovation in our lives with the advances of spatial AI, i.e. combination of robot navigation and AI technology.

By Hyoung-Rock Kim, DongKi Noh and Seung-Min Baek from LG

[ LG ]

HILARE stands for: Heuristiques Intégrées aux Logiciels et aux Automatismes dans un Robot Evolutif. The HILARE project started by the end of 1977 at LAAS (Laboratoire d’Automatique et d’Analyse des Systèmes at this time) under the leadership of Georges Giralt. The video features HILARE robot and delivers explanations.

By Aurelie Clodic, Raja Chatila, Marc Vaisset, Matthieu Herrb, Stephy Le Foll, Jerome Lamy, and Simon Lacroix from LAAS/CNRS (Note that the video narration is in French with English subtitles.)

[ LAAS/CNRS ]

Humanoid legged locomotion is versatile, but typically used for reaching nearby targets. Employing a personal transporter (PT) designed for humans, such as a Segway, offers an alternative for humanoids navigating the real world, enabling them to switch from walking to wheeled locomotion for covering larger distances, similar to humans. In this work, we develop control strategies that allow humanoids to operate PTs while maintaining balance.

By Vidyasagar Rajendran, William Thibault, Francisco Javier Andrade Chavez, and Katja Mombaur from University of Waterloo

[ University of Waterloo ]

Motion planning, and in particular in tight settings, is a key problem in robotics and manufacturing. One infamous example for a difficult, tight motion planning problem is the Alpha Puzzle. We present a first demonstration in the real world of an Alpha Puzzle solution with a Universal Robotics UR5e, using a solution path generated from our previous work.

By Dror Livnat, Yuval Lavi, Michael M. Bilevich, Tomer Buber, and Dan Halperin from Tel Aviv University

[ Tel Aviv University ]

Interaction between humans and their environment has been a key factor in the evolution and the expansion of intelligent species. Here we present methods to design and build an artificial environment through interactive robotic surfaces.

By Fabio Zuliani, Neil Chennoufi, Alihan Bakir, Francesco Bruno, and Jamie Paik from EPFL

[ EPFL Reconfigurable Robotics Lab ]

At the intersection of swarm robotics and architecture, we created the Swarm Garden, a novel responsive system to be deployed on façades. The Swarm Garden is an adaptive shading system made of a swarm of robotic modules that respond to humans and the environment while creating beautiful spaces. In this video, we showcase 35 robotic modules that we designed and built for The Swarm Garden.

By Merihan Alhafnawi, Lucia Stein-Montalvo, Jad Bendarkawi, Yenet Tafesse, Vicky Chow, Sigrid Adriaenssens, and Radhika Nagpal from Princeton University

[ Princeton University ]

My team at the University of Southern Denmark has been pioneering the field of self-recharging drones since 2017. These drones are equipped with a robust perception and navigation system, enabling them to identify powerlines and approach them for landing. A unique feature of our drones is their self-recharging capability. They accomplish this by landing on powerlines and utilizing a passively actuated gripping mechanism to secure themselves to the powerline cable.

By Emad Ebeid from University of southern Denmark

[ University of Southern Denmark (SDU) ]

This paper explores the design and implementation of Furnituroids, shape-changing mobile furniture robots that embrace ambiguity to offer multiple and dynamic affordances for both individual and social behaviors.

By Yasuto Nakanishi from Keio University

[ Keio University ]



When we think of grasping robots, we think of manipulators of some sort on the ends of arms of some sort. Because of course we do—that’s how (most of us) are built, and that’s the mindset with which we have consequently optimized the world around us. But one of the great things about robots is that they don’t have to be constrained by our constraints, and at ICRA@40 in Rotterdam this week, we saw a novel new Thing: a robotic hand that can detach from its arm and then crawl around to grasp objects that would be otherwise out of reach, designed by roboticists from EPFL in Switzerland.

Fundamentally, robot hands and crawling robots share a lot of similarities, including a body along with some wiggly bits that stick out and do stuff. But most robotic hands are designed to grasp rather than crawl, and as far as I’m aware, no robotic hands have been designed to do both of those things at the same time. Since both capabilities are important, you don’t necessarily want to stick with a traditional grasping-focused hand design. The researchers employed a genetic algorithm and simulation to test a bunch of different configurations in order to optimize for the ability to hold things and to move.

You’ll notice that the fingers bend backwards as well as forwards, which effectively doubles the ways in which the hand (or, “Handcrawler”) can grasp objects. And it’s a little bit hard to tell from the video, but the Handcrawler attaches to the wrist using magnets for alignment along with a screw that extends to lock the hand into place.

“Although you see it in scary movies, I think we’re the first to introduce this idea to robotics.” —Xiao Gao, EPFL

The whole system is controlled manually in the video, but lead author Xiao Gao tells us that they already have an autonomous version (with external localization) working in the lab. In fact, they’ve managed to run an entire grasping sequence autonomously, with the Handcrawler detaching from the arm, crawling to a location the arm can’t reach, picking up an object, and then returning and reattaching itself to the arm again.

Beyond Manual Dexterity: Designing a Multi-fingered Robotic Hand for Grasping and Crawling, by Xiao Gao, Kunpeng Yao, Kai Junge, Josie Hughes, and Aude Billard from EPFL and MIT, was presented at ICRA@40 this week in Rotterdam.


This is a sponsored article brought to you by Khalifa University of Science and Technology.

A total of eight intense competitions to inspire creativity and innovation along with 13 forums dedicated to diverse segments of robotics and artificial intelligence will be part of the 36th edition of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) in Abu Dhabi.

These competitions at the Middle East and North Africa (MENA) region’s first-ever global conference and exhibition from 14-18 October 2024 at the Abu Dhabi National Exhibition Center (ADNEC) will highlight some of the key aspects of robotics. These include physical or athletic intelligence of robots, remote robot navigation, robot manipulation, underwater robotics, perception and sensing as well as challenges for wildlife preservation.

This edition of IROS is one of the largest of its kind globally in this category because of active participation across all levels, with 5,740 authors, 16 keynote speakers, 46 workshops, 11 tutorials, as well as 28 exhibitors and 12 startups. The forums at IROS will explore the rapidly evolving role of robotics in many industry sectors as well as policy-making and regulatory areas. Several leading corporate majors, and industry professionals from across the globe are gathering for IROS 2024 which is themed “Robotics for Sustainable Development.”

“The intense robotics competitions will inspire creativity, while the products on display as well as keynotes will pave the way for more community-relevant solutions.” —Jorge Dias, IROS 2024 General Chair

Dr. Jorge Dias, IROS 2024 General Chair, said: “Such a large gathering of scientists, researchers, industry leaders and government stakeholders in Abu Dhabi for IROS 2024 also demonstrates the role of UAE in pioneering new technologies and in providing an international platform for knowledge exchange and sharing of expertise. The intense robotics competitions will inspire creativity, while the products on display as well as keynotes will pave the way for more community-relevant solutions.”

The competitions are:

In addition to these competitions, the Falcon Monitoring Challenge (FMC) will focus on advancing the field of wildlife tracking and conservation through the development of sophisticated, noninvasive monitoring systems.

Khalifa University

IROS 2024 will also include three keynote talks on ‘Robotic Competitions’ that will be moderated by Professor Lakmal Seneviratne, Director, Center for Autonomous Robotic Systems (KU-CARS), Khalifa University. The keynotes will be delivered by Professor Pedro Lima, Institute for Systems and Robotics, Instituto Superior Técnico, University of. Lisbon, Portugal; Dr. Timothy Chung, General Manager, Autonomy and Robotics, Microsoft, US; and Dr. Ubbo Visser, President of the RoboCup Federation, Director of Graduate Studies, and Associate Professor of Computer Science, University of Miami, US.

The forums at IROS 2024 will include:

Other forums include:

One of the largest and most important robotics research conferences in the world, IROS 2024 provides a platform for the international robotics community to exchange knowledge and ideas about the latest advances in intelligent robots and smart machines. A total of 3,344 paper submissions representing 60 countries, have been received from researchers and scientists across the world. China tops the list with more than 1,000 papers, the US with 777, Germany with 302, Japan with 253, and the UK and South Korea with 173 each. The UAE remains top in the Arab region with 68 papers.

One of the largest and most important robotics research conferences in the world, IROS 2024 provides a platform for the international robotics community to exchange knowledge and ideas.

For eight consecutive years since 2017, Abu Dhabi has remained first on the world’s safest cities list, according to online database Numbeo, which assessed 329 global cities for the 2024 listing. This reflects the emirate’s ongoing efforts to ensure a good quality of life for citizens and residents. With a multicultural community, Abu Dhabi is home to people from more than 200 nationalities, and draws a large number of tourists to some of the top art galleries in the city such as Louvre Abu Dhabi and the Guggenheim Abu Dhabi, as well as other destinations such as Ferrari World Abu Dhabi and Warner Bros. World™ Abu Dhabi.

Because of its listing as one of the safest cities, Abu Dhabi continues to host several international conferences and exhibitions. Abu Dhabi is set to host the UNCTAD World Investment Forum, the 13th World Trade Organization (WTO) Ministerial Conference (MC13), the 12th World Environment Education Congress in 2024, and the IUCN World Conservation Congress in 2025.

IROS 2024 is sponsored by IEEE Robotics and Automation Society, Abu Dhabi Convention and Exhibition Bureau, the Robotics Society of Japan (RSJ), the Society of Instrument and Control Engineers (SICE), the New Technology Foundation, and the IEEE Industrial Electronics Society (IES).

More information at https://iros2024-abudhabi.org/



Where’s your flying car? I’m sorry to say that I have no idea. But here’s something that is somewhat similar, in that it flies, transports things, and has “car” in the name: it’s a flying cart, called the Palletrone (pallet+drone), designed for human-robot interaction-based aerial cargo transportation.

The way this thing works is fairly straightforward. The Palletrone will try to keep its roll and pitch at zero, to make sure that there’s a flat and stable platform for your preciouses, even if you don’t load those preciouses onto the drone evenly. Once loaded up, the drone relies on you to tell it where to go and what to do, using its IMU to respond to the slightest touch and translating those forces into control over the Palletrone’s horizontal, vertical, and yaw trajectories. This is particularly tricky to do, because the system has to be able to differentiate between the force exerted by cargo, and the force exerted by a human, since if the IMU senses a force moving the drone downward, it could be either. But professor Seung Jae Lee tells us that they developed “a simple but effective method to distinguish between them.”

Since the drone has to do all of this sensing and movement without pitching or rolling (since that would dump its cargo directly onto the floor) it’s equipped with internal propeller arms that can be rotated to vector thrust in any direction. We were curious about how having a bunch of unpredictable stuff sitting right above those rotors might affect the performance of the drone. But Seung Jae Lee says that the drone’s porous side structures allow for sufficient airflow and that even when the entire top of the drone is covered, thrust is only decreased by about 5 percent.

The current incarnation of the Palletrone is not particularly smart, and you need to remain in control of it, although if you let it go it will do its best to remain stationary (until it runs out of batteries). The researchers describe the experience of using this thing as “akin to maneuvering a shopping cart,” although I would guess that it’s somewhat noisier. In the video, the Palletrone is loaded down with just under 3 kilograms of cargo, which is respectable enough for testing. The drone is obviously not powerful enough to haul your typical grocery bag up the stairs to your apartment. But, it’s a couple of steps in the right direction, at least.

We also asked Seung Jae Lee about how he envisions the Palletrone being used, besides as just a logistics platform for either commercial or industrial use. “By attaching a camera to the platform, it could serve as a flying tripod or even act as a dolly, allowing for flexible camera movements and angles,” he says. “This would be particularly useful in environments where specialized filming equipment is difficult to procure.”

And for those of you about to comment something along the lines of, “this can’t possibly have enough battery life to be real-world useful,” they’re already working to solve that, with a docking system that allows one Palletrone to change the battery of another in-flight:

One Palletrone swaps out the battery of a second Palletrone.Seoul Tech

The Palletrone Cart: Human-Robot Interaction-Based Aerial Cargo Transportation,” by Geonwoo Park, Hyungeun Park, Wooyong Park, Dongjae Lee, Murim Kim, and Seung Jae Lee from Seoul National University of Science and Technology in Korea, is published in IEEE Robotics And Automation Letters.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Zipline has (finally) posted some real live footage of its new Platform 2 drone, and while it’s just as weird looking as before, it seems to actually work really well.

[ Zipline ]

I appreciate Disney Research’s insistence on always eventually asking, “okay, but can we get this to work on a real robot in the real world?”

[ Paper from ETH Zurich and Disney Research [PDF] ]

In this video, we showcase our humanoid robot, Nadia, being remotely controlled for boxing training using a simple VR motion capture setup. A remote user takes charge of Nadia’s movements, demonstrating the power of our advanced teleoperation system. Watch as Nadia performs precise boxing moves, highlighting the potential for humanoid robots in dynamic, real-world tasks.

[ IHMC ]

Guide dogs are expensive to train and maintain—if available at all. Because of these limiting factors, relatively few blind people use them. Computer science assistant professor Donghyun Kim and Ph.D candidate Hochul Hwang are hoping to change that with the help of UMass database analyst Gail Gunn and her guide dog, Brawny.

[ University of Massachusetts, Amherst ]

Thanks Julia!

The current paradigm for motion planning generates solutions from scratch for every new problem, which consumes significant amounts of time and computational resources. Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy. We then combine this with lightweight optimization to obtain a safe path for real world deployment.

[ Neural MP ]

A nice mix of NAO and AI for embodied teaching.

[ Aldebaran ]

When retail and logistics giant Otto Group set out to strengthen its operational efficiency and safety, it turned to robotics and automation. The Otto Group has become the first company in Europe to deploy the mobile case handling robot Stretch, which unloads floor-loaded trailers and containers.

[ Boston Dynamics ]

From groceries to last-minute treats, Wing is here to make sure deliveries arrive quickly and safely. Our latest aircraft design features a larger, more standardized box and can carry a higher payload which came directly from customer and partner feedback.

[ Wing ]

It’s the jacket that gets me.

[ Devanthro ]

In this video, we introduce Rotograb, a robotic hand that merges the dexterity of human hands with the strength and efficiency of industrial grippers. Rotograb features a new rotating thumb mechanism, allowing for precision in-hand manipulation and power grasps while being adaptable. The robotic hand was developed by students during “Real World Robotics”, a master course by the Soft Robotics Lab at ETH Zurich.

[ ETH Zurich ]

A small scene where Rémi, our distinguished professor, is teaching chess to the person remotely operating Reachy! The grippers allow for easy and precise handling of chess pieces, even the small ones! The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen ]

Enhancing the adaptability and versatility of unmanned micro aerial vehicles (MAVs) is crucial for expanding their application range. In this article, we present a bimodal reconfigurable robot capable of operating in both regular quadcopter flight mode and a unique revolving flight mode, which allows independent control of the vehicle’s position and roll-pitch attitude.

[ City University Hong Kong ]

The Parallel Continuum Manipulator (PACOMA) is an advanced robotic system designed to replace traditional robotic arms in space missions, such as exploration, in-orbit servicing, and docking. Its design emphasizes robustness against misalignments and impacts, high precision and payload capacity, and sufficient mechanical damping for stable, controlled movements.

[ DFKI Robotics Innovation Center ]

Even the FPV pros from Team BlackSheep do, very occasionally, crash.

[ Team BlackSheep ]

This is a one-hour uninterrupted video of a robot cleaning bathrooms in real time. I’m not sure if it’s practical, but I am sure that it’s impressive, honestly.

[ Somatic ]



Four decades after the first IEEE International Conference on Robotics and Automation (ICRA) in Atlanta, robotics is bigger than ever. Next week in Rotterdam is the IEEE ICRA@40 conference, “a celebration of 40 years of pioneering research and technological advancements in robotics and automation.” There’s an ICRA every year, of course. Arguably the largest robotics research conference in the world, the 2024 edition was held in Yokohama, Japan back in May.

ICRA@40 is not just a second ICRA conference in 2024. Next week’s conference is a single track that promises “a journey through the evolution of robotics and automation,” through four days of short keynotes from prominent roboticists from across the entire field. You can see for yourself, the speaker list is nuts. There are also debates and panels tackling big ideas, like: “What progress has been made in different areas of robotics and automation over the past decades, and what key challenges remain?” Personally, I’d say “lots” and “most of them,” but that’s probably why I’m not going to be up on stage.

There will also be interactive research presentations, live demos, an expo, and more—the conference schedule is online now, and the abstracts are online as well. I’ll be there to cover it all, but if you can make it in person, it’ll be worth it.

Forty years ago is a long time, but it’s not that long, so just for fun, I had a look at the proceedings of ICRA 1984 which are available on IEEE Xplore, if you’re curious. Here’s an excerpt of the forward from the organizers, which included folks from International Business Machines and Bell Labs:

The proceedings of the first IEEE Computer Society International Conference on Robotics contains papers covering practically all aspects of robotics. The response to our call for papers has been overwhelming, and the number of papers submitted by authors outside the United States indicates the strong international interest in robotics.
The Conference program includes papers on: computer vision; touch and other local sensing; manipulator kinematics, dynamics, control and simulation; robot programming languages, operating systems, representation, planning, man-machine interfaces; multiple and mobile robot systems.
The technical level of the Conference is high with papers being presented by leading researchers in robotics. We believe that this conference, the first of a series to be sponsored by the IEEE, will provide a forum for the dissemination of fundamental research results in this fast developing field.

Technically, this was “ICR,” not “ICRA,” and it was put on by the IEEE Computer Society’s Technical Committee on Robotics, since there was no IEEE Robotics and Automation Society at that time; RAS didn’t get off the ground until 1987.

1984 ICR(A) had two tracks, and featured about 75 papers presented over three days. Looking through the proceedings, you’ll find lots of familiar names: Harry Asada, Ruzena Bajcsy, Ken Salisbury, Paolo Dario, Matt Mason, Toshio Fukuda, Ron Fearing, and Marc Raibert. Many of these folks will be at ICRA@40, so if you see them, make sure and thank them for helping to start it all, because 40 years of robotics is definitely something to celebrate.


The software used to control a robot is normally highly adapted to its specific physical set up. But now researchers have created a single general-purpose robotic control policy that can operate robotic arms, wheeled robots, quadrupeds, and even drones.

One of the biggest challenges when it comes to applying machine learning to robotics is the paucity of data. While computer vision and natural language processing can piggyback off the vast quantities of image and text data found on the Internet, collecting robot data is costly and time-consuming.

To get around this, there have been growing efforts to pool data collected by different groups on different kinds of robots, including the Open X-Embodiment and DROID datasets. The hope is that training on diverse robotics data will lead to “positive transfer,” which refers to when skills learned from training on one task help to boost performance on another.

The problem is that robots often have very different embodiments—a term used to describe their physical layout and suite of sensors and actuators—so the data they collect can vary significantly. For instance, a robotic arm might be static, have a complex arrangement of joints and fingers, and collect video from a camera on its wrist. In contrast, a quadruped robot is regularly on the move and relies on force feedback from its legs to maneuver. The kinds of tasks and actions these machines are trained to carry out are also diverse: The arm may pick and place objects, while the quadruped needs keen navigation.

That makes training a single AI model on these large collections of data challenging, says Homer Walke, a Ph.D. student at the University of California, Berkeley. So far, most attempts have either focused on data from a narrower selection of similar robots or researchers have manually tweaked data to make observations from different robots more similar. But in research to be presented at the Conference on Robot Learning (CoRL) in Munich in November, they unveiled a new model called CrossFormer that can train on data from a diverse set of robots and control them just as well as specialized control policies.

“We want to be able to train on all of this data to get the most capable robot,” says Walke. “The main advance in this paper is working out what kind of architecture works the best for accommodating all these varying inputs and outputs.”

How to control diverse robots with the same AI model

The team used the same model architecture that powers large language model, known as a transformer. In many ways, the challenge the researchers were trying to solve is not dissimilar to that facing a chatbot, says Walke. In language modeling, the AI has to to pick out similar patterns in sentences with different lengths and word orders. Robot data can also be arranged in a sequence much like a written sentence, but depending on the particular embodiment, observations and actions vary in length and order too.

“Words might appear in different locations in a sentence, but they still mean the same thing,” says Walke. “In our task, an observation image might appear in different locations in the sequence, but it’s still fundamentally an image and we still want to treat it like an image.”

UC Berkeley/Carnegie Mellon University

Most machine learning approaches work through a sequence one element at a time, but transformers can process the entire stream of data at once. This allows them to analyze the relationship between different elements and makes them better at handling sequences that are not standardized, much like the diverse data found in large robotics datasets.

Walke and his colleagues aren’t the first to train transformers on large-scale robotics data. But previous approaches have either trained solely on data from robotic arms with broadly similar embodiments or manually converted input data to a common format to make it easier to process. In contrast, CrossFormer can process images from cameras positioned above a robot, at head height or on a robotic arms wrist, as well as joint position data from both quadrupeds and robotic arms, without any tweaks.

The result is a single control policy that can operate single robotic arms, pairs of robotic arms, quadrupeds, and wheeled robots on tasks as varied as picking and placing objects, cutting sushi, and obstacle avoidance. Crucially, it matched the performance of specialized models tailored for each robot and outperformed previous approaches trained on diverse robotic data. The team even tested whether the model could control an embodiment not included in the dataset—a small quadcopter. While they simplified things by making the drone fly at a fixed altitude, CrossFormer still outperformed the previous best method.

“That was definitely pretty cool,” says Ria Doshi, an undergraduate student at Berkeley. “I think that as we scale up our policy to be able to train on even larger sets of diverse data, it’ll become easier to see this kind of zero shot transfer onto robots that have been completely unseen in the training.”

The limitations of one AI model for all robots

The team admits there’s still work to do, however. The model is too big for any of the robots’ embedded chips and instead has to be run from a server. Even then, processing times are only just fast enough to support real-time operation, and Walke admits that could break down if they scale up the model. “When you pack so much data into a model it has to be very big and that means running it for real-time control becomes difficult.”

One potential workaround would be to use an approach called distillation, says Oier Mees, a postdoctoral research at Berkley and part of the CrossFormer team. This essentially involves training a smaller model to mimic the larger model, and if successful can result in similar performance for a much smaller computational budget.

But of more importance than the computing resource problem is that the team failed to see any positive transfer in their experiments, as CrossFormer simply matched previous performance rather than exceeding it. Walke thinks progress in computer vision and natural language processing suggests that training on more data could be the key.

Others say it might not be that simple. Jeannette Bohg, a professor of robotics at Stanford University, says the ability to train on such a diverse dataset is a significant contribution. But she wonders whether part of the reason why the researchers didn’t see positive transfer is their insistence on not aligning the input data. Previous research that trained on robots with similar observation and action data has shown evidence of such cross-overs. “By getting rid of this alignment, they may have also gotten rid of this significant positive transfer that we’ve seen in other work,” Bohg says.

It’s also not clear if the approach will boost performance on tasks specific to particular embodiments or robotic applications, says Ram Ramamoorthy, a robotics professor at Edinburgh University. The work is a promising step towards helping robots capture concepts common to most robots, like “avoid this obstacle,” he says. But it may be less useful for tackling control problems specific to a particular robot, such as how to knead dough or navigate a forest, which are often the hardest to solve.



This is a sponsored article brought to you by Khalifa University of Science and Technology.

Abu Dhabi-based Khalifa University of Science and Technology in the United Arab Emirates (UAE) will be hosting the 36th edition of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.

aspect_ratio

Themed “Robotics for Sustainable Development,” the IROS 2024 will be held from 14-18 October 2024 at the Abu Dhabi National Exhibition Center (ADNEC) in the UAE’s capital city. It will offer a platform for universities and research institutions to display their research and innovation activities and initiatives in robotics, gathering researchers, academics, leading corporate majors, and industry professionals from around the globe.

A total of 13 forums, nine global-level competitions and challenges covering various aspects of robotics and AI, an IROS Expo, as well as an exclusive Career Fair will also be part of IROS 2024. The challenges and competitions will focus on physical or athletic intelligence of robots, remote robot navigation, robot manipulation, underwater robotics, as well as perception and sensing.

Delegates for the event will represent sectors including manufacturing, healthcare, logistics, agriculture, defense, security, and mining sectors with 60 percent of the talent pool having over six years of experience in robotics. A major component of the conference will be the poster sessions, keynotes, panel discussions by researchers and scientists, and networking events.

Khalifa University will be hosting IROS 2024 to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.Khalifa University

Abu Dhabi ranks first on the world’s safest cities list in 2024, according to online database Numbeo, out of 329 global cities in the 2024 standings, holding the title for eight consecutive years since 2017, reflecting the emirate’s ongoing efforts to ensure a good quality of life for citizens and residents.

With a multicultural community, Abu Dhabi is home to people from more than 200 nationalities and draws a large number of tourists to some of the top art galleries in the city such as Louvre Abu Dhabi and the Guggenheim Abu Dhabi, as well as other destinations such as Ferrari World Abu Dhabi and Warner Bros. World Abu Dhabi.

The UAE and Abu Dhabi have increasingly become a center for creative skillsets, human capital and advanced technologies, attracting several international and regional events such as the global COP28 UAE climate summit, in which more than 160 countries participated.

Abu Dhabi city itself has hosted a number of association conventions such as the 34th International Nursing Research Congress and is set to host the UNCTAD World Investment Forum, the 13th World Trade Organization (WTO) Ministerial Conference (MC13), the 12th World Environment Education Congress in 2024, and the IUCN World Conservation Congress in 2025.

Khalifa University’s Center for Robotics and Autonomous Systems (KU-CARS) includes a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.Khalifa University

Dr. Jorge Dias, IROS 2024 General Chair, said: “Khalifa University is delighted to bring the Intelligent Robots and Systems 2024 to Abu Dhabi in the UAE and highlight the innovations in line with the theme Robotics for Sustainable Development. As the region’s rapidly advancing capabilities in robotics and intelligent transport systems gain momentum, this event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities. By hosting IROS 2024, Khalifa University aims to reaffirm the UAE’s status as a global innovation hub and destination for all industry stakeholders to collaborate on cutting-edge research and explore opportunities for growth within the UAE’s innovation ecosystem.”

“This event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities” —Dr. Jorge Dias, IROS 2024 General Chair

Dr. Dias added: “The organizing committee of IROS 2024 has received over 4000 submissions representing 60 countries, with China leading with 1,029 papers, followed by the U.S. (777), Germany (302), and Japan (253), as well as the U.K. and South Korea (173 each). The UAE with a total of 68 papers comes atop the Arab region.”

Driving innovation at Khalifa University is the Center for Robotics and Autonomous Systems (KU-CARS) with around 50 researchers and state-of-the-art laboratory facilities, including a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.

IROS 2024 is sponsored by IEEE Robotics and Automation Society, Abu Dhabi Convention and Exhibition Bureau, the Robotics Society of Japan (RSJ), the Society of Instrument and Control Engineers (SICE), the New Technology Foundation, and the IEEE Industrial Electronics Society (IES).

More information at https://iros2024-abudhabi.org/



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Researchers at the Max Planck Institute for Intelligent Systems and ETH Zurich have developed a robotic leg with artificial muscles. Inspired by living creatures, it jumps across different terrains in an agile and energy-efficient manner.

[ Nature ] via [ MPI ]

Thanks, Toshi!

ETH Zurich researchers have now developed a fast robotic printing process for earth-based materials that does not require cement. In what is known as “impact printing,” a robot shoots material from above, gradually building a wall. On impact, the parts bond together, and very minimal additives are required.

[ ETH Zurich ]

How could you not be excited to see this happen for real?

[ arXiv paper ]

Can we all agree that sanding, grinding, deburring, and polishing tasks are really best done by robots, for the most part?

[ Cohesive Robotics ]

Thanks, David!

Using doors is a longstanding challenge in robotics and is of significant practical interest in giving robots greater access to human-centric spaces. The task is challenging due to the need for online adaptation to varying door properties and precise control in manipulating the door panel and navigating through the confined doorway. To address this, we propose a learning-based controller for a legged manipulator to open and traverse through doors.

[ arXiv paper ]

Isaac is the first robot assistant that’s built for the home. And we’re shipping it in fall of 2025.

Fall of 2025 is a long enough time from now that I’m not even going to speculate about it.

[ Weave Robotics ]

By patterning liquid metal paste onto a soft sheet of silicone or acrylic foam tape, we developed stretchable versions of conventional rigid circuits (like Arduinos). Our soft circuits can be stretched to over 300% strain (over 4x their length) and are integrated into active soft robots.

[ Science Robotics ] via [ Yale ]

NASA’s Curiosity rover is exploring a scientifically exciting area on Mars, but communicating with the mission team on Earth has recently been a challenge due to both the current season and the surrounding terrain. In this Mars Report, Curiosity engineer Reidar Larsen takes you inside the uplink room where the team talks to the rover.

[ NASA ]

I love this and want to burn it with fire.

[ Carpentopod ]

Very often, people ask us what Reachy 2 is capable of, which is why we’re showing you the manipulation possibilities (through teleoperation) of our technology. The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen Robotics ]

The Scalable Autonomous Robots (ScalAR) Lab is an interdisciplinary lab focused on fundamental research problems in robotics that lie at the intersection of robotics, nonlinear dynamical systems theory, and uncertainty.

[ ScalAR Lab ]

Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of #KawasakiRobotics industrial robots, giving students the necessary skills for future work.

[ Astorino ]

I guess fish-fillet-shaping robots need to exist because otherwise customers will freak out if all their fish fillets are not identical, or something?

[ Flexiv ]

Watch the second episode of the ExoMars Rosalind Franklin rover mission—Europe’s ambitious exploration journey to search for past and present signs of life on Mars. The rover will dig, collect, and investigate the chemical composition of material collected by a drill. Rosalind Franklin will be the first rover to reach a depth of up to two meters below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures.

[ ESA ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

The National Science Foundation Human AugmentatioN via Dexterity Engineering Research Center (HAND ERC) was announced in August 2024. Funded for up to 10 years and $52 million, the HAND ERC is led by Northwestern University, with core members Texas A&M, Florida A&M, Carnegie Mellon, and MIT, and support from Wisconsin-Madison, Syracuse, and an innovation ecosystem consisting of companies, national labs, and civic and advocacy organizations. HAND will develop versatile, easy-to-use dexterous robot end-effectors (hands).

[ HAND ]

The Environmental Robotics Lab at ETH Zurich, in partnership with Wilderness International (and some help from DJI and Audi), is using drones to sample DNA from the tops of trees in the Peruvian rainforest. Somehow, the treetops are where 60 to 90 percent of biodiversity is found, and these drones can help researchers determine what the heck is going on up there.

[ ERL ]

Thanks, Steffen!

1X introduces NEO Beta, “the pre-production build of our home humanoid.”

“Our priority is safety,” said Bernt Børnich, CEO at 1X. “Safety is the cornerstone that allows us to confidently introduce NEO Beta into homes, where it will gather essential feedback and demonstrate its capabilities in real-world settings. This year, we are deploying a limited number of NEO units in selected homes for research and development purposes. Doing so means we are taking another step toward achieving our mission.”

[ 1X ]

We love MangDang’s fun and affordable approach to robotics with Mini Pupper. The next generation of the little legged robot has just launched on Kickstarter, featuring new and updated robots that make it easy to explore embodied AI.

The Kickstarter is already fully funded after just a day or two, but there are still plenty of robots up for grabs.

[ Kickstarter ]

Quadrupeds in space can use their legs to reorient themselves. Or, if you throw one off a roof, it can learn to land on its feet.

To be presented at CoRL 2024.

[ ARL ]

HEBI Robotics, which apparently was once headquartered inside a Pittsburgh public bus, has imbued a table with actuators and a mind of its own.

[ HEBI Robotics ]

Carcinization is a concept in evolutionary biology where a crustacean that isn’t a crab eventually becomes a crab. So why not do the same thing with robots? Crab robots solve all problems!

[ KAIST ]

Waymo is smart, but also humans are really, really dumb sometimes.

[ Waymo ]

The Robotics Department of the University of Michigan created an interactive community art project. The group that led the creation believed that while roboticists typically take on critical and impactful problems in transportation, medicine, mobility, logistics, and manufacturing, there are many opportunities to find play and amusement. The final piece is a grid of art boxes, produced by different members of our robotics community, which offer an eight-inch square view into their own work with robotics.

[ Michigan Robotics ]

I appreciate that UBTECH’s humanoid is doing an actual job, but why would you use a humanoid for this?

[ UBTECH ]

I’m sure most actuators go through some form of lifecycle testing. But if you really want to test an electric motor, put it into a BattleBot and see what happens.

[ Hardcore Robotics ]

Yes, but have you tried fighting a BattleBot?

[ AgileX ]

In this video, we present collaboration aerial grasping and transportation using multiple quadrotors with cable-suspended payloads. Grasping using a suspended gripper requires accurate tracking of the electromagnet to ensure a successful grasp while switching between different slack and taut modes. In this work, we grasp the payload using a hybrid control approach that switches between a quadrotor position control and payload position control based on cable-slackness. Finally, we use two quadrotors with suspended electromagnet systems to collaboratively grasp and pick up a larger payload for transportation.

[ Hybrid Robotics ]

I had not realized that the floretizing of broccoli was so violent.

[ Oxipital ]

While the RoboCup was held over a month ago, we still wanted to make a small summary of our results, the most memorable moments, and of course a homage to everyone who is involved with the B-Human team. The team members, the sponsors, and the fans at home. Thank you so much for making B-Human the team it is!

[ B-Human ]


At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.

Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not the cheapest—humanoid around.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Imbuing robots with “human-level performance” in anything is an enormous challenge, but it’s worth it when you see a robot with the skill to interact with a human on a (nearly) human level. Google DeepMind has managed to achieve amateur human-level competence at table tennis, which is much harder than it looks, even for humans. Pannag Sanketi, a tech-lead manager in the robotics team at DeepMind, shared some interesting insights about performing the research. But first, video!

Some behind the scenes detail from Pannag:

  • The robot had not seen any participants before. So we knew we had a cool agent, but we had no idea how it was going to fare in a full match with real humans. To witness it outmaneuver even some of the most advanced players was such a delightful moment for team!
  • All the participants had a lot of fun playing against the robot, irrespective of who won the match. And all of them wanted to play more. Some of them said it will be great to have the robot as a playing partner. From the videos, you can even see how much fun the user study hosts sitting there (who are not authors on the paper) are having watching the games!
  • Barney, who is a professional coach, was an advisor on the project, and our chief evaluator of robot’s skills the way he evaluates his students. He also got surprised by how the robot is always able to learn from the last few weeks’ sessions.
  • We invested a lot in remote and automated 24x7 operations. So not the setup in this video, but there are other cells that we can run 24x7 with a ball thrower.
  • We even tried robot-vs-robot, i.e. 2 robots playing against each other! :) The line between collaboration and competition becomes very interesting when they try to learn by playing with each other.

[ DeepMind ]

Thanks, Heni!

Yoink.

[ MIT ]

Considering how their stability and recovery is often tested, teaching robot dogs to be shy of humans is an excellent idea.

[ Deep Robotics ]

Yes, quadruped robots need tow truck hooks.

[ Paper ]

Earthworm-inspired robots require novel actuators, and Ayato Kanada at Kyushu University has come up with a neat one.

[ Paper ]

Thanks, Ayato!

Meet the AstroAnt! This miniaturized swarm robot can ride atop a lunar rover and collect data related to its health, including surface temperatures and damage from micrometeoroid impacts. In the summer of 2024, with support from our collaborator Castrol, the Media Lab’s Space Exploration Initiative tested AstroAnt in the Canary Islands, where the volcanic landscape resembles the lunar surface.

[ MIT ]

Kengoro has a new forearm that mimics the human radioulnar joint giving it an even more natural badminton swing.

[ JSK Lab ]

Thanks, Kento!

Gromit’s concern that Wallace is becoming too dependent on his inventions proves justified, when Wallace invents a “smart” gnome that seems to develop a mind of its own. When it emerges that a vengeful figure from the past might be masterminding things, it falls to Gromit to battle sinister forces and save his master… or Wallace may never be able to invent again!

[ Wallace and Gromit ]

ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.

[ Kawasaki ]

Engineers at NASA’s Jet Propulsion Laboratory are testing a prototype of IceNode, a robot designed to access one of the most difficult-to-reach places on Earth. The team envisions a fleet of these autonomous robots deploying into unmapped underwater cavities beneath Antarctic ice shelves. There, they’d measure how fast the ice is melting — data that’s crucial to helping scientists accurately project how much global sea levels will rise.

[ IceNode ]

Los Alamos National Laboratory, in a consortium with four other National Laboratories, is leading the charge in finding the best practices to find orphaned wells. These abandoned wells can leak methane gas into the atmosphere and possibly leak liquid into the ground water.

[ LANL ]

Looks like Fourier has been working on something new, although this is still at the point of “looks like” rather than something real.

[ Fourier ]

Bio-Inspired Robot Hands: Altus Dexterity is a collaboration between researchers and professionals from Carnegie Mellon University, UPMC, the University of Illinois and the University of Houston.

[ Altus Dexterity ]

PiPER is a lightweight robotic arm with six integrated joint motors for smooth, precise control. Weighing just 4.2kg, it easily handles a 1.5kg payload and is made from durable yet lightweight materials for versatile use across various environments. Available for just $2,499 USD.

[ AgileX ]

At 104 years old, Lilabel has seen over a century of automotive transformation, from sharing a single car with her family in the 1920s to experiencing her first ride in a robotaxi.

[ Zoox ]

Traditionally, blind juggling robots use plates that are slightly concave to help them with ball control, but it’s also possible to make a blind juggler the hard way. Which, honestly, is much more impressive.

[ Jugglebot ]



In the 1960s and 1970s, NASA spent a lot of time thinking about whether toroidal (donut-shaped) fuel tanks were the way to go with its spacecraft. Toroidal tanks have a bunch of potential advantages over conventional spherical fuel tanks. For example, you can fit nearly 40% more volume within a toroidal tank than if you were using multiple spherical tanks within the same space. And perhaps most interestingly, you can shove stuff (like the back of an engine) through the middle of a toroidal tank, which could lead to some substantial efficiency gains if the tanks could also handle structural loads.

Because of their relatively complex shape, toroidal tanks are much more difficult to make than spherical tanks. Even though these tanks can perform better, NASA simply doesn’t have the expertise to manufacture them anymore, since each one has to be hand-built by highly skilled humans. But a company called Machina Labs thinks that they can do this with robots instead. And their vision is to completely change how we make things out of metal.

The fundamental problem that Machina Labs is trying to solve is that if you want to build parts out of metal efficiently at scale, it’s a slow process. Large metal parts need their own custom dies, which are very expensive one-offs that are about as inflexible as it’s possible to get, and then entire factories are built around these parts. It’s a huge investment, which means that it doesn’t matter if you find some new geometry or technique or material or market, because you have to justify that enormous up-front cost by making as much of the original thing as you possibly can, stifling the potential for rapid and flexible innovation.

On the other end of the spectrum you have the also very slow and expensive process of making metal parts one at a time by hand. A few hundred years ago, this was the only way of making metal parts: skilled metalworkers using hand tools for months to make things like armor and weapons. The nice thing about an expert metalworker is that they can use their skills and experience to make anything at all, which is where Machina Labs’ vision comes from, explains CEO Edward Mehr who co-founded Machina Labs after spending time at SpaceX followed by leading the 3D printing team at Relativity Space.

“Craftsmen can pick up different tools and apply them creatively to metal to do all kinds of different things. One day they can pick up a hammer and form a shield out of a sheet of metal,” says Mehr. “Next, they pick up the same hammer, and create a sword out of a metal rod. They’re very flexible.”

The technique that a human metalworker uses to shape metal is called forging, which preserves the grain flow of the metal as it’s worked. Casting, stamping, or milling metal (which are all ways of automating metal part production) are simply not as strong or as durable as parts that are forged, which can be an important differentiator for (say) things that have to go into space. But more on that in a bit.

The problem with human metalworkers is that the throughput is bad—humans are slow, and highly skilled humans in particular don’t scale well. For Mehr and Machina Labs, this is where the robots come in.

“We want to automate and scale using a platform called the ‘robotic craftsman.’ Our core enablers are robots that give us the kinematics of a human craftsman, and artificial intelligence that gives us control over the process,” Mehr says. “The concept is that we can do any process that a human craftsman can do, and actually some that humans can’t do because we can apply more force with better accuracy.”

This flexibility that robot metalworkers offer also enables the crafting of bespoke parts that would be impractical to make in any other way. These include toroidal (donut-shaped) fuel tanks that NASA has had its eye on for the last half century or so.

Machina Labs’ CEO Edward Mehr (on right) stands behind a 15 foot toroidal fuel tank.Machina Labs

“The main challenge of these tanks is that the geometry is complex,” Mehr says. “Sixty years ago, NASA was bump-forming them with very skilled craftspeople, but a lot of them aren’t around anymore.” Mehr explains that the only other way to get that geometry is with dies, but for NASA, getting a die made for a fuel tank that’s necessarily been customized for one single spacecraft would be pretty much impossible to justify. “So one of the main reasons we’re not using toroidal tanks is because it’s just hard to make them.”

Machina Labs is now making toroidal tanks for NASA. For the moment, the robots are just doing the shaping, which is the tough part. Humans then weld the pieces together. But there’s no reason why the robots couldn’t do the entire process end-to-end and even more efficiently. Currently, they’re doing it the “human” way based on existing plans from NASA. “In the future,” Mehr tells us, “we can actually form these tanks in one or two pieces. That’s the next area that we’re exploring with NASA—how can we do things differently now that we don’t need to design around human ergonomics?”

Machina Labs’ ‘robotic craftsmen’ work in pairs to shape sheet metal, with one robot on each side of the sheet. The robots align their tools slightly offset from each other with the metal between them such that as the robots move across the sheet, it bends between the tools. Machina Labs

The video above shows Machina’s robots working on a tank that’s 4.572 m (15 feet) in diameter, likely destined for the Moon. “The main application is for lunar landers,” says Mehr. “The toroidal tanks bring the center of gravity of the vehicle lower than what you would have with spherical or pill-shaped tanks.”

Training these robots to work metal like this is done primarily through physics-based simulations that Machina developed in house (existing software being too slow), followed by human-guided iterations based on the resulting real-world data. The way that metal moves under pressure can be simulated pretty well, and although there’s certainly still a sim-to-real gap (simulating how the robot’s tool adheres to the surface of the material is particularly tricky), the robots are collecting so much empirical data that Machina is making substantial progress towards full autonomy, and even finding ways to improve the process.

An example of the kind of complex metal parts that Machina’s robots are able to make.Machina Labs

Ultimately, Machina wants to use robots to produce all kinds of metal parts. On the commercial side, they’re exploring things like car body panels, offering the option to change how your car looks in geometry rather than just color. The requirement for a couple of beefy robots to make this work means that roboforming is unlikely to become as pervasive as 3D printing, but the broader concept is the same: making physical objects a software problem rather than a hardware problem to enable customization at scale.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

I think it’s time for us all to admit that some of the most interesting bipedal and humanoid research is being done by Disney.

[ Research Paper from ETH Zurich and Disney Research]

Over the past few months, Unitree G1 robot has been upgraded into a mass production version, with stronger performance, ultimate appearance, and being more in line with mass production requirements.

[ Unitree ]

This robot is from Kinisi Robotics, which was founded by Brennand Pierce, who also founded Bear Robotics. You can’t really tell from this video, but check out the website because the reach this robot has is bonkers.

Kinisi Robotics is on a mission to democratize access to advanced robotics with our latest innovation—a low-cost, dual-arm robot designed for warehouses, factories, and supermarkets. What sets our robot apart is its integration of LLM technology, enabling it to learn from demonstrations and perform complex tasks with minimal setup. Leveraging Brennand’s extensive experience in scaling robotic solutions, we’re able to produce this robot for under $20k, making it a game-changer in the industry.

[ Kinisi Robotics ]

Thanks Bren!

Finally, something that Atlas does that I am also physically capable of doing. In theory.

Okay, never mind. I don’t have those hips.

[ Boston Dynamics ]

Researchers in the Department of Mechanical Engineering at Carnegie Mellon University have created the first legged robot of its size to run, turn, push loads, and climb miniature stairs.

They say it can “run,” but I’m skeptical that there’s a flight phase unless someone sneezes nearby.

[ Carnegie Mellon University ]

The lights are cool and all, but it’s the pulsing soft skin that’s squigging me out.

[ Paper, Robotics Reports Vol.2 ]

Roofing is a difficult and dangerous enough job that it would be great if robots could take it over. It’ll be a challenge though.

[ Renovate Robotics ] via [ TechCrunch ]

Kento Kawaharazuka from JSK Robotics Laboratory at the University of Tokyo wrote in to share this paper, just accepted at RA-L, which (among other things) shows a robot using its flexible hands to identify objects through random finger motion.

[ Paper accepted by IEEE Robotics and Automation Letters ]

Thanks Kento!

It’s one thing to make robots that are reliable, and it’s another to make robots that are reliable and repairable by the end user. I don’t think iRobot gets enough credit for this.

[ iRobot ]

I like competitions where they say, “just relax and forget about the competition and show us what you can do.”

[ MBZIRC Maritime Grand Challenge ]

I kid you not, this used to be my job.

[ RoboHike ]



Boardwalk Robotics is announcing its entry into the increasingly crowded commercial humanoid(ish) space with Alex, a “workforce transformation” humanoid upper torso designed to work in manufacturing, logistics, and maintenance.

Before we get into Alex, let me take just a minute here to straighten out how Boardwalk Robotics is related to IHMC, the Institute for Human Machine Cognition in Pensacola, Florida. IHMC is, I think it’s fair to say, somewhat legendary when it comes to bipedal robotics—its DARPA Robotics Challenge team took second place in the final event (using a Boston Dynamics DRC Atlas), and when NASA needed someone to teach the agency’s Valkyrie humanoid to walk better, they sent it to IHMC.

Boardwalk, which was founded in 2017, has been a commercial partner with IHMC when it comes to the actual building of robots. The most visible example of this to date has been IHMC’s Nadia humanoid, a research platform which Boardwalk collaborated on and built. There’s obviously a lot of crossover between IHMC and Boardwalk in terms of institutional knowledge and experience, but Alex is a commercial robot developed entirely in-house by Boardwalk.

“We’ve used Nadia to learn a lot in the realm of dynamic locomotion research, and we’re taking all that and sticking it into a manipulation platform that’s ready for commercial work,” says Brandon Shrewsbury, Boardwalk Robotics’ CTO. “With Alex, we’re focusing on the manipulation side first, getting that well established. And then picking the mobility to match the task.”

The first thing you’ll notice about Alex is that it doesn’t have legs, at least for now. Boardwalk’s theory is that for a humanoid to be practical and cost effective in the near term, legs aren’t necessary, and that there are many tasks that offer a good return on investment where a stationary pedestal or a glorified autonomous mobile robotic base would be totally fine.

“There are going to be some problem sets that require legs, but there are many problem sets that don’t,” says Robert Griffin, a technical advisor at Boardwalk. “And there aren’t very many problem sets that don’t require halfway decent manipulation capabilities. So if we can design the manipulation well from the beginning, then we won’t have to depend on legs for making a robot that’s functionally useful.”

It certainly helps that Boardwalk isn’t at all worried about developing legs: “Every time we bring up a new humanoid, it’s something like twice as fast as the previous time,” Griffin says. This will be the eighth humanoid that IHMC has been involved in bringing up—I’d tell you more about all eight of those humanoids, but some of them are so secret that even I don’t know anything about them. Legs are definitely on the roadmap, but they’re not done yet, and IHMC will have a hand in their development to speed things along: It turns out that already having access to a functional (top of the line, really) locomotion stack is a big head start.

Alex’s actuators are all designed in-house, and the next version will feature new grippers that allow for quicker tool changes.Boardwalk Robotics

While the humanoid space is wide open right now and competition isn’t really an issue, looking ahead, Boardwalk sees safety as one of its primary differentiators since it’s not starting out with legs, says Shrewsbury. “For a full humanoid, there’s no way to make that completely safe. If it falls, it’s going to faceplant.” By keeping Alex on a stable base, it can work closer to humans and potentially move its arms much faster while also preserving a dynamic safety zone.

Alex is available for researchers to purchase immediately.Boardwalk Robotics

Despite its upbringing in research, Alex is not intended to be a research robot. You can buy it for research purposes, if you want, but Boardwalk will be selling Alex as a commercial robot. At the moment, Boardwalk is conducting pilot programs with Alex where they’re working in partnership with select customers, with the eventual goal of transitioning to a service model. The first few sectors that Boardwalk is targeting include logistics (because of course) and food processing, although as Boardwalk CEO Michael Morin one of the very first pilots is (appropriately enough) in aviation.

Morin, who helped to commercialize Barrett Technologies’ WAM Arm before spending some time at Vicarious Surgical as that company went public, joined Boardwalk to help them turn good engineering into a good product, which is arguably the hardest part of making useful robots (besides all the other hardest parts). “A lot of these companies are just learning about humanoids for the first time,” says Morin. “That makes the customer journey longer. But we’re putting in the effort to educate them on how this could be implemented in their world.”

If you want an Alex of your very own, Boardwalk is currently selecting commercial partners for a few more pilots. And for researchers, the robot is available right now.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

The title of this video is “Silly Robot Dog Jump” and that’s probably more than you need to know.

[ Deep Robotics ]

It’ll be great when robots are reliably autonomous, but until they get there, collaborative capabilities are a must.

[ Robust AI ]

I am so INCREDIBLY EXCITED for this.

[ IIT Instituto Italiano di Tecnologia ]

In this 3 minutes long one-take video, the LimX Dynamics CL-1 takes on the challenge of continuous heavy objects loading among shelves in a simulated warehouse, showcasing the advantages of the general-purpose form factor of humanoid robots.

[ LimX Dynamics ]

Birds, bats and many insects can tuck their wings against their bodies when at rest and deploy them to power flight. Whereas birds and bats use well-developed pectoral and wing muscles, how insects control their wing deployment and retraction remains unclear because this varies among insect species. Here we demonstrate that rhinoceros beetles can effortlessly deploy their hindwings without necessitating muscular activity. We validated the hypothesis using a flapping microrobot that passively deployed its wings for stable, controlled flight and retracted them neatly upon landing, demonstrating a simple, yet effective, approach to the design of insect-like flying micromachines.

[ Nature ]

Agility Robotics’ CTO, Pras Velagapudi, talks about data collection, and specifically about the different kinds we collect from our real-world robot deployments and generally what that data is used for.

[ Agility Robotics ]

Robots that try really hard but are bad at things are utterly charming.

[ University of Tokyo JSK Lab ]

The DARPA Triage Challenge unsurprisingly has a bunch of robots in it.

[ DARPA ]

The Cobalt security robot has been around for a while, but I have to say, the design really holds up—it’s a good looking robot.

[ Cobalt AI ]

All robots that enter elevators should be programmed to gently sway back and forth to the elevator music. Even if there’s no elevator music.

[ Somatic ]

ABB Robotics and the Texas Children’s Hospital have developed a groundbreaking lab automation solution using ABB’s YuMi® cobot to transfer fruit flies (Drosophila melanogaster) used in the study for developing new drugs for neurological conditions such as Alzheimer’s, Huntington’s and Parkinson’s.

[ ABB ]

Extend Robotics are building embodied AI enabling highly flexible automation for real-world physical tasks. The system features intuitive immersive interface enabling tele-operation, supervision and training AI models.

[ Extend Robotics ]

The recorded livestream of RSS 2024 is now online, in case you missed anything.

[ RSS 2024 ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

At ICRA 2024, in Tokyo last May, we sat down with the director of Shadow Robot, Rich Walker, to talk about the journey toward developing its newest model. Designed for reinforcement learning, the hand is extremely rugged, has three fingers that act like thumbs, and has fingertips that are highly sensitive to touch.

[ IEEE Spectrum ]

Food Angel is a food delivery robot to help with the problems of food insecurity and homelessness. Utilizing autonomous wheeled robots for this application may seem to be a good approach, especially with a number of successful commercial robotic delivery services. However, besides technical considerations such as range, payload, operation time, autonomy, etc., there are a number of important aspects that still need to be investigated, such as how the general public and the receiving end may feel about using robots for such applications, or human-robot interaction issues such as how to communicate the intent of the robot to the homeless.

[ RoMeLa ]

The UKRI FLF team RoboHike of UCL Computer Science of the Robot Perception and Learning lab with Forestry England demonstrate the ANYmal robot to help preserve the cultural heritage of an historic mine in the Forest of Dean, Gloucestershire, UK.

This clip is from a reboot of the British TV show “Time Team.” If you’re not already a fan of “Time Team,” let me just say that it is one of the greatest retro reality TV shows ever made, where actual archaeologists wander around the United Kingdom and dig stuff up. If they can find anything. Which they often can’t. And also it has Tony Robinson (from “Blackadder”), who runs everywhere for some reason. Go to Time Team Classics on YouTube for 70+ archived episodes.

[ UCL RPL ]

UBTECH humanoid robot Walker S Lite is working in Zeekr’s intelligent factory to complete handling tasks at the loading workstation for 21 consecutive days, and assist its employees with logistics work.

[ UBTECH ]

Current visual navigation systems often treat the environment as static, lacking the ability to adaptively interact with obstacles. This limitation leads to navigation failure when encountering unavoidable obstructions. In response, we introduce IN-Sight, a novel approach to self-supervised path planning, enabling more effective navigation strategies through interaction with obstacles.

[ ETH Zurich paper / IROS 2024 ]

When working on autonomous cars, sometimes it’s best to start small.

[ University of Pennsylvania ]

MIT MechE researchers introduce an approach called SimPLE (Simulation to Pick Localize and placE), a method of precise kitting, or pick and place, in which a robot learns to pick, regrasp, and place objects using the object’s computer-aided design (CAD) model, and all without any prior experience or encounters with the specific objects.

[ MIT ]

Staff, students (and quadruped robots!) from UCL Computer Science wish the Great Britain athletes the best of luck this summer in the Olympic Games & Paralympics.

[ UCL Robotics Institute ]

Walking in tall grass can be hard for robots, because they can’t see the ground that they’re actually stepping on. Here’s a technique to solve that, published in Robotics and Automation Letters last year.

[ ETH Zurich Robotic Systems Lab ]

There is no such thing as excess batter on a corn dog, and there is also no such thing as a defective donut. And apparently, making Kool-Aid drink pouches is harder than it looks.

[ Oxipital AI ]

Unitree has open-sourced its software to teleoperate humanoids in VR for training-data collection.

[ Unitree / GitHub ]

Nothing more satisfying than seeing point-cloud segments wiggle themselves into place, and CSIRO’s Wildcat SLAM does this better than anyone.

[ IEEE Transactions on Robotics ]

A lecture by Mentee Robotics CEO Lior Wolf, on Mentee’s AI approach.

[ Mentee Robotics ]



Today, Figure is introducing the newest, slimmest, shiniest, and least creatively named next generation of its humanoid robot: Figure 02. According to the press release, Figure 02 is the result of “a ground-up hardware and software redesign” and is “the highest performing humanoid robot,” which may even be true for some arbitrary value of “performing.” Also notable is that Figure has been actively testing robots with BMW at a manufacturing plant in Spartanburg, S.C., where the new humanoid has been performing “data collection and use case training.”

The rest of the press release is pretty much, “Hey, check out our new robot!” And you’ll get all of the content in the release by watching the videos. What you won’t get from the videos is any additional info about the robot. But we sent along some questions to Figure about these videos, and have a few answers from Michael Rose, director of controls, and Vadim Chernyak, director of hardware.

First, the trailer:

How many parts does Figure 02 have, and is this all of them?

Figure: A couple hundred unique parts and a couple thousand parts total. No, this is not all of them.

Does Figure 02 make little Figure logos with every step?

Figure: If the surface is soft enough, yes.

Swappable legs! Was that hard to do, or easier to do because you only have to make one leg? Figure: We chose to make swappable legs to help with manufacturing.

Is the battery pack swappable too?

Figure: Our battery is swappable, but it is not a quick swap procedure.

What’s that squishy-looking stuff on the back of Figure 02’s knees and in its elbow joints?

Figure: These are soft stops which limit the range of motion in a controlled way and prevent robot pinch points

Where’d you hide that thumb motor?

Figure: The thumb is now fully contained in the hand.

Tell me about the “skin” on the neck!

Figure: The skin is a soft fabric which is able to keep a clean seamless look even as the robot moves its head.

And here’s the reveal video:

When Figure 02’s head turns, its body turns too, and its arms move. Is that necessary, or aesthetic?

Figure: Aesthetic.

The upper torso and shoulders seem very narrow compared to other humanoids. Why is that?

Figure: We find it essential to package the robot to be of similar proportions to a human. This allows us to complete our target use cases and fit into our environment more easily.

What can you tell me about Figure 02’s walking gait?

Figure: The robot is using a model predictive controller to determine footstep locations and forces required to maintain balance and follow the desired robot trajectory.

How much runtime do you get from 2.25 kilowatt-hours doing the kinds of tasks that we see in the video?

Figure: We are targeting a 5-hour run time for our product.


Slick, but also a little sinister?Figure

This thing looks slick. I’d say that it’s maybe a little too far on the sinister side for a robot intended to work around humans, but the industrial design is badass and the packaging is excellent, with the vast majority of the wiring now integrated within the robot’s skins and flexible materials covering joints that are typically left bare. Figure, if you remember, raised a US $675 million Series B that valued the company at $2.6 billion, and somehow the look of this robot seems appropriate to that.

I do still have some questions about Figure 02, such as where the interesting foot design came from and whether a 16-degree-of-freedom hand is really worth it in the near term. It’s also worth mentioning that Figure seems to have a fair number of Figure 02 robots running around—at least five units at its California headquarters, plus potentially a couple of more at the BMW Spartanburg manufacturing facility.

I also want to highlight this boilerplate at the end of the release: “our humanoid is designed to perform human-like tasks within the workforce and in the home.” We are very, very far away from a humanoid robot in the home, but I appreciate that it’s still an explicit goal that Figure is trying to achieve. Because I want one.



Rodney Brooks is the Panasonic Professor of Robotics (emeritus) at MIT, where he was director of the AI Lab and then CSAIL. He has been cofounder of iRobot, Rethink Robotics, and Robust AI, where he is currently CTO. This article is shared with permission from his blog.

Here are some of the things I’ve learned about robotics after working in the field for almost five decades. In honor of Isaac Asimov and Arthur C. Clarke, my two boyhood go-to science fiction writers, I’m calling them my three laws of robotics.

  1. The visual appearance of a robot makes a promise about what it can do and how smart it is. It needs to deliver or slightly overdeliver on that promise or it will not be accepted.
  2. When robots and people coexist in the same spaces, the robots must not take away from people’s agency, particularly when the robots are failing, as inevitably they will at times.
  3. Technologies for robots need 10+ years of steady improvement beyond lab demos of the target tasks to mature to low cost and to have their limitations characterized well enough that they can deliver 99.9 percent of the time. Every 10 more years gets another 9 in reliability.

Below I explain each of these laws in more detail. But in a related post here are my three laws of artificial intelligence.

Note that these laws are written from the point of view of making robots work in the real world, where people pay for them, and where people want return on their investment. This is very different from demonstrating robots or robot technologies in the laboratory.

In the lab there is phalanx of graduate students eager to demonstrate their latest idea, on which they have worked very hard, to show its plausibility. Their interest is in showing that a technique or technology that they have developed is plausible and promising. They will do everything in their power to nurse the robot through the demonstration to make that point, and they will eagerly explain everything about what they have developed and what could come next.

In the real world there is just the customer, or the employee or relative of the customer. The robot has to work with no external intervention from the people who designed and built it. It needs to be a good experience for the people around it or there will not be more sales to those, and perhaps other, customers.

So these laws are not about what might, or could, be done. They are about real robots deployed in the real world. The laws are not about research demonstrations. They are about robots in everyday life.

The Promise Given By Appearance

My various companies have produced all sorts of robots and sold them at scale. A lot of thought goes into the visual appearance of the robot when it is designed, as that tells the buyer or user what to expect from it.

The iRobot Roomba was carefully designed to meld looks with function.iStock

The Roomba, from iRobot, looks like a flat disk. It cleans floors. The disk shape was so that it could turn in place without hitting anything it wasn’t already hitting. The low profile of the disk was so that it could get under the toe kicks in kitchens and clean the floor that is overhung just a little by kitchen cabinets. It does not look like it can go up and down stairs or even a single step up or step down in a house and it cannot. It has a handle, which makes it look like it can be picked up by a person, and it can be. Unlike fictional Rosey the Robot it does not look like it could clean windows, and it cannot. It cleans floors, and that is it.

The Packbot, the remotely operable military robot, also from iRobot, looked very different indeed. It has tracked wheels, like a miniature tank, and that appearance promises anyone who looks at it that it can go over rough terrain, and is not going to be stopped by steps or rocks or drops in terrain. When the Fukushima disaster happened, in 2011, Packbots were able to operate in the reactor buildings that had been smashed and wrecked by the tsunami, open door handles under remote control, drive up rubble-covered staircases and get their cameras pointed at analog pressure and temperature gauges so that workers trying to safely secure the nuclear plant had some data about what was happening in highly radioactive areas of the plant.

An iRobot PackBot picks up a demonstration object at the Joint Robotics Repair Detachment at Victory Base Complex in Baghdad.Alamy

The point of this first law of robotics is to warn against making a robot appear more than it actually is. Perhaps that will get funding for your company, leading investors to believe that in time the robot will be able to do all the things its physical appearance suggests it might be able to do. But it is going to disappoint customers when it cannot do the sorts of things that something with that physical appearance looks like it can do. Glamming up a robot risks overpromising what the robot as a product can actually do. That risks disappointing customers. And disappointed customers are not going to be advocates for your product/robot, nor be repeat buyers.

Preserving People’s Agency

The worst thing for its acceptance by people that a robot can do in the workplace is to make their jobs or lives harder, by not letting them do what they need to do.

Robots that work in hospitals taking dirty sheets or dishes from a patient floor to where they are to be cleaned are meant to make the lives of the nurses easier. But often they do exactly the opposite. If the robots are not aware of what is happening and do not get out of the way when there is an emergency they will probably end up blocking some lifesaving work by the nurses—e.g., pushing a gurney with a critically ill patient on it to where they need to be for immediate treatment. That does not endear such a robot to the hospital staff. It has interfered with their main job function, a function of which the staff is proud, and what motivates them to do such work.

A lesser, but still unacceptable behavior of robots in hospitals, is to have them wait in front of elevator doors, central, and blocking for people. It makes it harder for people to do some things they need to do all the time in that environment—enter and exit elevators.

Those of us who live in San Francisco or Austin, Texas, have had firsthand views of robots annoying people daily for the last few years. The robots in question have been autonomous vehicles, driving around the city with no human occupant. I see these robots every single time I leave my house, whether on foot or by car.

Some of the vehicles were notorious for blocking intersections, and there was absolutely nothing that other drivers, pedestrians, or police could do. We just had to wait until some remote operator hidden deep inside the company that deployed them decided to pay attention to the stuck vehicle and get it out of people’s way. Worse, they would wander into the scene of a fire where there were fire trucks and firefighters and actual buildings on fire, get confused and just stop, sometime on top of the fire hoses.

There was no way for the firefighters to move the vehicles, nor communicate with them. This is in contrast to an automobile driven by a human driver. Firefighters can use their normal social interactions to communicate with a driver, and use their privileged position in society as frontline responders to apply social pressure on a human driver to cooperate with them. Not so with the autonomous vehicles.

The autonomous vehicles took agency from people going about their regular business on the streets, but worse took away agency from firefighters whose role is to protect other humans. Deployed robots that do not respect people and what they need to do will not get respect from people and the robots will end up undeployed.

Robust Robots That Work Every Time

Making robots that work reliably in the real world is hard. In fact, making anything that works physically in the real world, and is reliable, is very hard.

For a customer to be happy with a robot it must appear to work every time it tries a task, otherwise it will frustrate the user to the point that they will question whether it makes their life better or not.

But what does appear mean here? It means that the user can have the assumption that it going to work, as their default understanding of what will happen in the world.

The tricky part is that robots interact with the real physical world.

Software programs interact with a well-understood abstracted machine, so they tend not fail in a manner where the instructions in them do not get executed in a consistent way by the hardware on which they are running. Those same programs may also interact with the physical world, be it a human being, a network connection, or an input device like a mouse. It is then that the programs might fail as the instructions in them are based on assumptions in the real world that are not met.

Robots are subject to forces in the real world, subject to the exact position of objects relative to them, and subject to interacting with humans who are very variable in their behavior. There are no teams of graduate students or junior engineers eager to make the robot succeed on the 8,354th attempt to do the same thing that has worked so many times before. Getting software that adequately adapts to the uncertain changes in the world in that particular instance and that particular instant of time is where the real challenge arises in robotics.

Great-looking videos are just not the same things as working for a customer every time. Most of what we see in the news about robots is lab demonstrations. There is no data on how general the solution is, nor how many takes it took to get the video that is shown. Even worse sometimes the videos are tele-operated or sped up many times over.

I have rarely seen a new technology that is less than ten years out from a lab demo make it into a deployed robot. It takes time to see how well the method works, and to characterize it well enough that it is unlikely to fail in a deployed robot that is working by itself in the real world. Even then there will be failures, and it takes many more years of shaking out the problem areas and building it into the robot product in a defensive way so that the failure does not happen again.

Most robots require kill buttons or estops on them so that a human can shut them down. If a customer ever feels the need to hit that button, then the people who have built and sold the robot have failed. They have not made it operate well enough that the robot never gets into a state where things are going that wrong.

Pages