Feed aggregator



Where’s your flying car? I’m sorry to say that I have no idea. But here’s something that is somewhat similar, in that it flies, transports things, and has “car” in the name: it’s a flying cart, called the Palletrone (pallet+drone), designed for human-robot interaction-based aerial cargo transportation.

The way this thing works is fairly straightforward. The Palletrone will try to keep its roll and pitch at zero, to make sure that there’s a flat and stable platform for your preciouses, even if you don’t load those preciouses onto the drone evenly. Once loaded up, the drone relies on you to tell it where to go and what to do, using its IMU to respond to the slightest touch and translating those forces into control over the Palletrone’s horizontal, vertical, and yaw trajectories. This is particularly tricky to do, because the system has to be able to differentiate between the force exerted by cargo, and the force exerted by a human, since if the IMU senses a force moving the drone downward, it could be either. But professor Seung Jae Lee tells us that they developed “a simple but effective method to distinguish between them.”

Since the drone has to do all of this sensing and movement without pitching or rolling (since that would dump its cargo directly onto the floor) it’s equipped with internal propeller arms that can be rotated to vector thrust in any direction. We were curious about how having a bunch of unpredictable stuff sitting right above those rotors might affect the performance of the drone. But Seung Jae Lee says that the drone’s porous side structures allow for sufficient airflow and that even when the entire top of the drone is covered, thrust is only decreased by about 5 percent.

The current incarnation of the Palletrone is not particularly smart, and you need to remain in control of it, although if you let it go it will do its best to remain stationary (until it runs out of batteries). The researchers describe the experience of using this thing as “akin to maneuvering a shopping cart,” although I would guess that it’s somewhat noisier. In the video, the Palletrone is loaded down with just under 3 kilograms of cargo, which is respectable enough for testing. The drone is obviously not powerful enough to haul your typical grocery bag up the stairs to your apartment. But, it’s a couple of steps in the right direction, at least.

We also asked Seung Jae Lee about how he envisions the Palletrone being used, besides as just a logistics platform for either commercial or industrial use. “By attaching a camera to the platform, it could serve as a flying tripod or even act as a dolly, allowing for flexible camera movements and angles,” he says. “This would be particularly useful in environments where specialized filming equipment is difficult to procure.”

And for those of you about to comment something along the lines of, “this can’t possibly have enough battery life to be real-world useful,” they’re already working to solve that, with a docking system that allows one Palletrone to change the battery of another in-flight:

One Palletrone swaps out the battery of a second Palletrone.Seoul Tech

The Palletrone Cart: Human-Robot Interaction-Based Aerial Cargo Transportation,” by Geonwoo Park, Hyungeun Park, Wooyong Park, Dongjae Lee, Murim Kim, and Seung Jae Lee from Seoul National University of Science and Technology in Korea, is published in IEEE Robotics And Automation Letters.



Where’s your flying car? I’m sorry to say that I have no idea. But here’s something that is somewhat similar, in that it flies, transports things, and has “car” in the name: it’s a flying cart, called the Palletrone (pallet+drone), designed for human-robot interaction-based aerial cargo transportation.

The way this thing works is fairly straightforward. The Palletrone will try to keep its roll and pitch at zero, to make sure that there’s a flat and stable platform for your preciouses, even if you don’t load those preciouses onto the drone evenly. Once loaded up, the drone relies on you to tell it where to go and what to do, using its IMU to respond to the slightest touch and translating those forces into control over the Palletrone’s horizontal, vertical, and yaw trajectories. This is particularly tricky to do, because the system has to be able to differentiate between the force exerted by cargo, and the force exerted by a human, since if the IMU senses a force moving the drone downward, it could be either. But professor Seung Jae Lee tells us that they developed “a simple but effective method to distinguish between them.”

Since the drone has to do all of this sensing and movement without pitching or rolling (since that would dump its cargo directly onto the floor) it’s equipped with internal propeller arms that can be rotated to vector thrust in any direction. We were curious about how having a bunch of unpredictable stuff sitting right above those rotors might affect the performance of the drone. But Seung Jae Lee says that the drone’s porous side structures allow for sufficient airflow and that even when the entire top of the drone is covered, thrust is only decreased by about 5 percent.

The current incarnation of the Palletrone is not particularly smart, and you need to remain in control of it, although if you let it go it will do its best to remain stationary (until it runs out of batteries). The researchers describe the experience of using this thing as “akin to maneuvering a shopping cart,” although I would guess that it’s somewhat noisier. In the video, the Palletrone is loaded down with just under 3 kilograms of cargo, which is respectable enough for testing. The drone is obviously not powerful enough to haul your typical grocery bag up the stairs to your apartment. But, it’s a couple of steps in the right direction, at least.

We also asked Seung Jae Lee about how he envisions the Palletrone being used, besides as just a logistics platform for either commercial or industrial use. “By attaching a camera to the platform, it could serve as a flying tripod or even act as a dolly, allowing for flexible camera movements and angles,” he says. “This would be particularly useful in environments where specialized filming equipment is difficult to procure.”

And for those of you about to comment something along the lines of, “this can’t possibly have enough battery life to be real-world useful,” they’re already working to solve that, with a docking system that allows one Palletrone to change the battery of another in-flight:

One Palletrone swaps out the battery of a second Palletrone.Seoul Tech

The Palletrone Cart: Human-Robot Interaction-Based Aerial Cargo Transportation,” by Geonwoo Park, Hyungeun Park, Wooyong Park, Dongjae Lee, Murim Kim, and Seung Jae Lee from Seoul National University of Science and Technology in Korea, is published in IEEE Robotics And Automation Letters.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Zipline has (finally) posted some real live footage of its new Platform 2 drone, and while it’s just as weird looking as before, it seems to actually work really well.

[ Zipline ]

I appreciate Disney Research’s insistence on always eventually asking, “okay, but can we get this to work on a real robot in the real world?”

[ Paper from ETH Zurich and Disney Research [PDF] ]

In this video, we showcase our humanoid robot, Nadia, being remotely controlled for boxing training using a simple VR motion capture setup. A remote user takes charge of Nadia’s movements, demonstrating the power of our advanced teleoperation system. Watch as Nadia performs precise boxing moves, highlighting the potential for humanoid robots in dynamic, real-world tasks.

[ IHMC ]

Guide dogs are expensive to train and maintain—if available at all. Because of these limiting factors, relatively few blind people use them. Computer science assistant professor Donghyun Kim and Ph.D candidate Hochul Hwang are hoping to change that with the help of UMass database analyst Gail Gunn and her guide dog, Brawny.

[ University of Massachusetts, Amherst ]

Thanks Julia!

The current paradigm for motion planning generates solutions from scratch for every new problem, which consumes significant amounts of time and computational resources. Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy. We then combine this with lightweight optimization to obtain a safe path for real world deployment.

[ Neural MP ]

A nice mix of NAO and AI for embodied teaching.

[ Aldebaran ]

When retail and logistics giant Otto Group set out to strengthen its operational efficiency and safety, it turned to robotics and automation. The Otto Group has become the first company in Europe to deploy the mobile case handling robot Stretch, which unloads floor-loaded trailers and containers.

[ Boston Dynamics ]

From groceries to last-minute treats, Wing is here to make sure deliveries arrive quickly and safely. Our latest aircraft design features a larger, more standardized box and can carry a higher payload which came directly from customer and partner feedback.

[ Wing ]

It’s the jacket that gets me.

[ Devanthro ]

In this video, we introduce Rotograb, a robotic hand that merges the dexterity of human hands with the strength and efficiency of industrial grippers. Rotograb features a new rotating thumb mechanism, allowing for precision in-hand manipulation and power grasps while being adaptable. The robotic hand was developed by students during “Real World Robotics”, a master course by the Soft Robotics Lab at ETH Zurich.

[ ETH Zurich ]

A small scene where Rémi, our distinguished professor, is teaching chess to the person remotely operating Reachy! The grippers allow for easy and precise handling of chess pieces, even the small ones! The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen ]

Enhancing the adaptability and versatility of unmanned micro aerial vehicles (MAVs) is crucial for expanding their application range. In this article, we present a bimodal reconfigurable robot capable of operating in both regular quadcopter flight mode and a unique revolving flight mode, which allows independent control of the vehicle’s position and roll-pitch attitude.

[ City University Hong Kong ]

The Parallel Continuum Manipulator (PACOMA) is an advanced robotic system designed to replace traditional robotic arms in space missions, such as exploration, in-orbit servicing, and docking. Its design emphasizes robustness against misalignments and impacts, high precision and payload capacity, and sufficient mechanical damping for stable, controlled movements.

[ DFKI Robotics Innovation Center ]

Even the FPV pros from Team BlackSheep do, very occasionally, crash.

[ Team BlackSheep ]

This is a one-hour uninterrupted video of a robot cleaning bathrooms in real time. I’m not sure if it’s practical, but I am sure that it’s impressive, honestly.

[ Somatic ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Zipline has (finally) posted some real live footage of its new Platform 2 drone, and while it’s just as weird looking as before, it seems to actually work really well.

[ Zipline ]

I appreciate Disney Research’s insistence on always eventually asking, “okay, but can we get this to work on a real robot in the real world?”

[ Paper from ETH Zurich and Disney Research [PDF] ]

In this video, we showcase our humanoid robot, Nadia, being remotely controlled for boxing training using a simple VR motion capture setup. A remote user takes charge of Nadia’s movements, demonstrating the power of our advanced teleoperation system. Watch as Nadia performs precise boxing moves, highlighting the potential for humanoid robots in dynamic, real-world tasks.

[ IHMC ]

Guide dogs are expensive to train and maintain—if available at all. Because of these limiting factors, relatively few blind people use them. Computer science assistant professor Donghyun Kim and Ph.D candidate Hochul Hwang are hoping to change that with the help of UMass database analyst Gail Gunn and her guide dog, Brawny.

[ University of Massachusetts, Amherst ]

Thanks Julia!

The current paradigm for motion planning generates solutions from scratch for every new problem, which consumes significant amounts of time and computational resources. Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy. We then combine this with lightweight optimization to obtain a safe path for real world deployment.

[ Neural MP ]

A nice mix of NAO and AI for embodied teaching.

[ Aldebaran ]

When retail and logistics giant Otto Group set out to strengthen its operational efficiency and safety, it turned to robotics and automation. The Otto Group has become the first company in Europe to deploy the mobile case handling robot Stretch, which unloads floor-loaded trailers and containers.

[ Boston Dynamics ]

From groceries to last-minute treats, Wing is here to make sure deliveries arrive quickly and safely. Our latest aircraft design features a larger, more standardized box and can carry a higher payload which came directly from customer and partner feedback.

[ Wing ]

It’s the jacket that gets me.

[ Devanthro ]

In this video, we introduce Rotograb, a robotic hand that merges the dexterity of human hands with the strength and efficiency of industrial grippers. Rotograb features a new rotating thumb mechanism, allowing for precision in-hand manipulation and power grasps while being adaptable. The robotic hand was developed by students during “Real World Robotics”, a master course by the Soft Robotics Lab at ETH Zurich.

[ ETH Zurich ]

A small scene where Rémi, our distinguished professor, is teaching chess to the person remotely operating Reachy! The grippers allow for easy and precise handling of chess pieces, even the small ones! The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen ]

Enhancing the adaptability and versatility of unmanned micro aerial vehicles (MAVs) is crucial for expanding their application range. In this article, we present a bimodal reconfigurable robot capable of operating in both regular quadcopter flight mode and a unique revolving flight mode, which allows independent control of the vehicle’s position and roll-pitch attitude.

[ City University Hong Kong ]

The Parallel Continuum Manipulator (PACOMA) is an advanced robotic system designed to replace traditional robotic arms in space missions, such as exploration, in-orbit servicing, and docking. Its design emphasizes robustness against misalignments and impacts, high precision and payload capacity, and sufficient mechanical damping for stable, controlled movements.

[ DFKI Robotics Innovation Center ]

Even the FPV pros from Team BlackSheep do, very occasionally, crash.

[ Team BlackSheep ]

This is a one-hour uninterrupted video of a robot cleaning bathrooms in real time. I’m not sure if it’s practical, but I am sure that it’s impressive, honestly.

[ Somatic ]



Four decades after the first IEEE International Conference on Robotics and Automation (ICRA) in Atlanta, robotics is bigger than ever. Next week in Rotterdam is the IEEE ICRA@40 conference, “a celebration of 40 years of pioneering research and technological advancements in robotics and automation.” There’s an ICRA every year, of course. Arguably the largest robotics research conference in the world, the 2024 edition was held in Yokohama, Japan back in May.

ICRA@40 is not just a second ICRA conference in 2024. Next week’s conference is a single track that promises “a journey through the evolution of robotics and automation,” through four days of short keynotes from prominent roboticists from across the entire field. You can see for yourself, the speaker list is nuts. There are also debates and panels tackling big ideas, like: “What progress has been made in different areas of robotics and automation over the past decades, and what key challenges remain?” Personally, I’d say “lots” and “most of them,” but that’s probably why I’m not going to be up on stage.

There will also be interactive research presentations, live demos, an expo, and more—the conference schedule is online now, and the abstracts are online as well. I’ll be there to cover it all, but if you can make it in person, it’ll be worth it.

Forty years ago is a long time, but it’s not that long, so just for fun, I had a look at the proceedings of ICRA 1984 which are available on IEEE Xplore, if you’re curious. Here’s an excerpt of the forward from the organizers, which included folks from International Business Machines and Bell Labs:

The proceedings of the first IEEE Computer Society International Conference on Robotics contains papers covering practically all aspects of robotics. The response to our call for papers has been overwhelming, and the number of papers submitted by authors outside the United States indicates the strong international interest in robotics.
The Conference program includes papers on: computer vision; touch and other local sensing; manipulator kinematics, dynamics, control and simulation; robot programming languages, operating systems, representation, planning, man-machine interfaces; multiple and mobile robot systems.
The technical level of the Conference is high with papers being presented by leading researchers in robotics. We believe that this conference, the first of a series to be sponsored by the IEEE, will provide a forum for the dissemination of fundamental research results in this fast developing field.

Technically, this was “ICR,” not “ICRA,” and it was put on by the IEEE Computer Society’s Technical Committee on Robotics, since there was no IEEE Robotics and Automation Society at that time; RAS didn’t get off the ground until 1987.

1984 ICR(A) had two tracks, and featured about 75 papers presented over three days. Looking through the proceedings, you’ll find lots of familiar names: Harry Asada, Ruzena Bajcsy, Ken Salisbury, Paolo Dario, Matt Mason, Toshio Fukuda, Ron Fearing, and Marc Raibert. Many of these folks will be at ICRA@40, so if you see them, make sure and thank them for helping to start it all, because 40 years of robotics is definitely something to celebrate.


Four decades after the first IEEE International Conference on Robotics and Automation (ICRA) in Atlanta, robotics is bigger than ever. Next week in Rotterdam is the IEEE ICRA@40 conference, “a celebration of 40 years of pioneering research and technological advancements in robotics and automation.” There’s an ICRA every year, of course. Arguably the largest robotics research conference in the world, the 2024 edition was held in Yokohama, Japan back in May.

ICRA@40 is not just a second ICRA conference in 2024. Next week’s conference is a single track that promises “a journey through the evolution of robotics and automation,” through four days of short keynotes from prominent roboticists from across the entire field. You can see for yourself, the speaker list is nuts. There are also debates and panels tackling big ideas, like: “What progress has been made in different areas of robotics and automation over the past decades, and what key challenges remain?” Personally, I’d say “lots” and “most of them,” but that’s probably why I’m not going to be up on stage.

There will also be interactive research presentations, live demos, an expo, and more—the conference schedule is online now, and the abstracts are online as well. I’ll be there to cover it all, but if you can make it in person, it’ll be worth it.

Forty years ago is a long time, but it’s not that long, so just for fun, I had a look at the proceedings of ICRA 1984 which are available on IEEE Xplore, if you’re curious. Here’s an excerpt of the forward from the organizers, which included folks from International Business Machines and Bell Labs:

The proceedings of the first IEEE Computer Society International Conference on Robotics contains papers covering practically all aspects of robotics. The response to our call for papers has been overwhelming, and the number of papers submitted by authors outside the United States indicates the strong international interest in robotics.
The Conference program includes papers on: computer vision; touch and other local sensing; manipulator kinematics, dynamics, control and simulation; robot programming languages, operating systems, representation, planning, man-machine interfaces; multiple and mobile robot systems.
The technical level of the Conference is high with papers being presented by leading researchers in robotics. We believe that this conference, the first of a series to be sponsored by the IEEE, will provide a forum for the dissemination of fundamental research results in this fast developing field.

Technically, this was “ICR,” not “ICRA,” and it was put on by the IEEE Computer Society’s Technical Committee on Robotics, since there was no IEEE Robotics and Automation Society at that time; RAS didn’t get off the ground until 1987.

1984 ICR(A) had two tracks, and featured about 75 papers presented over three days. Looking through the proceedings, you’ll find lots of familiar names: Harry Asada, Ruzena Bajcsy, Ken Salisbury, Paolo Dario, Matt Mason, Toshio Fukuda, Ron Fearing, and Marc Raibert. Many of these folks will be at ICRA@40, so if you see them, make sure and thank them for helping to start it all, because 40 years of robotics is definitely something to celebrate.


The software used to control a robot is normally highly adapted to its specific physical set up. But now researchers have created a single general-purpose robotic control policy that can operate robotic arms, wheeled robots, quadrupeds, and even drones.

One of the biggest challenges when it comes to applying machine learning to robotics is the paucity of data. While computer vision and natural language processing can piggyback off the vast quantities of image and text data found on the Internet, collecting robot data is costly and time-consuming.

To get around this, there have been growing efforts to pool data collected by different groups on different kinds of robots, including the Open X-Embodiment and DROID datasets. The hope is that training on diverse robotics data will lead to “positive transfer,” which refers to when skills learned from training on one task help to boost performance on another.

The problem is that robots often have very different embodiments—a term used to describe their physical layout and suite of sensors and actuators—so the data they collect can vary significantly. For instance, a robotic arm might be static, have a complex arrangement of joints and fingers, and collect video from a camera on its wrist. In contrast, a quadruped robot is regularly on the move and relies on force feedback from its legs to maneuver. The kinds of tasks and actions these machines are trained to carry out are also diverse: The arm may pick and place objects, while the quadruped needs keen navigation.

That makes training a single AI model on these large collections of data challenging, says Homer Walke, a Ph.D. student at the University of California, Berkeley. So far, most attempts have either focused on data from a narrower selection of similar robots or researchers have manually tweaked data to make observations from different robots more similar. But in research to be presented at the Conference on Robot Learning (CoRL) in Munich in November, they unveiled a new model called CrossFormer that can train on data from a diverse set of robots and control them just as well as specialized control policies.

“We want to be able to train on all of this data to get the most capable robot,” says Walke. “The main advance in this paper is working out what kind of architecture works the best for accommodating all these varying inputs and outputs.”

How to control diverse robots with the same AI model

The team used the same model architecture that powers large language model, known as a transformer. In many ways, the challenge the researchers were trying to solve is not dissimilar to that facing a chatbot, says Walke. In language modeling, the AI has to to pick out similar patterns in sentences with different lengths and word orders. Robot data can also be arranged in a sequence much like a written sentence, but depending on the particular embodiment, observations and actions vary in length and order too.

“Words might appear in different locations in a sentence, but they still mean the same thing,” says Walke. “In our task, an observation image might appear in different locations in the sequence, but it’s still fundamentally an image and we still want to treat it like an image.”

UC Berkeley/Carnegie Mellon University

Most machine learning approaches work through a sequence one element at a time, but transformers can process the entire stream of data at once. This allows them to analyze the relationship between different elements and makes them better at handling sequences that are not standardized, much like the diverse data found in large robotics datasets.

Walke and his colleagues aren’t the first to train transformers on large-scale robotics data. But previous approaches have either trained solely on data from robotic arms with broadly similar embodiments or manually converted input data to a common format to make it easier to process. In contrast, CrossFormer can process images from cameras positioned above a robot, at head height or on a robotic arms wrist, as well as joint position data from both quadrupeds and robotic arms, without any tweaks.

The result is a single control policy that can operate single robotic arms, pairs of robotic arms, quadrupeds, and wheeled robots on tasks as varied as picking and placing objects, cutting sushi, and obstacle avoidance. Crucially, it matched the performance of specialized models tailored for each robot and outperformed previous approaches trained on diverse robotic data. The team even tested whether the model could control an embodiment not included in the dataset—a small quadcopter. While they simplified things by making the drone fly at a fixed altitude, CrossFormer still outperformed the previous best method.

“That was definitely pretty cool,” says Ria Doshi, an undergraduate student at Berkeley. “I think that as we scale up our policy to be able to train on even larger sets of diverse data, it’ll become easier to see this kind of zero shot transfer onto robots that have been completely unseen in the training.”

The limitations of one AI model for all robots

The team admits there’s still work to do, however. The model is too big for any of the robots’ embedded chips and instead has to be run from a server. Even then, processing times are only just fast enough to support real-time operation, and Walke admits that could break down if they scale up the model. “When you pack so much data into a model it has to be very big and that means running it for real-time control becomes difficult.”

One potential workaround would be to use an approach called distillation, says Oier Mees, a postdoctoral research at Berkley and part of the CrossFormer team. This essentially involves training a smaller model to mimic the larger model, and if successful can result in similar performance for a much smaller computational budget.

But of more importance than the computing resource problem is that the team failed to see any positive transfer in their experiments, as CrossFormer simply matched previous performance rather than exceeding it. Walke thinks progress in computer vision and natural language processing suggests that training on more data could be the key.

Others say it might not be that simple. Jeannette Bohg, a professor of robotics at Stanford University, says the ability to train on such a diverse dataset is a significant contribution. But she wonders whether part of the reason why the researchers didn’t see positive transfer is their insistence on not aligning the input data. Previous research that trained on robots with similar observation and action data has shown evidence of such cross-overs. “By getting rid of this alignment, they may have also gotten rid of this significant positive transfer that we’ve seen in other work,” Bohg says.

It’s also not clear if the approach will boost performance on tasks specific to particular embodiments or robotic applications, says Ram Ramamoorthy, a robotics professor at Edinburgh University. The work is a promising step towards helping robots capture concepts common to most robots, like “avoid this obstacle,” he says. But it may be less useful for tackling control problems specific to a particular robot, such as how to knead dough or navigate a forest, which are often the hardest to solve.



The software used to control a robot is normally highly adapted to its specific physical set up. But now researchers have created a single general-purpose robotic control policy that can operate robotic arms, wheeled robots, quadrupeds, and even drones.

One of the biggest challenges when it comes to applying machine learning to robotics is the paucity of data. While computer vision and natural language processing can piggyback off the vast quantities of image and text data found on the Internet, collecting robot data is costly and time-consuming.

To get around this, there have been growing efforts to pool data collected by different groups on different kinds of robots, including the Open X-Embodiment and DROID datasets. The hope is that training on diverse robotics data will lead to “positive transfer,” which refers to when skills learned from training on one task help to boost performance on another.

The problem is that robots often have very different embodiments—a term used to describe their physical layout and suite of sensors and actuators—so the data they collect can vary significantly. For instance, a robotic arm might be static, have a complex arrangement of joints and fingers, and collect video from a camera on its wrist. In contrast, a quadruped robot is regularly on the move and relies on force feedback from its legs to maneuver. The kinds of tasks and actions these machines are trained to carry out are also diverse: The arm may pick and place objects, while the quadruped needs keen navigation.

That makes training a single AI model on these large collections of data challenging, says Homer Walke, a Ph.D. student at the University of California, Berkeley. So far, most attempts have either focused on data from a narrower selection of similar robots or researchers have manually tweaked data to make observations from different robots more similar. But in research to be presented at the Conference on Robot Learning (CoRL) in Munich in November, they unveiled a new model called CrossFormer that can train on data from a diverse set of robots and control them just as well as specialized control policies.

“We want to be able to train on all of this data to get the most capable robot,” says Walke. “The main advance in this paper is working out what kind of architecture works the best for accommodating all these varying inputs and outputs.”

How to control diverse robots with the same AI model

The team used the same model architecture that powers large language model, known as a transformer. In many ways, the challenge the researchers were trying to solve is not dissimilar to that facing a chatbot, says Walke. In language modeling, the AI has to to pick out similar patterns in sentences with different lengths and word orders. Robot data can also be arranged in a sequence much like a written sentence, but depending on the particular embodiment, observations and actions vary in length and order too.

“Words might appear in different locations in a sentence, but they still mean the same thing,” says Walke. “In our task, an observation image might appear in different locations in the sequence, but it’s still fundamentally an image and we still want to treat it like an image.”

UC Berkeley/Carnegie Mellon University

Most machine learning approaches work through a sequence one element at a time, but transformers can process the entire stream of data at once. This allows them to analyze the relationship between different elements and makes them better at handling sequences that are not standardized, much like the diverse data found in large robotics datasets.

Walke and his colleagues aren’t the first to train transformers on large-scale robotics data. But previous approaches have either trained solely on data from robotic arms with broadly similar embodiments or manually converted input data to a common format to make it easier to process. In contrast, CrossFormer can process images from cameras positioned above a robot, at head height or on a robotic arms wrist, as well as joint position data from both quadrupeds and robotic arms, without any tweaks.

The result is a single control policy that can operate single robotic arms, pairs of robotic arms, quadrupeds, and wheeled robots on tasks as varied as picking and placing objects, cutting sushi, and obstacle avoidance. Crucially, it matched the performance of specialized models tailored for each robot and outperformed previous approaches trained on diverse robotic data. The team even tested whether the model could control an embodiment not included in the dataset—a small quadcopter. While they simplified things by making the drone fly at a fixed altitude, CrossFormer still outperformed the previous best method.

“That was definitely pretty cool,” says Ria Doshi, an undergraduate student at Berkeley. “I think that as we scale up our policy to be able to train on even larger sets of diverse data, it’ll become easier to see this kind of zero shot transfer onto robots that have been completely unseen in the training.”

The limitations of one AI model for all robots

The team admits there’s still work to do, however. The model is too big for any of the robots’ embedded chips and instead has to be run from a server. Even then, processing times are only just fast enough to support real-time operation, and Walke admits that could break down if they scale up the model. “When you pack so much data into a model it has to be very big and that means running it for real-time control becomes difficult.”

One potential workaround would be to use an approach called distillation, says Oier Mees, a postdoctoral research at Berkley and part of the CrossFormer team. This essentially involves training a smaller model to mimic the larger model, and if successful can result in similar performance for a much smaller computational budget.

But of more importance than the computing resource problem is that the team failed to see any positive transfer in their experiments, as CrossFormer simply matched previous performance rather than exceeding it. Walke thinks progress in computer vision and natural language processing suggests that training on more data could be the key.

Others say it might not be that simple. Jeannette Bohg, a professor of robotics at Stanford University, says the ability to train on such a diverse dataset is a significant contribution. But she wonders whether part of the reason why the researchers didn’t see positive transfer is their insistence on not aligning the input data. Previous research that trained on robots with similar observation and action data has shown evidence of such cross-overs. “By getting rid of this alignment, they may have also gotten rid of this significant positive transfer that we’ve seen in other work,” Bohg says.

It’s also not clear if the approach will boost performance on tasks specific to particular embodiments or robotic applications, says Ram Ramamoorthy, a robotics professor at Edinburgh University. The work is a promising step towards helping robots capture concepts common to most robots, like “avoid this obstacle,” he says. But it may be less useful for tackling control problems specific to a particular robot, such as how to knead dough or navigate a forest, which are often the hardest to solve.



This is a sponsored article brought to you by Khalifa University of Science and Technology.

Abu Dhabi-based Khalifa University of Science and Technology in the United Arab Emirates (UAE) will be hosting the 36th edition of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.

aspect_ratio

Themed “Robotics for Sustainable Development,” the IROS 2024 will be held from 14-18 October 2024 at the Abu Dhabi National Exhibition Center (ADNEC) in the UAE’s capital city. It will offer a platform for universities and research institutions to display their research and innovation activities and initiatives in robotics, gathering researchers, academics, leading corporate majors, and industry professionals from around the globe.

A total of 13 forums, nine global-level competitions and challenges covering various aspects of robotics and AI, an IROS Expo, as well as an exclusive Career Fair will also be part of IROS 2024. The challenges and competitions will focus on physical or athletic intelligence of robots, remote robot navigation, robot manipulation, underwater robotics, as well as perception and sensing.

Delegates for the event will represent sectors including manufacturing, healthcare, logistics, agriculture, defense, security, and mining sectors with 60 percent of the talent pool having over six years of experience in robotics. A major component of the conference will be the poster sessions, keynotes, panel discussions by researchers and scientists, and networking events.

Khalifa University will be hosting IROS 2024 to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.Khalifa University

Abu Dhabi ranks first on the world’s safest cities list in 2024, according to online database Numbeo, out of 329 global cities in the 2024 standings, holding the title for eight consecutive years since 2017, reflecting the emirate’s ongoing efforts to ensure a good quality of life for citizens and residents.

With a multicultural community, Abu Dhabi is home to people from more than 200 nationalities and draws a large number of tourists to some of the top art galleries in the city such as Louvre Abu Dhabi and the Guggenheim Abu Dhabi, as well as other destinations such as Ferrari World Abu Dhabi and Warner Bros. World Abu Dhabi.

The UAE and Abu Dhabi have increasingly become a center for creative skillsets, human capital and advanced technologies, attracting several international and regional events such as the global COP28 UAE climate summit, in which more than 160 countries participated.

Abu Dhabi city itself has hosted a number of association conventions such as the 34th International Nursing Research Congress and is set to host the UNCTAD World Investment Forum, the 13th World Trade Organization (WTO) Ministerial Conference (MC13), the 12th World Environment Education Congress in 2024, and the IUCN World Conservation Congress in 2025.

Khalifa University’s Center for Robotics and Autonomous Systems (KU-CARS) includes a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.Khalifa University

Dr. Jorge Dias, IROS 2024 General Chair, said: “Khalifa University is delighted to bring the Intelligent Robots and Systems 2024 to Abu Dhabi in the UAE and highlight the innovations in line with the theme Robotics for Sustainable Development. As the region’s rapidly advancing capabilities in robotics and intelligent transport systems gain momentum, this event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities. By hosting IROS 2024, Khalifa University aims to reaffirm the UAE’s status as a global innovation hub and destination for all industry stakeholders to collaborate on cutting-edge research and explore opportunities for growth within the UAE’s innovation ecosystem.”

“This event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities” —Dr. Jorge Dias, IROS 2024 General Chair

Dr. Dias added: “The organizing committee of IROS 2024 has received over 4000 submissions representing 60 countries, with China leading with 1,029 papers, followed by the U.S. (777), Germany (302), and Japan (253), as well as the U.K. and South Korea (173 each). The UAE with a total of 68 papers comes atop the Arab region.”

Driving innovation at Khalifa University is the Center for Robotics and Autonomous Systems (KU-CARS) with around 50 researchers and state-of-the-art laboratory facilities, including a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.

IROS 2024 is sponsored by IEEE Robotics and Automation Society, Abu Dhabi Convention and Exhibition Bureau, the Robotics Society of Japan (RSJ), the Society of Instrument and Control Engineers (SICE), the New Technology Foundation, and the IEEE Industrial Electronics Society (IES).

More information at https://iros2024-abudhabi.org/



This is a sponsored article brought to you by Khalifa University of Science and Technology.

Abu Dhabi-based Khalifa University of Science and Technology in the United Arab Emirates (UAE) will be hosting the 36th edition of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024) to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.

aspect_ratio

Themed “Robotics for Sustainable Development,” the IROS 2024 will be held from 14-18 October 2024 at the Abu Dhabi National Exhibition Center (ADNEC) in the UAE’s capital city. It will offer a platform for universities and research institutions to display their research and innovation activities and initiatives in robotics, gathering researchers, academics, leading corporate majors, and industry professionals from around the globe.

A total of 13 forums, nine global-level competitions and challenges covering various aspects of robotics and AI, an IROS Expo, as well as an exclusive Career Fair will also be part of IROS 2024. The challenges and competitions will focus on physical or athletic intelligence of robots, remote robot navigation, robot manipulation, underwater robotics, as well as perception and sensing.

Delegates for the event will represent sectors including manufacturing, healthcare, logistics, agriculture, defense, security, and mining sectors with 60 percent of the talent pool having over six years of experience in robotics. A major component of the conference will be the poster sessions, keynotes, panel discussions by researchers and scientists, and networking events.

Khalifa University will be hosting IROS 2024 to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.Khalifa University

Abu Dhabi ranks first on the world’s safest cities list in 2024, according to online database Numbeo, out of 329 global cities in the 2024 standings, holding the title for eight consecutive years since 2017, reflecting the emirate’s ongoing efforts to ensure a good quality of life for citizens and residents.

With a multicultural community, Abu Dhabi is home to people from more than 200 nationalities and draws a large number of tourists to some of the top art galleries in the city such as Louvre Abu Dhabi and the Guggenheim Abu Dhabi, as well as other destinations such as Ferrari World Abu Dhabi and Warner Bros. World Abu Dhabi.

The UAE and Abu Dhabi have increasingly become a center for creative skillsets, human capital and advanced technologies, attracting several international and regional events such as the global COP28 UAE climate summit, in which more than 160 countries participated.

Abu Dhabi city itself has hosted a number of association conventions such as the 34th International Nursing Research Congress and is set to host the UNCTAD World Investment Forum, the 13th World Trade Organization (WTO) Ministerial Conference (MC13), the 12th World Environment Education Congress in 2024, and the IUCN World Conservation Congress in 2025.

Khalifa University’s Center for Robotics and Autonomous Systems (KU-CARS) includes a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.Khalifa University

Dr. Jorge Dias, IROS 2024 General Chair, said: “Khalifa University is delighted to bring the Intelligent Robots and Systems 2024 to Abu Dhabi in the UAE and highlight the innovations in line with the theme Robotics for Sustainable Development. As the region’s rapidly advancing capabilities in robotics and intelligent transport systems gain momentum, this event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities. By hosting IROS 2024, Khalifa University aims to reaffirm the UAE’s status as a global innovation hub and destination for all industry stakeholders to collaborate on cutting-edge research and explore opportunities for growth within the UAE’s innovation ecosystem.”

“This event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities” —Dr. Jorge Dias, IROS 2024 General Chair

Dr. Dias added: “The organizing committee of IROS 2024 has received over 4000 submissions representing 60 countries, with China leading with 1,029 papers, followed by the U.S. (777), Germany (302), and Japan (253), as well as the U.K. and South Korea (173 each). The UAE with a total of 68 papers comes atop the Arab region.”

Driving innovation at Khalifa University is the Center for Robotics and Autonomous Systems (KU-CARS) with around 50 researchers and state-of-the-art laboratory facilities, including a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.

IROS 2024 is sponsored by IEEE Robotics and Automation Society, Abu Dhabi Convention and Exhibition Bureau, the Robotics Society of Japan (RSJ), the Society of Instrument and Control Engineers (SICE), the New Technology Foundation, and the IEEE Industrial Electronics Society (IES).

More information at https://iros2024-abudhabi.org/



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Researchers at the Max Planck Institute for Intelligent Systems and ETH Zurich have developed a robotic leg with artificial muscles. Inspired by living creatures, it jumps across different terrains in an agile and energy-efficient manner.

[ Nature ] via [ MPI ]

Thanks, Toshi!

ETH Zurich researchers have now developed a fast robotic printing process for earth-based materials that does not require cement. In what is known as “impact printing,” a robot shoots material from above, gradually building a wall. On impact, the parts bond together, and very minimal additives are required.

[ ETH Zurich ]

How could you not be excited to see this happen for real?

[ arXiv paper ]

Can we all agree that sanding, grinding, deburring, and polishing tasks are really best done by robots, for the most part?

[ Cohesive Robotics ]

Thanks, David!

Using doors is a longstanding challenge in robotics and is of significant practical interest in giving robots greater access to human-centric spaces. The task is challenging due to the need for online adaptation to varying door properties and precise control in manipulating the door panel and navigating through the confined doorway. To address this, we propose a learning-based controller for a legged manipulator to open and traverse through doors.

[ arXiv paper ]

Isaac is the first robot assistant that’s built for the home. And we’re shipping it in fall of 2025.

Fall of 2025 is a long enough time from now that I’m not even going to speculate about it.

[ Weave Robotics ]

By patterning liquid metal paste onto a soft sheet of silicone or acrylic foam tape, we developed stretchable versions of conventional rigid circuits (like Arduinos). Our soft circuits can be stretched to over 300% strain (over 4x their length) and are integrated into active soft robots.

[ Science Robotics ] via [ Yale ]

NASA’s Curiosity rover is exploring a scientifically exciting area on Mars, but communicating with the mission team on Earth has recently been a challenge due to both the current season and the surrounding terrain. In this Mars Report, Curiosity engineer Reidar Larsen takes you inside the uplink room where the team talks to the rover.

[ NASA ]

I love this and want to burn it with fire.

[ Carpentopod ]

Very often, people ask us what Reachy 2 is capable of, which is why we’re showing you the manipulation possibilities (through teleoperation) of our technology. The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen Robotics ]

The Scalable Autonomous Robots (ScalAR) Lab is an interdisciplinary lab focused on fundamental research problems in robotics that lie at the intersection of robotics, nonlinear dynamical systems theory, and uncertainty.

[ ScalAR Lab ]

Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of #KawasakiRobotics industrial robots, giving students the necessary skills for future work.

[ Astorino ]

I guess fish-fillet-shaping robots need to exist because otherwise customers will freak out if all their fish fillets are not identical, or something?

[ Flexiv ]

Watch the second episode of the ExoMars Rosalind Franklin rover mission—Europe’s ambitious exploration journey to search for past and present signs of life on Mars. The rover will dig, collect, and investigate the chemical composition of material collected by a drill. Rosalind Franklin will be the first rover to reach a depth of up to two meters below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures.

[ ESA ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Researchers at the Max Planck Institute for Intelligent Systems and ETH Zurich have developed a robotic leg with artificial muscles. Inspired by living creatures, it jumps across different terrains in an agile and energy-efficient manner.

[ Nature ] via [ MPI ]

Thanks, Toshi!

ETH Zurich researchers have now developed a fast robotic printing process for earth-based materials that does not require cement. In what is known as “impact printing,” a robot shoots material from above, gradually building a wall. On impact, the parts bond together, and very minimal additives are required.

[ ETH Zurich ]

How could you not be excited to see this happen for real?

[ arXiv paper ]

Can we all agree that sanding, grinding, deburring, and polishing tasks are really best done by robots, for the most part?

[ Cohesive Robotics ]

Thanks, David!

Using doors is a longstanding challenge in robotics and is of significant practical interest in giving robots greater access to human-centric spaces. The task is challenging due to the need for online adaptation to varying door properties and precise control in manipulating the door panel and navigating through the confined doorway. To address this, we propose a learning-based controller for a legged manipulator to open and traverse through doors.

[ arXiv paper ]

Isaac is the first robot assistant that’s built for the home. And we’re shipping it in fall of 2025.

Fall of 2025 is a long enough time from now that I’m not even going to speculate about it.

[ Weave Robotics ]

By patterning liquid metal paste onto a soft sheet of silicone or acrylic foam tape, we developed stretchable versions of conventional rigid circuits (like Arduinos). Our soft circuits can be stretched to over 300% strain (over 4x their length) and are integrated into active soft robots.

[ Science Robotics ] via [ Yale ]

NASA’s Curiosity rover is exploring a scientifically exciting area on Mars, but communicating with the mission team on Earth has recently been a challenge due to both the current season and the surrounding terrain. In this Mars Report, Curiosity engineer Reidar Larsen takes you inside the uplink room where the team talks to the rover.

[ NASA ]

I love this and want to burn it with fire.

[ Carpentopod ]

Very often, people ask us what Reachy 2 is capable of, which is why we’re showing you the manipulation possibilities (through teleoperation) of our technology. The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!

[ Pollen Robotics ]

The Scalable Autonomous Robots (ScalAR) Lab is an interdisciplinary lab focused on fundamental research problems in robotics that lie at the intersection of robotics, nonlinear dynamical systems theory, and uncertainty.

[ ScalAR Lab ]

Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of #KawasakiRobotics industrial robots, giving students the necessary skills for future work.

[ Astorino ]

I guess fish-fillet-shaping robots need to exist because otherwise customers will freak out if all their fish fillets are not identical, or something?

[ Flexiv ]

Watch the second episode of the ExoMars Rosalind Franklin rover mission—Europe’s ambitious exploration journey to search for past and present signs of life on Mars. The rover will dig, collect, and investigate the chemical composition of material collected by a drill. Rosalind Franklin will be the first rover to reach a depth of up to two meters below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures.

[ ESA ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

The National Science Foundation Human AugmentatioN via Dexterity Engineering Research Center (HAND ERC) was announced in August 2024. Funded for up to 10 years and $52 million, the HAND ERC is led by Northwestern University, with core members Texas A&M, Florida A&M, Carnegie Mellon, and MIT, and support from Wisconsin-Madison, Syracuse, and an innovation ecosystem consisting of companies, national labs, and civic and advocacy organizations. HAND will develop versatile, easy-to-use dexterous robot end-effectors (hands).

[ HAND ]

The Environmental Robotics Lab at ETH Zurich, in partnership with Wilderness International (and some help from DJI and Audi), is using drones to sample DNA from the tops of trees in the Peruvian rainforest. Somehow, the treetops are where 60 to 90 percent of biodiversity is found, and these drones can help researchers determine what the heck is going on up there.

[ ERL ]

Thanks, Steffen!

1X introduces NEO Beta, “the pre-production build of our home humanoid.”

“Our priority is safety,” said Bernt Børnich, CEO at 1X. “Safety is the cornerstone that allows us to confidently introduce NEO Beta into homes, where it will gather essential feedback and demonstrate its capabilities in real-world settings. This year, we are deploying a limited number of NEO units in selected homes for research and development purposes. Doing so means we are taking another step toward achieving our mission.”

[ 1X ]

We love MangDang’s fun and affordable approach to robotics with Mini Pupper. The next generation of the little legged robot has just launched on Kickstarter, featuring new and updated robots that make it easy to explore embodied AI.

The Kickstarter is already fully funded after just a day or two, but there are still plenty of robots up for grabs.

[ Kickstarter ]

Quadrupeds in space can use their legs to reorient themselves. Or, if you throw one off a roof, it can learn to land on its feet.

To be presented at CoRL 2024.

[ ARL ]

HEBI Robotics, which apparently was once headquartered inside a Pittsburgh public bus, has imbued a table with actuators and a mind of its own.

[ HEBI Robotics ]

Carcinization is a concept in evolutionary biology where a crustacean that isn’t a crab eventually becomes a crab. So why not do the same thing with robots? Crab robots solve all problems!

[ KAIST ]

Waymo is smart, but also humans are really, really dumb sometimes.

[ Waymo ]

The Robotics Department of the University of Michigan created an interactive community art project. The group that led the creation believed that while roboticists typically take on critical and impactful problems in transportation, medicine, mobility, logistics, and manufacturing, there are many opportunities to find play and amusement. The final piece is a grid of art boxes, produced by different members of our robotics community, which offer an eight-inch square view into their own work with robotics.

[ Michigan Robotics ]

I appreciate that UBTECH’s humanoid is doing an actual job, but why would you use a humanoid for this?

[ UBTECH ]

I’m sure most actuators go through some form of lifecycle testing. But if you really want to test an electric motor, put it into a BattleBot and see what happens.

[ Hardcore Robotics ]

Yes, but have you tried fighting a BattleBot?

[ AgileX ]

In this video, we present collaboration aerial grasping and transportation using multiple quadrotors with cable-suspended payloads. Grasping using a suspended gripper requires accurate tracking of the electromagnet to ensure a successful grasp while switching between different slack and taut modes. In this work, we grasp the payload using a hybrid control approach that switches between a quadrotor position control and payload position control based on cable-slackness. Finally, we use two quadrotors with suspended electromagnet systems to collaboratively grasp and pick up a larger payload for transportation.

[ Hybrid Robotics ]

I had not realized that the floretizing of broccoli was so violent.

[ Oxipital ]

While the RoboCup was held over a month ago, we still wanted to make a small summary of our results, the most memorable moments, and of course a homage to everyone who is involved with the B-Human team. The team members, the sponsors, and the fans at home. Thank you so much for making B-Human the team it is!

[ B-Human ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

The National Science Foundation Human AugmentatioN via Dexterity Engineering Research Center (HAND ERC) was announced in August 2024. Funded for up to 10 years and $52 million, the HAND ERC is led by Northwestern University, with core members Texas A&M, Florida A&M, Carnegie Mellon, and MIT, and support from Wisconsin-Madison, Syracuse, and an innovation ecosystem consisting of companies, national labs, and civic and advocacy organizations. HAND will develop versatile, easy-to-use dexterous robot end-effectors (hands).

[ HAND ]

The Environmental Robotics Lab at ETH Zurich, in partnership with Wilderness International (and some help from DJI and Audi), is using drones to sample DNA from the tops of trees in the Peruvian rainforest. Somehow, the treetops are where 60 to 90 percent of biodiversity is found, and these drones can help researchers determine what the heck is going on up there.

[ ERL ]

Thanks, Steffen!

1X introduces NEO Beta, “the pre-production build of our home humanoid.”

“Our priority is safety,” said Bernt Børnich, CEO at 1X. “Safety is the cornerstone that allows us to confidently introduce NEO Beta into homes, where it will gather essential feedback and demonstrate its capabilities in real-world settings. This year, we are deploying a limited number of NEO units in selected homes for research and development purposes. Doing so means we are taking another step toward achieving our mission.”

[ 1X ]

We love MangDang’s fun and affordable approach to robotics with Mini Pupper. The next generation of the little legged robot has just launched on Kickstarter, featuring new and updated robots that make it easy to explore embodied AI.

The Kickstarter is already fully funded after just a day or two, but there are still plenty of robots up for grabs.

[ Kickstarter ]

Quadrupeds in space can use their legs to reorient themselves. Or, if you throw one off a roof, it can learn to land on its feet.

To be presented at CoRL 2024.

[ ARL ]

HEBI Robotics, which apparently was once headquartered inside a Pittsburgh public bus, has imbued a table with actuators and a mind of its own.

[ HEBI Robotics ]

Carcinization is a concept in evolutionary biology where a crustacean that isn’t a crab eventually becomes a crab. So why not do the same thing with robots? Crab robots solve all problems!

[ KAIST ]

Waymo is smart, but also humans are really, really dumb sometimes.

[ Waymo ]

The Robotics Department of the University of Michigan created an interactive community art project. The group that led the creation believed that while roboticists typically take on critical and impactful problems in transportation, medicine, mobility, logistics, and manufacturing, there are many opportunities to find play and amusement. The final piece is a grid of art boxes, produced by different members of our robotics community, which offer an eight-inch square view into their own work with robotics.

[ Michigan Robotics ]

I appreciate that UBTECH’s humanoid is doing an actual job, but why would you use a humanoid for this?

[ UBTECH ]

I’m sure most actuators go through some form of lifecycle testing. But if you really want to test an electric motor, put it into a BattleBot and see what happens.

[ Hardcore Robotics ]

Yes, but have you tried fighting a BattleBot?

[ AgileX ]

In this video, we present collaboration aerial grasping and transportation using multiple quadrotors with cable-suspended payloads. Grasping using a suspended gripper requires accurate tracking of the electromagnet to ensure a successful grasp while switching between different slack and taut modes. In this work, we grasp the payload using a hybrid control approach that switches between a quadrotor position control and payload position control based on cable-slackness. Finally, we use two quadrotors with suspended electromagnet systems to collaboratively grasp and pick up a larger payload for transportation.

[ Hybrid Robotics ]

I had not realized that the floretizing of broccoli was so violent.

[ Oxipital ]

While the RoboCup was held over a month ago, we still wanted to make a small summary of our results, the most memorable moments, and of course a homage to everyone who is involved with the B-Human team. The team members, the sponsors, and the fans at home. Thank you so much for making B-Human the team it is!

[ B-Human ]


At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.

Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not the cheapest—humanoid around.


At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.

Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not the cheapest—humanoid around.



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Imbuing robots with “human-level performance” in anything is an enormous challenge, but it’s worth it when you see a robot with the skill to interact with a human on a (nearly) human level. Google DeepMind has managed to achieve amateur human-level competence at table tennis, which is much harder than it looks, even for humans. Pannag Sanketi, a tech-lead manager in the robotics team at DeepMind, shared some interesting insights about performing the research. But first, video!

Some behind the scenes detail from Pannag:

  • The robot had not seen any participants before. So we knew we had a cool agent, but we had no idea how it was going to fare in a full match with real humans. To witness it outmaneuver even some of the most advanced players was such a delightful moment for team!
  • All the participants had a lot of fun playing against the robot, irrespective of who won the match. And all of them wanted to play more. Some of them said it will be great to have the robot as a playing partner. From the videos, you can even see how much fun the user study hosts sitting there (who are not authors on the paper) are having watching the games!
  • Barney, who is a professional coach, was an advisor on the project, and our chief evaluator of robot’s skills the way he evaluates his students. He also got surprised by how the robot is always able to learn from the last few weeks’ sessions.
  • We invested a lot in remote and automated 24x7 operations. So not the setup in this video, but there are other cells that we can run 24x7 with a ball thrower.
  • We even tried robot-vs-robot, i.e. 2 robots playing against each other! :) The line between collaboration and competition becomes very interesting when they try to learn by playing with each other.

[ DeepMind ]

Thanks, Heni!

Yoink.

[ MIT ]

Considering how their stability and recovery is often tested, teaching robot dogs to be shy of humans is an excellent idea.

[ Deep Robotics ]

Yes, quadruped robots need tow truck hooks.

[ Paper ]

Earthworm-inspired robots require novel actuators, and Ayato Kanada at Kyushu University has come up with a neat one.

[ Paper ]

Thanks, Ayato!

Meet the AstroAnt! This miniaturized swarm robot can ride atop a lunar rover and collect data related to its health, including surface temperatures and damage from micrometeoroid impacts. In the summer of 2024, with support from our collaborator Castrol, the Media Lab’s Space Exploration Initiative tested AstroAnt in the Canary Islands, where the volcanic landscape resembles the lunar surface.

[ MIT ]

Kengoro has a new forearm that mimics the human radioulnar joint giving it an even more natural badminton swing.

[ JSK Lab ]

Thanks, Kento!

Gromit’s concern that Wallace is becoming too dependent on his inventions proves justified, when Wallace invents a “smart” gnome that seems to develop a mind of its own. When it emerges that a vengeful figure from the past might be masterminding things, it falls to Gromit to battle sinister forces and save his master… or Wallace may never be able to invent again!

[ Wallace and Gromit ]

ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.

[ Kawasaki ]

Engineers at NASA’s Jet Propulsion Laboratory are testing a prototype of IceNode, a robot designed to access one of the most difficult-to-reach places on Earth. The team envisions a fleet of these autonomous robots deploying into unmapped underwater cavities beneath Antarctic ice shelves. There, they’d measure how fast the ice is melting — data that’s crucial to helping scientists accurately project how much global sea levels will rise.

[ IceNode ]

Los Alamos National Laboratory, in a consortium with four other National Laboratories, is leading the charge in finding the best practices to find orphaned wells. These abandoned wells can leak methane gas into the atmosphere and possibly leak liquid into the ground water.

[ LANL ]

Looks like Fourier has been working on something new, although this is still at the point of “looks like” rather than something real.

[ Fourier ]

Bio-Inspired Robot Hands: Altus Dexterity is a collaboration between researchers and professionals from Carnegie Mellon University, UPMC, the University of Illinois and the University of Houston.

[ Altus Dexterity ]

PiPER is a lightweight robotic arm with six integrated joint motors for smooth, precise control. Weighing just 4.2kg, it easily handles a 1.5kg payload and is made from durable yet lightweight materials for versatile use across various environments. Available for just $2,499 USD.

[ AgileX ]

At 104 years old, Lilabel has seen over a century of automotive transformation, from sharing a single car with her family in the 1920s to experiencing her first ride in a robotaxi.

[ Zoox ]

Traditionally, blind juggling robots use plates that are slightly concave to help them with ball control, but it’s also possible to make a blind juggler the hard way. Which, honestly, is much more impressive.

[ Jugglebot ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDSIROS 2024: 14–18 October 2024, ABU DHABI, UAEICSR 2024: 23–26 October 2024, ODENSE, DENMARKCybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Imbuing robots with “human-level performance” in anything is an enormous challenge, but it’s worth it when you see a robot with the skill to interact with a human on a (nearly) human level. Google DeepMind has managed to achieve amateur human-level competence at table tennis, which is much harder than it looks, even for humans. Pannag Sanketi, a tech-lead manager in the robotics team at DeepMind, shared some interesting insights about performing the research. But first, video!

Some behind the scenes detail from Pannag:

  • The robot had not seen any participants before. So we knew we had a cool agent, but we had no idea how it was going to fare in a full match with real humans. To witness it outmaneuver even some of the most advanced players was such a delightful moment for team!
  • All the participants had a lot of fun playing against the robot, irrespective of who won the match. And all of them wanted to play more. Some of them said it will be great to have the robot as a playing partner. From the videos, you can even see how much fun the user study hosts sitting there (who are not authors on the paper) are having watching the games!
  • Barney, who is a professional coach, was an advisor on the project, and our chief evaluator of robot’s skills the way he evaluates his students. He also got surprised by how the robot is always able to learn from the last few weeks’ sessions.
  • We invested a lot in remote and automated 24x7 operations. So not the setup in this video, but there are other cells that we can run 24x7 with a ball thrower.
  • We even tried robot-vs-robot, i.e. 2 robots playing against each other! :) The line between collaboration and competition becomes very interesting when they try to learn by playing with each other.

[ DeepMind ]

Thanks, Heni!

Yoink.

[ MIT ]

Considering how their stability and recovery is often tested, teaching robot dogs to be shy of humans is an excellent idea.

[ Deep Robotics ]

Yes, quadruped robots need tow truck hooks.

[ Paper ]

Earthworm-inspired robots require novel actuators, and Ayato Kanada at Kyushu University has come up with a neat one.

[ Paper ]

Thanks, Ayato!

Meet the AstroAnt! This miniaturized swarm robot can ride atop a lunar rover and collect data related to its health, including surface temperatures and damage from micrometeoroid impacts. In the summer of 2024, with support from our collaborator Castrol, the Media Lab’s Space Exploration Initiative tested AstroAnt in the Canary Islands, where the volcanic landscape resembles the lunar surface.

[ MIT ]

Kengoro has a new forearm that mimics the human radioulnar joint giving it an even more natural badminton swing.

[ JSK Lab ]

Thanks, Kento!

Gromit’s concern that Wallace is becoming too dependent on his inventions proves justified, when Wallace invents a “smart” gnome that seems to develop a mind of its own. When it emerges that a vengeful figure from the past might be masterminding things, it falls to Gromit to battle sinister forces and save his master… or Wallace may never be able to invent again!

[ Wallace and Gromit ]

ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.

[ Kawasaki ]

Engineers at NASA’s Jet Propulsion Laboratory are testing a prototype of IceNode, a robot designed to access one of the most difficult-to-reach places on Earth. The team envisions a fleet of these autonomous robots deploying into unmapped underwater cavities beneath Antarctic ice shelves. There, they’d measure how fast the ice is melting — data that’s crucial to helping scientists accurately project how much global sea levels will rise.

[ IceNode ]

Los Alamos National Laboratory, in a consortium with four other National Laboratories, is leading the charge in finding the best practices to find orphaned wells. These abandoned wells can leak methane gas into the atmosphere and possibly leak liquid into the ground water.

[ LANL ]

Looks like Fourier has been working on something new, although this is still at the point of “looks like” rather than something real.

[ Fourier ]

Bio-Inspired Robot Hands: Altus Dexterity is a collaboration between researchers and professionals from Carnegie Mellon University, UPMC, the University of Illinois and the University of Houston.

[ Altus Dexterity ]

PiPER is a lightweight robotic arm with six integrated joint motors for smooth, precise control. Weighing just 4.2kg, it easily handles a 1.5kg payload and is made from durable yet lightweight materials for versatile use across various environments. Available for just $2,499 USD.

[ AgileX ]

At 104 years old, Lilabel has seen over a century of automotive transformation, from sharing a single car with her family in the 1920s to experiencing her first ride in a robotaxi.

[ Zoox ]

Traditionally, blind juggling robots use plates that are slightly concave to help them with ball control, but it’s also possible to make a blind juggler the hard way. Which, honestly, is much more impressive.

[ Jugglebot ]



In the 1960s and 1970s, NASA spent a lot of time thinking about whether toroidal (donut-shaped) fuel tanks were the way to go with its spacecraft. Toroidal tanks have a bunch of potential advantages over conventional spherical fuel tanks. For example, you can fit nearly 40% more volume within a toroidal tank than if you were using multiple spherical tanks within the same space. And perhaps most interestingly, you can shove stuff (like the back of an engine) through the middle of a toroidal tank, which could lead to some substantial efficiency gains if the tanks could also handle structural loads.

Because of their relatively complex shape, toroidal tanks are much more difficult to make than spherical tanks. Even though these tanks can perform better, NASA simply doesn’t have the expertise to manufacture them anymore, since each one has to be hand-built by highly skilled humans. But a company called Machina Labs thinks that they can do this with robots instead. And their vision is to completely change how we make things out of metal.

The fundamental problem that Machina Labs is trying to solve is that if you want to build parts out of metal efficiently at scale, it’s a slow process. Large metal parts need their own custom dies, which are very expensive one-offs that are about as inflexible as it’s possible to get, and then entire factories are built around these parts. It’s a huge investment, which means that it doesn’t matter if you find some new geometry or technique or material or market, because you have to justify that enormous up-front cost by making as much of the original thing as you possibly can, stifling the potential for rapid and flexible innovation.

On the other end of the spectrum you have the also very slow and expensive process of making metal parts one at a time by hand. A few hundred years ago, this was the only way of making metal parts: skilled metalworkers using hand tools for months to make things like armor and weapons. The nice thing about an expert metalworker is that they can use their skills and experience to make anything at all, which is where Machina Labs’ vision comes from, explains CEO Edward Mehr who co-founded Machina Labs after spending time at SpaceX followed by leading the 3D printing team at Relativity Space.

“Craftsmen can pick up different tools and apply them creatively to metal to do all kinds of different things. One day they can pick up a hammer and form a shield out of a sheet of metal,” says Mehr. “Next, they pick up the same hammer, and create a sword out of a metal rod. They’re very flexible.”

The technique that a human metalworker uses to shape metal is called forging, which preserves the grain flow of the metal as it’s worked. Casting, stamping, or milling metal (which are all ways of automating metal part production) are simply not as strong or as durable as parts that are forged, which can be an important differentiator for (say) things that have to go into space. But more on that in a bit.

The problem with human metalworkers is that the throughput is bad—humans are slow, and highly skilled humans in particular don’t scale well. For Mehr and Machina Labs, this is where the robots come in.

“We want to automate and scale using a platform called the ‘robotic craftsman.’ Our core enablers are robots that give us the kinematics of a human craftsman, and artificial intelligence that gives us control over the process,” Mehr says. “The concept is that we can do any process that a human craftsman can do, and actually some that humans can’t do because we can apply more force with better accuracy.”

This flexibility that robot metalworkers offer also enables the crafting of bespoke parts that would be impractical to make in any other way. These include toroidal (donut-shaped) fuel tanks that NASA has had its eye on for the last half century or so.

Machina Labs’ CEO Edward Mehr (on right) stands behind a 15 foot toroidal fuel tank.Machina Labs

“The main challenge of these tanks is that the geometry is complex,” Mehr says. “Sixty years ago, NASA was bump-forming them with very skilled craftspeople, but a lot of them aren’t around anymore.” Mehr explains that the only other way to get that geometry is with dies, but for NASA, getting a die made for a fuel tank that’s necessarily been customized for one single spacecraft would be pretty much impossible to justify. “So one of the main reasons we’re not using toroidal tanks is because it’s just hard to make them.”

Machina Labs is now making toroidal tanks for NASA. For the moment, the robots are just doing the shaping, which is the tough part. Humans then weld the pieces together. But there’s no reason why the robots couldn’t do the entire process end-to-end and even more efficiently. Currently, they’re doing it the “human” way based on existing plans from NASA. “In the future,” Mehr tells us, “we can actually form these tanks in one or two pieces. That’s the next area that we’re exploring with NASA—how can we do things differently now that we don’t need to design around human ergonomics?”

Machina Labs’ ‘robotic craftsmen’ work in pairs to shape sheet metal, with one robot on each side of the sheet. The robots align their tools slightly offset from each other with the metal between them such that as the robots move across the sheet, it bends between the tools. Machina Labs

The video above shows Machina’s robots working on a tank that’s 4.572 m (15 feet) in diameter, likely destined for the Moon. “The main application is for lunar landers,” says Mehr. “The toroidal tanks bring the center of gravity of the vehicle lower than what you would have with spherical or pill-shaped tanks.”

Training these robots to work metal like this is done primarily through physics-based simulations that Machina developed in house (existing software being too slow), followed by human-guided iterations based on the resulting real-world data. The way that metal moves under pressure can be simulated pretty well, and although there’s certainly still a sim-to-real gap (simulating how the robot’s tool adheres to the surface of the material is particularly tricky), the robots are collecting so much empirical data that Machina is making substantial progress towards full autonomy, and even finding ways to improve the process.

An example of the kind of complex metal parts that Machina’s robots are able to make.Machina Labs

Ultimately, Machina wants to use robots to produce all kinds of metal parts. On the commercial side, they’re exploring things like car body panels, offering the option to change how your car looks in geometry rather than just color. The requirement for a couple of beefy robots to make this work means that roboforming is unlikely to become as pervasive as 3D printing, but the broader concept is the same: making physical objects a software problem rather than a hardware problem to enable customization at scale.



In the 1960s and 1970s, NASA spent a lot of time thinking about whether toroidal (donut-shaped) fuel tanks were the way to go with its spacecraft. Toroidal tanks have a bunch of potential advantages over conventional spherical fuel tanks. For example, you can fit nearly 40% more volume within a toroidal tank than if you were using multiple spherical tanks within the same space. And perhaps most interestingly, you can shove stuff (like the back of an engine) through the middle of a toroidal tank, which could lead to some substantial efficiency gains if the tanks could also handle structural loads.

Because of their relatively complex shape, toroidal tanks are much more difficult to make than spherical tanks. Even though these tanks can perform better, NASA simply doesn’t have the expertise to manufacture them anymore, since each one has to be hand-built by highly skilled humans. But a company called Machina Labs thinks that they can do this with robots instead. And their vision is to completely change how we make things out of metal.

The fundamental problem that Machina Labs is trying to solve is that if you want to build parts out of metal efficiently at scale, it’s a slow process. Large metal parts need their own custom dies, which are very expensive one-offs that are about as inflexible as it’s possible to get, and then entire factories are built around these parts. It’s a huge investment, which means that it doesn’t matter if you find some new geometry or technique or material or market, because you have to justify that enormous up-front cost by making as much of the original thing as you possibly can, stifling the potential for rapid and flexible innovation.

On the other end of the spectrum you have the also very slow and expensive process of making metal parts one at a time by hand. A few hundred years ago, this was the only way of making metal parts: skilled metalworkers using hand tools for months to make things like armor and weapons. The nice thing about an expert metalworker is that they can use their skills and experience to make anything at all, which is where Machina Labs’ vision comes from, explains CEO Edward Mehr who co-founded Machina Labs after spending time at SpaceX followed by leading the 3D printing team at Relativity Space.

“Craftsmen can pick up different tools and apply them creatively to metal to do all kinds of different things. One day they can pick up a hammer and form a shield out of a sheet of metal,” says Mehr. “Next, they pick up the same hammer, and create a sword out of a metal rod. They’re very flexible.”

The technique that a human metalworker uses to shape metal is called forging, which preserves the grain flow of the metal as it’s worked. Casting, stamping, or milling metal (which are all ways of automating metal part production) are simply not as strong or as durable as parts that are forged, which can be an important differentiator for (say) things that have to go into space. But more on that in a bit.

The problem with human metalworkers is that the throughput is bad—humans are slow, and highly skilled humans in particular don’t scale well. For Mehr and Machina Labs, this is where the robots come in.

“We want to automate and scale using a platform called the ‘robotic craftsman.’ Our core enablers are robots that give us the kinematics of a human craftsman, and artificial intelligence that gives us control over the process,” Mehr says. “The concept is that we can do any process that a human craftsman can do, and actually some that humans can’t do because we can apply more force with better accuracy.”

This flexibility that robot metalworkers offer also enables the crafting of bespoke parts that would be impractical to make in any other way. These include toroidal (donut-shaped) fuel tanks that NASA has had its eye on for the last half century or so.

Machina Labs’ CEO Edward Mehr (on right) stands behind a 15 foot toroidal fuel tank.Machina Labs

“The main challenge of these tanks is that the geometry is complex,” Mehr says. “Sixty years ago, NASA was bump-forming them with very skilled craftspeople, but a lot of them aren’t around anymore.” Mehr explains that the only other way to get that geometry is with dies, but for NASA, getting a die made for a fuel tank that’s necessarily been customized for one single spacecraft would be pretty much impossible to justify. “So one of the main reasons we’re not using toroidal tanks is because it’s just hard to make them.”

Machina Labs is now making toroidal tanks for NASA. For the moment, the robots are just doing the shaping, which is the tough part. Humans then weld the pieces together. But there’s no reason why the robots couldn’t do the entire process end-to-end and even more efficiently. Currently, they’re doing it the “human” way based on existing plans from NASA. “In the future,” Mehr tells us, “we can actually form these tanks in one or two pieces. That’s the next area that we’re exploring with NASA—how can we do things differently now that we don’t need to design around human ergonomics?”

Machina Labs’ ‘robotic craftsmen’ work in pairs to shape sheet metal, with one robot on each side of the sheet. The robots align their tools slightly offset from each other with the metal between them such that as the robots move across the sheet, it bends between the tools. Machina Labs

The video above shows Machina’s robots working on a tank that’s 4.572 m (15 feet) in diameter, likely destined for the Moon. “The main application is for lunar landers,” says Mehr. “The toroidal tanks bring the center of gravity of the vehicle lower than what you would have with spherical or pill-shaped tanks.”

Training these robots to work metal like this is done primarily through physics-based simulations that Machina developed in house (existing software being too slow), followed by human-guided iterations based on the resulting real-world data. The way that metal moves under pressure can be simulated pretty well, and although there’s certainly still a sim-to-real gap (simulating how the robot’s tool adheres to the surface of the material is particularly tricky), the robots are collecting so much empirical data that Machina is making substantial progress towards full autonomy, and even finding ways to improve the process.

An example of the kind of complex metal parts that Machina’s robots are able to make.Machina Labs

Ultimately, Machina wants to use robots to produce all kinds of metal parts. On the commercial side, they’re exploring things like car body panels, offering the option to change how your car looks in geometry rather than just color. The requirement for a couple of beefy robots to make this work means that roboforming is unlikely to become as pervasive as 3D printing, but the broader concept is the same: making physical objects a software problem rather than a hardware problem to enable customization at scale.

Pages