Feed aggregator

Sensory feedback is essential for the control of soft robotic systems and to enable deployment in a variety of different tasks. Proprioception refers to sensing the robot’s own state and is of crucial importance in order to deploy soft robotic systems outside of laboratory environments, i.e. where no external sensing, such as motion capture systems, is available. A vision-based sensing approach for a soft robotic arm made from fabric is presented, leveraging the high-resolution sensory feedback provided by cameras. No mechanical interaction between the sensor and the soft structure is required and consequently the compliance of the soft system is preserved. The integration of a camera into an inflatable, fabric-based bellow actuator is discussed. Three actuators, each featuring an integrated camera, are used to control the spherical robotic arm and simultaneously provide sensory feedback of the two rotational degrees of freedom. A convolutional neural network architecture predicts the two angles describing the robot’s orientation from the camera images. Ground truth data is provided by a motion capture system during the training phase of the supervised learning approach and its evaluation thereafter. The camera-based sensing approach is able to provide estimates of the orientation in real-time with an accuracy of about one degree. The reliability of the sensing approach is demonstrated by using the sensory feedback to control the orientation of the robotic arm in closed-loop.

Researchers continue to devise creative ways to explore the extent to which people perceive robots as social agents, as opposed to objects. One such approach involves asking participants to inflict ‘harm’ on a robot. Researchers are interested in the length of time between the experimenter issuing the instruction and the participant complying, and propose that relatively long periods of hesitation might reflect empathy for the robot, and perhaps even attribution of human-like qualities, such as agency and sentience. In a recent experiment, we adapted the so-called ‘hesitance to hit’ paradigm, in which participants were instructed to hit a humanoid robot on the head with a mallet. After standing up to do so (signaling intent to hit the robot), participants were stopped, and then took part in a semi-structured interview to probe their thoughts and feelings during the period of hesitation. Thematic analysis of the responses indicate that hesitation not only reflects perceived socialness, but also other factors including (but not limited to) concerns about cost, mallet disbelief, processing of the task instruction, and the influence of authority. The open-ended, free responses participants provided also offer rich insights into individual differences with regards to anthropomorphism, perceived power imbalances, and feelings of connection toward the robot. In addition to aiding understanding of this measurement technique and related topics regarding socialness attribution to robots, we argue that greater use of open questions can lead to exciting new research questions and interdisciplinary collaborations in the domain of social robotics.

Many robot exploration algorithms that are used to explore office, home, or outdoor environments, rely on the concept of frontier cells. Frontier cells define the border between known and unknown space. Frontier-based exploration is the process of repeatedly detecting frontiers and moving towards them, until there are no more frontiers and therefore no more unknown regions. The faster frontier cells can be detected, the more efficient exploration becomes. This paper proposes several algorithms for detecting frontiers. The first is called Naïve Active Area (NaïveAA) frontier detection and achieves frontier detection in constant time by only evaluating the cells in the active area defined by scans taken. The second algorithm is called Expanding-Wavefront Frontier Detection (EWFD) and uses frontiers from the previous timestep as a starting point for searching for frontiers in newly discovered space. The third approach is called Frontier-Tracing Frontier Detection (FTFD) and also uses the frontiers from the previous timestep as well as the endpoints of the scan, to determine the frontiers at the current timestep. Algorithms are compared to state-of-the-art algorithms such as Naïve, WFD, and WFD-INC. NaïveAA is shown to operate in constant time and therefore is suitable as a basic benchmark for frontier detection algorithms. EWFD and FTFD are found to be significantly faster than other algorithms.

Flexible endoscopy involves the insertion of a long narrow flexible tube into the body for diagnostic and therapeutic procedures. In the gastrointestinal (GI) tract, flexible endoscopy plays a major role in cancer screening, surveillance, and treatment programs. As a result of gas insufflation during the procedure, both upper and lower GI endoscopy procedures have been classified as aerosol generating by the guidelines issued by the respective societies during the COVID-19 pandemic—although no quantifiable data on aerosol generation currently exists. Due to the risk of COVID-19 transmission to healthcare workers, most societies halted non-emergency and diagnostic procedures during the lockdown. The long-term implications of stoppage in cancer diagnoses and treatment is predicted to lead to a large increase in preventable deaths. Robotics may play a major role in this field by allowing healthcare operators to control the flexible endoscope from a safe distance and pave a path for protecting healthcare workers through minimizing the risk of virus transmission without reducing diagnostic and therapeutic capacities. This review focuses on the needs and challenges associated with the design of robotic flexible endoscopes for use during a pandemic. The authors propose that a few minor changes to existing platforms or considerations for platforms in development could lead to significant benefits for use during infection control scenarios.

Over the last half decade or so, the commercialization of autonomous robots that can operate outside of structured environments has dramatically increased. But this relatively new transition of robotic technologies from research projects to commercial products comes with its share of challenges, many of which relate to the rapidly increasing visibility that these robots have in society.

Whether it's because of their appearance of agency, or because of their history in popular culture, robots frequently inspire people’s imagination. Sometimes this is a good thing, like when it leads to innovative new use cases. And sometimes this is a bad thing, like when it leads to use cases that could be classified as irresponsible or unethical. Can the people selling robots do anything about the latter? And even if they can, should they?

Roboticists understand that robots, fundamentally, are tools. We build them, we program them, and even the autonomous ones are just following the instructions that we’ve coded into them. However, that same appearance of agency that makes robots so compelling means that it may not be clear to people without much experience with or exposure to real robots that a robot itself isn’t inherently good or bad—rather, as a tool, a robot is a reflection of its designers and users.

This can put robotics companies into a difficult position. When they sell a robot to someone, that person can, hypothetically, use the robot in any way they want. Of course, this is the case with every tool, but it’s the autonomous aspect that makes robots unique. I would argue that autonomy brings with it an implied association between a robot and its maker, or in this case, the company that develops and sells it. I’m not saying that this association is necessarily a reasonable one, but I think that it exists, even if that robot has been sold to someone else who has assumed full control over everything it does.

“All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon”  —Robert Playter, Boston Dynamics

Robotics companies are certainly aware of this, because many of them are very careful about who they sell their robots to, and very explicit about what they want their robots to be doing. But once a robot is out in the wild, as it were, how far should that responsibility extend? And realistically, how far can it extend? Should robotics companies be held accountable for what their robots do in the world, or should we accept that once a robot is sold to someone else, responsibility is transferred as well? And what can be done if a robot is being used in an irresponsible or unethical way that could have a negative impact on the robotics community?

For perspective on this, we contacted folks from three different robotics companies, each of which has experience selling distinctive mobile robots to commercial end users. We asked them the same five questions about the responsibility that robotics companies have regarding the robots that they sell, and here’s what they had to say:

Do you have any restrictions on what people can do with your robots? If so, what are they, and if not, why not?

Péter Fankhauser, CEO, ANYbotics:

We closely work together with our customers to make sure that our solution provides the right approach for their problem. Thereby, the target use case is clear from the beginning and we do not work with customers interested in using our robot ANYmal outside the intended target applications. Specifically, we strictly exclude any military or weaponized uses and since the foundation of ANYbotics it is close to our heart to make human work easier, safer, and more enjoyable.

Robert Playter, CEO, Boston Dynamics:

Yes, we have restrictions on what people can do with our robots, which are outlined in our Terms and Conditions of Sale. All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon. Spot, just like any product, must be used in compliance with the law. 

Ryan Gariepy, CTO, Clearpath Robotics:

We do have strict restrictions and KYC processes which are based primarily on Canadian export control regulations. They depend on the type of equipment sold as well as where it is going. More generally, we also will not sell or support a robot if we know that it will create an uncontrolled safety hazard or if we have reason to believe that the buyer is unqualified to use the product. And, as always, we do not support using our products for the development of fully autonomous weapons systems.

More broadly, if you sell someone a robot, why should they be restricted in what they can do with it?

Péter Fankhauser, ANYbotics: We see the robot less as a simple object but more as an artificial workforce. This implies to us that the usage is closely coupled with the transfer of the robot and both the customer and the provider agree what the robot is expected to do. This approach is supported by what we hear from our customers with an increasing interest to pay for the robots as a service or per use.

Robert Playter, Boston Dynamics: We’re offering a product for sale. We’re going to do the best we can to stop bad actors from using our technology for harm, but we don’t have the control to regulate every use. That said, we believe that our business will be best served if our technology is used for peaceful purposes—to work alongside people as trusted assistants and remove them from harm’s way. We do not want to see our technology used to cause harm or promote violence. Our restrictions are similar to those of other manufacturers or technology companies that take steps to reduce or eliminate the violent or unlawful use of their products. 

Ryan Gariepy, Clearpath Robotics: Assuming the organization doing the restricting is a private organization and the robot and its software is sold vs. leased or “managed,” there aren't strong legal reasons to restrict use. That being said, the manufacturer likewise has no obligation to continue supporting that specific robot or customer going forward. However, given that we are only at the very edge of how robots will reshape a great deal of society, it is in the best interest for the manufacturer and user to be honest with each other about their respective goals. Right now, you're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.

“If a robot is being used in a way that is irresponsible due to safety: intervene! If it’s unethical: speak up!” —Péter Fankhauser, ANYbotics What can you realistically do to make sure that people who buy your robots use them in the ways that you intend?

Péter Fankhauser, ANYbotics: We maintain a close collaboration with our customers to ensure their success with our solution. So for us, we have refrained from technical solutions to block unintended use.

Robert Playter, Boston Dynamics: We vet our customers to make sure that their desired applications are things that Spot can support, and are in alignment with our Terms and Conditions of Sale. We’ve turned away customers whose applications aren’t a good match with our technology. If customers misuse our technology, we’re clear in our Terms of Sale that their violations may void our warranty and prevent their robots from being updated, serviced, repaired, or replaced. We may also repossess robots that are not purchased, but leased. Finally, we will refuse future sales to customers that violate our Terms of Sale.

Ryan Gariepy, Clearpath Robotics: We typically work with our clients ahead of the purchase to make sure their expectations match reality, in particular on aspects like safety, supervisory requirements, and usability. It's far worse to sell a robot that'll sit on a shelf or worse, cause harm, then to not sell a robot at all, so we prefer to reduce the risk of this situation in advance of receiving an order or shipping a robot.

How do you evaluate the merit of edge cases, for example if someone wants to use your robot in research or art that may push the boundaries of what you personally think is responsible or ethical?

Péter Fankhauser, ANYbotics: It’s about the dialog, understanding, and figuring out alternatives that work for all involved parties and the earlier you can have this dialog the better.

Robert Playter, Boston Dynamics: There’s a clear line between exploring robots in research and art, and using the robot for violent or illegal purposes. 

Ryan Gariepy, Clearpath Robotics: We have sold thousands of robots to hundreds of clients, and I do not recall the last situation that was not covered by a combination of export control and a general evaluation of the client's goals and expectations. I'm sure this will change as robots continue to drop in price and increase in flexibility and usability.

“You're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.” —Ryan Gariepy, Clearpath Robotics What should roboticists do if we see a robot being used in a way that we feel is unethical or irresponsible?

Péter Fankhauser, ANYbotics: If it’s irresponsible due to safety: intervene! If it’s unethical: speak up!

Robert Playter, Boston Dynamics: We want robots to be beneficial for humanity, which includes the notion of not causing harm. As an industry, we think robots will achieve long-term commercial viability only if people see robots as helpful, beneficial tools without worrying if they’re going to cause harm.

Ryan Gariepy, Clearpath Robotics: On a one off basis, they should speak to a combination of the user, the supplier or suppliers, the media, and, if safety is an immediate concern, regulatory or government agencies. If the situation in question risks becoming commonplace and is not being taken seriously, they should speak up more generally in appropriate forums—conferences, industry groups, standards bodies, and the like.

As more and more robots representing different capabilities become commercially available, these issues are likely to come up more frequently. The three companies we talked to certainly don’t represent every viewpoint, and we did reach out to other companies who declined to comment. But I would think (I would hope?) that everyone in the robotics community can agree that robots should be used in a way that makes people’s lives better. What “better” means in the context of art and research and even robots in the military may not always be easy to define, and inevitably there’ll be disagreement as to what is ethical and responsible, and what isn’t.

We’ll keep on talking about it, though, and do our best to help the robotics community to continue growing and evolving in a positive way. Let us know what you think in the comments.

Over the last half decade or so, the commercialization of autonomous robots that can operate outside of structured environments has dramatically increased. But this relatively new transition of robotic technologies from research projects to commercial products comes with its share of challenges, many of which relate to the rapidly increasing visibility that these robots have in society.

Whether it's because of their appearance of agency, or because of their history in popular culture, robots frequently inspire people’s imagination. Sometimes this is a good thing, like when it leads to innovative new use cases. And sometimes this is a bad thing, like when it leads to use cases that could be classified as irresponsible or unethical. Can the people selling robots do anything about the latter? And even if they can, should they?

Roboticists understand that robots, fundamentally, are tools. We build them, we program them, and even the autonomous ones are just following the instructions that we’ve coded into them. However, that same appearance of agency that makes robots so compelling means that it may not be clear to people without much experience with or exposure to real robots that a robot itself isn’t inherently good or bad—rather, as a tool, a robot is a reflection of its designers and users.

This can put robotics companies into a difficult position. When they sell a robot to someone, that person can, hypothetically, use the robot in any way they want. Of course, this is the case with every tool, but it’s the autonomous aspect that makes robots unique. I would argue that autonomy brings with it an implied association between a robot and its maker, or in this case, the company that develops and sells it. I’m not saying that this association is necessarily a reasonable one, but I think that it exists, even if that robot has been sold to someone else who has assumed full control over everything it does.

“All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon”  —Robert Playter, Boston Dynamics

Robotics companies are certainly aware of this, because many of them are very careful about who they sell their robots to, and very explicit about what they want their robots to be doing. But once a robot is out in the wild, as it were, how far should that responsibility extend? And realistically, how far can it extend? Should robotics companies be held accountable for what their robots do in the world, or should we accept that once a robot is sold to someone else, responsibility is transferred as well? And what can be done if a robot is being used in an irresponsible or unethical way that could have a negative impact on the robotics community?

For perspective on this, we contacted folks from three different robotics companies, each of which has experience selling distinctive mobile robots to commercial end users. We asked them the same five questions about the responsibility that robotics companies have regarding the robots that they sell, and here’s what they had to say:

Do you have any restrictions on what people can do with your robots? If so, what are they, and if not, why not?

Péter Fankhauser, CEO, ANYbotics:

We closely work together with our customers to make sure that our solution provides the right approach for their problem. Thereby, the target use case is clear from the beginning and we do not work with customers interested in using our robot ANYmal outside the intended target applications. Specifically, we strictly exclude any military or weaponized uses and since the foundation of ANYbotics it is close to our heart to make human work easier, safer, and more enjoyable.

Robert Playter, CEO, Boston Dynamics:

Yes, we have restrictions on what people can do with our robots, which are outlined in our Terms and Conditions of Sale. All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon. Spot, just like any product, must be used in compliance with the law. 

Ryan Gariepy, CTO, Clearpath Robotics:

We do have strict restrictions and KYC processes which are based primarily on Canadian export control regulations. They depend on the type of equipment sold as well as where it is going. More generally, we also will not sell or support a robot if we know that it will create an uncontrolled safety hazard or if we have reason to believe that the buyer is unqualified to use the product. And, as always, we do not support using our products for the development of fully autonomous weapons systems.

More broadly, if you sell someone a robot, why should they be restricted in what they can do with it?

Péter Fankhauser, ANYbotics: We see the robot less as a simple object but more as an artificial workforce. This implies to us that the usage is closely coupled with the transfer of the robot and both the customer and the provider agree what the robot is expected to do. This approach is supported by what we hear from our customers with an increasing interest to pay for the robots as a service or per use.

Robert Playter, Boston Dynamics: We’re offering a product for sale. We’re going to do the best we can to stop bad actors from using our technology for harm, but we don’t have the control to regulate every use. That said, we believe that our business will be best served if our technology is used for peaceful purposes—to work alongside people as trusted assistants and remove them from harm’s way. We do not want to see our technology used to cause harm or promote violence. Our restrictions are similar to those of other manufacturers or technology companies that take steps to reduce or eliminate the violent or unlawful use of their products. 

Ryan Gariepy, Clearpath Robotics: Assuming the organization doing the restricting is a private organization and the robot and its software is sold vs. leased or “managed,” there aren't strong legal reasons to restrict use. That being said, the manufacturer likewise has no obligation to continue supporting that specific robot or customer going forward. However, given that we are only at the very edge of how robots will reshape a great deal of society, it is in the best interest for the manufacturer and user to be honest with each other about their respective goals. Right now, you're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.

“If a robot is being used in a way that is irresponsible due to safety: intervene! If it’s unethical: speak up!” —Péter Fankhauser, ANYbotics What can you realistically do to make sure that people who buy your robots use them in the ways that you intend?

Péter Fankhauser, ANYbotics: We maintain a close collaboration with our customers to ensure their success with our solution. So for us, we have refrained from technical solutions to block unintended use.

Robert Playter, Boston Dynamics: We vet our customers to make sure that their desired applications are things that Spot can support, and are in alignment with our Terms and Conditions of Sale. We’ve turned away customers whose applications aren’t a good match with our technology. If customers misuse our technology, we’re clear in our Terms of Sale that their violations may void our warranty and prevent their robots from being updated, serviced, repaired, or replaced. We may also repossess robots that are not purchased, but leased. Finally, we will refuse future sales to customers that violate our Terms of Sale.

Ryan Gariepy, Clearpath Robotics: We typically work with our clients ahead of the purchase to make sure their expectations match reality, in particular on aspects like safety, supervisory requirements, and usability. It's far worse to sell a robot that'll sit on a shelf or worse, cause harm, then to not sell a robot at all, so we prefer to reduce the risk of this situation in advance of receiving an order or shipping a robot.

How do you evaluate the merit of edge cases, for example if someone wants to use your robot in research or art that may push the boundaries of what you personally think is responsible or ethical?

Péter Fankhauser, ANYbotics: It’s about the dialog, understanding, and figuring out alternatives that work for all involved parties and the earlier you can have this dialog the better.

Robert Playter, Boston Dynamics: There’s a clear line between exploring robots in research and art, and using the robot for violent or illegal purposes. 

Ryan Gariepy, Clearpath Robotics: We have sold thousands of robots to hundreds of clients, and I do not recall the last situation that was not covered by a combination of export control and a general evaluation of the client's goals and expectations. I'm sure this will change as robots continue to drop in price and increase in flexibility and usability.

“You're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.” —Ryan Gariepy, Clearpath Robotics What should roboticists do if we see a robot being used in a way that we feel is unethical or irresponsible?

Péter Fankhauser, ANYbotics: If it’s irresponsible due to safety: intervene! If it’s unethical: speak up!

Robert Playter, Boston Dynamics: We want robots to be beneficial for humanity, which includes the notion of not causing harm. As an industry, we think robots will achieve long-term commercial viability only if people see robots as helpful, beneficial tools without worrying if they’re going to cause harm.

Ryan Gariepy, Clearpath Robotics: On a one off basis, they should speak to a combination of the user, the supplier or suppliers, the media, and, if safety is an immediate concern, regulatory or government agencies. If the situation in question risks becoming commonplace and is not being taken seriously, they should speak up more generally in appropriate forums—conferences, industry groups, standards bodies, and the like.

As more and more robots representing different capabilities become commercially available, these issues are likely to come up more frequently. The three companies we talked to certainly don’t represent every viewpoint, and we did reach out to other companies who declined to comment. But I would think (I would hope?) that everyone in the robotics community can agree that robots should be used in a way that makes people’s lives better. What “better” means in the context of art and research and even robots in the military may not always be easy to define, and inevitably there’ll be disagreement as to what is ethical and responsible, and what isn’t.

We’ll keep on talking about it, though, and do our best to help the robotics community to continue growing and evolving in a positive way. Let us know what you think in the comments.

When a natural disaster strikes, first responders must move quickly to search for survivors. To support the search-and-rescue efforts, one group of innovators in Europe has succeeded in harnessing the power of drones, AI, and smartphones, all in one novel combination.

Their idea is to use a single drone as a moving cellular base station, which can do large sweeps over disaster areas and locate survivors using signals from their phones. AI helps the drone methodically survey the area and even estimate the trajectory of survivors who are moving.

The team built its platform, called Search-And-Rescue DrOne based solution (SARDO), using off-the-shelf hardware and tested it in field experiments and simulations. They describe the results in a study published 13 January in IEEE Transactions on Mobile Computing.

“We built SARDO to provide first responders with an all-in-one victims localization system capable of working in the aftermath of a disaster without existing network infrastructure support,” explains Antonio Albanese, a Research Associate at NEC Laboratories Europe GmbH, which is headquartered in Heidelberg, Germany.

The point is that a natural disaster may knock out cell towers along with other infrastructure. SARDO, which is quipped with a light-weight cellular base station, is a mobile solution that could be implemented regardless of what infrastructure remains after a natural disaster. 

To detect and map out the locations of victims, SARDO performs time-of-flight measurements (using the timing of signals emitted by the users’ phones to estimate distance). 

A machine learning algorithm is then applied to the time-of-flight measurements to calculate the positions of victims. The algorithm compensates for when signals are blocked by rubble.

If a victim is on the move in the wake of a disaster, a second machine learning algorithm, tasked with estimating the person’s trajectory based on their current movement, kicks in—potentially helping first responders locate the person sooner.   

After sweeping an area, the drone is programmed to automatically maneuver closer to the position of a suspected victim to retrieve more accurate distance measurements. If too many errors are interfering with the drone’s ability to locate victims, it’s programmed to enlarge the scanning area.

In their study, Albanese and his colleagues tested SARDO in several field experiments without rubble, and used simulations to test the approach in a scenario where rubble interfered with some signals. In the field experiments, the drone was able to pinpoint the location of missing people to within a few tens of meters, requiring approximately three minutes to locate each victim (within a field roughly 200 meters squared. As would be expected, SARDO was less accurate when rubble was present or when the drone was flying at higher speeds or altitudes.

Albanese notes that a limitation of SARDO–as is the case with all drone-based approaches–is the battery life of the drone. But, he says, the energy consumption of the NEC team’s design remains relatively low.

The group is consulting the laboratory’s business experts on the possibility of commercializing this tech.  Says Albanese: “There is interest, especially from the public safety divisions, but still no final decision has been taken.”

In the meantime, SARDO may undergo further advances. “We plan to extend SARDO to emergency indoor localization so [it is] capable of working in any emergency scenario where buildings might not be accessible [to human rescuers],” says Albanese.

<Back to IEEE Journal Watch

The purpose of this work is to optimize the rigid or compliant behavior of a new type of parallel-actuated robot architecture developed for exoskeleton robot applications. This is done in an effort to provide those that utilize the architecture with the means to maximize, minimize, or simply adjust its stiffness property so as to optimize it for particular tasks, such as augmented lifting or impact absorption. This research even provides the means to produce non-homogeneous stiffness properties for applications that may require non-homogeneous dynamic behavior. In this work, the new architecture is demonstrated in the form of a shoulder exoskeleton. An analytical stiffness model for the shoulder exoskeleton is created and validated experimentally. The model is then used, along with a method of bounded nonlinear multi-objective optimization to configure the parallel substructures for desired rigidity, compliance or nonhomogeneous stiffness behavior. The stiffness model and its optimization can be applied beyond the shoulder to any embodiment of the new parallel architecture, including hip, wrist and ankle robot applications. In order to exemplify this, we present the rigidity optimization for a theoretical hip exoskeleton.

At a press conference this afternoon, NASA released a new video showing, in real-time and full color, the entire descent and landing of the Perseverance Mars rover. The video begins with the deployment of the parachute, and ends with the Skycrane cutting the rover free and flying away. It’s the most mind-blowing three minutes of video I have ever seen. 

Image: NASA/JPL The cameras that recorded video during the Mars 2020 rover’s landing on Mars.

Some very quick context: during landing, multiple cameras were recording the event, and this video is a combination of these. No audio was recorded, so you’re hearing a feed from JPL mission control.

Here’s the video:

We’ll have a lot more on the Perseverance rover, but for now, we’re just going to let this video sink in.

[ Mars 2020 ]

At a press conference this afternoon, NASA released a new video showing, in real-time and full color, the entire descent and landing of the Perseverance Mars rover. The video begins with the deployment of the parachute, and ends with the Skycrane cutting the rover free and flying away. It’s the most mind-blowing three minutes of video I have ever seen. 

Image: NASA/JPL The cameras that recorded video during the Mars 2020 rover’s landing on Mars.

Some very quick context: during landing, multiple cameras were recording the event, and this video is a combination of these. No audio was recorded, so you’re hearing a feed from JPL mission control.

Here’s the video:

We’ll have a lot more on the Perseverance rover, but for now, we’re just going to let this video sink in.

[ Mars 2020 ]

Inspecting old mines is a dangerous business. For humans, mines can be lethal: prone to rockfalls and filled with noxious gases. Robots can go where humans might suffocate, but even robots can only do so much when mines are inaccessible from the surface.

Now, researchers in the UK, led by Headlight AI, have developed a drone that could cast a light in the darkness. Named Prometheus, this drone can enter a mine through a borehole not much larger than a football, before unfurling its arms and flying around the void. Once down there, it can use its payload of scanning equipment to map mines where neither humans nor robots can presently go. This, the researchers hope, could make mine inspection quicker and easier. The team behind Prometheus published its design in November in the journal Robotics.

Mine inspection might seem like a peculiarly specific task to fret about, but old mines can collapse, causing the ground to sink and damaging nearby buildings. It’s a far-reaching threat: the geotechnical engineering firm Geoinvestigate, based in Northeast England, estimates that around 8 percent of all buildings in the UK are at risk from any of the thousands of abandoned coal mines near the country’s surface. It’s also a threat to transport, such as road and rail. Indeed, Prometheus is backed by Network Rail, which operates Britain’s railway infrastructure.

Such grave dangers mean that old mines need periodic check-ups. To enter depths that are forbidden to traditional wheeled robots—such as those featured in the DARPA SubT Challenge—inspectors today drill boreholes down into the mine and lower scanners into the darkness.

But that can be an arduous and often fruitless process. Inspecting the entirety of a mine can take multiple boreholes, and that still might not be enough to chart a complete picture. Mines are jagged, labyrinthine places, and much of the void might lie out of sight. Furthermore, many old mines aren’t well-mapped, so it’s hard to tell where best to enter them.

Prometheus can fly around some of those challenges. Inspectors can lower Prometheus, tethered to a docking apparatus, down a single borehole. Once inside the mine, the drone can undock and fly around, using LIDAR scanners—common in mine inspection today—to generate a 3D map of the unknown void. Prometheus can fly through the mine autonomously, using infrared data to plot out its own course.

Other drones exist that can fly underground, but they’re either too small to carry a relatively heavy payload of scanning equipment, or too large to easily fit down a borehole. What makes Prometheus unique is its ability to fold its arms, allowing it to squeeze down spaces its counterparts cannot.

It’s that ability to fold and enter a borehole that makes Prometheus remarkable, says Jason Gross, a professor of mechanical and aerospace engineering at West Virginia University. Gross calls Prometheus “an exciting idea,” but he does note that it has a relatively short flight window and few abilities beyond scanning.

The researchers have conducted a number of successful test flights, both in a basement and in an old mine near Shrewsbury, England. Not only was Prometheus able to map out its space, the drone was able to plot its own course in an unknown area.

The researchers’ next steps, according to Puneet Chhabra, co-founder of Headlight AI, will be to test Prometheus’s ability to unfold in an actual mine. Following that, researchers plan to conduct full-scale test flights by the end of 2021.

Inspecting old mines is a dangerous business. For humans, mines can be lethal: prone to rockfalls and filled with noxious gases. Robots can go where humans might suffocate, but even robots can only do so much when mines are inaccessible from the surface.

Now, researchers in the UK, led by Headlight AI, have developed a drone that could cast a light in the darkness. Named Prometheus, this drone can enter a mine through a borehole not much larger than a football, before unfurling its arms and flying around the void. Once down there, it can use its payload of scanning equipment to map mines where neither humans nor robots can presently go. This, the researchers hope, could make mine inspection quicker and easier. The team behind Prometheus published its design in November in the journal Robotics.

Mine inspection might seem like a peculiarly specific task to fret about, but old mines can collapse, causing the ground to sink and damaging nearby buildings. It’s a far-reaching threat: the geotechnical engineering firm Geoinvestigate, based in Northeast England, estimates that around 8 percent of all buildings in the UK are at risk from any of the thousands of abandoned coal mines near the country’s surface. It’s also a threat to transport, such as road and rail. Indeed, Prometheus is backed by Network Rail, which operates Britain’s railway infrastructure.

Such grave dangers mean that old mines need periodic check-ups. To enter depths that are forbidden to traditional wheeled robots—such as those featured in the DARPA SubT Challenge—inspectors today drill boreholes down into the mine and lower scanners into the darkness.

But that can be an arduous and often fruitless process. Inspecting the entirety of a mine can take multiple boreholes, and that still might not be enough to chart a complete picture. Mines are jagged, labyrinthine places, and much of the void might lie out of sight. Furthermore, many old mines aren’t well-mapped, so it’s hard to tell where best to enter them.

Prometheus can fly around some of those challenges. Inspectors can lower Prometheus, tethered to a docking apparatus, down a single borehole. Once inside the mine, the drone can undock and fly around, using LIDAR scanners—common in mine inspection today—to generate a 3D map of the unknown void. Prometheus can fly through the mine autonomously, using infrared data to plot out its own course.

Other drones exist that can fly underground, but they’re either too small to carry a relatively heavy payload of scanning equipment, or too large to easily fit down a borehole. What makes Prometheus unique is its ability to fold its arms, allowing it to squeeze down spaces its counterparts cannot.

It’s that ability to fold and enter a borehole that makes Prometheus remarkable, says Jason Gross, a professor of mechanical and aerospace engineering at West Virginia University. Gross calls Prometheus “an exciting idea,” but he does note that it has a relatively short flight window and few abilities beyond scanning.

The researchers have conducted a number of successful test flights, both in a basement and in an old mine near Shrewsbury, England. Not only was Prometheus able to map out its space, the drone was able to plot its own course in an unknown area.

The researchers’ next steps, according to Puneet Chhabra, co-founder of Headlight AI, will be to test Prometheus’s ability to unfold in an actual mine. Following that, researchers plan to conduct full-scale test flights by the end of 2021.

The current pandemic has highlighted the need for rapid construction of structures to treat patients and ensure manufacturing of health care products such as vaccines. In order to achieve this, rapid transportation of construction materials from staging area to deposition is needed. In the future, this could be achieved through automated construction sites that make use of robots. Toward this, in this paper a cable driven parallel manipulator (CDPM) is designed and built to balance a highly unstable load, a ball plate system. The system consists of eight cables attached to the end effector plate that can be extended or retracted to actuate movement of the plate. The hardware for the system was designed and built utilizing modern manufacturing processes. A camera system was designed using image recognition to identify the ball pose on the plate. The hardware was used to inform the development of a control system consisting of a reinforcement-learning trained neural network controller that outputs the desired platform response. A nested PID controller for each motor attached to each cable was used to realize the desired response. For the neural network controller, three different model structures were compared to assess the impact of varying model complexity. It was seen that less complex structures resulted in a slower response that was less flexible and more complex structures output a high frequency oscillation of the actuation signal resulting in an unresponsive system. It was concluded that the system showed promise for future development with the potential to improve on the state of the art.

This paper proposes an underactuated grippers mechanism that grasps and pulls in different types of objects. These two movements are generated by only a single actuator while two independent actuators are used in conventional grippers. To demonstrate this principle, we have developed two kinds of gripper by different driving systems: one is driven by a DC motor with planetary gear reducers and another is driven by pneumatic actuators with branch tubes as a differential. Each pulling-in mechanism in the former one and the latter one is achieved by a belt-driven finger surface and a linear slider with an air cylinder, respectively. The motor-driven gripper with planetary gear reducers can pull-up the object after grasping. However, the object tends to fall when placing because it opens the finger before pushing out the object during the reversed movement. In addition, the closing speed and the picking-up speed of the fingers are slow due to the high reduction gear. To solve these drawbacks, a new pneumatic gripper by combining three valves, a speed control valve, a relief valve, and non-return valves, is proposed. The proposed pneumatic gripper is superior in the sense that it can perform pulling-up after grasping the object and opening the fingers after pushing-out the object. In the present paper, a design methodology of the different underactuated grippers that can not only grasp but also pull up objects is discussed. Then, to examine the performance of the grippers, experiments were conducted using various objects with different rigidity, shapes, size, and mass, which may be potentially available in real applications.

Soft robots are inherently safe, highly resilient, and potentially very cheap, making them promising for a wide array of applications. But development on them has been a bit slow relative to other areas of robotics, at least partially because soft robots can’t directly benefit from the massive increase in computing power and sensor and actuator availability that we’ve seen over the last few decades. Instead, roboticists have had to get creative to find ways of achieving the functionality of conventional robotics components using soft materials and compatible power sources.

In the current issue of Science Robotics, researchers from UC San Diego demonstrate a soft walking robot with four legs that moves with a turtle-like gait controlled by a pneumatic circuit system made from tubes and valves. This air-powered nervous system can actuate multiple degrees of freedom in sequence from a single source of pressurized air, offering a huge reduction in complexity and bringing a very basic form of decision making onto the robot itself.

Generally, when people talk about soft robots, the robots are only mostly soft. There are some components that are very difficult to make soft, including pressure sources and the necessary electronics to direct that pressure between different soft actuators in a way that can be used for propulsion. What’s really cool about this robot is that researchers have managed to take a pressure source (either a single tether or an onboard CO2 cartridge) and direct it to four different legs, each with three different air chambers, using an oscillating three valve circuit made entirely of soft materials. 

Photo: UCSD The pneumatic circuit that powers and controls the soft quadruped.

The inspiration for this can be found in biology—natural organisms, including quadrupeds, use nervous system components called central pattern generators (CPGs) to prompt repetitive motions with limbs that are used for walking, flying, and swimming. This is obviously more complicated in some organisms than in others, and is typically mediated by sensory feedback, but the underlying structure of a CPG is basically just a repeating circuit that drives muscles in sequence to produce a stable, continuous gait. In this case, we’ve got pneumatic muscles being driven in opposing pairs, resulting in a diagonal couplet gait, where diagonally opposed limbs rotate forwards and backwards at the same time.

Diagram: Science Robotics

(J) Pneumatic logic circuit for rhythmic leg motion. A constant positive pressure source (P+) applied to three inverter components causes a high-pressure state to propagate around the circuit, with a delay at each inverter. While the input to one inverter is high, the attached actuator (i.e., A1, A2, or A3) is inflated. This sequence of high-pressure states causes each pair of legs of the robot to rotate in a direction determined by the pneumatic connections. (K) By reversing the sequence of activation of the pneumatic oscillator circuit, the attached actuators inflate in a new sequence (A1, A3, and A2), causing (L) the legs of the robot to rotate in reverse. (M) Schematic bottom view of the robot with the directions of leg motions indicated for forward walking.

Diagram: Science Robotics

Each of the valves acts as an inverter by switching the normally closed half (top) to open and the normally open half (bottom) to closed.

The circuit itself is made up of three bistable pneumatic valves connected by tubing that acts as a delay by providing resistance to the gas moving through it that can be adjusted by altering the tube’s length and inner diameter. Within the circuit, the movement of the pressurized gas acts as both a source of energy and as a signal, since wherever the pressure is in the circuit is where the legs are moving. The simplest circuit uses only three valves, and can keep the robot walking in one single direction, but more valves can add more complex leg control options. For example, the researchers were able to use seven valves to tune the phase offset of the gait, and even just one additional valve (albeit of a slightly more complex design) could enable reversal of the system, causing the robot to walk backwards in response to input from a soft sensor. And with another complex valve, a manual (tethered) controller could be used for omnidirectional movement.

This work has some similarities to the rover that JPL is developing to explore Venus—that rover isn’t a soft robot, of course, but it operates under similar constraints in that it can’t rely on conventional electronic systems for autonomous navigation or control. It turns out that there are plenty of clever ways to use mechanical (or in this case, pneumatic) intelligence to make robots with relatively complex autonomous behaviors, meaning that in the future, soft (or soft-ish) robots could find valuable roles in situations where using a non-compliant system is not a good option.

For more on why we should be so excited about soft robots and just how soft a soft robot needs to be, we spoke with Michael Tolley, who runs the Bioinspired Robotics and Design Lab at UCSD, and Dylan Drotman, the paper’s first author.

IEEE Spectrum: What can soft robots do for us that more rigid robotic designs can’t?

Michael Tolley: At the very highest level, one of the fundamental assumptions of robotics is that you have rigid bodies connected at joints, and all your motion happens at these joints. That's a really nice approach because it makes the math easy, frankly, and it simplifies control. But when you look around us in nature, even though animals do have bones and joints, the way we interact with the world is much more complicated than that simple story. I’m interested in where we can take advantage of material properties in robotics. If you look at robots that have to operate in very unknown environments, I think you can build in some of the intelligence for how to deal with those environments into the body of the robot itself. And that’s the category this work really falls under—it's about navigating the world.

Dylan Drotman: Walking through confined spaces is a good example. With the rigid legged robot, you would have to completely change the way that the legs move to walk through a confined space, while if you have flexible legs, like the robot in our paper, you can use relatively simple control strategies to squeeze through an area you wouldn’t be able to get through with a rigid system. 

How smart can a soft robot get?

Drotman: Right now we have a sensor on the front that's connected through a fluidic transmission to a bistable valve that causes the robot to reverse. We could add other sensors around the robot to allow it to change direction whenever it runs into an obstacle to effectively make an electronics-free version of a Roomba.

Tolley: Stepping back a little bit from that, one could make an argument that we’re using basic memory elements to generate very basic signals. There’s nothing in principle that would stop someone from making a pneumatic computer—it’s just very complicated to make something that complex. I think you could build on this and do more intelligent decision making, but using this specific design and the components we’re using, it’s likely to be things that are more direct responses to the environment. 

How well would robots like these scale down?

Drotman: At the moment we’re manufacturing these components by hand, so the idea would be to make something more like a printed circuit board instead, and looking at how the channel sizes and the valve design would affect the actuation properties. We’ll also be coming up with new circuits, and different designs for the circuits themselves.

Tolley: Down to centimeter or millimeter scale, I don’t think you’d have fundamental fluid flow problems. I think you’re going to be limited more by system design constraints. You’ll have to be able to locomote while carrying around your pressure source, and possibly some other components that are also still rigid. When you start to talk about really small scales, though, it's not as clear to me that you really need an intrinsically soft robot. If you think about insects, their structural geometry can make them behave like they’re soft, but they’re not intrinsically soft.

Should we be thinking about soft robots and compliant robots in the same way, or are they fundamentally different?

Tolley: There’s certainly a connection between the two. You could have a compliant robot that behaves in a very similar way to an intrinsically soft robot, or a robot made of intrinsically soft materials. At that point, it comes down to design and manufacturing and practical limitations on what you can make. I think when you get down to small scales, the two sort of get connected. 

There was some interesting work several years ago on using explosions to power soft robots. Is that still a thing?

Tolley: One of the opportunities with soft robots is that with material compliance, you have the potential to store energy. I think there’s exciting potential there for rapid motion with a soft body. Combustion is one way of doing that with power coming from a chemical source all at once, but you could also use a relatively weak muscle that over time stores up energy in a soft body and then releases it. 

Is it realistic to expect complete softness from soft robots, or will they likely always have rigid components because they have to store or generate and move pressurized gas somehow?

Tolley: If you look in nature, you do have soft pumps like the heart, but although it’s soft, it’s still relatively stiff. Like, if you grab a heart, it’s not totally squishy. I haven’t done it, but I’d imagine. If you have a container that you’re pressurizing, it has to be stiff enough to not just blow up like a balloon. Certainly pneumatics or hydraulics are not the only way to go for soft actuators; there has been some really nice work on smart muscles and smart materials like hydraulic electrostatic (HASEL) actuators. They seem promising, but all of these actuators have challenges. We’ve chosen to stick with pressurized pneumatics in the near term; longer term, I think you’ll start to see more of these smart material actuators become more practical.

Personally, I don’t have any problem with soft robots having some rigid components. Most animals on land have some rigid components, but they can still take advantage of being soft, so it’s probably going to be a combination. But I do also like the vision of making an entirely soft, squishy thing.

Soft robots are inherently safe, highly resilient, and potentially very cheap, making them promising for a wide array of applications. But development on them has been a bit slow relative to other areas of robotics, at least partially because soft robots can’t directly benefit from the massive increase in computing power and sensor and actuator availability that we’ve seen over the last few decades. Instead, roboticists have had to get creative to find ways of achieving the functionality of conventional robotics components using soft materials and compatible power sources.

In the current issue of Science Robotics, researchers from UC San Diego demonstrate a soft walking robot with four legs that moves with a turtle-like gait controlled by a pneumatic circuit system made from tubes and valves. This air-powered nervous system can actuate multiple degrees of freedom in sequence from a single source of pressurized air, offering a huge reduction in complexity and bringing a very basic form of decision making onto the robot itself.

Generally, when people talk about soft robots, the robots are only mostly soft. There are some components that are very difficult to make soft, including pressure sources and the necessary electronics to direct that pressure between different soft actuators in a way that can be used for propulsion. What’s really cool about this robot is that researchers have managed to take a pressure source (either a single tether or an onboard CO2 cartridge) and direct it to four different legs, each with three different air chambers, using an oscillating three valve circuit made entirely of soft materials. 

Photo: UCSD The pneumatic circuit that powers and controls the soft quadruped.

The inspiration for this can be found in biology—natural organisms, including quadrupeds, use nervous system components called central pattern generators (CPGs) to prompt repetitive motions with limbs that are used for walking, flying, and swimming. This is obviously more complicated in some organisms than in others, and is typically mediated by sensory feedback, but the underlying structure of a CPG is basically just a repeating circuit that drives muscles in sequence to produce a stable, continuous gait. In this case, we’ve got pneumatic muscles being driven in opposing pairs, resulting in a diagonal couplet gait, where diagonally opposed limbs rotate forwards and backwards at the same time.

Diagram: Science Robotics

(J) Pneumatic logic circuit for rhythmic leg motion. A constant positive pressure source (P+) applied to three inverter components causes a high-pressure state to propagate around the circuit, with a delay at each inverter. While the input to one inverter is high, the attached actuator (i.e., A1, A2, or A3) is inflated. This sequence of high-pressure states causes each pair of legs of the robot to rotate in a direction determined by the pneumatic connections. (K) By reversing the sequence of activation of the pneumatic oscillator circuit, the attached actuators inflate in a new sequence (A1, A3, and A2), causing (L) the legs of the robot to rotate in reverse. (M) Schematic bottom view of the robot with the directions of leg motions indicated for forward walking.

Diagram: Science Robotics

Each of the valves acts as an inverter by switching the normally closed half (top) to open and the normally open half (bottom) to closed.

The circuit itself is made up of three bistable pneumatic valves connected by tubing that acts as a delay by providing resistance to the gas moving through it that can be adjusted by altering the tube’s length and inner diameter. Within the circuit, the movement of the pressurized gas acts as both a source of energy and as a signal, since wherever the pressure is in the circuit is where the legs are moving. The simplest circuit uses only three valves, and can keep the robot walking in one single direction, but more valves can add more complex leg control options. For example, the researchers were able to use seven valves to tune the phase offset of the gait, and even just one additional valve (albeit of a slightly more complex design) could enable reversal of the system, causing the robot to walk backwards in response to input from a soft sensor. And with another complex valve, a manual (tethered) controller could be used for omnidirectional movement.

This work has some similarities to the rover that JPL is developing to explore Venus—that rover isn’t a soft robot, of course, but it operates under similar constraints in that it can’t rely on conventional electronic systems for autonomous navigation or control. It turns out that there are plenty of clever ways to use mechanical (or in this case, pneumatic) intelligence to make robots with relatively complex autonomous behaviors, meaning that in the future, soft (or soft-ish) robots could find valuable roles in situations where using a non-compliant system is not a good option.

For more on why we should be so excited about soft robots and just how soft a soft robot needs to be, we spoke with Michael Tolley, who runs the Bioinspired Robotics and Design Lab at UCSD, and Dylan Drotman, the paper’s first author.

IEEE Spectrum: What can soft robots do for us that more rigid robotic designs can’t?

Michael Tolley: At the very highest level, one of the fundamental assumptions of robotics is that you have rigid bodies connected at joints, and all your motion happens at these joints. That's a really nice approach because it makes the math easy, frankly, and it simplifies control. But when you look around us in nature, even though animals do have bones and joints, the way we interact with the world is much more complicated than that simple story. I’m interested in where we can take advantage of material properties in robotics. If you look at robots that have to operate in very unknown environments, I think you can build in some of the intelligence for how to deal with those environments into the body of the robot itself. And that’s the category this work really falls under—it's about navigating the world.

Dylan Drotman: Walking through confined spaces is a good example. With the rigid legged robot, you would have to completely change the way that the legs move to walk through a confined space, while if you have flexible legs, like the robot in our paper, you can use relatively simple control strategies to squeeze through an area you wouldn’t be able to get through with a rigid system. 

How smart can a soft robot get?

Drotman: Right now we have a sensor on the front that's connected through a fluidic transmission to a bistable valve that causes the robot to reverse. We could add other sensors around the robot to allow it to change direction whenever it runs into an obstacle to effectively make an electronics-free version of a Roomba.

Tolley: Stepping back a little bit from that, one could make an argument that we’re using basic memory elements to generate very basic signals. There’s nothing in principle that would stop someone from making a pneumatic computer—it’s just very complicated to make something that complex. I think you could build on this and do more intelligent decision making, but using this specific design and the components we’re using, it’s likely to be things that are more direct responses to the environment. 

How well would robots like these scale down?

Drotman: At the moment we’re manufacturing these components by hand, so the idea would be to make something more like a printed circuit board instead, and looking at how the channel sizes and the valve design would affect the actuation properties. We’ll also be coming up with new circuits, and different designs for the circuits themselves.

Tolley: Down to centimeter or millimeter scale, I don’t think you’d have fundamental fluid flow problems. I think you’re going to be limited more by system design constraints. You’ll have to be able to locomote while carrying around your pressure source, and possibly some other components that are also still rigid. When you start to talk about really small scales, though, it's not as clear to me that you really need an intrinsically soft robot. If you think about insects, their structural geometry can make them behave like they’re soft, but they’re not intrinsically soft.

Should we be thinking about soft robots and compliant robots in the same way, or are they fundamentally different?

Tolley: There’s certainly a connection between the two. You could have a compliant robot that behaves in a very similar way to an intrinsically soft robot, or a robot made of intrinsically soft materials. At that point, it comes down to design and manufacturing and practical limitations on what you can make. I think when you get down to small scales, the two sort of get connected. 

There was some interesting work several years ago on using explosions to power soft robots. Is that still a thing?

Tolley: One of the opportunities with soft robots is that with material compliance, you have the potential to store energy. I think there’s exciting potential there for rapid motion with a soft body. Combustion is one way of doing that with power coming from a chemical source all at once, but you could also use a relatively weak muscle that over time stores up energy in a soft body and then releases it. 

Is it realistic to expect complete softness from soft robots, or will they likely always have rigid components because they have to store or generate and move pressurized gas somehow?

Tolley: If you look in nature, you do have soft pumps like the heart, but although it’s soft, it’s still relatively stiff. Like, if you grab a heart, it’s not totally squishy. I haven’t done it, but I’d imagine. If you have a container that you’re pressurizing, it has to be stiff enough to not just blow up like a balloon. Certainly pneumatics or hydraulics are not the only way to go for soft actuators; there has been some really nice work on smart muscles and smart materials like hydraulic electrostatic (HASEL) actuators. They seem promising, but all of these actuators have challenges. We’ve chosen to stick with pressurized pneumatics in the near term; longer term, I think you’ll start to see more of these smart material actuators become more practical.

Personally, I don’t have any problem with soft robots having some rigid components. Most animals on land have some rigid components, but they can still take advantage of being soft, so it’s probably going to be a combination. But I do also like the vision of making an entirely soft, squishy thing.

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online Conference] RoboSoft 2021 – April 12-16, 2021 – [Online Conference] ICRA 2021 – May 30-5, 2021 – Xi'an, China

Let us know if you have suggestions for next week, and enjoy today's videos.

Hmm, did anything interesting happen in robotics yesterday week?

Obviously, we're going to have tons more on the Mars Rover and Mars Helicopter over the next days, weeks, months, years, and (if JPL's track record has anything to say about it) decades. Meantime, here's what's going to happen over the next day or two:

[ Mars 2020 ]

PLEN hopes you had a happy Valentine's Day!

[ PLEN ]

Unitree dressed up a whole bunch of Laikago quadrupeds to take part in the 2021 Spring Festival Gala in China.

[ Unitree ]

Thanks Xingxing!

Marine iguanas compete for the best nesting sites on the Galapagos Islands. Meanwhile RoboSpy Iguana gets involved in a snot sneezing competition after the marine iguanas return from the sea.

[ Spy in the Wild ]

Tails, it turns out, are useful for almost everything.

[ DART Lab ]

Partnered with MD-TEC, this video demonstrates use of teleoperated robotic arms and virtual reality interface to perform closed suction for self-ventilating tracheostomy patients during COVID -19 outbreak. Use of closed suction is recommended to minimise aerosol generated during this procedure. This robotic method avoids staff exposure to virus to further protect NHS.

[ Extend Robotics ]

Fotokite is a safe, practical way to do local surveillance with a drone.

I just wish they still had a consumer version :(

[ Fotokite ]

How to confuse fish.

[ Harvard ]

Army researchers recently expanded their research area for robotics to a site just north of Baltimore. Earlier this year, Army researchers performed the first fully-autonomous tests onsite using an unmanned ground vehicle test bed platform, which serves as the standard baseline configuration for multiple programmatic efforts within the laboratory. As a means to transition from simulation-based testing, the primary purpose of this test event was to capture relevant data in a live, operationally-relevant environment.

[ Army ]

Flexiv's new RIZON 10 robot hopes you had a happy Valentine's Day!

[ Flexiv ]

Thanks Yunfan!

An inchworm-inspired crawling robot (iCrawl) is a 5 DOF robot with two legs; each with an electromagnetic foot to crawl on the metal pipe surfaces. The robot uses a passive foot-cap underneath an electromagnetic foot, enabling it to be a versatile pipe-crawler. The robot has the ability to crawl on the metal pipes of various curvatures in horizontal and vertical directions. The robot can be used as a new robotic solution to assist close inspection outside the pipelines, thus minimizing downtime in the oil and gas industry.

[ Paper ]

Thanks Poramate!

A short film about Robot Wars from Blender Magazine in 1995.

[ YouTube ]

While modern cameras provide machines with a very well-developed sense of vision, robots still lack such a comprehensive solution for their sense of touch. The talk will present examples of why the sense of touch can prove crucial for a wide range of robotic applications, and a tech demo will introduce a novel sensing technology targeting the next generation of soft robotic skins. The prototype of the tactile sensor developed at ETH Zurich exploits the advances in camera technology to reconstruct the forces applied to a soft membrane. This technology has the potential to revolutionize robotic manipulation, human-robot interaction, and prosthetics.

[ ETHZ ]

Thanks Markus!

Quadrupedal robotics has reached a level of performance and maturity that enables some of the most advanced real-world applications with autonomous mobile robots. Driven by excellent research in academia and industry all around the world, a growing number of platforms with different skills target different applications and markets. We have invited a selection of experts with long-standing experience in this vibrant research area

[ IFRR ]

Thanks Fan!

Since January 2020, more than 300 different robots in over 40 countries have been used to cope with some aspect of the impact of the coronavirus pandemic on society. The majority of these robots have been used to support clinical care and public safety, allowing responders to work safely and to handle the surge in infections. This panel will discuss how robots have been successfully used and what is needed, both in terms of fundamental research and policy, for robotics to be prepared for the future emergencies.

[ IFRR ]

At Skydio, we ship autonomous robots that are flown at scale in complex, unknown environments every day. We’ve invested six years of R&D into handling extreme visual scenarios not typically considered by academia nor encountered by cars, ground robots, or AR applications. Drones are commonly in scenes with few or no semantic priors on the environment and must deftly navigate thin objects, extreme lighting, camera artifacts, motion blur, textureless surfaces, vibrations, dirt, smudges, and fog. These challenges are daunting for classical vision, because photometric signals are simply inconsistent. And yet, there is no ground truth for direct supervision of deep networks. We’ll take a detailed look at these issues and how we’ve tackled them to push the state of the art in visual inertial navigation, obstacle avoidance, rapid trajectory planning. We will also cover the new capabilities on top of our core navigation engine to autonomously map complex scenes and capture all surfaces, by performing real-time 3D reconstruction across multiple flights.

[ UPenn ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online Conference] RoboSoft 2021 – April 12-16, 2021 – [Online Conference] ICRA 2021 – May 30-5, 2021 – Xi'an, China

Let us know if you have suggestions for next week, and enjoy today's videos.

Hmm, did anything interesting happen in robotics yesterday week?

Obviously, we're going to have tons more on the Mars Rover and Mars Helicopter over the next days, weeks, months, years, and (if JPL's track record has anything to say about it) decades. Meantime, here's what's going to happen over the next day or two:

[ Mars 2020 ]

PLEN hopes you had a happy Valentine's Day!

[ PLEN ]

Unitree dressed up a whole bunch of Laikago quadrupeds to take part in the 2021 Spring Festival Gala in China.

[ Unitree ]

Thanks Xingxing!

Marine iguanas compete for the best nesting sites on the Galapagos Islands. Meanwhile RoboSpy Iguana gets involved in a snot sneezing competition after the marine iguanas return from the sea.

[ Spy in the Wild ]

Tails, it turns out, are useful for almost everything.

[ DART Lab ]

Partnered with MD-TEC, this video demonstrates use of teleoperated robotic arms and virtual reality interface to perform closed suction for self-ventilating tracheostomy patients during COVID -19 outbreak. Use of closed suction is recommended to minimise aerosol generated during this procedure. This robotic method avoids staff exposure to virus to further protect NHS.

[ Extend Robotics ]

Fotokite is a safe, practical way to do local surveillance with a drone.

I just wish they still had a consumer version :(

[ Fotokite ]

How to confuse fish.

[ Harvard ]

Army researchers recently expanded their research area for robotics to a site just north of Baltimore. Earlier this year, Army researchers performed the first fully-autonomous tests onsite using an unmanned ground vehicle test bed platform, which serves as the standard baseline configuration for multiple programmatic efforts within the laboratory. As a means to transition from simulation-based testing, the primary purpose of this test event was to capture relevant data in a live, operationally-relevant environment.

[ Army ]

Flexiv's new RIZON 10 robot hopes you had a happy Valentine's Day!

[ Flexiv ]

Thanks Yunfan!

An inchworm-inspired crawling robot (iCrawl) is a 5 DOF robot with two legs; each with an electromagnetic foot to crawl on the metal pipe surfaces. The robot uses a passive foot-cap underneath an electromagnetic foot, enabling it to be a versatile pipe-crawler. The robot has the ability to crawl on the metal pipes of various curvatures in horizontal and vertical directions. The robot can be used as a new robotic solution to assist close inspection outside the pipelines, thus minimizing downtime in the oil and gas industry.

[ Paper ]

Thanks Poramate!

A short film about Robot Wars from Blender Magazine in 1995.

[ YouTube ]

While modern cameras provide machines with a very well-developed sense of vision, robots still lack such a comprehensive solution for their sense of touch. The talk will present examples of why the sense of touch can prove crucial for a wide range of robotic applications, and a tech demo will introduce a novel sensing technology targeting the next generation of soft robotic skins. The prototype of the tactile sensor developed at ETH Zurich exploits the advances in camera technology to reconstruct the forces applied to a soft membrane. This technology has the potential to revolutionize robotic manipulation, human-robot interaction, and prosthetics.

[ ETHZ ]

Thanks Markus!

Quadrupedal robotics has reached a level of performance and maturity that enables some of the most advanced real-world applications with autonomous mobile robots. Driven by excellent research in academia and industry all around the world, a growing number of platforms with different skills target different applications and markets. We have invited a selection of experts with long-standing experience in this vibrant research area

[ IFRR ]

Thanks Fan!

Since January 2020, more than 300 different robots in over 40 countries have been used to cope with some aspect of the impact of the coronavirus pandemic on society. The majority of these robots have been used to support clinical care and public safety, allowing responders to work safely and to handle the surge in infections. This panel will discuss how robots have been successfully used and what is needed, both in terms of fundamental research and policy, for robotics to be prepared for the future emergencies.

[ IFRR ]

At Skydio, we ship autonomous robots that are flown at scale in complex, unknown environments every day. We’ve invested six years of R&D into handling extreme visual scenarios not typically considered by academia nor encountered by cars, ground robots, or AR applications. Drones are commonly in scenes with few or no semantic priors on the environment and must deftly navigate thin objects, extreme lighting, camera artifacts, motion blur, textureless surfaces, vibrations, dirt, smudges, and fog. These challenges are daunting for classical vision, because photometric signals are simply inconsistent. And yet, there is no ground truth for direct supervision of deep networks. We’ll take a detailed look at these issues and how we’ve tackled them to push the state of the art in visual inertial navigation, obstacle avoidance, rapid trajectory planning. We will also cover the new capabilities on top of our core navigation engine to autonomously map complex scenes and capture all surfaces, by performing real-time 3D reconstruction across multiple flights.

[ UPenn ]

We report on a series of workshops with musicians and robotics engineers aimed to study how human and machine improvisation can be explored through interdisciplinary design research. In the first workshop, we posed two leading questions to participants. First, what can AI and robotics learn by how improvisers think about time, space, actions, and decisions? Second, how can improvisation and musical instruments be enhanced by AI and robotics? The workshop included sessions led by the musicians, which provided an overview of the theory and practice of musical improvisation. In other sessions, AI and robotics researchers introduced AI principles to the musicians. Two smaller follow-up workshops comprised of only engineering and information science students provided an opportunity to elaborate on the principles covered in the first workshop. The workshops revealed parallels and discrepancies in the conceptualization of improvisation between musicians and engineers. These thematic differences could inform considerations for future designers of improvising robots.

Pages