Feed aggregator

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

The Real-World Deployment of Legged Robots Workshop is back at ICRA 2020!

We’ll be there!

Workshop ]

Thanks Marko!

This video shows some cool musical experiments with Pepper. They should definitely release this karaoke feature to Peppers everywhere—with “Rage Against the Machine” songs included, of course. NSFW warning: There is some swearing by both robot and humans, so headphones recommended if you’re at work.

It all started when on a whim David and another team member fed a karaoke file into Pepper’s text to speech, with a quick Python script, and playing some music in parallel from their PC. The effect was a bit strange, but there was something so fun (and funny) to it. I think they were going for a virtual performance from Pepper or something, but someone noted that it sounds like he’s struggling like someone doing karaoke. And from there it grew into doing duets with Pepper.

This thing might seem ridiculous, and it is. But believe me, it’s genuinely fun. It was going all night in a meeting room at the office winter party.

[ Taylor Veltrop ]

And now, this.

In “Scary Beauty,” a performance conceived and directed by Tokyo-based musician Keiichiro Shibuya, a humanoid robot called Alter 3 not only conducts a human orchestra but also sings along with it. 

Unlike the previous two "Alters", the Alter 3 has improved sensory and expression capabilities closer to humans, such as a camera with both eyes and the ability to utter from the mouth, as well as expressiveness around the mouth for singing. In addition, the output was enhanced compared to the alternator 2, which made it possible to improve the immediacy of the body expression and achieve dynamic movement. In addition, portability, which allows anyone to disassemble and assemble and transport by air, is one of the evolutions of the Altera 3.

Scary Beauty ] via [ RobotStart ]

Carnegie Mellon University’s Henny Admoni studies human behavior in order to program robots to better anticipate people’s needs. Admoni’s research focuses on using assistive robots to address different impairments and aid people in living more fulfilling lives.

[ HARP Lab ]

Olympia was produced as part of a two-year project exploring the growth of social and humanoid robotics in the UK and beyond. Olympia was shot on location at Bristol Robotics Labs, one of the largest of its kind in Britain.

Humanoid robotics - one the most complex and often provocative areas of artificial intelligence - form the central subject of this short film. At what point are we willing to believe that we might form a real bond with a machine?

[ Olympia ] via [ Bristol Robotics Lab ]

In this work, we explore user preferences for different modes of autonomy for robot-assisted feeding given perceived error risks and also analyze the effect of input modalities on technology acceptance.

[ Personal Robotics Lab ]

This video brings to you a work conducted on a multi-agent system of aerial robots to form mid-air structures by docking using position-based visual servoing of the aerial robot. For the demonstration, the commercially available drone DJI Tello has been modified to fit to use and has been commanded using the DJI Tello Python SDK.

[ YouTube ]

The video present DLR CLASH (Compliant Low-cost Antagonistic Servo Hand) developed within the EU-Project Soma (grant number H2020-ICT-645599) and shows the hand resilience tests and the capability of the hand to grasp objects under different motor and sensor failures.

[ DLR ]

Squishy Robotics is celebrating our birthday! Here is a short montage of the places we’ve been and the things we’ve done over the last three years.

[ Squishy Robotics ]

The 2020 DJI RoboMaster Challenge takes place in Shenzhen in early August 2020.

[ RoboMaster ]

With support from the National Science Foundation, electrical engineer Yan Wan and a team at the University of Texas at Arlington are developing a new generation of "networked" unmanned aerial vehicles (UAVs) to bring long distance, broadband communications capability to first responders in the field.

[ NSF ]

Drones and UAVs are vulnerable to hackers that might try to take control of the craft or access data stored on-board. Researchers at the University of Michigan are part of a team building a suite of software to keep drones secure.

The suite is called Trusted and Resilient Mission Operations (TRMO). The U-M team, led by Wes Weimer, professor of electrical engineering and computer science, is focused on integrating the different applications into a holistic system that can prevent and combat attacks in real time.

[ UMich ]

A mobile robot that revs up industrial production: SOTO enables efficient automated line feeding, for example in the automotive industry. The supply chain robot SOTO brings materials to the assembly line, just-in-time and completely autonomous.

[ Magazino ]

MIT’s Lex Fridman get us caught up with the state-of-the-art in deep learning.

[ MIT ]

Just in case you couldn’t make it out to Australia in 2018, here are a couple of the keynotes from ICRA in Brisbane.

[ ICRA 2018 ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

The Real-World Deployment of Legged Robots Workshop is back at ICRA 2020!

We’ll be there!

Workshop ]

Thanks Marko!

This video shows some cool musical experiments with Pepper. They should definitely release this karaoke feature to Peppers everywhere—with “Rage Against the Machine” songs included, of course. NSFW warning: There is some swearing by both robot and humans, so headphones recommended if you’re at work.

It all started when on a whim David and another team member fed a karaoke file into Pepper’s text to speech, with a quick Python script, and playing some music in parallel from their PC. The effect was a bit strange, but there was something so fun (and funny) to it. I think they were going for a virtual performance from Pepper or something, but someone noted that it sounds like he’s struggling like someone doing karaoke. And from there it grew into doing duets with Pepper.

This thing might seem ridiculous, and it is. But believe me, it’s genuinely fun. It was going all night in a meeting room at the office winter party.

[ Taylor Veltrop ]

And now, this.

In “Scary Beauty,” a performance conceived and directed by Tokyo-based musician Keiichiro Shibuya, a humanoid robot called Alter 3 not only conducts a human orchestra but also sings along with it. 

Unlike the previous two "Alters", the Alter 3 has improved sensory and expression capabilities closer to humans, such as a camera with both eyes and the ability to utter from the mouth, as well as expressiveness around the mouth for singing. In addition, the output was enhanced compared to the alternator 2, which made it possible to improve the immediacy of the body expression and achieve dynamic movement. In addition, portability, which allows anyone to disassemble and assemble and transport by air, is one of the evolutions of the Altera 3.

Scary Beauty ] via [ RobotStart ]

Carnegie Mellon University’s Henny Admoni studies human behavior in order to program robots to better anticipate people’s needs. Admoni’s research focuses on using assistive robots to address different impairments and aid people in living more fulfilling lives.

[ HARP Lab ]

Olympia was produced as part of a two-year project exploring the growth of social and humanoid robotics in the UK and beyond. Olympia was shot on location at Bristol Robotics Labs, one of the largest of its kind in Britain.

Humanoid robotics - one the most complex and often provocative areas of artificial intelligence - form the central subject of this short film. At what point are we willing to believe that we might form a real bond with a machine?

[ Olympia ] via [ Bristol Robotics Lab ]

In this work, we explore user preferences for different modes of autonomy for robot-assisted feeding given perceived error risks and also analyze the effect of input modalities on technology acceptance.

[ Personal Robotics Lab ]

This video brings to you a work conducted on a multi-agent system of aerial robots to form mid-air structures by docking using position-based visual servoing of the aerial robot. For the demonstration, the commercially available drone DJI Tello has been modified to fit to use and has been commanded using the DJI Tello Python SDK.

[ YouTube ]

The video present DLR CLASH (Compliant Low-cost Antagonistic Servo Hand) developed within the EU-Project Soma (grant number H2020-ICT-645599) and shows the hand resilience tests and the capability of the hand to grasp objects under different motor and sensor failures.

[ DLR ]

Squishy Robotics is celebrating our birthday! Here is a short montage of the places we’ve been and the things we’ve done over the last three years.

[ Squishy Robotics ]

The 2020 DJI RoboMaster Challenge takes place in Shenzhen in early August 2020.

[ RoboMaster ]

With support from the National Science Foundation, electrical engineer Yan Wan and a team at the University of Texas at Arlington are developing a new generation of "networked" unmanned aerial vehicles (UAVs) to bring long distance, broadband communications capability to first responders in the field.

[ NSF ]

Drones and UAVs are vulnerable to hackers that might try to take control of the craft or access data stored on-board. Researchers at the University of Michigan are part of a team building a suite of software to keep drones secure.

The suite is called Trusted and Resilient Mission Operations (TRMO). The U-M team, led by Wes Weimer, professor of electrical engineering and computer science, is focused on integrating the different applications into a holistic system that can prevent and combat attacks in real time.

[ UMich ]

A mobile robot that revs up industrial production: SOTO enables efficient automated line feeding, for example in the automotive industry. The supply chain robot SOTO brings materials to the assembly line, just-in-time and completely autonomous.

[ Magazino ]

MIT’s Lex Fridman get us caught up with the state-of-the-art in deep learning.

[ MIT ]

Just in case you couldn’t make it out to Australia in 2018, here are a couple of the keynotes from ICRA in Brisbane.

[ ICRA 2018 ]

Automation of logistic tasks, such as object picking and placing, is currently one of the most active areas of research in robotics. Handling delicate objects, such as fruits and vegetables, both in warehouses and in plantations, is a big challenge due to the delicacy and precision required for the task. This paper presents the CLASH hand, a Compliant Low-Cost Antagonistic Servo Hand, whose kinematics was specifically designed for handling groceries. The main feature of the hand is its variable stiffness, which allows it to withstand collisions with the environment and also to adapt the passive stiffness to the object weight while relying on a modular design using off-the-shelf low-cost components. Due to the implementation of differentially coupled flexors, the hand can be actuated like an underactuated hand but can also be driven with different stiffness levels to planned grasp poses, i.e., it can serve for both model-based grasp planning and for underactuated or model-free grasping. The hand also includes self-checking and logging processes, which enable more robust performance during grasping actions. This paper presents key aspects of the hand design, examines the robustness of the hand in impact tests, and uses a standardized fruit benchmarking test to verify the behavior of the hand when different actuator and sensor failures occur and are compensated for autonomously by the hand.

In the immediate aftermath following a large-scale release of radioactive material into the environment, it is necessary to determine the spatial distribution of radioactivity quickly. At present, this is conducted by utilizing manned aircraft equipped with large-volume radiation detection systems. Whilst these are capable of mapping large areas quickly, they suffer from a low spatial resolution due to the operating altitude of the aircraft. They are also expensive to deploy and their manned nature means that the operators are still at risk of exposure to potentially harmful ionizing radiation. Previous studies have identified the feasibility of utilizing unmanned aerial systems (UASs) in monitoring radiation in post-disaster environments. However, the majority of these systems suffer from a limited range or are too heavy to be easily integrated into regulatory restrictions that exist on the deployment of UASs worldwide. This study presents a new radiation mapping UAS based on a lightweight (8 kg) fixed-wing unmanned aircraft and tests its suitability to mapping post-disaster radiation in the Chornobyl Exclusion Zone (CEZ). The system is capable of continuous flight for more than 1 h and can resolve small scale changes in dose-rate in high resolution (sub-20 m). It is envisaged that with some minor development, these systems could be utilized to map large areas of hazardous land without exposing a single operator to a harmful dose of ionizing radiation.

Birds have been doing their flying thing with flexible and feathery wings for about a hundred million years, give or take. And about a hundred years ago, give or take, humans decided that, although birds may be the flying experts, we’re just going to go off in our own direction with mostly rigid wings and propellers and stuff, because it’s easier or whatever. The few attempts at making artificial feathers that we’ve seen in the past have been sufficient for a few specific purposes but haven’t really come close to emulating the capabilities that real feathers bestow on the wings of birds. So a century later, we’re still doing the rigid wings with discrete flappy bits, while birds (one has to assume) continue to judge us for our poor choices.

In a paper published today in Science Robotics, researchers at Stanford University have presented some new work on understanding exactly how birds maintain control by morphing the shape of their wings. They put together a flying robot called PigeonBot with a pair of “biohybrid morphing wings” to test out new control principles, and instead of trying to develop some kind of fancy new artificial feather system, they did something that makes a lot more sense: They cheated, by just using real feathers instead. 

The reason why robots are an important part of this research (which otherwise seems like it would be avian biology) is because there’s no good way to use a real bird as a test platform. As far as I know, you can’t exactly ask a pigeon to try and turn just using some specific wing muscles, but you can definitely program a biohybrid robot to do that. However, most of the other bioinspired flying robots that we’ve seen have been some flavor of ornithopter (rigid flapping wings), or they’ve used stretchy membrane wings, like bats.

Image: Lentink Lab/Stanford University By examining real feathers, the Stanford researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro.”

Feathers aren’t just more complicated to manufacture, but you have to find some way of replicating and managing all of the complex feather-on-feather interactions that govern wing morphing in real birds. For example, by examining real feathers, the researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro,” something “new to science and technology.” Real feathers can slide to allow the wing to morph, but past a certain point, the directional Velcro engages to keep gaps from developing in the wing surface. There are additional practical advantages, too: “they are softer, lighter, more robust, and easier to get back into shape after a crash by simply preening ruffled feathers between one’s fingers.”

With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering, while thrust and a bit of additional stabilizing control comes from a propeller and a conventional tail. The researchers found that PigeonBot’s roll could be controlled with just the movement of the finger joint on the wing, and that this technique is inherently much more stable than the aileron roll used by conventional aircraft, as corresponding author David Lentink, head of Stanford's Bio-Inspired Research & Design (BIRD) Lab, describes:

The other cool thing we found is that the morphing wing asymmetry results automatically in a steady roll angle. In contrast aircraft aileron left-right asymmetry results in a roll rate, which the pilot or autopilot then has to stop to achieve a steady roll angle. Controlling a banked turn via roll angle is much simpler than via roll rate. We think it may enable birds to fly more stably in turbulence, because wing asymmetry corresponds to an equilibrium angle that the wings automatically converge to. If you are flying in turbulence and have to control the robot or airplane attitude via roll rate in response to many stochastic perturbations, roll angle has to be actively adjusted continuously without any helpful passive dynamics of the wing. Although this finding requires more research and testing, it shows how aerospace engineers can find inspiration to think outside of the box by studying how birds fly. 

The researchers suggest that the directional Velcro technology is one of the more important results of this study, and while they’re not pursuing any of the numerous potential applications, they’ve “decided to not patent this finding to help proliferate our discovery to the benefit of society at large” in the hopes that anyone who makes a huge pile of money off of it will (among other things) invest in bird conservation in gratitude.

Image: Lentink Lab/Stanford University With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering.

As for PigeonBot itself, Lentink says he’d like to add a biohybrid morphing tail, as well as legs with grasping feet, and additional actuators for wing folding and twisting and flapping. And maybe make it fly autonomously, too. Sound good to me—that kind of robot would be great at data transfer.

[ Science Robotics ]

Birds have been doing their flying thing with flexible and feathery wings for about a hundred million years, give or take. And about a hundred years ago, give or take, humans decided that, although birds may be the flying experts, we’re just going to go off in our own direction with mostly rigid wings and propellers and stuff, because it’s easier or whatever. The few attempts at making artificial feathers that we’ve seen in the past have been sufficient for a few specific purposes but haven’t really come close to emulating the capabilities that real feathers bestow on the wings of birds. So a century later, we’re still doing the rigid wings with discrete flappy bits, while birds (one has to assume) continue to judge us for our poor choices.

In a paper published today in Science Robotics, researchers at Stanford University have presented some new work on understanding exactly how birds maintain control by morphing the shape of their wings. They put together a flying robot called PigeonBot with a pair of “biohybrid morphing wings” to test out new control principles, and instead of trying to develop some kind of fancy new artificial feather system, they did something that makes a lot more sense: They cheated, by just using real feathers instead. 

The reason why robots are an important part of this research (which otherwise seems like it would be avian biology) is because there’s no good way to use a real bird as a test platform. As far as I know, you can’t exactly ask a pigeon to try and turn just using some specific wing muscles, but you can definitely program a biohybrid robot to do that. However, most of the other bioinspired flying robots that we’ve seen have been some flavor of ornithopter (rigid flapping wings), or they’ve used stretchy membrane wings, like bats.

Image: Lentink Lab/Stanford University By examining real feathers, the Stanford researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro.”

Feathers aren’t just more complicated to manufacture, but you have to find some way of replicating and managing all of the complex feather-on-feather interactions that govern wing morphing in real birds. For example, by examining real feathers, the researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro,” something “new to science and technology.” Real feathers can slide to allow the wing to morph, but past a certain point, the directional Velcro engages to keep gaps from developing in the wing surface. There are additional practical advantages, too: “they are softer, lighter, more robust, and easier to get back into shape after a crash by simply preening ruffled feathers between one’s fingers.”

With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering, while thrust and a bit of additional stabilizing control comes from a propeller and a conventional tail. The researchers found that PigeonBot’s roll could be controlled with just the movement of the finger joint on the wing, and that this technique is inherently much more stable than the aileron roll used by conventional aircraft, as corresponding author David Lentink, head of Stanford's Bio-Inspired Research & Design (BIRD) Lab, describes:

The other cool thing we found is that the morphing wing asymmetry results automatically in a steady roll angle. In contrast aircraft aileron left-right asymmetry results in a roll rate, which the pilot or autopilot then has to stop to achieve a steady roll angle. Controlling a banked turn via roll angle is much simpler than via roll rate. We think it may enable birds to fly more stably in turbulence, because wing asymmetry corresponds to an equilibrium angle that the wings automatically converge to. If you are flying in turbulence and have to control the robot or airplane attitude via roll rate in response to many stochastic perturbations, roll angle has to be actively adjusted continuously without any helpful passive dynamics of the wing. Although this finding requires more research and testing, it shows how aerospace engineers can find inspiration to think outside of the box by studying how birds fly. 

The researchers suggest that the directional Velcro technology is one of the more important results of this study, and while they’re not pursuing any of the numerous potential applications, they’ve “decided to not patent this finding to help proliferate our discovery to the benefit of society at large” in the hopes that anyone who makes a huge pile of money off of it will (among other things) invest in bird conservation in gratitude.

Image: Lentink Lab/Stanford University With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering.

As for PigeonBot itself, Lentink says he’d like to add a biohybrid morphing tail, as well as legs with grasping feet, and additional actuators for wing folding and twisting and flapping. And maybe make it fly autonomously, too. Sound good to me—that kind of robot would be great at data transfer.

[ Science Robotics ]

Self-folding technologies have been studied by many researchers for applications to various engineering fields. Most of the self-folding methods that use the physical properties of materials require complex preparation, and usually take time to complete. In order to solve these problems, we focus on the elasticity of a material, and propose a model for forming a 3D structure using its characteristics. Our proposed model achieves high-speed and high-precision self-folding with a simple structure, by attaching rigid frames to a stretchable elastomer. The self-folded structure is applied to introduce a self-assembled actuator by exploiting a dielectric elastomer actuator (DEA). We develop the self-assembled actuator driven with the voltage application by attaching stretchable electrodes on the both side of the elastomer. We attempt several experiments to investigate the basic characteristics of the actuator. We also propose an application of the self-assembled actuator as a gripper based on the experimental results. The gripper has three joints with the angle of 120°, and successfully grabs objects by switching the voltage.

At CES 2017, I got my mind blown by a giant mystery box from a company called AxonVR that was able to generate astonishingly convincing tactile sensations of things like tiny palm-sized animals running around on my palm in VR. An update in late 2017 traded the giant mystery box (and the ability to reproduce heat and cold) for a wearable glove with high resolution microfluidic haptics embedded inside of it. By itself, the HaptX system is constrained to virtual reality, but when combined with a pair of Universal Robotics UR10 arms, Shadow dexterous robotic hands, and SynTouch tactile sensors, you end up with a system that can reproduce physical reality instead.

The demo at CES is pretty much the same thing that you may have seen video of Jeff Bezos trying at Amazon’s re:MARS conference. The heart of the system are the haptic gloves, which are equipped with spatial position and orientation sensors as well as finger location sensors. The movements that you make with your hands and fingers are mapped to the Shadow hands, while the UR10 arms try to match the relative position of the hands in space to your own. Going the other way, there’s a more or less 1-to-1 mapping between what the robot hands feel, and the forces that are transmitted into the fingers of the gloves.

It’s not a perfect system quite yet—sensors get occluded or otherwise out of whack on occasion, and you have to use a foot pedal as a sort of pause button on the robots while you reposition your limbs in a way that’s easier for the system to interpret. And the feel of the force transmission takes some getting used to. I want to say that it could be more finely calibrated, but much of that feeling is likely on my end, since we’re told that the system gets much easier to control with practice.

Photo: Andrew Mitrak/HaptX Evan uses the telerobotic hands to operate a mockup of a control panel used to shut down a nuclear reactor. He had to turn a valve, flip some switches, twist a knob, and then push a button. Meltdown averted!

Even as a brand new user, it was surprising how capable my remote controlled hands were. I had no trouble grabbing plastic cups and transferring a ball between them, although I had to take care not to accidentally crush the cups (which would trap the ball inside). At first, it was easy to consider the force feedback as more of a gimmick, but once I started to relax and pay attention to it, it provided useful information that made me more effective at the task I was working on.

After playing around with things a bit more (and perhaps proving myself not to be totally incompetent), I was given the second most challenging scenario—a simple mockup of a control panel used to shut down a nuclear reactor. I had to turn a valve, flip some switches, twist a knob, and then push a button, all of which required a variety of different grasps, motions, and forces. It was a bit fiddly, but I got it all, and what I found most impressive was that I was able to manipulate things even when I couldn’t see them—in this case, because one of the arms was blocking my view. I’m not sure that would have been possible without the integrated haptic system.

Image: Converge Robotics Group The three companies involved in the project (Shadow Robot Company, HaptX, and Tangible Research) have formed the Converge Robotics Group to commercialize the system.

The news from CES is that the three companies involved in this project (Shadow Robot Company, HaptX, and Tangible Research) have formed a sort of consortium-thing called Converge Robotics Group. Basically, the idea is to create a framework under which the tactile telerobot can be further developed and sold, because otherwise, it’s not at all clear who you’d even throw money at if you wanted to buy one. 

Speaking of buying one, this system is “now available for purchase by early access customers.” As for what it might cost, well… It’ll be a lot. There isn’t a specific number attached to the system yet, but with two UR10 arms and pair of Shadow hands, we’re looking at low six figures just in that portion of the hardware. Add in the HaptX gloves and whatever margin you need to keep your engineers fed, and it’s safe to say that this isn’t going to end up in your living room in the near future, no matter how cool that would be.

[ Converge Robotics Group ]

At CES 2017, I got my mind blown by a giant mystery box from a company called AxonVR that was able to generate astonishingly convincing tactile sensations of things like tiny palm-sized animals running around on my palm in VR. An update in late 2017 traded the giant mystery box (and the ability to reproduce heat and cold) for a wearable glove with high resolution microfluidic haptics embedded inside of it. By itself, the HaptX system is constrained to virtual reality, but when combined with a pair of Universal Robotics UR10 arms, Shadow dexterous robotic hands, and SynTouch tactile sensors, you end up with a system that can reproduce physical reality instead.

The demo at CES is pretty much the same thing that you may have seen video of Jeff Bezos trying at Amazon’s re:MARS conference. The heart of the system are the haptic gloves, which are equipped with spatial position and orientation sensors as well as finger location sensors. The movements that you make with your hands and fingers are mapped to the Shadow hands, while the UR10 arms try to match the relative position of the hands in space to your own. Going the other way, there’s a more or less 1-to-1 mapping between what the robot hands feel, and the forces that are transmitted into the fingers of the gloves.

It’s not a perfect system quite yet—sensors get occluded or otherwise out of whack on occasion, and you have to use a foot pedal as a sort of pause button on the robots while you reposition your limbs in a way that’s easier for the system to interpret. And the feel of the force transmission takes some getting used to. I want to say that it could be more finely calibrated, but much of that feeling is likely on my end, since we’re told that the system gets much easier to control with practice.

Photo: Andrew Mitrak/HaptX Evan uses the telerobotic hands to operate a mockup of a control panel used to shut down a nuclear reactor. He had to turn a valve, flip some switches, twist a knob, and then push a button. Meltdown averted!

Even as a brand new user, it was surprising how capable my remote controlled hands were. I had no trouble grabbing plastic cups and transferring a ball between them, although I had to take care not to accidentally crush the cups (which would trap the ball inside). At first, it was easy to consider the force feedback as more of a gimmick, but once I started to relax and pay attention to it, it provided useful information that made me more effective at the task I was working on.

After playing around with things a bit more (and perhaps proving myself not to be totally incompetent), I was given the second most challenging scenario—a simple mockup of a control panel used to shut down a nuclear reactor. I had to turn a valve, flip some switches, twist a knob, and then push a button, all of which required a variety of different grasps, motions, and forces. It was a bit fiddly, but I got it all, and what I found most impressive was that I was able to manipulate things even when I couldn’t see them—in this case, because one of the arms was blocking my view. I’m not sure that would have been possible without the integrated haptic system.

Image: Converge Robotics Group The three companies involved in the project (Shadow Robot Company, HaptX, and Tangible Research) have formed the Converge Robotics Group to commercialize the system.

The news from CES is that the three companies involved in this project (Shadow Robot Company, HaptX, and Tangible Research) have formed a sort of consortium-thing called Converge Robotics Group. Basically, the idea is to create a framework under which the tactile telerobot can be further developed and sold, because otherwise, it’s not at all clear who you’d even throw money at if you wanted to buy one. 

Speaking of buying one, this system is “now available for purchase by early access customers.” As for what it might cost, well… It’ll be a lot. There isn’t a specific number attached to the system yet, but with two UR10 arms and pair of Shadow hands, we’re looking at low six figures just in that portion of the hardware. Add in the HaptX gloves and whatever margin you need to keep your engineers fed, and it’s safe to say that this isn’t going to end up in your living room in the near future, no matter how cool that would be.

[ Converge Robotics Group ]

Engineers at Purdue University and at Georgia Tech have constructed the first devices from a new kind of two-dimensional material that combines memory-retaining properties and semiconductor properties. The engineers used a newly discovered ferroelectric semiconductor, alpha indium selenide, in two applications: as the basis of a type of transistor that stores memory as the amount of amplification it produces; and in a two-terminal device that could act as a component in future brain-inspired computers. The latter device was unveiled last month at the IEEE International Electron Devices Meeting in San Francisco.

Ferroelectric materials become polarized in an electric field and retain that polarization even after the field has been removed. Ferroelectric RAM cells in commercial memory chips use the former ability to store data in a capacitor-like structure. Recently, researchers have been trying to coax more tricks from these ferroelectric materials by bringing them into the transistor structure itself or by building other types of devices from them.

In particular, they’ve been embedding ferroelectric materials into a transistor’s gate dielectric, the thin layer that separates the electrode responsible for turning the transistor on and off from the channel through which current flows. Researchers have also been seeking a ferroelectric equivalent of the memristors, or resistive RAM, two-terminal devices that store data as resistance. Such devices, called ferroelectric tunnel junctions, are particularly attractive because they could be made into a very dense memory configuration called a cross-bar array. Many researchers working on neuromorphic- and low-power AI chips use memristors to act as the neural synapses in their networks. But so far, ferroelectric tunnel junction memories have been a problem.

“It’s very difficult to do,” says IEEE Fellow Peide Ye, who led the research at Purdue University. Because traditional ferroelectric materials are insulators, when the device is scaled down, there’s too little current passing through, explains Ye. When researchers try to solve that problem by making the ferroelectric layer very thin, the layer loses its ferroelectric properties.  

Instead, Ye’s group sought to solve the conductance problem by using a new ferroelectric material—alpha indium selenide— that acts as a semiconductor instead of an insulator. Under the influence of an electric field, the molecule undergoes a structural change that holds the polarization. Even better, the material is ferroelectric even as a single-molecule layer that is only about a nanometer thick. “This material is very unique,” says Ye.

Ye’s group made both transistors and memristor-like devices using the semiconductor. The memristor-like device, which they called a ferroelectric-semiconductor junction (FSJ), is just the semiconductor sandwiched between two conductors. This simple configuration could be formed into a dense cross-bar array and potentially shrunk down so that each device is only about 10 nanometers across, says Ye.

Proving the ability to scale the device down is the next goal for the research, along with characterizing how quickly the devices can switch, explains Ye. Further on, his team will look at applications for the FSJ in neuromorphic chips, where researchers have been trying a variety of new devices in the search for the perfect artificial neural synapse.

This paper describes a new method that enables a service robot to understand spoken commands in a robust manner using off-the-shelf automatic speech recognition (ASR) systems and an encoder-decoder neural network with noise injection. In numerous instances, the understanding of spoken commands in the area of service robotics is modeled as a mapping of speech signals to a sequence of commands that can be understood and performed by a robot. In a conventional approach, speech signals are recognized, and semantic parsing is applied to infer the command sequence from the utterance. However, if errors occur during the process of speech recognition, a conventional semantic parsing method cannot be appropriately applied because most natural language processing methods do not recognize such errors. We propose the use of encoder-decoder neural networks, e.g., sequence to sequence, with noise injection. The noise is injected into phoneme sequences during the training phase of encoder-decoder neural network-based semantic parsing systems. We demonstrate that the use of neural networks with a noise injection can mitigate the negative effects of speech recognition errors in understanding robot-directed speech commands i.e., increase the performance of semantic parsing. We implemented the method and evaluated it using the commands given during a general purpose service robot (GPSR) task, such as a task applied in RoboCup@Home, which is a standard service robot competition for the testing of service robots. The results of the experiment show that the proposed method, namely, sequence to sequence with noise injection (Seq2Seq-NI), outperforms the baseline methods. In addition, Seq2Seq-NI enables a robot to understand a spoken command even when the speech recognition by an off-the-shelf ASR system contains recognition errors. Moreover, in this paper we describe an experiment conducted to evaluate the influence of the injected noise and provide a discussion of the results.

Percutaneous Nephrolithotomy is the standard surgical procedure used to remove large kidney stones. PCNL procedures have a steep learning curve; a physician needs to complete between 36 and 60 procedures, to achieve clinical proficiency. Marion Surgical K181 is a virtual reality surgical simulator, which emulates the PCNL procedures without compromising the well-being of patients. The simulator uses a VR headset to place a user in a realistic and immersive operating theater, and haptic force-feedback robots to render physical interactions between surgical tools and the virtual patient. The simulator has two modules for two different aspects of PCNL kidney stone removal procedure: kidney access module where the user must insert a needle into the kidney of the patient, and a kidney stone removal module where the user removes the individual stones from the organ. In this paper, we present user trials to validate the face and construct validity of the simulator. The results, based on the data gathered from 4 groups of users independently, indicate that Marion's surgical simulator is a useful tool for teaching and practicing PCNL procedures. The kidney stone removal module of the simulator has proven construct validity by identifying the skill level of different users based on their tool path. We plan to continue evaluating the simulator with a larger sample of users to reinforce our findings.

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

Apparently the whole “little home robot with a cute personality will seamlessly improve your life” thing is still going on, at least as far as Samsung is concerned.

Predictably, there’s no information on how much Ballie costs, when it might be available, what’s inside of it, and whether it can survive a shaggy carpet.

Samsung ]

In case you had the good sense to be somewhere besides CES this week, there’s the full demo of Agility RoboticsDigit at the Ford booth.

Classy!

Because of the nightmarish Wi-Fi environment in the convention center, Digit is steered manually, but the box interaction is autonomous.

[ Agility Robotics ]

Stefano Mintchev from EPFL and his startup Foldaway Haptics are the folks behind the 33 individual actuated "bionic flaps" on the new Mercedes-Benz Vision AVTR concept car that was at CES this week.

The underlying technology, which is based on origami structures, can be used in a variety of other applications, like this robotic billboard:

[ Foldaway Haptics ] via [ Mercedes ]

Thanks Stefano!

The Sarcos Guardian XO alpha version is looking way more polished than the pre-alpha prototype that we saw late last year.

And Sarcos tells us that it’s now even more efficient, although despite my begging, they won’t tell us exactly how they’ve managed that.

[ Sarcos ]

It is our belief that in 5 years’ time, not one day will go by without most of us interacting with a robot. Reachy is the only humanoid service robot that is open source and can manipulate objects. He mimics human expressions and body language, with a cute free-moving head and antennas as well as bio-inspired arms. Reachy is the optimum platform to create real-world interactive & service applications right away.

[ Pollen Robotics ]

Ritsumeikan Humanoid System Laboratory is working on a promising hydraulic humanoid:

[ Ritsumeikan HSL ]

With the steep rise of automation and robotics across industries, the requirements for robotic grippers become increasingly demanding. By using acoustic levitational forces, No-​Touch Robotics develops damage and contamination free contactless robotic grippers for handling highly fragile objects. Such grippers can beneficially be applied in the field of micro assembly and the semiconductor industry, resulting in an increased production yield, reduced waste, and high production quality by completely eliminating damage inflicted during handling.

You can also experience the magic by building your own acoustic levitator.

[ ETHZ ]

Preview of the Unitree A1. Maximum torque of each joint is 35.5 Nm. Weight (with battery) 12 kg. Price Less than $10k.

Under $10k? I’m going to start saving up!

[ Unitree ]

A team from the Micro Aerial Vehicle Lab (MAVLab) of TU Delft has won the 2019 Artificial Intelligence Robotic Racing (AIRR) Circuit, with a final breathtaking victory in the World Championship Race held in Austin, Texas, last December. The team takes home the $1 million grand prize, sponsored by Lockheed Martin, for creating the fastest and most reliable self-piloting aircraft this season.

[ MAVLab ]

After 10 years and 57 robots, hinamitetu brings you a few more.

[ Hinamitetu ]

Vision 60 legged robot managing unstructured terrain without vision or force sensors in its legs.

[ Ghost Robotics ]

In 2019 GRVC has lived one of the best years of its life, with the lastest developments of GRIFFIN ERC Advances Grant, the kick-off meeting of H2020 Aerial-Core Project and another projects.

[ GRVC ]

The Official Wrap-Up of ABU ROBOCON 2019 Ulaanbaatar, Mongolia.

[ RoboCon 2019 ]

Roboy had a busy 2019:

[ Roboy ]

Very interesting talk from IHMC’s Jerry Pratt, at the Workshop on Teleoperation of Humanoid Robots at Humanoids 2019.

[ Workshop ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

Apparently the whole “little home robot with a cute personality will seamlessly improve your life” thing is still going on, at least as far as Samsung is concerned.

Predictably, there’s no information on how much Ballie costs, when it might be available, what’s inside of it, and whether it can survive a shaggy carpet.

Samsung ]

In case you had the good sense to be somewhere besides CES this week, there’s the full demo of Agility RoboticsDigit at the Ford booth.

Classy!

Because of the nightmarish Wi-Fi environment in the convention center, Digit is steered manually, but the box interaction is autonomous.

[ Agility Robotics ]

Stefano Mintchev from EPFL and his startup Foldaway Haptics are the folks behind the 33 individual actuated "bionic flaps" on the new Mercedes-Benz Vision AVTR concept car that was at CES this week.

The underlying technology, which is based on origami structures, can be used in a variety of other applications, like this robotic billboard:

[ Foldaway Haptics ] via [ Mercedes ]

Thanks Stefano!

The Sarcos Guardian XO alpha version is looking way more polished than the pre-alpha prototype that we saw late last year.

And Sarcos tells us that it’s now even more efficient, although despite my begging, they won’t tell us exactly how they’ve managed that.

[ Sarcos ]

It is our belief that in 5 years’ time, not one day will go by without most of us interacting with a robot. Reachy is the only humanoid service robot that is open source and can manipulate objects. He mimics human expressions and body language, with a cute free-moving head and antennas as well as bio-inspired arms. Reachy is the optimum platform to create real-world interactive & service applications right away.

[ Pollen Robotics ]

Ritsumeikan Humanoid System Laboratory is working on a promising hydraulic humanoid:

[ Ritsumeikan HSL ]

With the steep rise of automation and robotics across industries, the requirements for robotic grippers become increasingly demanding. By using acoustic levitational forces, No-​Touch Robotics develops damage and contamination free contactless robotic grippers for handling highly fragile objects. Such grippers can beneficially be applied in the field of micro assembly and the semiconductor industry, resulting in an increased production yield, reduced waste, and high production quality by completely eliminating damage inflicted during handling.

You can also experience the magic by building your own acoustic levitator.

[ ETHZ ]

Preview of the Unitree A1. Maximum torque of each joint is 35.5 Nm. Weight (with battery) 12 kg. Price Less than $10k.

Under $10k? I’m going to start saving up!

[ Unitree ]

A team from the Micro Aerial Vehicle Lab (MAVLab) of TU Delft has won the 2019 Artificial Intelligence Robotic Racing (AIRR) Circuit, with a final breathtaking victory in the World Championship Race held in Austin, Texas, last December. The team takes home the $1 million grand prize, sponsored by Lockheed Martin, for creating the fastest and most reliable self-piloting aircraft this season.

[ MAVLab ]

After 10 years and 57 robots, hinamitetu brings you a few more.

[ Hinamitetu ]

Vision 60 legged robot managing unstructured terrain without vision or force sensors in its legs.

[ Ghost Robotics ]

In 2019 GRVC has lived one of the best years of its life, with the lastest developments of GRIFFIN ERC Advances Grant, the kick-off meeting of H2020 Aerial-Core Project and another projects.

[ GRVC ]

The Official Wrap-Up of ABU ROBOCON 2019 Ulaanbaatar, Mongolia.

[ RoboCon 2019 ]

Roboy had a busy 2019:

[ Roboy ]

Very interesting talk from IHMC’s Jerry Pratt, at the Workshop on Teleoperation of Humanoid Robots at Humanoids 2019.

[ Workshop ]

Minimally Invasive Surgery (MIS) imposes a trade-off between non-invasive access and surgical capability. Treatment of early gastric cancers over 20 mm in diameter can be achieved by performing Endoscopic Submucosal Dissection (ESD) with a flexible endoscope; however, this procedure is technically challenging, suffers from extended operation times and requires extensive training. To facilitate the ESD procedure, we have created a deployable cable driven robot that increases the surgical capabilities of the flexible endoscope while attempting to minimize the impact on the access that they offer. Using a low-profile inflatable support structure in the shape of a hollow hexagonal prism, our robot can fold around the flexible endoscope and, when the target site has been reached, achieve a 73.16% increase in volume and increase its radial stiffness. A sheath around the variable stiffness structure delivers a series of force transmission cables that connect to two independent tubular end-effectors through which standard flexible endoscopic instruments can pass and be anchored. Using a simple control scheme based on the length of each cable, the pose of the two instruments can be controlled by haptic controllers in each hand of the user. The forces exerted by a single instrument were measured, and a maximum magnitude of 8.29 N observed along a single axis. The working channels and tip control of the flexible endoscope remain in use in conjunction with our robot and were used during a procedure imitating the demands of ESD was successfully carried out by a novice user. Not only does this robot facilitate difficult surgical techniques, but it can be easily customized and rapidly produced at low cost due to a programmatic design approach.

In recent years, artificial intelligence (AI)/machine learning (ML; a subset of AI) have become increasingly important to the biomedical research community. These technologies, coupled to big data and cheminformatics, have tremendous potential to improve the design of novel therapeutics and to provide safe and effective drugs to patients. A National Center for Advancing Translational Sciences (NCATS) program called A Specialized Platform for Innovative Research Exploration (ASPIRE) leverages advances in AI/ML, automated synthetic chemistry, and high-throughput biology, and seeks to enable translation and drug development by catalyzing exploration of biologically active chemical space. Here we discuss the opportunities and challenges surrounding the application of AI/ML to the exploration of novel biologically relevant chemical space as part of ASPIRE.

Soft wearable robots could provide support for lower and upper limbs, increase weight lifting ability, decrease energy required for walking and running, and even provide haptic feedback. However, to date most of wearable robots are based on electromagnetic motors or fluidic actuators, the former being rigid and bulky, the latter requiring external pumps or compressors, greatly limiting integration and portability. Here we describe a new class of electrically-driven soft fluidic muscles combining thin, fiber-like McKibben actuators with fully Stretchable Pumps. These pumps rely on ElectroHydroDynamics, a solid-state pumping mechanism that directly accelerates liquid molecules by means of an electric field. Requiring no moving parts, these pumps are silent and can be bent and stretched while operating. Each electrically-driven fluidic muscle consists of one Stretchable Pump and one thin McKibben actuator, resulting in a slender soft device weighing 2 g. We characterized the response of these devices, obtaining a blocked force of 0.84 N and a maximum stroke of 4 mm. Future work will focus on decreasing the response time and increasing the energy efficiency. Modular and straightforward to integrate in textiles, these electrically-driven fluidic muscles will enable soft smart clothing with multi-functional capabilities for human assistance and augmentation.

Acknowledging the benefits of active learning and the importance of collaboration skills, the higher education system has started to transform toward utilization of group activities into lecture hall culture. In this study, a novel interaction has been introduced, wherein a social robot facilitated a small collaborative group activity of students in higher education. Thirty-six students completed a 3 h activity that covered the main content of a course in Human Computer Interaction. In this within-subject study, the students worked in groups of four on three activities, moving between three conditions: instructor facilitation of several groups using pen and paper for the activity; tablets facilitation, also used for the activity; and robot facilitation, using tablets for the activity. The robot facilitated the activity by introducing the different tasks, ensuring proper time management, and encouraging discussion among the students. This study examined the effects of facilitation type on attitudes toward the activity facilitation, the group activity, and the robot, using quantitative, and qualitative measures. Overall students perceived the robot positively, as friendly and responsive, even though the robot did not directly respond to the students' verbal communications. While most survey items did not convey significant differences between the robot, tablet, or instructor, we found significant correlations between perceptions of the robot, and attitudes toward the activity facilitation, and the group activity. Qualitative data revealed the drawbacks and benefits of the robot, as well as its relative perceived advantages over a human facilitator, such as better time management, objectivity, and efficiency. These results suggest that the robot's complementary characteristics enable a higher quality learning environment, that corresponds with students' requirements and that a Robot Supportive Collaborative Learning (RSCL) is a promising novel paradigm for higher education.

Dielectric elastomers (DEs) consist of highly compliant electrostatic transducers which can be operated as actuators, by converting an applied high voltage into motion, and as sensors, since capacitive changes can be related to displacement information. Due to large achievable deformation (on the order of 100%) and high flexibility, DEs appear as highly suitable for the design of soft robotic systems. An important requirement for robotic systems is the possibility of generating a multi degree-of-freedom (MDOF) actuation. By means of DE technology, a controllable motion along several directions can be made possible by combining different membrane actuators in protagonist-antagonist configurations, as well as by designing electrode patterns which allow independent activation of different sections of a single membrane. However, despite several concepts of DE soft robots have been presented in the recent literature, up to date there is still a lack of systematic studies targeted at optimizing the design of the system. To properly understand how different parameters influence the complex motion of DE soft robots, this paper presents an experimental study on how geometry scaling affects the performance of a specific MDOF actuator configuration. The system under investigation consists of two cone DE membranes rigidly connected along the outer diameter, and pre-compressed out-of-plane against each other via a rigid spacer. The electrodes of both membranes are partitioned in four sections that can be activated separately, thus allowing the desired MDOF actuation feature. Different prototypes are assembled and tested to study the influence of the inner radius as well as the length of the rigid spacer on the achievable motion range. For the first experimental study presented here, we focus our analysis on a single actuation variable, i.e., the rotation of the rigid spacer about a fixed axis. A physics-based model is then developed and validated based on the collected experimental measurements. A model-based investigation is subsequently performed, with the aim of studying the influence of the regarded parameters on the rotation angle. Finally, based on the results of the performed study, a model-based optimization of the prototype geometry is performed.

.sm img { width: 150px !important; height: 108px !important; }

We’ve been writing about robots here at IEEE Spectrum for a long, long time. Erico started covering robotics for Spectrum in 2007, about the same time that Evan started BotJunkie.com. We joined forces in 2011, and have published thousands of articles since then, chronicling as many aspects of the field as we could. Autonomous cars, humanoids, legged robots, drones, robots in space—the last decade in robotics has been incredible.

To kick off 2020, we’re taking a look back at our most popular posts of the last 10 years. In order, listed below are the 100 articles with the highest total page views, providing a cross-section of what was the most interesting in robotics from 2010 until now.

Also, sometime in the next several weeks, we plan to post a selection of our favorite stories, focusing on what we think were the biggest developments in robotics of the past decade (including a few things that, surprisingly, did not make the list below). If you have suggestions of important robot stories we should include, let us know. Thank you for reading!

#1 How Google’s Self-Driving Car Works

Google engineers explain the technology behind their self-driving car and show videos of road tests

By Erico Guizzo
Posted 18 Oct 2011

#2 This Robot Can Do More Push-Ups Because It Sweats

A robot that uses artificial sweat can cool its motors without bulky radiators

By Evan Ackerman
Posted 13 Oct 2016

#3 Meet Geminoid F, a Smiling Female Android

Geminoid F displays facial expressions more naturally than previous androids

By Erico Guizzo
Posted 3 Apr 2010

#4 Latest Geminoid Is Incredibly Realistic

Geminoid DK is a realistic android nearly indistinguishable from a real human

By Evan Ackerman
Posted 5 Mar 2011

#5 The Next Generation of Boston Dynamics’ ATLAS Robot Is Quiet, Robust, and Tether Free

The latest ATLAS is by far the most advanced humanoid robot in existence

By Evan Ackerman & Erico Guizzo
Posted 23 Feb 2016

#6 The Uncanny Valley: The Original Essay by Masahiro Mori

“The Uncanny Valley” by Masahiro Mori is an influential essay in robotics. This is the first English translation authorized by Mori.

By Masahiro Mori
Posted 12 Jun 2012

#7 NASA JSC Unveils Valkyrie DRC Robot

NASA’s DARPA Robotics Challenge entry is much more than Robonaut with legs: it’s a completely new humanoid robot

By Evan Ackerman
Posted 10 Dec 2013

#8 Origami Robot Folds Itself Up, Does Cool Stuff, Dissolves Into Nothing

Tiny self-folding magnetically actuated robot creates itself when you want it, disappears when you don’t

By Evan Ackerman
Posted 28 May 2015

#9 Robots Bring Couple Together, Engagement Ensues

Yes, you really can find love at an IEEE conference

By Evan Ackerman & Erico Guizzo
Posted 31 Mar 2014

#10 Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter

The Deep Learning expert explains how convolutional nets work, why Facebook needs AI, what he dislikes about the Singularity, and more

By Lee Gomes
Posted 18 Feb 2015

#11 This Is the Most Amazing Biomimetic Anthropomorphic Robot Hand We’ve Ever Seen

Luke Skywalker, your new robotic hand is ready

By Evan Ackerman
Posted 17 Feb 2016

#12 Dutch Police Training Eagles to Take Down Drones

Attack eagles are training to become part of the Dutch National Police anti-drone arsenal

By Evan Ackerman
Posted 1 Feb 2016

#13 You (YOU!) Can Take Stanford’s ’Intro to AI’ Course Next Quarter, For Free

Sebastian Thrun and Peter Norvig are offering Stanford’s "Introduction to Artificial Intelligence" course online, for free, grades and all

By Evan Ackerman
Posted 4 Aug 2011

#14 Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time

Watch this high-speed robot hand cheat at rock, paper, scissors

By Evan Ackerman
Posted 26 Jun 2012

#15 You’ve Never Seen a Robot Drive System Like This Before

Using just a single spinning hemisphere mounted on a gimbal, this robot demonstrates some incredible agility

By Evan Ackerman
Posted 7 Jul 2011

#16 Fukushima Robot Operator Writes Tell-All Blog

An anonymous worker at Japan’s Fukushima Dai-ichi nuclear power plant has written dozens of blog posts describing his experience as a lead robot operator at the crippled facility

By Erico Guizzo
Posted 23 Aug 2011

#17 Should Quadrotors All Look Like This?

Researchers say we’ve been designing quadrotors the wrong way

By Evan Ackerman
Posted 13 Nov 2013

#18 Boston Dynamics’ PETMAN Humanoid Robot Walks and Does Push-Ups

Boston Dynamics releases stunning video showing off its most advanced humanoid robot

By Erico Guizzo
Posted 31 Oct 2011

#19 Boston Dynamics’ Spot Robot Dog Goes on Sale

Here’s everything we know about Boston Dynamics’ first commercial robot

By Erico Guizzo
Posted 24 Sep 2019

#20 Agility Robotics Introduces Cassie, a Dynamic and Talented Robot Delivery Ostrich

One day, robots like these will be scampering up your steps to drop off packages

By Evan Ackerman
Posted 9 Feb 2017

#21 Superfast Scanner Lets You Digitize Book By Flipping Pages

Tokyo University researchers develop scanner that can capture 200 pages in one minute

By Erico Guizzo
Posted 17 Mar 2010

#22 A Robot That Balances on a Ball

Masaaki Kumagai has built wheeled robots, crawling robots, and legged robots. Now he’s built a robot that rides on a ball

By Erico Guizzo
Posted 29 Apr 2010

#23 Top 10 Robotic Kinect Hacks

Microsoft’s Kinect 3D motion detector has been hacked into lots of awesome robots, and here are our 10 favorites

By Evan Ackerman
Posted 7 Mar 2011

#24 Latest AlphaDog Robot Prototypes Get Less Noisy, More Brainy

New video shows Boston Dynamics and DARPA putting AlphaDog through its paces

By Evan Ackerman
Posted 11 Sep 2012

#25 How South Korea’s DRC-HUBO Robot Won the DARPA Robotics Challenge

This transformer robot took first place because it was fast, adaptable, and didn’t fall down

By Erico Guizzo & Evan Ackerman
Posted 9 Jun 2015

#26 U.S. Army Considers Replacing Thousands of Soldiers With Robots

The U.S. Army could slash personnel numbers and toss in more robots instead

By Evan Ackerman
Posted 22 Jan 2014

#27 Google Acquires Seven Robot Companies, Wants Big Role in Robotics

The company is funding a major new robotics group and acquiring a bunch of robot startups

By Evan Ackerman & Erico Guizzo
Posted 4 Dec 2013

#28 Who Is SCHAFT, the Robot Company Bought by Google and Winner of the DRC?

Here’s everything we know about this secretive robotics startup

By Erico Guizzo & Evan Ackerman
Posted 6 Feb 2014

#29 Ground-Effect Robot Could Be Key To Future High-Speed Trains

Trains that levitate on cushions of air could be the future of fast and efficient travel, if this robot can figure out how to keep them stable

By Evan Ackerman
Posted 10 May 2011

#30 Hobby Robot Rides a Bike the Old-Fashioned Way

I don’t know where this little robot got its awesome bicycle, but it sure knows how to ride

By Evan Ackerman
Posted 24 Oct 2011

#31 SRI Demonstrates Abacus, the First New Rotary Transmission Design in 50 Years

Finally a gear system that could replace costly harmonic drives

By Evan Ackerman
Posted 19 Oct 2016

#32 Robotic Micro-Scallops Can Swim Through Your Eyeballs

Tiny robots modeled on scallops are able to swim through all the fluids in your body

By Evan Ackerman
Posted 4 Nov 2014

#33 Boston Dynamics Officially Unveils Its Wheel-Leg Robot: "Best of Both Worlds"

Handle is a humanoid robot on wheels, and it’s amazing

By Erico Guizzo & Evan Ackerman
Posted 27 Feb 2017

#34 iRobot Brings Visual Mapping and Navigation to the Roomba 980

The new robot vacuum uses VSLAM to navigate and clean larger spaces in satisfyingly straight lines

By Evan Ackerman & Erico Guizzo
Posted 16 Sep 2015

#35 When Will We Have Robots To Help With Household Chores?

Google, Microsoft, and Apple are investing in robots. Does that mean home robots are on the way?

By Satyandra K. Gupta
Posted 2 Jan 2014

#36 Robots Playing Ping Pong: What’s Real, and What’s Not?

Kuka’s robot vs. human ping pong match looks to be more hype than reality

By Evan Ackerman
Posted 12 Mar 2014

#37 BigDog Throws Cinder Blocks with Huge Robotic Face-Arm

I don’t know why BigDog needs a fifth limb to throw cinder blocks, but it’s incredibly awesome

By Evan Ackerman
Posted 28 Feb 2013

#38 Children Beating Up Robot Inspires New Escape Maneuver System

Japanese researchers show that children can act like horrible little brats towards robots

By Kate Darling
Posted 6 Aug 2015

#39 Boston Dynamics’ AlphaDog Quadruped Robot Prototype on Video

Boston Dynamics has just released some absolutely incredible video of their huge new quadruped robot, AlphaDog

By Evan Ackerman
Posted 30 Sep 2011

#40 Building a Super Robust Robot Hand

Researchers have built an anthropomorphic robot hand that can endure even strikes from a hammer without breaking into pieces

By Erico Guizzo
Posted 25 Jan 2011

#41 Who’s Afraid of the Uncanny Valley?

To design the androids of the future, we shouldn’t fear exploring the depths of the uncanny valley

By Erico Guizzo
Posted 2 Apr 2010

#42 Why AlphaGo Is Not AI

Google DeepMind’s artificial intelligence AlphaGo is a big advance but it will not get us to strong AI

By Jean-Christophe Baillie
Posted 17 Mar 2016

#43 Freaky Boneless Robot Walks on Soft Legs

This soft, inflatable, and totally creepy robot from Harvard can get up and walk on four squishy legs

By Evan Ackerman
Posted 29 Nov 2011

#44 Sweep Is a $250 LIDAR With Range of 40 Meters That Works Outdoors

Finally an affordable LIDAR for robots and drones

By Evan Ackerman
Posted 6 Apr 2016

#45 How Google Wants to Solve Robotic Grasping by Letting Robots Learn for Themselves

800,000 grasps is just the beginning for Google’s large-scale robotic grasping project

By Evan Ackerman
Posted 28 Mar 2016

#46 Whoa: Boston Dynamics Announces New WildCat Quadruped Robot

A new robot from Boston Dynamics can run outdoors, untethered, at up to 25 km/h

By Evan Ackerman
Posted 3 Oct 2013

#47 SCHAFT Unveils Awesome New Bipedal Robot at Japan Conference

SCHAFT demos a new bipedal robot designed to "help society"

By Evan Ackerman & Erico Guizzo
Posted 8 Apr 2016

#48 Riding Honda’s U3-X Unicycle of the Future

It only has one wheel, but Honda’s futuristic personal mobility device is no pedal-pusher

By Anne-Marie Corley
Posted 12 Apr 2010

#49 Lingodroid Robots Invent Their Own Spoken Language

These little robots make up their own words to tell each other where they are and where they want to go

By Evan Ackerman
Posted 17 May 2011

#50 Disney Robot With Air-Water Actuators Shows Off "Very Fluid" Motions

Meet Jimmy, a robot puppet powered by fluid actuators

By Erico Guizzo
Posted 1 Sep 2016

#51 Kilobots Are Cheap Enough to Swarm in the Thousands

What can you do with a $14 robot? Not much. What can you do with a thousand $14 robots? World domination

By Evan Ackerman
Posted 16 Jun 2011

#52 Honda Robotics Unveils Next-Generation ASIMO Robot

We heard some rumors that Honda was working on something big, and here it is: a brand new ASIMO

By Evan Ackerman
Posted 7 Nov 2011

#53 Cybernetic Third Arm Makes Drummers Even More Annoying

It keeps proper time and comes with an off switch, making this robotic third arm infinitely better than a human drummer

By Evan Ackerman
Posted 18 Feb 2016

#54 Chatbot Tries to Talk to Itself, Things Get Weird

"I am not a robot. I am a unicorn."

By Evan Ackerman
Posted 29 Aug 2011

#55 Dean Kamen’s "Luke Arm" Prosthesis Receives FDA Approval

This advanced bionic arm for amputees has been approved for commercialization

By Erico Guizzo
Posted 13 May 2014

#56 Meet the Amazing Robots That Will Compete in the DARPA Robotics Challenge

Over the next two years, robotics will be revolutionized, and here’s how it’s going to happen

By Evan Ackerman
Posted 24 Oct 2012

#57 ReWalk Robotics’s New Exoskeleton Lets Paraplegic Stroll the Streets of NYC

Yesterday, a paralyzed man strapped on a pair of robotic legs and stepped out a hotel door in midtown Manhattan

By Eliza Strickland
Posted 15 Jul 2015

#58 Drone Uses AI and 11,500 Crashes to Learn How to Fly

Crashing into objects has taught this drone to fly autonomously, by learning what not to do

By Evan Ackerman
Posted 10 May 2017

#59 Lego Announces Mindstorms EV3, a More ’Hackable’ Robotics Kit

Lego’s latest Mindstorms kit has a new IR sensor, runs on Linux, and is compatible with Android and iOS apps

By Erico Guizzo & Stephen Cass
Posted 7 Jan 2013

#60 Boston Dynamics’ Marc Raibert on Next-Gen ATLAS: "A Huge Amount of Work"

The founder of Boston Dynamics describes how his team built one of the most advanced humanoids ever

By Erico Guizzo & Evan Ackerman
Posted 24 Feb 2016

#61 AR Drone That Infects Other Drones With Virus Wins DroneGames

Other projects included a leashed auto-tweeting drone, and code to control a swarm of drones all at once

By Evan Ackerman
Posted 6 Dec 2012

#62 DARPA Robotics Challenge: A Compilation of Robots Falling Down

Gravity is a bad thing for robots

By Erico Guizzo & Evan Ackerman
Posted 6 Jun 2015

#63 Bosch’s Giant Robot Can Punch Weeds to Death

A modular agricultural robot from Bosch startup Deepfield Robotics deals with weeds the old fashioned way: violently

By Evan Ackerman
Posted 12 Nov 2015

#64 How to Make a Humanoid Robot Dance

Japanese roboticists demonstrate a female android singing and dancing along with a troupe of human performers

By Erico Guizzo
Posted 2 Nov 2010

#65 What Technologies Enabled Drones to Proliferate?

Five years ago few people had even heard of quadcopters. Now they seem to be everywhere

By Markus Waibel
Posted 19 Feb 2010

#66 Video Friday: Professor Ishiguro’s New Robot Child, and More

Your weekly selection of awesome robot videos

By Evan Ackerman, Erico Guizzo & Fan Shi
Posted 3 Aug 2018

#67 Drone Provides Colorado Flooding Assistance Until FEMA Freaks Out

Drones can provide near real-time maps in weather that grounds other aircraft, but FEMA has banned them

By Evan Ackerman
Posted 16 Sep 2013

#68 A Thousand Kilobots Self-Assemble Into Complex Shapes

This is probably the most robots that have ever been in the same place at the same time, ever

By Evan Ackerman
Posted 14 Aug 2014

#69 Boston Dynamics’ SpotMini Is All Electric, Agile, and Has a Capable Face-Arm

A fun-sized version of Spot is the most domesticated Boston Dynamics robot we’ve seen

By Evan Ackerman
Posted 23 Jun 2016

#70 Kenshiro Robot Gets New Muscles and Bones

This humanoid is trying to mimic the human body down to muscles and bones

By Angelica Lim
Posted 10 Dec 2012

#71 Roomba Inventor Joe Jones on His New Weed-Killing Robot, and What’s So Hard About Consumer Robotics

The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot

By Evan Ackerman
Posted 6 Jul 2017

#72 George Devol: A Life Devoted to Invention, and Robots

George Devol’s most famous invention—the first programmable industrial robot—started a revolution in manufacturing that continues to this day

By Bob Malone
Posted 26 Sep 2011

#73 World Robot Population Reaches 8.6 Million

Here’s an estimate of the number of industrial and service robots worldwide

By Erico Guizzo
Posted 14 Apr 2010

#74 U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful

U.S. Senator Tom Coburn criticizes the NSF for squandering "millions of dollars on wasteful projects," including three that involve robots

By Erico Guizzo
Posted 14 Jun 2011

#75 Inception Drive: A Compact, Infinitely Variable Transmission for Robotics

A novel nested-pulley configuration forms the heart of a transmission that could make robots safer and more energy efficient

By Evan Ackerman & Celia Gorman
Posted 20 Sep 2017

#76 iRobot Demonstrates New Weaponized Robot

iRobot has released video showing a Warrior robot deploying an anti-personnel obstacle breaching system

By John Palmisano
Posted 30 May 2010

#77 Robotics Trends for 2012

Nearly a quarter of the year is already behind us, but we thought we’d spend some time looking at the months ahead and make some predictions about what’s going to be big in robotics

By Erico Guizzo & Travis Deyle
Posted 20 Mar 2012

#78 DRC Finals: CMU’s CHIMP Gets Up After Fall, Shows How Awesome Robots Can Be

The most amazing run we saw all day came from CHIMP, which was the only robot to fall and get up again

By Evan Ackerman & Erico Guizzo
Posted 5 Jun 2015

#79 Lethal Microdrones, Dystopian Futures, and the Autonomous Weapons Debate

The future of weaponized robots requires a reasoned discussion, not scary videos

By Evan Ackerman
Posted 15 Nov 2017

#80 Every Kid Needs One of These DIY Robotics Kits

For just $200, this kit from a CMU spinoff company is a great way for total beginners to get started building robots

By Evan Ackerman
Posted 11 Jul 2012

#81 Beautiful Fluid Actuators from Disney Research Make Soft, Safe Robot Arms

Routing forces through air and water allows for displaced motors and safe, high-performance arms

By Evan Ackerman
Posted 9 Oct 2014

#82 Boston Dynamics Sand Flea Robot Demonstrates Astonishing Jumping Skills

Watch a brand new video of Boston Dynamics’ Sand Flea robot jumping 10 meters into the air

By Evan Ackerman
Posted 28 Mar 2012

#83 Eyeborg: Man Replaces False Eye With Bionic Camera

Canadian filmmaker Rob Spence has replaced his false eye with a bionic camera eye. He showed us his latest prototype

By Tim Hornyak
Posted 11 Jun 2010

#84 We Should Not Ban ‘Killer Robots,’ and Here’s Why

What we really need is a way of making autonomous armed robots ethical, because we’re not going to be able to prevent them from existing

By Evan Ackerman
Posted 28 Jul 2015

#85 Yale’s Robot Hand Copies How Your Fingers Work to Improve Object Manipulation

These robotic fingers can turn friction on and off to make it easier to manipulate objects with one hand

By Evan Ackerman
Posted 12 Sep 2018

#86 France Developing Advanced Humanoid Robot Romeo

Nao, the small French humanoid robot, is getting a big brother

By Erico Guizzo
Posted 13 Dec 2010

#87 DARPA Wants to Give Soldiers Robot Surrogates, Avatar Style

Soldiers controlling bipedal robot surrogates on the battlefield? It’s not science fiction, it’s DARPA’s 2012 budget

By Evan Ackerman
Posted 17 Feb 2012

#88 Whoa: Quadrotors Play Catch With Inverted Pendulum

Watch these quadrotors balance a stick on its end, and then toss it back and forth

By Evan Ackerman
Posted 21 Feb 2013

#89 Why We Should Build Humanlike Robots

Humans are brilliant, beautiful, compassionate, and capable of love. Why shouldn’t we aspire to make robots humanlike in these ways?

By David Hanson
Posted 1 Apr 2011

#90 DARPA Robotics Challenge Finals: Know Your Robots

All 25 robots in a single handy poster-size image

By Erico Guizzo & Evan Ackerman
Posted 3 Jun 2015

#91 Here’s That Extra Pair of Robot Arms You’ve Always Wanted

MIT researchers develop wearable robotic arms that can give you an extra hand (or two)

By Evan Ackerman
Posted 2 Jun 2014

#92 Rat Robot Beats on Live Rats to Make Them Depressed

A robotic rat can be used to depress live rats to make them suitable for human drug trials

By Evan Ackerman
Posted 13 Feb 2013

#93 MIT Cheetah Robot Bounds Off Tether, Outdoors

The newest version of MIT’s Cheetah is fast, it’s quiet, and it jumps

By Evan Ackerman
Posted 15 Sep 2014

#94 Bizarre Soft Robots Evolve to Run

These simulated robots may be wacky looking, but they’ve evolved on their own to be fast and efficient

By Evan Ackerman
Posted 11 Apr 2013

#95 Robot Car Intersections Are Terrifyingly Efficient

In the future, robots will blow through intersections without stopping, and you won’t be able to handle it

By Evan Ackerman
Posted 13 Mar 2012

#96 iRobot’s New Roomba 800 Series Has Better Vacuuming With Less Maintenance

A redesigned cleaning system makes the new vacuum way better at dealing with hair (and everything else)

By Evan Ackerman
Posted 12 Nov 2013

#97 Sawyer: Rethink Robotics Unveils New Robot

It’s smaller, faster, stronger, and more precise: meet Sawyer, Rethink Robotics’ new manufacturing robot

By Evan Ackerman & Erico Guizzo
Posted 19 Mar 2015

#98 Cynthia Breazeal Unveils Jibo, a Social Robot for the Home

The famed MIT roboticist is launching a crowdfunding campaign to bring social robots to consumers

By Erico Guizzo
Posted 16 Jul 2014

#99 These Robots Will Teach Kids Programming Skills

Startup Play-i says its robots can make computer programming fun and accessible

By Erico Guizzo
Posted 30 Oct 2013

#100 Watch a Swarm of Flying Robots Build a 6-Meter Brick Tower

This is what happens when a bunch of roboticists and architects get together in an art gallery

By Erico Guizzo
Posted 2 Dec 2011

Pages