Feed aggregator

NASA has decided that humans are going back to the Moon. That’s great! Before that actually happens, a whole bunch of other things have to happen, and excitingly, many of those things involve robots. As a sort of first-ish step, NASA is developing a new lunar rover called VIPER (Volatiles Investigating Polar Exploration Rover). VIPER’s job is to noodle around the permanently shaded craters at the Moon’s south pole looking for water ice, which can (eventually) be harvested and turned into breathable air and rocket fuel.

An engineering model of the Volatiles Investigating Polar Exploration Rover, or VIPER, is tested in the Simulated Lunar Operations Laboratory at NASA’s Glenn Research Center in Cleveland, Ohio. About the size of a golf cart, VIPER is a mobile robot that will roam around the Moon’s South Pole looking for water ice in the region and for the first time ever, actually sample the water ice at the same pole where the first woman and next man will land in 2024 under the Artemis program.

In the video, the VIPER engineering model is enjoying playtime in simulated lunar regolith (not stuff that you want to be breathing, hence the fancy hats) to help characterize the traction of the wheels on different slopes, and to help figure out how much power will be necessary. The final rover might look a bit more like this:

VIPER is more than somewhat similar to an earlier rover that NASA was working on called Resource Prospector, which was cancelled back in 2018. Resource Prospector was also scheduled to go to the Moon’s south pole to look for water ice, and VIPER will be carrying some of the instruments originally developed for Resource Prospector. If it seems a little weird that NASA cancelled Resource Prospector only to almost immediately start work on VIPER, well, yeah—the primary difference between the two rovers seems to be that VIPER is intended to spend several months operating, while Resource Prospector’s lifespan was only a couple of weeks.

The other big difference between VIPER and Resource Prospector is that NASA has been gradually shifting away from developing all of its own hardware in-house, and VIPER is no exception. One of the primary science instruments, a drilling system called TRIDENT (The Regolith and Ice Drill for Exploring New Terrain, obviously), comes from Honeybee Robotics, which has contributed a variety of tools that have been used to poke and prod at the surface of Mars on the Mars Exploration rovers, Phoenix, and Curiosity. There’s nothing wrong with this, except that for VIPER, it looks like NASA wants a commercial delivery system as well.

Finding water ice on the Moon is the first step towards the in-situ resource utilization (ISRU) robots necessary to practically sustain a long-term lunar mission

Last week, Space News reported that NASA is postponing procurement of a commercially-developed lander that would deliver VIPER to the lunar surface, meaning that not only does VIPER not have a ride to the Moon right now, but that it’s not very clear when it’ll actually happen—as recently as last November, the plan was to have a lander selected by early February, for a landing in late 2022. From the sound of things, the problem is that VIPER is a relatively chunky payload (weighing about 350 kg), meaning that only a few companies have the kind of hardware that would be required to get it safely to the lunar surface, and NASA has a limited budget that also has to cover a bunch of other stuff at the same time.

This delay is unfortunate, because VIPER plays an important part in NASA’s overall lunar strategy. Finding water ice on the Moon is the first step towards the in-situ resource utilization (ISRU) robots necessary to practically sustain a long-term lunar mission, and after that, it’ll take a bunch more work to actually deploy a system to harvest ice and turn it into usable hydrogen and oxygen with enough reliability and volume to make a difference. We have the technology—we’ve just got to get it there, and get it working. 

[ VIPER ]

Suction is a useful tool in many robotic applications, as long as those applications are grasping objects that are suction-friendly—that is, objects that are impermeable and generally smooth-ish and flat-ish. If you can’t form a seal on a surface, your suction gripper is going to have a bad time, which is why you don’t often see suction systems working outside of an environment that’s at least semi-constrained. Warehouses? Yes. Kitchens? Maybe. The outdoors? Almost certainly not.

In general, getting robotic grippers (and robots themselves) to adhere to smooth surfaces and rough surfaces requires completely different technology. But researchers from Zhejiang University in China have come up with a new kind of suction gripper that can very efficiently handle surfaces like widely-spaced tile and even rough concrete, by augmenting the sealing system with a spinning vortex of water.

Image: Zhejiang University To climb, the robot uses a micro-vacuum pump coupled to a rapidly rotating fan and a water source. Centripetal force causes the spinning water to form a ring around the outside of the vacuum chamber. Because water can get into all those surface irregularities that doom traditional vacuum grippers, the seal is much stronger.

The paper is a little bit dense, but from what I can make out, what’s going on is that you’ve got a traditional suction gripper with a vacuum pump, modified with a water injection system and a fan. The fan has nothing to do with creating or maintaining a vacuum—its job is to get the water spinning at up to 90 rotations per second. Centripetal force causes the spinning water to form a ring around the outside of the vacuum chamber, which keeps the water from being sucked out through the vacuum pump while also maintaining a liquid seal between the vacuum chamber and the surface. Because water can get into all of those annoying little nooks and crannies that can mean doom for traditional vacuum grippers, the seal is much better, resulting in far higher performance, especially on surfaces with high roughness.

Photo: Zhejiang University One of the potential applications for the water-vortex suction robot is as a “Spider-Man” wall-climbing device.

For example, a single suction unit weighing 0.8 kg was able to generate a suction force of over 245 N on a rough surface using less than 400 W, while a traditional suction unit of the same footprint would need several thousand watts (and weigh dozens of kilograms) to generate a comparable amount of suction, since the rough surface would cause a significant amount of leakage (although not a loss of suction). At very high power, the efficiency does decrease a bit— the “Spider-Man” system weighs 3 kg per unit, with a suction force of 2000 N using 650 W.

And as for the downsides? Er, well, it does kind of leak all over the place, especially when disengaging. The “Spider-Man” version leaks over 2 liters per minute. It’s only water, but still. And since it leaks, it needs to be provided with a constant water supply, which limits its versatility. The researchers are working on ways of significantly reducing water consumption to make the system more independent, but personally, I feel like the splooshyness is part of the appeal.

Vacuum suction unit based on the zero pressure difference method,” by Kaige Shi and Xin Li from Zhejiang University in China, is published in Physics of Fluids.

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

The Real-World Deployment of Legged Robots Workshop is back at ICRA 2020!

We’ll be there!

Workshop ]

Thanks Marko!

This video shows some cool musical experiments with Pepper. They should definitely release this karaoke feature to Peppers everywhere—with “Rage Against the Machine” songs included, of course. NSFW warning: There is some swearing by both robot and humans, so headphones recommended if you’re at work.

It all started when on a whim David and another team member fed a karaoke file into Pepper’s text to speech, with a quick Python script, and playing some music in parallel from their PC. The effect was a bit strange, but there was something so fun (and funny) to it. I think they were going for a virtual performance from Pepper or something, but someone noted that it sounds like he’s struggling like someone doing karaoke. And from there it grew into doing duets with Pepper.

This thing might seem ridiculous, and it is. But believe me, it’s genuinely fun. It was going all night in a meeting room at the office winter party.

[ Taylor Veltrop ]

And now, this.

In “Scary Beauty,” a performance conceived and directed by Tokyo-based musician Keiichiro Shibuya, a humanoid robot called Alter 3 not only conducts a human orchestra but also sings along with it. 

Unlike the previous two "Alters", the Alter 3 has improved sensory and expression capabilities closer to humans, such as a camera with both eyes and the ability to utter from the mouth, as well as expressiveness around the mouth for singing. In addition, the output was enhanced compared to the alternator 2, which made it possible to improve the immediacy of the body expression and achieve dynamic movement. In addition, portability, which allows anyone to disassemble and assemble and transport by air, is one of the evolutions of the Altera 3.

Scary Beauty ] via [ RobotStart ]

Carnegie Mellon University’s Henny Admoni studies human behavior in order to program robots to better anticipate people’s needs. Admoni’s research focuses on using assistive robots to address different impairments and aid people in living more fulfilling lives.

[ HARP Lab ]

Olympia was produced as part of a two-year project exploring the growth of social and humanoid robotics in the UK and beyond. Olympia was shot on location at Bristol Robotics Labs, one of the largest of its kind in Britain.

Humanoid robotics - one the most complex and often provocative areas of artificial intelligence - form the central subject of this short film. At what point are we willing to believe that we might form a real bond with a machine?

[ Olympia ] via [ Bristol Robotics Lab ]

In this work, we explore user preferences for different modes of autonomy for robot-assisted feeding given perceived error risks and also analyze the effect of input modalities on technology acceptance.

[ Personal Robotics Lab ]

This video brings to you a work conducted on a multi-agent system of aerial robots to form mid-air structures by docking using position-based visual servoing of the aerial robot. For the demonstration, the commercially available drone DJI Tello has been modified to fit to use and has been commanded using the DJI Tello Python SDK.

[ YouTube ]

The video present DLR CLASH (Compliant Low-cost Antagonistic Servo Hand) developed within the EU-Project Soma (grant number H2020-ICT-645599) and shows the hand resilience tests and the capability of the hand to grasp objects under different motor and sensor failures.

[ DLR ]

Squishy Robotics is celebrating our birthday! Here is a short montage of the places we’ve been and the things we’ve done over the last three years.

[ Squishy Robotics ]

The 2020 DJI RoboMaster Challenge takes place in Shenzhen in early August 2020.

[ RoboMaster ]

With support from the National Science Foundation, electrical engineer Yan Wan and a team at the University of Texas at Arlington are developing a new generation of "networked" unmanned aerial vehicles (UAVs) to bring long distance, broadband communications capability to first responders in the field.

[ NSF ]

Drones and UAVs are vulnerable to hackers that might try to take control of the craft or access data stored on-board. Researchers at the University of Michigan are part of a team building a suite of software to keep drones secure.

The suite is called Trusted and Resilient Mission Operations (TRMO). The U-M team, led by Wes Weimer, professor of electrical engineering and computer science, is focused on integrating the different applications into a holistic system that can prevent and combat attacks in real time.

[ UMich ]

A mobile robot that revs up industrial production: SOTO enables efficient automated line feeding, for example in the automotive industry. The supply chain robot SOTO brings materials to the assembly line, just-in-time and completely autonomous.

[ Magazino ]

MIT’s Lex Fridman get us caught up with the state-of-the-art in deep learning.

[ MIT ]

Just in case you couldn’t make it out to Australia in 2018, here are a couple of the keynotes from ICRA in Brisbane.

[ ICRA 2018 ]

Birds have been doing their flying thing with flexible and feathery wings for about a hundred million years, give or take. And about a hundred years ago, give or take, humans decided that, although birds may be the flying experts, we’re just going to go off in our own direction with mostly rigid wings and propellers and stuff, because it’s easier or whatever. The few attempts at making artificial feathers that we’ve seen in the past have been sufficient for a few specific purposes but haven’t really come close to emulating the capabilities that real feathers bestow on the wings of birds. So a century later, we’re still doing the rigid wings with discrete flappy bits, while birds (one has to assume) continue to judge us for our poor choices.

In a paper published today in Science Robotics, researchers at Stanford University have presented some new work on understanding exactly how birds maintain control by morphing the shape of their wings. They put together a flying robot called PigeonBot with a pair of “biohybrid morphing wings” to test out new control principles, and instead of trying to develop some kind of fancy new artificial feather system, they did something that makes a lot more sense: They cheated, by just using real feathers instead. 

The reason why robots are an important part of this research (which otherwise seems like it would be avian biology) is because there’s no good way to use a real bird as a test platform. As far as I know, you can’t exactly ask a pigeon to try and turn just using some specific wing muscles, but you can definitely program a biohybrid robot to do that. However, most of the other bioinspired flying robots that we’ve seen have been some flavor of ornithopter (rigid flapping wings), or they’ve used stretchy membrane wings, like bats.

Image: Lentink Lab/Stanford University By examining real feathers, the Stanford researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro.”

Feathers aren’t just more complicated to manufacture, but you have to find some way of replicating and managing all of the complex feather-on-feather interactions that govern wing morphing in real birds. For example, by examining real feathers, the researchers discovered that adjacent feathers stick to each other to resist sliding in one direction only using micron-scale features that researchers describe as “directional Velcro,” something “new to science and technology.” Real feathers can slide to allow the wing to morph, but past a certain point, the directional Velcro engages to keep gaps from developing in the wing surface. There are additional practical advantages, too: “they are softer, lighter, more robust, and easier to get back into shape after a crash by simply preening ruffled feathers between one’s fingers.”

With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering, while thrust and a bit of additional stabilizing control comes from a propeller and a conventional tail. The researchers found that PigeonBot’s roll could be controlled with just the movement of the finger joint on the wing, and that this technique is inherently much more stable than the aileron roll used by conventional aircraft, as corresponding author David Lentink, head of Stanford's Bio-Inspired Research & Design (BIRD) Lab, describes:

The other cool thing we found is that the morphing wing asymmetry results automatically in a steady roll angle. In contrast aircraft aileron left-right asymmetry results in a roll rate, which the pilot or autopilot then has to stop to achieve a steady roll angle. Controlling a banked turn via roll angle is much simpler than via roll rate. We think it may enable birds to fly more stably in turbulence, because wing asymmetry corresponds to an equilibrium angle that the wings automatically converge to. If you are flying in turbulence and have to control the robot or airplane attitude via roll rate in response to many stochastic perturbations, roll angle has to be actively adjusted continuously without any helpful passive dynamics of the wing. Although this finding requires more research and testing, it shows how aerospace engineers can find inspiration to think outside of the box by studying how birds fly. 

The researchers suggest that the directional Velcro technology is one of the more important results of this study, and while they’re not pursuing any of the numerous potential applications, they’ve “decided to not patent this finding to help proliferate our discovery to the benefit of society at large” in the hopes that anyone who makes a huge pile of money off of it will (among other things) invest in bird conservation in gratitude.

Image: Lentink Lab/Stanford University With the real feathers elastically connected to a pair of robotic bird wings with wrist and finger joints that can be actuated individually, PigeonBot relies on its biohybrid systems for maneuvering.

As for PigeonBot itself, Lentink says he’d like to add a biohybrid morphing tail, as well as legs with grasping feet, and additional actuators for wing folding and twisting and flapping. And maybe make it fly autonomously, too. Sound good to me—that kind of robot would be great at data transfer.

[ Science Robotics ]

At CES 2017, I got my mind blown by a giant mystery box from a company called AxonVR that was able to generate astonishingly convincing tactile sensations of things like tiny palm-sized animals running around on my palm in VR. An update in late 2017 traded the giant mystery box (and the ability to reproduce heat and cold) for a wearable glove with high resolution microfluidic haptics embedded inside of it. By itself, the HaptX system is constrained to virtual reality, but when combined with a pair of Universal Robotics UR10 arms, Shadow dexterous robotic hands, and SynTouch tactile sensors, you end up with a system that can reproduce physical reality instead.

The demo at CES is pretty much the same thing that you may have seen video of Jeff Bezos trying at Amazon’s re:MARS conference. The heart of the system are the haptic gloves, which are equipped with spatial position and orientation sensors as well as finger location sensors. The movements that you make with your hands and fingers are mapped to the Shadow hands, while the UR10 arms try to match the relative position of the hands in space to your own. Going the other way, there’s a more or less 1-to-1 mapping between what the robot hands feel, and the forces that are transmitted into the fingers of the gloves.

It’s not a perfect system quite yet—sensors get occluded or otherwise out of whack on occasion, and you have to use a foot pedal as a sort of pause button on the robots while you reposition your limbs in a way that’s easier for the system to interpret. And the feel of the force transmission takes some getting used to. I want to say that it could be more finely calibrated, but much of that feeling is likely on my end, since we’re told that the system gets much easier to control with practice.

Photo: Andrew Mitrak/HaptX Evan uses the telerobotic hands to operate a mockup of a control panel used to shut down a nuclear reactor. He had to turn a valve, flip some switches, twist a knob, and then push a button. Meltdown averted!

Even as a brand new user, it was surprising how capable my remote controlled hands were. I had no trouble grabbing plastic cups and transferring a ball between them, although I had to take care not to accidentally crush the cups (which would trap the ball inside). At first, it was easy to consider the force feedback as more of a gimmick, but once I started to relax and pay attention to it, it provided useful information that made me more effective at the task I was working on.

After playing around with things a bit more (and perhaps proving myself not to be totally incompetent), I was given the second most challenging scenario—a simple mockup of a control panel used to shut down a nuclear reactor. I had to turn a valve, flip some switches, twist a knob, and then push a button, all of which required a variety of different grasps, motions, and forces. It was a bit fiddly, but I got it all, and what I found most impressive was that I was able to manipulate things even when I couldn’t see them—in this case, because one of the arms was blocking my view. I’m not sure that would have been possible without the integrated haptic system.

Image: Converge Robotics Group The three companies involved in the project (Shadow Robot Company, HaptX, and Tangible Research) have formed the Converge Robotics Group to commercialize the system.

The news from CES is that the three companies involved in this project (Shadow Robot Company, HaptX, and Tangible Research) have formed a sort of consortium-thing called Converge Robotics Group. Basically, the idea is to create a framework under which the tactile telerobot can be further developed and sold, because otherwise, it’s not at all clear who you’d even throw money at if you wanted to buy one. 

Speaking of buying one, this system is “now available for purchase by early access customers.” As for what it might cost, well… It’ll be a lot. There isn’t a specific number attached to the system yet, but with two UR10 arms and pair of Shadow hands, we’re looking at low six figures just in that portion of the hardware. Add in the HaptX gloves and whatever margin you need to keep your engineers fed, and it’s safe to say that this isn’t going to end up in your living room in the near future, no matter how cool that would be.

[ Converge Robotics Group ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

Apparently the whole “little home robot with a cute personality will seamlessly improve your life” thing is still going on, at least as far as Samsung is concerned.

Predictably, there’s no information on how much Ballie costs, when it might be available, what’s inside of it, and whether it can survive a shaggy carpet.

Samsung ]

In case you had the good sense to be somewhere besides CES this week, there’s the full demo of Agility RoboticsDigit at the Ford booth.

Classy!

Because of the nightmarish Wi-Fi environment in the convention center, Digit is steered manually, but the box interaction is autonomous.

[ Agility Robotics ]

Stefano Mintchev from EPFL and his startup Foldaway Haptics are the folks behind the 33 individual actuated "bionic flaps" on the new Mercedes-Benz Vision AVTR concept car that was at CES this week.

The underlying technology, which is based on origami structures, can be used in a variety of other applications, like this robotic billboard:

[ Foldaway Haptics ] via [ Mercedes ]

Thanks Stefano!

The Sarcos Guardian XO alpha version is looking way more polished than the pre-alpha prototype that we saw late last year.

And Sarcos tells us that it’s now even more efficient, although despite my begging, they won’t tell us exactly how they’ve managed that.

[ Sarcos ]

It is our belief that in 5 years’ time, not one day will go by without most of us interacting with a robot. Reachy is the only humanoid service robot that is open source and can manipulate objects. He mimics human expressions and body language, with a cute free-moving head and antennas as well as bio-inspired arms. Reachy is the optimum platform to create real-world interactive & service applications right away.

[ Pollen Robotics ]

Ritsumeikan Humanoid System Laboratory is working on a promising hydraulic humanoid:

[ Ritsumeikan HSL ]

With the steep rise of automation and robotics across industries, the requirements for robotic grippers become increasingly demanding. By using acoustic levitational forces, No-​Touch Robotics develops damage and contamination free contactless robotic grippers for handling highly fragile objects. Such grippers can beneficially be applied in the field of micro assembly and the semiconductor industry, resulting in an increased production yield, reduced waste, and high production quality by completely eliminating damage inflicted during handling.

You can also experience the magic by building your own acoustic levitator.

[ ETHZ ]

Preview of the Unitree A1. Maximum torque of each joint is 35.5 Nm. Weight (with battery) 12 kg. Price Less than $10k.

Under $10k? I’m going to start saving up!

[ Unitree ]

A team from the Micro Aerial Vehicle Lab (MAVLab) of TU Delft has won the 2019 Artificial Intelligence Robotic Racing (AIRR) Circuit, with a final breathtaking victory in the World Championship Race held in Austin, Texas, last December. The team takes home the $1 million grand prize, sponsored by Lockheed Martin, for creating the fastest and most reliable self-piloting aircraft this season.

[ MAVLab ]

After 10 years and 57 robots, hinamitetu brings you a few more.

[ Hinamitetu ]

Vision 60 legged robot managing unstructured terrain without vision or force sensors in its legs.

[ Ghost Robotics ]

In 2019 GRVC has lived one of the best years of its life, with the lastest developments of GRIFFIN ERC Advances Grant, the kick-off meeting of H2020 Aerial-Core Project and another projects.

[ GRVC ]

The Official Wrap-Up of ABU ROBOCON 2019 Ulaanbaatar, Mongolia.

[ RoboCon 2019 ]

Roboy had a busy 2019:

[ Roboy ]

Very interesting talk from IHMC’s Jerry Pratt, at the Workshop on Teleoperation of Humanoid Robots at Humanoids 2019.

[ Workshop ]

.sm img { width: 150px !important; height: 108px !important; }

We’ve been writing about robots here at IEEE Spectrum for a long, long time. Erico started covering robotics for Spectrum in 2007, about the same time that Evan started BotJunkie.com. We joined forces in 2011, and have published thousands of articles since then, chronicling as many aspects of the field as we could. Autonomous cars, humanoids, legged robots, drones, robots in space—the last decade in robotics has been incredible.

To kick off 2020, we’re taking a look back at our most popular posts of the last 10 years. In order, listed below are the 100 articles with the highest total page views, providing a cross-section of what was the most interesting in robotics from 2010 until now.

Also, sometime in the next several weeks, we plan to post a selection of our favorite stories, focusing on what we think were the biggest developments in robotics of the past decade (including a few things that, surprisingly, did not make the list below). If you have suggestions of important robot stories we should include, let us know. Thank you for reading!

#1 How Google’s Self-Driving Car Works

Google engineers explain the technology behind their self-driving car and show videos of road tests

By Erico Guizzo
Posted 18 Oct 2011

#2 This Robot Can Do More Push-Ups Because It Sweats

A robot that uses artificial sweat can cool its motors without bulky radiators

By Evan Ackerman
Posted 13 Oct 2016

#3 Meet Geminoid F, a Smiling Female Android

Geminoid F displays facial expressions more naturally than previous androids

By Erico Guizzo
Posted 3 Apr 2010

#4 Latest Geminoid Is Incredibly Realistic

Geminoid DK is a realistic android nearly indistinguishable from a real human

By Evan Ackerman
Posted 5 Mar 2011

#5 The Next Generation of Boston Dynamics’ ATLAS Robot Is Quiet, Robust, and Tether Free

The latest ATLAS is by far the most advanced humanoid robot in existence

By Evan Ackerman & Erico Guizzo
Posted 23 Feb 2016

#6 The Uncanny Valley: The Original Essay by Masahiro Mori

“The Uncanny Valley” by Masahiro Mori is an influential essay in robotics. This is the first English translation authorized by Mori.

By Masahiro Mori
Posted 12 Jun 2012

#7 NASA JSC Unveils Valkyrie DRC Robot

NASA’s DARPA Robotics Challenge entry is much more than Robonaut with legs: it’s a completely new humanoid robot

By Evan Ackerman
Posted 10 Dec 2013

#8 Origami Robot Folds Itself Up, Does Cool Stuff, Dissolves Into Nothing

Tiny self-folding magnetically actuated robot creates itself when you want it, disappears when you don’t

By Evan Ackerman
Posted 28 May 2015

#9 Robots Bring Couple Together, Engagement Ensues

Yes, you really can find love at an IEEE conference

By Evan Ackerman & Erico Guizzo
Posted 31 Mar 2014

#10 Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter

The Deep Learning expert explains how convolutional nets work, why Facebook needs AI, what he dislikes about the Singularity, and more

By Lee Gomes
Posted 18 Feb 2015

#11 This Is the Most Amazing Biomimetic Anthropomorphic Robot Hand We’ve Ever Seen

Luke Skywalker, your new robotic hand is ready

By Evan Ackerman
Posted 17 Feb 2016

#12 Dutch Police Training Eagles to Take Down Drones

Attack eagles are training to become part of the Dutch National Police anti-drone arsenal

By Evan Ackerman
Posted 1 Feb 2016

#13 You (YOU!) Can Take Stanford’s ’Intro to AI’ Course Next Quarter, For Free

Sebastian Thrun and Peter Norvig are offering Stanford’s "Introduction to Artificial Intelligence" course online, for free, grades and all

By Evan Ackerman
Posted 4 Aug 2011

#14 Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time

Watch this high-speed robot hand cheat at rock, paper, scissors

By Evan Ackerman
Posted 26 Jun 2012

#15 You’ve Never Seen a Robot Drive System Like This Before

Using just a single spinning hemisphere mounted on a gimbal, this robot demonstrates some incredible agility

By Evan Ackerman
Posted 7 Jul 2011

#16 Fukushima Robot Operator Writes Tell-All Blog

An anonymous worker at Japan’s Fukushima Dai-ichi nuclear power plant has written dozens of blog posts describing his experience as a lead robot operator at the crippled facility

By Erico Guizzo
Posted 23 Aug 2011

#17 Should Quadrotors All Look Like This?

Researchers say we’ve been designing quadrotors the wrong way

By Evan Ackerman
Posted 13 Nov 2013

#18 Boston Dynamics’ PETMAN Humanoid Robot Walks and Does Push-Ups

Boston Dynamics releases stunning video showing off its most advanced humanoid robot

By Erico Guizzo
Posted 31 Oct 2011

#19 Boston Dynamics’ Spot Robot Dog Goes on Sale

Here’s everything we know about Boston Dynamics’ first commercial robot

By Erico Guizzo
Posted 24 Sep 2019

#20 Agility Robotics Introduces Cassie, a Dynamic and Talented Robot Delivery Ostrich

One day, robots like these will be scampering up your steps to drop off packages

By Evan Ackerman
Posted 9 Feb 2017

#21 Superfast Scanner Lets You Digitize Book By Flipping Pages

Tokyo University researchers develop scanner that can capture 200 pages in one minute

By Erico Guizzo
Posted 17 Mar 2010

#22 A Robot That Balances on a Ball

Masaaki Kumagai has built wheeled robots, crawling robots, and legged robots. Now he’s built a robot that rides on a ball

By Erico Guizzo
Posted 29 Apr 2010

#23 Top 10 Robotic Kinect Hacks

Microsoft’s Kinect 3D motion detector has been hacked into lots of awesome robots, and here are our 10 favorites

By Evan Ackerman
Posted 7 Mar 2011

#24 Latest AlphaDog Robot Prototypes Get Less Noisy, More Brainy

New video shows Boston Dynamics and DARPA putting AlphaDog through its paces

By Evan Ackerman
Posted 11 Sep 2012

#25 How South Korea’s DRC-HUBO Robot Won the DARPA Robotics Challenge

This transformer robot took first place because it was fast, adaptable, and didn’t fall down

By Erico Guizzo & Evan Ackerman
Posted 9 Jun 2015

#26 U.S. Army Considers Replacing Thousands of Soldiers With Robots

The U.S. Army could slash personnel numbers and toss in more robots instead

By Evan Ackerman
Posted 22 Jan 2014

#27 Google Acquires Seven Robot Companies, Wants Big Role in Robotics

The company is funding a major new robotics group and acquiring a bunch of robot startups

By Evan Ackerman & Erico Guizzo
Posted 4 Dec 2013

#28 Who Is SCHAFT, the Robot Company Bought by Google and Winner of the DRC?

Here’s everything we know about this secretive robotics startup

By Erico Guizzo & Evan Ackerman
Posted 6 Feb 2014

#29 Ground-Effect Robot Could Be Key To Future High-Speed Trains

Trains that levitate on cushions of air could be the future of fast and efficient travel, if this robot can figure out how to keep them stable

By Evan Ackerman
Posted 10 May 2011

#30 Hobby Robot Rides a Bike the Old-Fashioned Way

I don’t know where this little robot got its awesome bicycle, but it sure knows how to ride

By Evan Ackerman
Posted 24 Oct 2011

#31 SRI Demonstrates Abacus, the First New Rotary Transmission Design in 50 Years

Finally a gear system that could replace costly harmonic drives

By Evan Ackerman
Posted 19 Oct 2016

#32 Robotic Micro-Scallops Can Swim Through Your Eyeballs

Tiny robots modeled on scallops are able to swim through all the fluids in your body

By Evan Ackerman
Posted 4 Nov 2014

#33 Boston Dynamics Officially Unveils Its Wheel-Leg Robot: "Best of Both Worlds"

Handle is a humanoid robot on wheels, and it’s amazing

By Erico Guizzo & Evan Ackerman
Posted 27 Feb 2017

#34 iRobot Brings Visual Mapping and Navigation to the Roomba 980

The new robot vacuum uses VSLAM to navigate and clean larger spaces in satisfyingly straight lines

By Evan Ackerman & Erico Guizzo
Posted 16 Sep 2015

#35 When Will We Have Robots To Help With Household Chores?

Google, Microsoft, and Apple are investing in robots. Does that mean home robots are on the way?

By Satyandra K. Gupta
Posted 2 Jan 2014

#36 Robots Playing Ping Pong: What’s Real, and What’s Not?

Kuka’s robot vs. human ping pong match looks to be more hype than reality

By Evan Ackerman
Posted 12 Mar 2014

#37 BigDog Throws Cinder Blocks with Huge Robotic Face-Arm

I don’t know why BigDog needs a fifth limb to throw cinder blocks, but it’s incredibly awesome

By Evan Ackerman
Posted 28 Feb 2013

#38 Children Beating Up Robot Inspires New Escape Maneuver System

Japanese researchers show that children can act like horrible little brats towards robots

By Kate Darling
Posted 6 Aug 2015

#39 Boston Dynamics’ AlphaDog Quadruped Robot Prototype on Video

Boston Dynamics has just released some absolutely incredible video of their huge new quadruped robot, AlphaDog

By Evan Ackerman
Posted 30 Sep 2011

#40 Building a Super Robust Robot Hand

Researchers have built an anthropomorphic robot hand that can endure even strikes from a hammer without breaking into pieces

By Erico Guizzo
Posted 25 Jan 2011

#41 Who’s Afraid of the Uncanny Valley?

To design the androids of the future, we shouldn’t fear exploring the depths of the uncanny valley

By Erico Guizzo
Posted 2 Apr 2010

#42 Why AlphaGo Is Not AI

Google DeepMind’s artificial intelligence AlphaGo is a big advance but it will not get us to strong AI

By Jean-Christophe Baillie
Posted 17 Mar 2016

#43 Freaky Boneless Robot Walks on Soft Legs

This soft, inflatable, and totally creepy robot from Harvard can get up and walk on four squishy legs

By Evan Ackerman
Posted 29 Nov 2011

#44 Sweep Is a $250 LIDAR With Range of 40 Meters That Works Outdoors

Finally an affordable LIDAR for robots and drones

By Evan Ackerman
Posted 6 Apr 2016

#45 How Google Wants to Solve Robotic Grasping by Letting Robots Learn for Themselves

800,000 grasps is just the beginning for Google’s large-scale robotic grasping project

By Evan Ackerman
Posted 28 Mar 2016

#46 Whoa: Boston Dynamics Announces New WildCat Quadruped Robot

A new robot from Boston Dynamics can run outdoors, untethered, at up to 25 km/h

By Evan Ackerman
Posted 3 Oct 2013

#47 SCHAFT Unveils Awesome New Bipedal Robot at Japan Conference

SCHAFT demos a new bipedal robot designed to "help society"

By Evan Ackerman & Erico Guizzo
Posted 8 Apr 2016

#48 Riding Honda’s U3-X Unicycle of the Future

It only has one wheel, but Honda’s futuristic personal mobility device is no pedal-pusher

By Anne-Marie Corley
Posted 12 Apr 2010

#49 Lingodroid Robots Invent Their Own Spoken Language

These little robots make up their own words to tell each other where they are and where they want to go

By Evan Ackerman
Posted 17 May 2011

#50 Disney Robot With Air-Water Actuators Shows Off "Very Fluid" Motions

Meet Jimmy, a robot puppet powered by fluid actuators

By Erico Guizzo
Posted 1 Sep 2016

#51 Kilobots Are Cheap Enough to Swarm in the Thousands

What can you do with a $14 robot? Not much. What can you do with a thousand $14 robots? World domination

By Evan Ackerman
Posted 16 Jun 2011

#52 Honda Robotics Unveils Next-Generation ASIMO Robot

We heard some rumors that Honda was working on something big, and here it is: a brand new ASIMO

By Evan Ackerman
Posted 7 Nov 2011

#53 Cybernetic Third Arm Makes Drummers Even More Annoying

It keeps proper time and comes with an off switch, making this robotic third arm infinitely better than a human drummer

By Evan Ackerman
Posted 18 Feb 2016

#54 Chatbot Tries to Talk to Itself, Things Get Weird

"I am not a robot. I am a unicorn."

By Evan Ackerman
Posted 29 Aug 2011

#55 Dean Kamen’s "Luke Arm" Prosthesis Receives FDA Approval

This advanced bionic arm for amputees has been approved for commercialization

By Erico Guizzo
Posted 13 May 2014

#56 Meet the Amazing Robots That Will Compete in the DARPA Robotics Challenge

Over the next two years, robotics will be revolutionized, and here’s how it’s going to happen

By Evan Ackerman
Posted 24 Oct 2012

#57 ReWalk Robotics’s New Exoskeleton Lets Paraplegic Stroll the Streets of NYC

Yesterday, a paralyzed man strapped on a pair of robotic legs and stepped out a hotel door in midtown Manhattan

By Eliza Strickland
Posted 15 Jul 2015

#58 Drone Uses AI and 11,500 Crashes to Learn How to Fly

Crashing into objects has taught this drone to fly autonomously, by learning what not to do

By Evan Ackerman
Posted 10 May 2017

#59 Lego Announces Mindstorms EV3, a More ’Hackable’ Robotics Kit

Lego’s latest Mindstorms kit has a new IR sensor, runs on Linux, and is compatible with Android and iOS apps

By Erico Guizzo & Stephen Cass
Posted 7 Jan 2013

#60 Boston Dynamics’ Marc Raibert on Next-Gen ATLAS: "A Huge Amount of Work"

The founder of Boston Dynamics describes how his team built one of the most advanced humanoids ever

By Erico Guizzo & Evan Ackerman
Posted 24 Feb 2016

#61 AR Drone That Infects Other Drones With Virus Wins DroneGames

Other projects included a leashed auto-tweeting drone, and code to control a swarm of drones all at once

By Evan Ackerman
Posted 6 Dec 2012

#62 DARPA Robotics Challenge: A Compilation of Robots Falling Down

Gravity is a bad thing for robots

By Erico Guizzo & Evan Ackerman
Posted 6 Jun 2015

#63 Bosch’s Giant Robot Can Punch Weeds to Death

A modular agricultural robot from Bosch startup Deepfield Robotics deals with weeds the old fashioned way: violently

By Evan Ackerman
Posted 12 Nov 2015

#64 How to Make a Humanoid Robot Dance

Japanese roboticists demonstrate a female android singing and dancing along with a troupe of human performers

By Erico Guizzo
Posted 2 Nov 2010

#65 What Technologies Enabled Drones to Proliferate?

Five years ago few people had even heard of quadcopters. Now they seem to be everywhere

By Markus Waibel
Posted 19 Feb 2010

#66 Video Friday: Professor Ishiguro’s New Robot Child, and More

Your weekly selection of awesome robot videos

By Evan Ackerman, Erico Guizzo & Fan Shi
Posted 3 Aug 2018

#67 Drone Provides Colorado Flooding Assistance Until FEMA Freaks Out

Drones can provide near real-time maps in weather that grounds other aircraft, but FEMA has banned them

By Evan Ackerman
Posted 16 Sep 2013

#68 A Thousand Kilobots Self-Assemble Into Complex Shapes

This is probably the most robots that have ever been in the same place at the same time, ever

By Evan Ackerman
Posted 14 Aug 2014

#69 Boston Dynamics’ SpotMini Is All Electric, Agile, and Has a Capable Face-Arm

A fun-sized version of Spot is the most domesticated Boston Dynamics robot we’ve seen

By Evan Ackerman
Posted 23 Jun 2016

#70 Kenshiro Robot Gets New Muscles and Bones

This humanoid is trying to mimic the human body down to muscles and bones

By Angelica Lim
Posted 10 Dec 2012

#71 Roomba Inventor Joe Jones on His New Weed-Killing Robot, and What’s So Hard About Consumer Robotics

The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot

By Evan Ackerman
Posted 6 Jul 2017

#72 George Devol: A Life Devoted to Invention, and Robots

George Devol’s most famous invention—the first programmable industrial robot—started a revolution in manufacturing that continues to this day

By Bob Malone
Posted 26 Sep 2011

#73 World Robot Population Reaches 8.6 Million

Here’s an estimate of the number of industrial and service robots worldwide

By Erico Guizzo
Posted 14 Apr 2010

#74 U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful

U.S. Senator Tom Coburn criticizes the NSF for squandering "millions of dollars on wasteful projects," including three that involve robots

By Erico Guizzo
Posted 14 Jun 2011

#75 Inception Drive: A Compact, Infinitely Variable Transmission for Robotics

A novel nested-pulley configuration forms the heart of a transmission that could make robots safer and more energy efficient

By Evan Ackerman & Celia Gorman
Posted 20 Sep 2017

#76 iRobot Demonstrates New Weaponized Robot

iRobot has released video showing a Warrior robot deploying an anti-personnel obstacle breaching system

By John Palmisano
Posted 30 May 2010

#77 Robotics Trends for 2012

Nearly a quarter of the year is already behind us, but we thought we’d spend some time looking at the months ahead and make some predictions about what’s going to be big in robotics

By Erico Guizzo & Travis Deyle
Posted 20 Mar 2012

#78 DRC Finals: CMU’s CHIMP Gets Up After Fall, Shows How Awesome Robots Can Be

The most amazing run we saw all day came from CHIMP, which was the only robot to fall and get up again

By Evan Ackerman & Erico Guizzo
Posted 5 Jun 2015

#79 Lethal Microdrones, Dystopian Futures, and the Autonomous Weapons Debate

The future of weaponized robots requires a reasoned discussion, not scary videos

By Evan Ackerman
Posted 15 Nov 2017

#80 Every Kid Needs One of These DIY Robotics Kits

For just $200, this kit from a CMU spinoff company is a great way for total beginners to get started building robots

By Evan Ackerman
Posted 11 Jul 2012

#81 Beautiful Fluid Actuators from Disney Research Make Soft, Safe Robot Arms

Routing forces through air and water allows for displaced motors and safe, high-performance arms

By Evan Ackerman
Posted 9 Oct 2014

#82 Boston Dynamics Sand Flea Robot Demonstrates Astonishing Jumping Skills

Watch a brand new video of Boston Dynamics’ Sand Flea robot jumping 10 meters into the air

By Evan Ackerman
Posted 28 Mar 2012

#83 Eyeborg: Man Replaces False Eye With Bionic Camera

Canadian filmmaker Rob Spence has replaced his false eye with a bionic camera eye. He showed us his latest prototype

By Tim Hornyak
Posted 11 Jun 2010

#84 We Should Not Ban ‘Killer Robots,’ and Here’s Why

What we really need is a way of making autonomous armed robots ethical, because we’re not going to be able to prevent them from existing

By Evan Ackerman
Posted 28 Jul 2015

#85 Yale’s Robot Hand Copies How Your Fingers Work to Improve Object Manipulation

These robotic fingers can turn friction on and off to make it easier to manipulate objects with one hand

By Evan Ackerman
Posted 12 Sep 2018

#86 France Developing Advanced Humanoid Robot Romeo

Nao, the small French humanoid robot, is getting a big brother

By Erico Guizzo
Posted 13 Dec 2010

#87 DARPA Wants to Give Soldiers Robot Surrogates, Avatar Style

Soldiers controlling bipedal robot surrogates on the battlefield? It’s not science fiction, it’s DARPA’s 2012 budget

By Evan Ackerman
Posted 17 Feb 2012

#88 Whoa: Quadrotors Play Catch With Inverted Pendulum

Watch these quadrotors balance a stick on its end, and then toss it back and forth

By Evan Ackerman
Posted 21 Feb 2013

#89 Why We Should Build Humanlike Robots

Humans are brilliant, beautiful, compassionate, and capable of love. Why shouldn’t we aspire to make robots humanlike in these ways?

By David Hanson
Posted 1 Apr 2011

#90 DARPA Robotics Challenge Finals: Know Your Robots

All 25 robots in a single handy poster-size image

By Erico Guizzo & Evan Ackerman
Posted 3 Jun 2015

#91 Here’s That Extra Pair of Robot Arms You’ve Always Wanted

MIT researchers develop wearable robotic arms that can give you an extra hand (or two)

By Evan Ackerman
Posted 2 Jun 2014

#92 Rat Robot Beats on Live Rats to Make Them Depressed

A robotic rat can be used to depress live rats to make them suitable for human drug trials

By Evan Ackerman
Posted 13 Feb 2013

#93 MIT Cheetah Robot Bounds Off Tether, Outdoors

The newest version of MIT’s Cheetah is fast, it’s quiet, and it jumps

By Evan Ackerman
Posted 15 Sep 2014

#94 Bizarre Soft Robots Evolve to Run

These simulated robots may be wacky looking, but they’ve evolved on their own to be fast and efficient

By Evan Ackerman
Posted 11 Apr 2013

#95 Robot Car Intersections Are Terrifyingly Efficient

In the future, robots will blow through intersections without stopping, and you won’t be able to handle it

By Evan Ackerman
Posted 13 Mar 2012

#96 iRobot’s New Roomba 800 Series Has Better Vacuuming With Less Maintenance

A redesigned cleaning system makes the new vacuum way better at dealing with hair (and everything else)

By Evan Ackerman
Posted 12 Nov 2013

#97 Sawyer: Rethink Robotics Unveils New Robot

It’s smaller, faster, stronger, and more precise: meet Sawyer, Rethink Robotics’ new manufacturing robot

By Evan Ackerman & Erico Guizzo
Posted 19 Mar 2015

#98 Cynthia Breazeal Unveils Jibo, a Social Robot for the Home

The famed MIT roboticist is launching a crowdfunding campaign to bring social robots to consumers

By Erico Guizzo
Posted 16 Jul 2014

#99 These Robots Will Teach Kids Programming Skills

Startup Play-i says its robots can make computer programming fun and accessible

By Erico Guizzo
Posted 30 Oct 2013

#100 Watch a Swarm of Flying Robots Build a 6-Meter Brick Tower

This is what happens when a bunch of roboticists and architects get together in an art gallery

By Erico Guizzo
Posted 2 Dec 2011

This robot is Hiro-chan. It’s made by Vstone, a Japanese robotics company known for producing a variety of totally normal educational and hobby robotics kits and parts. Hiro-chan is not what we would call totally normal, since it very obviously does not have a face. Vstone calls Hiro-chan a “healing communication device,” and while the whole faceless aspect is definitely weird, there is a reason for it, which unsurprisingly involves Hiroshi Ishiguro and his ATR Lab.

Hiro-chan’s entire existence seems to be based around transitioning from sad to happy in response to hugs. If left alone, Hiro-chan’s mood will gradually worsen and it’ll start crying. If you pick it up and hug it, an accelerometer will sense the motion, and Hiro-chan’s mood will improve until it starts to laugh. This is the extent of the interaction, but you’ll be glad to know that the robot has access to over 100 utterance variations collected from an actual baby (or babies) to make sure that mood changes are fluid and seamless. 

According to Japanese blog RobotStart, the target demographic for Hiro-chan is seniors, although it’s simple enough in operation that pretty much anyone could likely pick one up and figure out what they’re supposed to do with it. The end goal is the “healing effect” (a sense of accomplishment, I guess?) that you’d get from making the robot feel better.

Photo: Vstone At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes.

So why doesn’t the robot have a face? Since the functionality of the robot depends on you getting it go from sad to happy, Vstone says that giving the robot a face (and a fixed expression) would make that much less convincing and emotionally fulfilling—the robot would have the “wrong” expression half the time. Instead, the user can listen to Hiro-chan’s audio cues and imagine a face. Or not. Either way, the Uncanny Valley effect is avoided (as long as you can get over the complete lack of face, which I personally couldn’t), and the cost of the robot is kept low since there’s no need for actuators or a display.

Photo: Hiroshi Ishiguro/Osaka University/ATR

The Telenoid robot developed by Hiroshi Ishiguro’s group at ATR in Japan.

This concept that a user could imagine or project features and emotions onto a robot as long as it provides a blank enough slate came from Hiroshi Ishiguro with Telenoid, followed by Elfoid and Hugvie. While Telenoid and Elfoid did have faces, those faces were designed to look neither young nor old, and neither male nor female. When you communicate with another human through Telenoid or Elfoid, the neutral look of the robot makes it easier for you to imagine that it looks something like whoever’s on the other end. Or that’s the idea, anyway. Hiro-chan itself was developed in cooperation with Hidenobu Sumioka, who leads the Presence Media Research Group at Hiroshi Ishiguro Laboratory at ATR.

Vstone says the lack of a face is expected to enhance user attachment to the robot, and that testing during product development “showed that designs without faces were as popular as designs with faces.” Users can also enhance attachment by making clothing for the robot, Vstone suggests, and will provide patterns on its website when Hiro-chan is released. Otherwise, there’s really not much to the robot: It runs on AA batteries, has an on-off switch, and mercifully, a volume control, although the FAQ on the robot suggests that it may sometimes laugh even if it’s all by itself in a different room, which is not creepy at all.

Photo: Vstone Vstone says the lack of a face is expected to enhance user attachment to the robot.

At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes, relating this anecdote: 

In tests at nursing homes that cooperated with the development of Hiro-chan, even those who did not respond to facility staff etc., spontaneously started crying when Hiro-chan started crying, When "Hiro-chan" started laughing, she was seen smiling. By introducing "Hiro-chan", you can expect not only the healing of the user himself, but also the effect of reducing the labor of the facility staff.

Sounds like a great idea, but I still don’t want one.

[ Vstone ]

When Anki abruptly shut down in April of last year, things looked bleak for Vector, Cozmo, and the Overdrive little racing cars. Usually, abrupt shutdowns don’t end well, with assets and intellectual property getting liquidated and effectively disappearing forever. Despite some vague promises (more like hopes, really) from Anki at the time that their cloud-dependent robots would continue to operate, it was pretty clear that Anki’s robots wouldn’t have much of a future—at best, they’d continue to work only as long as there was money to support the cloud servers that gave them their spark of life.

A few weeks ago, The Robot Report reported that Anki’s intellectual property (patents, trademarks, and data) was acquired by Digital Dream Labs, an education tech startup based in Pittsburgh. Over the weekend, a new post on the Vector Kickstarter page (the campaign happened in 2018) from Digital Dream Labs CEO Jacob Hanchar announced that not only will Vector’s cloud servers keep running indefinitely, but that the next few months will see a new Kickstarter to add new features and future-proofing to Vectors everywhere.

Here’s the announcement from Hanchar:

I wanted to let you know that we have purchased Anki's assets and intend to restore the entire platform and continue to develop the robot we all know and love, Vector!

The most important part of this update is to let you know we have taken over the cloud servers and are going to maintain them going forward.  Therefore, if you were concerned about Vector 'dying' one day, you no longer have to worry!  

The next portion of this update is to let you know what we have planned next and we will be announcing a KickStarter under Digital Dream Labs in the next month or two.  While we are still brainstorming we are thinking the Kickstarter will focus on two features we have seen as major needs in the Vector community:

1)  We will develop an "Escape Pod".  This will, safely, expose settings and allow the user to move and set endpoints, and by doing so, remove the need for the cloud server.  In other words, if you're concerned Anki's demise could also happen to us, this is your guarantee that no matter what happens, you'll always get to play with Vector!

2)  We will develop a "Dev Vector".  Many users have asked us for open source and the ability to do more with their Vector even to the point of hosting him on their own servers.  With this feature, developers will be able to customize their robot through a bootloader we will develop.  With the robot unlocked, technologists and hobbyists across the globe will finally be able to hack, with safe guards in place, away at Vector for the ultimate AI and machine learning experience!

As a bonus, we will see about putting together an SDK so users can play with Vector's audio stream and system, which we have discovered is a major feature you guys love about this little guy!

This is just the beginning and subject to change, but because you have shown such loyalty and got this project off the ground in the first place, I felt it was necessary to communicate these developments as soon as possible! 

There are a few more details in the comments on this post—Hanchar notes that they didn’t get any of Anki’s physical inventory, meaning that at least for now, you won’t be able to buy any robots from them. However, Hanchar told The Robot Report that they’ve been talking with ex-Anki employees and manufacturers about getting new robots, with a goal of having the whole family (Vector, Cozmo, and Overdrive) available for the 2020 holidays. 

Photo: Anki Anki’s Cozmo robot.

Despite the announcement on the Vector Kickstarter page, it sounds like Cozmo will be the initial focus, because Cozmo works best with Digital Dream Labs’ existing educational products. The future of Vector, presumably, will depend on how well the forthcoming Kickstarter does. In its FAQ about the Anki acquisition, Digital Dream Labs says that they “will need to examine the business model surrounding Vector before we can relaunch that product,” and speaking with The Robot Report, Hanchar suggested that “monthly subscription packages” in a few different tiers might be the way to make sure that Vector stays profitable. 

It’s probably too early to get super excited about this, but it’s definitely far better news than we were expecting, and Anki’s robots now seem like they could potentially have a future. Hanchar even mentioned something about a “Vector 2.0,” whatever that means. In the short term, I think most folks would be pretty happy with a Vector 1.0 with support, some new features, and no expiration date, and that could be exactly what we’re getting. 

[ Anki Vector ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

IIT’s new HyQReal quadruped robot was released in May 2019. This highlight video shows previously unpublished footage of how we prepared the robot to pull a 3.3 ton airplane. Additionally, it shows the robot walking over unstructured terrain and during public events in October 2019. Including a face-to-face with a dog.

[ IIT ]

Thanks Claudio!

Agility Robotics has had a very busy 2019, and all 10 minutes of this video is worth watching.

Also: double Digits.

[ Agility Robotics ]

Happy (belated) holidays from Franka Emika!

[ Franka Emika ]

Thanks Anna!

Happy (belated) holidays from the GRASP lab!

[ GRASP Lab ]

Happy (belated) holidays from the Autonomous Robots Lab at the University of Nevada!

[ ARL ]

Happy (belated) holidays from the Georgia Tech Systems Research Lab!

[ GA Tech ]

Thanks Qiuyang!

NASA’s Jet Propulsion Laboratory has attached the Mars 2020 Helicopter to the belly of the Mars 2020 rover.

[ JPL ]

This isn’t a Roomba, mind you—are we at the point where “Roomba” is like “Xerox” or “Velcro,” representing a category rather than a brand?—but it does have a flying robot vacuum in it.

[ YouTube ] via [ Gizmodo ]

We’ve said it before, and it’s still true: Every quadrotor should have failsafe software like this.

[ Verity ]

KUKA robots are on duty at one of the largest tea factories in the world located in Rize, Turkey.

[ Kuka ]

This year, make sure and take your robot for more walks.

[ Sphero ]

Dorabot’s Robot for recycling, can identify, pick, and sort recyclable items such as plastic bottles, glass bottles, paper, cartons, and aluminum cans. The robot has deep learning-based computer vision and dynamic planning to select items in a moving conveyor belt. It also includes customized and erosion resistant grippers to pick irregularly shaped items, which results in a cost-effective integrated solution.

[ Dorabot ]

This cute little boat takes hyperlapse pictures autonomously, while more or less not sinking.

[ rctestflight ] via [ PetaPixel ]

Roboy’s Research Reviews takes a look at the OmniSkins paper from 2018.

[ RRR ]

When thinking about robot ethics (and robots in general), it’s typical to use humans and human ethics as a baseline. But what if we considered animals as a point of comparison instead? Ryan Calo, Kate Darling, and Paresh Kathrani were on a panel at the Animal Law Conference last month entitled Persons yet Unknown: Animals, Chimeras, Artificial Intelligence and Beyond where this idea was explored.

[ YouTube ]

Sasha Iatsenia, who was until very recently head of product at Kiwibot, gives a candid talk about “How (not) to build autonomous robots.”

We should mention that Kiwibot does seem to still be alive.

[ CCC ]

On this episode of the Artificial Intelligence Podcast, Lex Fridman interviews Sebastian Thrun.

[ AI Podcast ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores

Let us know if you have suggestions for next week, and enjoy today’s videos.

Thank you to our readers and Happy Holidays from IEEE Spectrum’s robotics team!
—Erico, Evan, and Fan

Happy Holidays from FZI Living Lab!

This is what a robot holiday video should be. Amazing work from FZI!

[ FZI ]

Thanks Arne!

This is the robot I’m most excited about for 2020:

[ IIT ]

Happy Holidays from ETH Zurich’s Autonomous Systems Lab!

ASL ]

Digit v2 demonstrates autonomous pick and place with multiple boxes.

[ Agility Robotics ]

Happy Holidays from EPFL LMTS, whose soft robots we wrote about this week!

NOW SMACK THEM!

[ LMTS ]

Happy Holidays from ETH Zurich’s Robotic Systems Lab!

[ RSL ]

Happy Holidays from OTTO Motors!

OTTO Motors is based in Ontario, which, being in Canada, is basically the North Pole.

[ OTTO Motors ]

Happy Holidays from FANUC!

[ FANUC ]

Brain Corp makes the brains required to turn manual cleaning machines into autonomous robotic cleaning machines.

Braaains.

[ Brain Corp ]

Happy Holidays from RE2 Robotics!

[ RE2 ]

Happy Holidays from Denso Robotics!

[ Denso ]

Happy Holidays from Robodev!

That sandwich thing looks pretty good, but I'm not sold on the potato.

[ Robodev ]

Thanks Andreas!

Happy Holidays from Kawasaki Robotics!

[ Kawasaki ]

On Dec. 17, 2019, engineers took NASA’s next Mars rover for its first spin. The test took place in the Spacecraft Assembly Facility clean room at NASA’s Jet Propulsion Laboratory in Pasadena, California. This was the first drive test for the new rover, which will move to Cape Canaveral, Florida, in the beginning of next year to prepare for its launch to Mars in the summer. Engineers are checking that all the systems are working together properly, the rover can operate under its own weight, and the rover can demonstrate many of its autonomous navigation functions. The launch window for Mars 2020 opens on July 17, 2020. The rover will land at Mars' Jezero Crater on Feb. 18, 2021.

[ JPL ]

Happy Holidays from Laval University’s Northern Robotics Laboratory!

[ Norlab ]

The Chaparral is a hybrid-electric vertical takeoff and landing (VTOL) cargo aircraft being developed by the team at Elroy Air in San Francisco, CA. The system will carry 300lbs of cargo over a 300mi range. This video reveals a bit more about the system than we've shown in the past. Enjoy!

[ Elroy Air ]

FANUC's new CRX-10iA and CRX-10iA/L collaborative robots feature quick setup, easy programming and reliable performance.

[ FANUC ]

Omron’s ping pong robot is pretty good at the game, as long as you’re only pretty good at the game. If you’re much better than pretty good, it’s pretty bad.

[ Omron ]

The Voliro drone may not look like it’s doing anything all that difficult but wait until it flips 90 degrees and stands on its head!

[ Voliro ]

Based on a unique, patented technology, ROVéo can swiftly tackle rough terrain, as well as steps and stairs, by simply adapting to their shape. It is ideal to monitor security both outside AND inside big industrial sites.

[ Rovenso ]

A picture says more than a thousand words, a video more than a thousand pictures. For this reason, we have produced a series of short films that present the researchers at the Max Planck Institute for Intelligent Systems, their projects and goals. We want to give an insight into our institute, making the work done here understandable for everyone. We continue the series with a portrait of the "Dynamic Locomotion" Max Planck research group lead by Dr. Alexander Badri-Spröwitz.

[ Max Planck ]

Thanks Fan!

This is a 13-minute-long IREX demo of Kawasaki’s Kaleido humanoid.

[ Kawasaki ]

Learn how TRI is working to build an uncrashable car, use robotics to amplify people’s capabilities as they age and leverage artificial intelligence to enable discovery of new materials for batteries and fuel cells.

[ Girl Geek X ]

Researchers at EPFL have developed a soft robotic insect that uses artificial soft muscles called dielectric elastomer actuators to drive tiny feet that propel the little bot along at a respectable speed. And since the whole thing is squishy and already mostly flat, you can repeatedly smash it into the ground with a fly swatter, and then peel it off and watch it start running again. Get ready for one of the most brutal robot abuse videos you’ve ever seen.

We’re obligated to point out that the version of the robot that survives being raged on with the swatter is a tethered one, not the autonomous version with the battery and microcontroller and sensors, which might not react so well to repeated batterings. But still, it’s pretty cool to see it get peeled right off and keep on going, and the researchers say they’ve been able to do this smash n’ peel eight times in a row without destroying the robot.

Powered by dielectric elastomer actuators

One of the tricky things about building robots like these (that rely on very high-speed actuation) is power—the power levels themselves are usually low, in the milliwatt range, but the actuators generally require several kilovolts to function, meaning that you need a bunch of electronics that can boost the battery voltage up to something you can use. Even miniaturized power systems are in the tens of grams, which is obviously impractical for a robot that weighs one gram or less. Dielectric elastomer actuators, or DEAs, are no exception to this, so the researchers instead used a stack of DEAs that could run at a significantly lower voltage. These low-voltage stacked DEAs (LVSDEAs, because more initialisms are better) run at just 450 volts, but cycle at up to 600 hertz, using power electronics weighing just 780 milligrams.

Image: EPFL Each soft robot uses three LVSDEAs to operate three independent legs.

The LVSDEA actuation is converted into motion by using flexible angled legs, similar to a bristlebot. One leg on each side allows the robot to turn, pivoting around a third supporting leg in the front. Top speed of the 190-mg tethered robot is 18 mm/s (0.5 body-lengths/s), while the autonomous version with an 800-g payload of batteries and electronics and sensors could move at 12 mm/s for 14 minutes before running out of juice. Interestingly, stiffening the structure of the robot by holding it in a curved shape with a piece of tape significantly increased its performance, nearly doubling its speed to 30 mm/s (0.85 body-lengths/s) and boosting its payload capacity as well.

What we’re all waiting for, of course, is a soft robot that can be smashable and untethered at the same time. This is always the issue with soft robots—they’re almost always just mostly soft, requiring either off-board power or rigid components in the form of electronics or batteries. The EPFL researchers say that they’re “currently working on an untethered and entirely soft version” in partnership with Stanford, which we’re very excited to see.

[ EPFL ]

For the most part, robots are a mystery to end users. And that’s part of the point: Robots are autonomous, so they’re supposed to do their own thing (presumably the thing that you want them to do) and not bother you about it. But as humans start to work more closely with robots, in collaborative tasks or social or assistive contexts, it’s going to be hard for us to trust them if their autonomy is such that we find it difficult to understand what they’re doing.

In a paper published in Science Robotics, researchers from UCLA have developed a robotic system that can generate different kinds of real-time, human-readable explanations about its actions, and then did some testing to figure which of the explanations were the most effective at improving a human’s trust in the system. Does this mean we can totally understand and trust robots now? Not yet—but it’s a start.

This work was funded by DARPA’s Explainable AI (XAI) program, which has a goal of being able to “understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real world phenomena.” According to DARPA, “explainable AI—especially explainable machine learning—will be essential if [humans] are to understand, appropriately trust, and effectively manage an emerging generation of artificially intelligent machine partners.”

There are a few different issues that XAI has to tackle. One of those is the inherent opaqueness of machine learning models, where you throw a big pile of training data at some kind of network, which then does what you want it to do most of the time but also sometimes fails in weird ways that are very difficult to understand or predict. A second issue is figuring out how AI systems (and the robots that they inhabit) can effectively communicate what they’re doing with humans, via what DARPA refers to as an explanation interface. This is what UCLA has been working on.

The present project aims to disentangle explainability from task performance, measuring each separately to gauge the advantages and limitations of two major families of representations—symbolic representations and data-driven representations—in both task performance and fostering human trust. The goals are to explore (i) what constitutes a good performer for a complex robot manipulation task? (ii) How can we construct an effective explainer to explain robot behavior and foster human trust?

UCLA’s Baxter robot learned how to open a safety-cap medication bottle (tricky for robots and humans alike) by learning a manipulation model from haptic demonstrations provided by humans opening medication bottles while wearing a sensorized glove. This was combined with a symbolic action planner to allow the robot adjust its actions to adapt to bottles with different kinds of caps, and it does a good job without the inherent mystery of a neural network.

Intuitively, such an integration of the symbolic planner and haptic model enables the robot to ask itself: “On the basis of the human demonstration, the poses and forces I perceive right now, and the action sequence I have executed thus far, which action has the highest likelihood of opening the bottle?”

Both the haptic model and the symbolic planner can be leveraged to provide human-compatible explanations of what the robot is doing. The haptic model can visually explain an individual action that the robot is taking, while the symbolic planner can show a sequence of actions that are (ideally) leading towards a goal. What’s key here is that these explanations are coming from the planning system itself, rather than something that’s been added later to try and translate between a planner and a human.

Image: Science Robotics

As the robot performs a set of actions (top row of images), its symbolic planner (middle row) and haptic model (bottom row) generate explanations for each action. The red on the robot gripper’s palm indicates a large magnitude of force applied by the gripper, and green indicates no force. These explanations are provided in real time as the robot executes the actions.

To figure out whether these explanations made a difference in the level of a human’s trust or confidence or belief that the robot would be successful at its task, the researchers conducted a psychological study with 150 participants. While watching a video of the robot opening a medicine bottle, groups of participants were shown the haptic planner, the symbolic planner, or both planners at the same time, while two other groups were either shown no explanation at all, or a human-generated one-sentence summary of what the robot did. Survey results showed that the highest trust rating came from the group that had access to both the symbolic and haptic explanations, although the symbolic explanation was more impactful.

In general, humans appear to need real-time, symbolic explanations of the robot’s internal decisions for performed action sequences to establish trust in machines performing multistep complex tasks… Information at the haptic level may be excessively tedious and may not yield a sense of rational agency that allows the robot to gain human trust. To establish human trust in machines and enable humans to predict robot behaviors, it appears that an effective explanation should provide a symbolic interpretation and maintain a tight temporal coupling between the explanation and the robot’s immediate behavior.

This paper focuses on a very specific interpretation of the word “explain.” The robot is able to explain what it’s doing (i.e. the steps that it’s taking) in a way that is easy for humans to interpret, and it’s effective in doing so. However, it’s really just explaining the “what” rather than the “why,” because at least in this case, the “why” (as far as the robot knows) is really just “because a human did it this way” due to the way the robot learned to do the task.

While the “what” explanations did foster more trust in humans in this study, long term, XAI will need to include “why” as well, and the example of the robot unscrewing a medicine bottle illustrates a situation in which it would be useful.

Image: Science Robotics In one study, the researchers showed participants a video of the robot opening the bottle (A). Different groups saw different explanation panels along with the video: (B) Symbolic explanation panel; (C) Haptic explanation panel; (D) Text explanation panel.

You can see that there are several repetitive steps in this successful bottle opening, and as an observer, I have no way of knowing if the robot is repeating an action because the first action failed, or if that was just part of its plan. Maybe the opening the bottle really just takes one single grasp-push-twist sequence, but the robot’s gripper slipped the first time. 

Personally, when I think of a robot explaining what it’s doing, this is what I’m thinking of. Knowing what a robot was “thinking,” or at least the reasoning behind its actions or non-actions, would significantly increase my comfort with and confidence around robotic systems, because they wouldn’t seem so… Dumb? For example, is that robot just sitting there and not doing anything because it’s broken, or because it’s doing some really complicated motion planning? Is my Roomba wandering around randomly because it’s lost, or is it wandering around pseudorandomly because that’s the most efficient way to clean? Does that medicine bottle need to be twisted again because a gripper slipped the first time, or because it takes two twists to open?

Knowing what a robot was “thinking,” or at least the reasoning behind its actions or non-actions, would significantly increase my confidence around robotic systems. For example, is that robot just sitting there and not doing anything because it’s broken, or because it’s doing some really complicated motion planning?

Even if the robot makes a decision that I would disagree with, this level of “why” explanation or “because” explanation means that I can have confidence that the robot isn’t dumb or broken, but is either doing what it was programmed to do, or dealing with some situation that it wasn’t prepared for. In either case, I feel like my trust in it would significantly improve, because I know it’s doing what it’s supposed to be doing and/or the best it can, rather than just having some kind of internal blue screen of death experience or something like that. And if it is dead inside, well, I’d want to know that, too.

Longer-term, the UCLA researchers are working on the “why” as well, but it’s going to take a major shift in the robotics community for even the “what” to become a priority. The fundamental problem is that right now, roboticists in general are relentlessly focused on optimization for performance—who cares what’s going on inside your black box system as long as it can successfully grasp random objects 99.9 percent of the time?

But people should care, says lead author of the UCLA paper Mark Edmonds. “I think that explanation should be considered along with performance,” he says. “Even if you have better performance, if you’re not able to provide an explanation, is that actually better?” He added: “The purpose of XAI in general is not to encourage people to stop going down that performance-driven path, but to instead take a step back, and ask, ‘What is this system really learning, and how can we get it to tell us?’ ”

It’s a little scary, I think, to have systems (and in some cases safety critical systems) that work just because they work—because they were fed a ton of training data and consequently seem to do what they’re supposed to do to the extent that you’re able to test them. But you only ever have the vaguest of ideas why these systems are working, and as robots and AI become a more prominent part of our society, explainability will be a critical factor in allowing us to comfortably trust them.

“A Tale of Two Explanations: Enhancing Human Trust by Explaining Robot Behavior,” by M. Edmonds, F. Gao, H. Liu, X. Xie, S. Qi, Y. Zhu, Y.N. Wu, H. Lu, and S.-C. Zhu from the University of California, Los Angeles, and B. Rothrock from the California Institute of Technology, in Pasadena, Calif., appears in the current issue of Science Robotics.

The more dynamic robots get, the more likely they are to break. Or rather, all robots are 100 percent guaranteed to break eventually (this is one of their defining characteristics). More dynamic robots will also break more violently. While they’re in the lab, this isn’t a big deal, but for long term real-world use, wouldn’t it be great if we could rely on robots to repair themselves?

Rather than give a robot a screwdriver and expect it to replace its own parts, though, a much more elegant solution is robots that can heal themselves more like animals, where for many common injuries, all you have to do is sit around for a little while and your body will magically fix itself. We’ve seen a few examples of this before using self-healing polymers, but for dynamic robots that run and jump, you need the strength of metal.

At the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) last month, roboticists from the University of Tokyo’s JSK Lab presented a prototype for a robot leg with a tendon “fuse” made out of a metal that can repair fractures. It does that by autonomously melting itself down and reforming into a single piece. It’s still a work in progress, but it’s basically a tiny little piece of the T-1000 Terminator. Great!

Image: University of Tokyo/JSK Lab In one test, the researchers equipped a robotic leg with the self-healing bolt and dropped it to the ground. The bolt broke, as designed, avoiding damage to other parts of the robot. At that point, internal heating elements were activated and the two halves of the bolt liquified and then melted back together again, healing the robotic leg within about half an hour. The leg wasn’t able to fully stand up, but that’s what the researchers want to achieve with future prototypes.

This is a life-sized robotic leg with an Achilles tendon made up of a cable that transmits force from the foot around the ankle to the lower leg bone. The cable is bisected by a module containing a bolt made out of a metallic alloy that will snap under stress lower than any other point in the system, meaning that it acts like a mechanical fuse—it’ll be the first thing that breaks, sacrificing itself to protect the robot’s other joints. 

Image: University of Tokyo/JSK Lab The self-healing module (top left) consists of two halves connected by magnets and springs. Each half has a cartridge that the researchers fill with a low melting point alloy (U-47). When the cartridges heat up, the alloy melts, fusing the two halves together.

The alloy has a very low melting point (just 50° Celsius), and the module around it is made up of two halves connected by magnets and springs. If the bolt breaks, the magnets and springs will come apart also, but then snap back together, realigning the two broken halves of the bolt. At that point, internal heaters fire up, the two halves of the bolt liquify, and then melt back together again, healing the tendon within about half an hour. This video shows the robot falling, the tendon breaking, and then the robot self-healing and starting to stand up again:

In the video, it’s not quite good as new—it turns out that passive melting reduces the strength of the self-healing bolt to just 30 percent of where it was before the break. But after some additional experiments, the researchers discovered that gentle vibration during the melting and reforming process can bring the healed strength up above 90 percent of the original strength, and there’s likely even more optimization that can be done.

The researchers feel like this is a practical system to have in a real robot, and the plan is to refine it to the point where it’s a realistic feature to have on a dynamic legged robot.

“An Approach of Facilitated Investigation of Active Self-healing Tension Transmission System Oriented for Legged Robots,” by Shinsuke Nakashima, Takuma Shirai, Kento Kawaharazuka, Yuki Asano, Yohei Kakiuchi, Kei Okada, and Masayuki Inaba from the University of Tokyo, was presented at IROS 2019 in Macau.

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores

Let us know if you have suggestions for next week, and enjoy today’s videos.

One of Digit’s autonomy layers ensures a minimum distance from obstacles, even mobile ones like pesky engineers. In this video, the vision system is active and Digit is operating under full autonomy.

I would pay money to watch a second video that’s just like this one except Agility has given Digit a voice and it’s saying this stuff dynamically.

[ Agility Robotics ]

The Intel RealSense lidar camera L515 is the world’s smallest and most power efficient hi-resolution lidar, featuring unparalleled depth and accuracy that makes it perfect for indoor uses cases. The L515 is designed with proprietary technology that creates entirely new ways to incorporate lidar into smart devices to perceive the world in 3D.

[ Intel ]

This project investigates a design space, a fabrication system and applications of creating fluidic chambers and channels at millimeter scale for tangible actuated interfaces. The ability to design and fabricate millifluidic chambers allows one to create high frequency actuation, sequential control of flows and high resolution design on thin film materials. We propose a four dimensional design space of creating these fluidic chambers, a novel heat sealing system that enables easy and precise millifluidics fabrication, and application demonstrations of the fabricated materials for haptics, ambient devices and robotics.

[ MIT ]

This looks like it could be relaxing.

[ Eura Shin ]

This is only sort of robotics related, but it’s cool: changing the direction of a ping pong ball by nudging it with controllable ultrasound.

[ University of Tokyo ]

Check out the natural gait on this little running robot from Nagoya Institute of Technology:

[ Nagoya ]

UAV Turbines announced that it successfully demonstrated its Monarch Hybrid Range Extender (HREX), a microturbine powered generator technology that extends the range of electrically powered medium-sized unmanned aircraft.

In UAV Turbines’ HREX system, the engine extracts energy from the fuel and uses it to power the propulsion motor directly, with any excess electric power used to top off the battery charge. This greatly increases range before the weight of the added fuel becomes uneconomical. There are many tradeoffs in optimizing power for any specific system; for example, some energy is lost in the extraction process, but as the fuel is consumed, the net weight of the aircraft drops. It is this flexibility that enables engine optimizations not otherwise possible with a single power source.

[ UAV Turbines ]

Happy Holidays from Sphero!

[ Sphero ]

Happy Holidays from Yaskawa, which, despite having access to lots of real robots, stubbornly refuses to use them in their holiday videos.

[ Yaskawa ]

Join us in celebrating the life of Woodie Flowers, professor emeritus of mechanical engineering at MIT and co-founder of the FIRST Robotics Competition. A beloved teacher and pioneer in hands-on engineering education, Flowers developed design and robotics competitions at MIT, FIRST, and beyond, while promoting his concept of “gracious professionalism.”

[ MIT ]

I still really like the design of EMYS, although I admit that it looks a little strange when viewed from the side.

[ EMYS ]

Japanese college students compete to make the best laundry hanging robot, where speed and efficiency are a hilarious priority.

[ ROBOCON ]

The U in U-Drone stands for: Underground, Unlimited, Unjammable, User-Friendly and Unearthing. A prototype of a compact cable drone (U-Drone) has been developed in the one year project. It has been tested and demonstrated in underground tunnels (without GPS reception), whereby the drone can be operated at distances of up to 100 meters.

The commands to the U-Drone, and the images and data from the drone to the operator, run through an (unjammable) lightweight cable. The replaceable spool with the 100 meter cable is connected to the U-Drone and therefore unwinds from the drone during the flight in such a way that the drone is not stopped when the cable gets stuck.

[ Delft Dynamics ]

Interesting tiltrotor design for a drone that can apply a probe to a structure at any angle you want.

[ Skygauge ]

NASA has developed a flexible way to test new designs for aircraft that use multiple rotors to fly. The Multirotor Test Bed, or MTB, will let researchers study a wide variety of rotor configurations for different vehicles, including tiltrotor aircraft, mid-sized drones and even air taxis planned for the coming era of air travel called Urban Air Mobility.

[ NASA ]

Here’s a robot not to get on the wrong side of.

The Javelin Joint Venture team, a partnership of Lockheed Martin and Raytheon Company, successfully fired Javelin missiles from a Kongsberg remote weapon station integrated onto an unmanned vehicle platform. The demonstrations, conducted at the Redstone Test Center, Ala., validated the integration of the weapon station, missile and vehicle.

Raytheon ]

From Paul Scharre, who knows what he’s talking about more than most, a nuanced look at the lethal autonomous robots question.

[ Freethink ]

One year ago, for IEEE Spectrum’s special report on the Top Tech for 2019, Sarcos Robotics promised that by the end of the year they’d be ready to ship a powered exoskeleton that would be the future of industrial work. And late last month, Sarcos invited us to Salt Lake City, Utah, to see what that future looks like.

Sarcos has been developing powered exoskeletons and the robotic technologies that make them possible for decades, and the lobby of the company’s headquarters is a resting place for concepts and prototype hardware that’s been abandoned along the way. But now, Sarcos is ready to unveil the prototype of the Guardian XO, a strength-multiplying exoskeleton that’s about to begin shipping.

As our introductory briefing concludes, Sarcos CEO Ben Wolff is visibly excited to be able to show off what they’ve been working on in their lab. “If you were to ask the question, What does 30 years and $300 million look like,” Wolff tells us, “you're going to see it downstairs.”

This is what we see downstairs:

GIF: Evan Ackerman/IEEE Spectrum Guardian XO operator Fletcher Garrison demonstrates the company’s exosuit by lifting a 125-pound payload. Sarcos says this task usually requires three people. How the Guardian XO Works

The Sarcos Guardian XO is a 24-degrees-of-freedom full-body robotic exoskeleton. While wearing it, a human can lift  200 pounds (90 kilograms) while feeling like they’re lifting just 10 lbs (4.5 kg). The Guardian XO is fully electrical and untethered with a runtime of 2 hours, and hot-swappable battery packs can keep it going for a full work day. It takes seconds to put on and take off, and Sarcos says new users can be trained to use the system in minutes. One Guardian XO costs $100,000 per year to rent, and the company will be shipping its first batch of alpha units to customers (including both heavy industry and the U.S. military) in January.

Photo: Evan Ackerman/IEEE Spectrum The prototype that Sarcos demonstrated had all of the functionality of the version that will ship in January, but latter models will include plastic fairings over the suit as well as quick-change end-effectors.

In a practical sense, the Guardian XO is a humanoid robot that uses a real human as its command and control system. As companies of all kinds look towards increasing efficiency through automation, Sarcos believes that the most effective solution is a direct combination of humans and machines, enhancing the intelligence and judgement of humans with the strength and endurance of robots. (Investors in the company include Caterpillar, GE Ventures, Microsoft, and Schlumberger.)

The first thing to understand about the Guardian XO is that like a humanoid robot, it’s self-supporting. Since it has its own legs and feet, the 150 lb weight of the suit (and whatever it’s carrying) bypasses its user and is transferred directly into the ground. You don’t strap the robot to you—you strap yourself to the robot, a process that takes less than a minute. So although it looks heavy and bulky (and it is definitely both of those things), at least the weight of the system isn’t something that the user experiences directly. You can see how that works by watching Guardian XO operator Fletcher Garrison lifting all kinds of payloads in the video below.

Hands On With the Guardian XO

When Sarcos reached out and asked if we wanted to come to Salt Lake City to try out the XO, we immediately said yes (disclosure: Sarcos covered our costs to attend a media event last month). But we were disappointed when, in the end, we were only allowed to try out a one-armed version of the exoskeleton. I even offered to sign additional waivers but, alas, the company wouldn’t let me into the full suit. So my experience with the exo was pretty limited—a hands-on, literally, of a single XO arm.

Photo: Evan Ackerman/IEEE Spectrum That’s me trying out the one-arm XO system. It’s not quite like the full-body suit, but Sarcos still required me to sign a “waiver of liability, assumption of risk, and indemnity agreement.”

Still, it was an amazing sensation. The arm I tested, which Sarcos says uses the same control system as the full-body suit, was incredibly easy to operate. In terms of control, all the exo tries to do is get out of the way of your limbs: It uses force sensors to detect every motion that you make, and then moves its own limbs in parallel, smoothly matching your body with its own hardware. If you take a step, it takes a step with you. If you swing your arm back and forth, it swings its arm back and forth in the same way, right next to yours. There’s no discernible lag to this process, and it’s so intuitive that Sarcos says most people take just a minute or two to get comfortable using the system, and just an hour or two to be comfortable doing work in it.

The Guardian XO can augment the strength of the user all the way up to making a 200-pound load feel like it weighs zero pounds. Typically, this is not how the exoskeleton works, though, since it can be disconcerting to be lifting something heavy and not feel like you’re lifting anything at all. It’s better to think of the exo as a tool that makes you stronger rather than a tool that makes objects weightless, especially since you still have to deal with inertia. Remember, even if something has no apparent weight (either because you’re in space or because you’re holding it with a powered exoskeleton), it still has mass, which you have to be aware of when trying to move it or stop it from moving. The amount of help that the exo gives you is easy to adjust; it’s got a graphical control panel on the left wrist.

GIF: Evan Ackerman/IEEE Spectrum This ammo crate weighs 110 pounds, but the exoskeleton makes it feel like each arm is lifting just 6 pounds. The Guardian XO is designed for loads of up to 200 lbs. How Safe Is the Exoskeleton?

With a robotic system this powerful (XO has a peak torque of about 4000 inch-pounds, or 450 newton-meters), Sarcos made safety a top priority. For example, to move the exo’s arms, your hands need to be holding down triggers. If you let go of the triggers (for whatever reason), the arms will lock in place, which has the added benefit of letting the exo hold stuff up for you while you, say, check your phone. All of the joints are speed limited, meaning that you can’t throw a punch with the exo—they told me this during my demo, so of course I tried it, and the joints locked themselves as soon as I exceeded their safety threshold. If the system loses power for any reason, current shunts back through the motors, bringing them down gradually rather than abruptly. And by design the joints are not capable of exceeding a human range of motion, which means that the exoskeleton can’t bend or twist in a way that would injure you. Interestingly, the Guardian XO’s joint speeds are easily fast enough to allow you to run, although that’s been limited for safety reasons as well.

We asked about whether falling down was much of a risk, but it turns out that having a human in the loop for control makes that problem much simpler. Sarcos hasn’t had to program the Guardian XO to balance itself, because the human inside does all of that naturally. Having someone try to push you over while you’re in the exoskeleton is no different than having someone try to push you over while you’re out of it, because you’ll keep your own balance in either case. If you do end up falling over, Sarcos claims that the exoskeleton is designed as a roll cage, so odds are you’ll be fine, although it’s not clear how easy it would be to get out of it afterwards (or get it off of you).

More of a concern is how the XO will operate around other people. While its mass and bulk may not make all that much of a different to the user, it seems like working collaboratively could be a problem, as could working in small spaces or around anything fragile. The suit does have force feedback so that you’ll feel if you contact something, but by then it might be too late to prevent an accident.

GIF: Evan Ackerman/IEEE Spectrum With a pair of 12 lb 500 watt-hour battery packs, the exoskeleton can operate for over 2 hours during normal use. Energy Efficiency and Reliability

Efficiency might not seem like a big deal for an exoskeleton like this, but what Sarcos has managed is very impressive. The Guardian XO uses about 500 watts while fully operational—that is, while carrying 160 lbs and walking at 3 mph. To put that in context, SRI’s DURUS robot, which was designed specifically for efficiency (and is significantly smaller and lighter than the Guardian XO), used 350 watts while just walking. “That’s really one of our key innovations,” says Sarcos COO Chris Beaufait. “There aren’t many robots in the world that are as efficient as what we’re doing.” These innovations come in the form of energy recovery mechanisms, reductions in the number of individual computers on-board, and getting everything as tightly integrated as possible. With a pair of 12 lb 500 watt-hour battery packs, the exoskeleton can operate for over 2 hours during normal use, and Sarcos expects to improve the efficiency from 500 watts to 425 watts or better by January.

Since the Guardian XO is a commercial product, it has to be reliable enough to be a practical tool that’s cost effective to use. “The difference between being an R&D shop that can prove a concept versus making a commercially viable product that’s robust—it takes an entirely different skill set and mind set,” Wolff, the CEO, told us. “That’s been a challenge. I think it’s the biggest challenge that robotics companies have, and we’ve put a lot of blood, sweat, and tears into that.”

Wolff says that future XO versions (not the alpha model that will ship in January) will be able to walk outdoors over challenging terrain, through a foot of mud, and in the rain or snow. It will be able to go up and down stairs, although they’re currently working on making sure that this will be safe. The expectation, Wolff tells us, is that there won’t be much ongoing service or maintenance required for the exo’s customers. We’re not sure we share Sarcos’ confidence yet—this is a complex system that’s going to be used by non-engineers in semi and unstructured environments. A lot of unexpected scenarios can happen, and until they do, we won’t know for sure how well the Guardian XO will stand up to real-world use.

Guardian XO Applications

The Guardian XO has been designed to target some specific (but also very common) types of jobs that require humans to repetitively lift heavy things. These jobs are generally not automatable, or at least not automatable in a way that’s cost effective—the skill of a human is required. These jobs are also labor intensive, which creates both short term and long term problems for human workers. Short term, acute injuries (like back injuries) lead to lost productivity. Long term, these injuries add up to serious medical problems for workers, many of whom can only function for between five and eight years before their bodies become permanently damaged.

Wolff believes that this is where there’s an opportunity for powered exoskeletons. Using the Guardian XO to offload the kinds of tasks that put strain on a worker’s body means that humans can work at a job longer without injury. And they can keep working at that same job as they age, since the exoskeleton takes jobs that used to be about strength and instead makes them about skill and experience.

Photo: Evan Ackerman/IEEE Spectrum Sarcos says that one worker in an exoskeleton can handle tasks that would otherwise take between 4 and 10 people.

Of course, the sad fact is that none of this stuff about worker health would matter all that much if companies couldn’t be convinced that exoskeletons could also save them money. Fortunately for workers, it’s an easy argument to make. Since the Guardian XO can lift 200 pounds, Wolff says that it can improve the productivity of its user by up to an order of magnitude: “Overall, we’re seeing across the board improved productivity of somewhere between 4 and 10 times in use cases that we’ve looked at. So what that means is, one worker in an exoskeleton can do the work of between 4 and 10 employees without any stress or strain on their body.”

On the 4x end of the scale, it’s just about being able to lift more, and for longer. OSHA recommends a maximum one person load of 51 pounds, a number that gets adjusted downwards if the object has to be lifted repetitively, held for a long time, or moved. The Guardian XO allows a worker to lift four times that, for hours, while walking at up to 3 mph. Things are a little more complicated on the 10x end of the scale, but you can imagine a single 200 pound object that requires an overhead crane plus several people to manage it. It’s not just about the extra people—it’s also about the extra time and infrastructure required, when a single worker in a Guardian XO could just pick up that same object and move it by themselves.

The obvious question at this point is whether introducing powered exoskeletons is going to put people out of work. Wolff insists that is not the reality of the industry right now, since the real problem is finding qualified workers to hire in the first place. “None of our customers are talking about firing people,” Wolff says. “All of them are talking about simply not being able to produce enough of their products or services to keep their customers happy.” It should keep workers happy as well. Wolff tells us that they’ve had “enthusiastic responses” from workers who’ve tried the Guardian XO out, with their only concern being whether the exoskeleton can be adjusted to fit folks of different shapes and sizes. While initial units will be adjustable for people ranging in height from 5’4” to 6’, by next year, Sarcos promises that they’ll be adjustable enough to cover 90 percent of the American workforce. 

Image: Sarcos A rendering of how the Guardian XO will look with fairings applied. Cost and Availability

“We could not have made this an economically viable product three years ago,” Wolff says. “The size, power, weight, and cost of all of the components that we use—all of that has now gotten to a point where this is commercially feasible.” What that means, for Sarcos and the companies that they’re partnering with, is that each exoskeleton costs about $100,000 per year. The alpha units will be going to companies that can afford at least 10 of them at once, and Sarcos will send a dedicated engineer along with each batch. The Guardian XO is being sold as a service rather than a product—at least for now, it’s more of a rental with dedicated customer support. “The goal is this has to be stupid simple to manage and use,” says Wolff, adding that Sarcos expects to learn a lot over the next few months once the exoskeletons start being deployed. Commercial versions should ship later in 2020.

I made sure to ask Wolff when I might be able to rent one of these things from my local hardware store for the next time I have to move, but disappointingly, he doesn’t see that happening anytime soon. Sarcos still has a lot to learn about how to make a business out of exoskeletons, and they’d rather keep expectations realistic than promise anyone an Iron Man suit. It’s too late for me, though—I’ve seen what the Guardian XO can do. And I want one.

[ Sarcos ]

Pages