Feed aggregator

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

Apparently the whole “little home robot with a cute personality will seamlessly improve your life” thing is still going on, at least as far as Samsung is concerned.

Predictably, there’s no information on how much Ballie costs, when it might be available, what’s inside of it, and whether it can survive a shaggy carpet.

Samsung ]

In case you had the good sense to be somewhere besides CES this week, there’s the full demo of Agility RoboticsDigit at the Ford booth.

Classy!

Because of the nightmarish Wi-Fi environment in the convention center, Digit is steered manually, but the box interaction is autonomous.

[ Agility Robotics ]

Stefano Mintchev from EPFL and his startup Foldaway Haptics are the folks behind the 33 individual actuated "bionic flaps" on the new Mercedes-Benz Vision AVTR concept car that was at CES this week.

The underlying technology, which is based on origami structures, can be used in a variety of other applications, like this robotic billboard:

[ Foldaway Haptics ] via [ Mercedes ]

Thanks Stefano!

The Sarcos Guardian XO alpha version is looking way more polished than the pre-alpha prototype that we saw late last year.

And Sarcos tells us that it’s now even more efficient, although despite my begging, they won’t tell us exactly how they’ve managed that.

[ Sarcos ]

It is our belief that in 5 years’ time, not one day will go by without most of us interacting with a robot. Reachy is the only humanoid service robot that is open source and can manipulate objects. He mimics human expressions and body language, with a cute free-moving head and antennas as well as bio-inspired arms. Reachy is the optimum platform to create real-world interactive & service applications right away.

[ Pollen Robotics ]

Ritsumeikan Humanoid System Laboratory is working on a promising hydraulic humanoid:

[ Ritsumeikan HSL ]

With the steep rise of automation and robotics across industries, the requirements for robotic grippers become increasingly demanding. By using acoustic levitational forces, No-​Touch Robotics develops damage and contamination free contactless robotic grippers for handling highly fragile objects. Such grippers can beneficially be applied in the field of micro assembly and the semiconductor industry, resulting in an increased production yield, reduced waste, and high production quality by completely eliminating damage inflicted during handling.

You can also experience the magic by building your own acoustic levitator.

[ ETHZ ]

Preview of the Unitree A1. Maximum torque of each joint is 35.5 Nm. Weight (with battery) 12 kg. Price Less than $10k.

Under $10k? I’m going to start saving up!

[ Unitree ]

A team from the Micro Aerial Vehicle Lab (MAVLab) of TU Delft has won the 2019 Artificial Intelligence Robotic Racing (AIRR) Circuit, with a final breathtaking victory in the World Championship Race held in Austin, Texas, last December. The team takes home the $1 million grand prize, sponsored by Lockheed Martin, for creating the fastest and most reliable self-piloting aircraft this season.

[ MAVLab ]

After 10 years and 57 robots, hinamitetu brings you a few more.

[ Hinamitetu ]

Vision 60 legged robot managing unstructured terrain without vision or force sensors in its legs.

[ Ghost Robotics ]

In 2019 GRVC has lived one of the best years of its life, with the lastest developments of GRIFFIN ERC Advances Grant, the kick-off meeting of H2020 Aerial-Core Project and another projects.

[ GRVC ]

The Official Wrap-Up of ABU ROBOCON 2019 Ulaanbaatar, Mongolia.

[ RoboCon 2019 ]

Roboy had a busy 2019:

[ Roboy ]

Very interesting talk from IHMC’s Jerry Pratt, at the Workshop on Teleoperation of Humanoid Robots at Humanoids 2019.

[ Workshop ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

Apparently the whole “little home robot with a cute personality will seamlessly improve your life” thing is still going on, at least as far as Samsung is concerned.

Predictably, there’s no information on how much Ballie costs, when it might be available, what’s inside of it, and whether it can survive a shaggy carpet.

Samsung ]

In case you had the good sense to be somewhere besides CES this week, there’s the full demo of Agility RoboticsDigit at the Ford booth.

Classy!

Because of the nightmarish Wi-Fi environment in the convention center, Digit is steered manually, but the box interaction is autonomous.

[ Agility Robotics ]

Stefano Mintchev from EPFL and his startup Foldaway Haptics are the folks behind the 33 individual actuated "bionic flaps" on the new Mercedes-Benz Vision AVTR concept car that was at CES this week.

The underlying technology, which is based on origami structures, can be used in a variety of other applications, like this robotic billboard:

[ Foldaway Haptics ] via [ Mercedes ]

Thanks Stefano!

The Sarcos Guardian XO alpha version is looking way more polished than the pre-alpha prototype that we saw late last year.

And Sarcos tells us that it’s now even more efficient, although despite my begging, they won’t tell us exactly how they’ve managed that.

[ Sarcos ]

It is our belief that in 5 years’ time, not one day will go by without most of us interacting with a robot. Reachy is the only humanoid service robot that is open source and can manipulate objects. He mimics human expressions and body language, with a cute free-moving head and antennas as well as bio-inspired arms. Reachy is the optimum platform to create real-world interactive & service applications right away.

[ Pollen Robotics ]

Ritsumeikan Humanoid System Laboratory is working on a promising hydraulic humanoid:

[ Ritsumeikan HSL ]

With the steep rise of automation and robotics across industries, the requirements for robotic grippers become increasingly demanding. By using acoustic levitational forces, No-​Touch Robotics develops damage and contamination free contactless robotic grippers for handling highly fragile objects. Such grippers can beneficially be applied in the field of micro assembly and the semiconductor industry, resulting in an increased production yield, reduced waste, and high production quality by completely eliminating damage inflicted during handling.

You can also experience the magic by building your own acoustic levitator.

[ ETHZ ]

Preview of the Unitree A1. Maximum torque of each joint is 35.5 Nm. Weight (with battery) 12 kg. Price Less than $10k.

Under $10k? I’m going to start saving up!

[ Unitree ]

A team from the Micro Aerial Vehicle Lab (MAVLab) of TU Delft has won the 2019 Artificial Intelligence Robotic Racing (AIRR) Circuit, with a final breathtaking victory in the World Championship Race held in Austin, Texas, last December. The team takes home the $1 million grand prize, sponsored by Lockheed Martin, for creating the fastest and most reliable self-piloting aircraft this season.

[ MAVLab ]

After 10 years and 57 robots, hinamitetu brings you a few more.

[ Hinamitetu ]

Vision 60 legged robot managing unstructured terrain without vision or force sensors in its legs.

[ Ghost Robotics ]

In 2019 GRVC has lived one of the best years of its life, with the lastest developments of GRIFFIN ERC Advances Grant, the kick-off meeting of H2020 Aerial-Core Project and another projects.

[ GRVC ]

The Official Wrap-Up of ABU ROBOCON 2019 Ulaanbaatar, Mongolia.

[ RoboCon 2019 ]

Roboy had a busy 2019:

[ Roboy ]

Very interesting talk from IHMC’s Jerry Pratt, at the Workshop on Teleoperation of Humanoid Robots at Humanoids 2019.

[ Workshop ]

.sm img { width: 150px !important; height: 108px !important; }

We’ve been writing about robots here at IEEE Spectrum for a long, long time. Erico started covering robotics for Spectrum in 2007, about the same time that Evan started BotJunkie.com. We joined forces in 2011, and have published thousands of articles since then, chronicling as many aspects of the field as we could. Autonomous cars, humanoids, legged robots, drones, robots in space—the last decade in robotics has been incredible.

To kick off 2020, we’re taking a look back at our most popular posts of the last 10 years. In order, listed below are the 100 articles with the highest total page views, providing a cross-section of what was the most interesting in robotics from 2010 until now.

Also, sometime in the next several weeks, we plan to post a selection of our favorite stories, focusing on what we think were the biggest developments in robotics of the past decade (including a few things that, surprisingly, did not make the list below). If you have suggestions of important robot stories we should include, let us know. Thank you for reading!

#1 How Google’s Self-Driving Car Works

Google engineers explain the technology behind their self-driving car and show videos of road tests

By Erico Guizzo
Posted 18 Oct 2011

#2 This Robot Can Do More Push-Ups Because It Sweats

A robot that uses artificial sweat can cool its motors without bulky radiators

By Evan Ackerman
Posted 13 Oct 2016

#3 Meet Geminoid F, a Smiling Female Android

Geminoid F displays facial expressions more naturally than previous androids

By Erico Guizzo
Posted 3 Apr 2010

#4 Latest Geminoid Is Incredibly Realistic

Geminoid DK is a realistic android nearly indistinguishable from a real human

By Evan Ackerman
Posted 5 Mar 2011

#5 The Next Generation of Boston Dynamics’ ATLAS Robot Is Quiet, Robust, and Tether Free

The latest ATLAS is by far the most advanced humanoid robot in existence

By Evan Ackerman & Erico Guizzo
Posted 23 Feb 2016

#6 The Uncanny Valley: The Original Essay by Masahiro Mori

“The Uncanny Valley” by Masahiro Mori is an influential essay in robotics. This is the first English translation authorized by Mori.

By Masahiro Mori
Posted 12 Jun 2012

#7 NASA JSC Unveils Valkyrie DRC Robot

NASA’s DARPA Robotics Challenge entry is much more than Robonaut with legs: it’s a completely new humanoid robot

By Evan Ackerman
Posted 10 Dec 2013

#8 Origami Robot Folds Itself Up, Does Cool Stuff, Dissolves Into Nothing

Tiny self-folding magnetically actuated robot creates itself when you want it, disappears when you don’t

By Evan Ackerman
Posted 28 May 2015

#9 Robots Bring Couple Together, Engagement Ensues

Yes, you really can find love at an IEEE conference

By Evan Ackerman & Erico Guizzo
Posted 31 Mar 2014

#10 Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter

The Deep Learning expert explains how convolutional nets work, why Facebook needs AI, what he dislikes about the Singularity, and more

By Lee Gomes
Posted 18 Feb 2015

#11 This Is the Most Amazing Biomimetic Anthropomorphic Robot Hand We’ve Ever Seen

Luke Skywalker, your new robotic hand is ready

By Evan Ackerman
Posted 17 Feb 2016

#12 Dutch Police Training Eagles to Take Down Drones

Attack eagles are training to become part of the Dutch National Police anti-drone arsenal

By Evan Ackerman
Posted 1 Feb 2016

#13 You (YOU!) Can Take Stanford’s ’Intro to AI’ Course Next Quarter, For Free

Sebastian Thrun and Peter Norvig are offering Stanford’s "Introduction to Artificial Intelligence" course online, for free, grades and all

By Evan Ackerman
Posted 4 Aug 2011

#14 Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time

Watch this high-speed robot hand cheat at rock, paper, scissors

By Evan Ackerman
Posted 26 Jun 2012

#15 You’ve Never Seen a Robot Drive System Like This Before

Using just a single spinning hemisphere mounted on a gimbal, this robot demonstrates some incredible agility

By Evan Ackerman
Posted 7 Jul 2011

#16 Fukushima Robot Operator Writes Tell-All Blog

An anonymous worker at Japan’s Fukushima Dai-ichi nuclear power plant has written dozens of blog posts describing his experience as a lead robot operator at the crippled facility

By Erico Guizzo
Posted 23 Aug 2011

#17 Should Quadrotors All Look Like This?

Researchers say we’ve been designing quadrotors the wrong way

By Evan Ackerman
Posted 13 Nov 2013

#18 Boston Dynamics’ PETMAN Humanoid Robot Walks and Does Push-Ups

Boston Dynamics releases stunning video showing off its most advanced humanoid robot

By Erico Guizzo
Posted 31 Oct 2011

#19 Boston Dynamics’ Spot Robot Dog Goes on Sale

Here’s everything we know about Boston Dynamics’ first commercial robot

By Erico Guizzo
Posted 24 Sep 2019

#20 Agility Robotics Introduces Cassie, a Dynamic and Talented Robot Delivery Ostrich

One day, robots like these will be scampering up your steps to drop off packages

By Evan Ackerman
Posted 9 Feb 2017

#21 Superfast Scanner Lets You Digitize Book By Flipping Pages

Tokyo University researchers develop scanner that can capture 200 pages in one minute

By Erico Guizzo
Posted 17 Mar 2010

#22 A Robot That Balances on a Ball

Masaaki Kumagai has built wheeled robots, crawling robots, and legged robots. Now he’s built a robot that rides on a ball

By Erico Guizzo
Posted 29 Apr 2010

#23 Top 10 Robotic Kinect Hacks

Microsoft’s Kinect 3D motion detector has been hacked into lots of awesome robots, and here are our 10 favorites

By Evan Ackerman
Posted 7 Mar 2011

#24 Latest AlphaDog Robot Prototypes Get Less Noisy, More Brainy

New video shows Boston Dynamics and DARPA putting AlphaDog through its paces

By Evan Ackerman
Posted 11 Sep 2012

#25 How South Korea’s DRC-HUBO Robot Won the DARPA Robotics Challenge

This transformer robot took first place because it was fast, adaptable, and didn’t fall down

By Erico Guizzo & Evan Ackerman
Posted 9 Jun 2015

#26 U.S. Army Considers Replacing Thousands of Soldiers With Robots

The U.S. Army could slash personnel numbers and toss in more robots instead

By Evan Ackerman
Posted 22 Jan 2014

#27 Google Acquires Seven Robot Companies, Wants Big Role in Robotics

The company is funding a major new robotics group and acquiring a bunch of robot startups

By Evan Ackerman & Erico Guizzo
Posted 4 Dec 2013

#28 Who Is SCHAFT, the Robot Company Bought by Google and Winner of the DRC?

Here’s everything we know about this secretive robotics startup

By Erico Guizzo & Evan Ackerman
Posted 6 Feb 2014

#29 Ground-Effect Robot Could Be Key To Future High-Speed Trains

Trains that levitate on cushions of air could be the future of fast and efficient travel, if this robot can figure out how to keep them stable

By Evan Ackerman
Posted 10 May 2011

#30 Hobby Robot Rides a Bike the Old-Fashioned Way

I don’t know where this little robot got its awesome bicycle, but it sure knows how to ride

By Evan Ackerman
Posted 24 Oct 2011

#31 SRI Demonstrates Abacus, the First New Rotary Transmission Design in 50 Years

Finally a gear system that could replace costly harmonic drives

By Evan Ackerman
Posted 19 Oct 2016

#32 Robotic Micro-Scallops Can Swim Through Your Eyeballs

Tiny robots modeled on scallops are able to swim through all the fluids in your body

By Evan Ackerman
Posted 4 Nov 2014

#33 Boston Dynamics Officially Unveils Its Wheel-Leg Robot: "Best of Both Worlds"

Handle is a humanoid robot on wheels, and it’s amazing

By Erico Guizzo & Evan Ackerman
Posted 27 Feb 2017

#34 iRobot Brings Visual Mapping and Navigation to the Roomba 980

The new robot vacuum uses VSLAM to navigate and clean larger spaces in satisfyingly straight lines

By Evan Ackerman & Erico Guizzo
Posted 16 Sep 2015

#35 When Will We Have Robots To Help With Household Chores?

Google, Microsoft, and Apple are investing in robots. Does that mean home robots are on the way?

By Satyandra K. Gupta
Posted 2 Jan 2014

#36 Robots Playing Ping Pong: What’s Real, and What’s Not?

Kuka’s robot vs. human ping pong match looks to be more hype than reality

By Evan Ackerman
Posted 12 Mar 2014

#37 BigDog Throws Cinder Blocks with Huge Robotic Face-Arm

I don’t know why BigDog needs a fifth limb to throw cinder blocks, but it’s incredibly awesome

By Evan Ackerman
Posted 28 Feb 2013

#38 Children Beating Up Robot Inspires New Escape Maneuver System

Japanese researchers show that children can act like horrible little brats towards robots

By Kate Darling
Posted 6 Aug 2015

#39 Boston Dynamics’ AlphaDog Quadruped Robot Prototype on Video

Boston Dynamics has just released some absolutely incredible video of their huge new quadruped robot, AlphaDog

By Evan Ackerman
Posted 30 Sep 2011

#40 Building a Super Robust Robot Hand

Researchers have built an anthropomorphic robot hand that can endure even strikes from a hammer without breaking into pieces

By Erico Guizzo
Posted 25 Jan 2011

#41 Who’s Afraid of the Uncanny Valley?

To design the androids of the future, we shouldn’t fear exploring the depths of the uncanny valley

By Erico Guizzo
Posted 2 Apr 2010

#42 Why AlphaGo Is Not AI

Google DeepMind’s artificial intelligence AlphaGo is a big advance but it will not get us to strong AI

By Jean-Christophe Baillie
Posted 17 Mar 2016

#43 Freaky Boneless Robot Walks on Soft Legs

This soft, inflatable, and totally creepy robot from Harvard can get up and walk on four squishy legs

By Evan Ackerman
Posted 29 Nov 2011

#44 Sweep Is a $250 LIDAR With Range of 40 Meters That Works Outdoors

Finally an affordable LIDAR for robots and drones

By Evan Ackerman
Posted 6 Apr 2016

#45 How Google Wants to Solve Robotic Grasping by Letting Robots Learn for Themselves

800,000 grasps is just the beginning for Google’s large-scale robotic grasping project

By Evan Ackerman
Posted 28 Mar 2016

#46 Whoa: Boston Dynamics Announces New WildCat Quadruped Robot

A new robot from Boston Dynamics can run outdoors, untethered, at up to 25 km/h

By Evan Ackerman
Posted 3 Oct 2013

#47 SCHAFT Unveils Awesome New Bipedal Robot at Japan Conference

SCHAFT demos a new bipedal robot designed to "help society"

By Evan Ackerman & Erico Guizzo
Posted 8 Apr 2016

#48 Riding Honda’s U3-X Unicycle of the Future

It only has one wheel, but Honda’s futuristic personal mobility device is no pedal-pusher

By Anne-Marie Corley
Posted 12 Apr 2010

#49 Lingodroid Robots Invent Their Own Spoken Language

These little robots make up their own words to tell each other where they are and where they want to go

By Evan Ackerman
Posted 17 May 2011

#50 Disney Robot With Air-Water Actuators Shows Off "Very Fluid" Motions

Meet Jimmy, a robot puppet powered by fluid actuators

By Erico Guizzo
Posted 1 Sep 2016

#51 Kilobots Are Cheap Enough to Swarm in the Thousands

What can you do with a $14 robot? Not much. What can you do with a thousand $14 robots? World domination

By Evan Ackerman
Posted 16 Jun 2011

#52 Honda Robotics Unveils Next-Generation ASIMO Robot

We heard some rumors that Honda was working on something big, and here it is: a brand new ASIMO

By Evan Ackerman
Posted 7 Nov 2011

#53 Cybernetic Third Arm Makes Drummers Even More Annoying

It keeps proper time and comes with an off switch, making this robotic third arm infinitely better than a human drummer

By Evan Ackerman
Posted 18 Feb 2016

#54 Chatbot Tries to Talk to Itself, Things Get Weird

"I am not a robot. I am a unicorn."

By Evan Ackerman
Posted 29 Aug 2011

#55 Dean Kamen’s "Luke Arm" Prosthesis Receives FDA Approval

This advanced bionic arm for amputees has been approved for commercialization

By Erico Guizzo
Posted 13 May 2014

#56 Meet the Amazing Robots That Will Compete in the DARPA Robotics Challenge

Over the next two years, robotics will be revolutionized, and here’s how it’s going to happen

By Evan Ackerman
Posted 24 Oct 2012

#57 ReWalk Robotics’s New Exoskeleton Lets Paraplegic Stroll the Streets of NYC

Yesterday, a paralyzed man strapped on a pair of robotic legs and stepped out a hotel door in midtown Manhattan

By Eliza Strickland
Posted 15 Jul 2015

#58 Drone Uses AI and 11,500 Crashes to Learn How to Fly

Crashing into objects has taught this drone to fly autonomously, by learning what not to do

By Evan Ackerman
Posted 10 May 2017

#59 Lego Announces Mindstorms EV3, a More ’Hackable’ Robotics Kit

Lego’s latest Mindstorms kit has a new IR sensor, runs on Linux, and is compatible with Android and iOS apps

By Erico Guizzo & Stephen Cass
Posted 7 Jan 2013

#60 Boston Dynamics’ Marc Raibert on Next-Gen ATLAS: "A Huge Amount of Work"

The founder of Boston Dynamics describes how his team built one of the most advanced humanoids ever

By Erico Guizzo & Evan Ackerman
Posted 24 Feb 2016

#61 AR Drone That Infects Other Drones With Virus Wins DroneGames

Other projects included a leashed auto-tweeting drone, and code to control a swarm of drones all at once

By Evan Ackerman
Posted 6 Dec 2012

#62 DARPA Robotics Challenge: A Compilation of Robots Falling Down

Gravity is a bad thing for robots

By Erico Guizzo & Evan Ackerman
Posted 6 Jun 2015

#63 Bosch’s Giant Robot Can Punch Weeds to Death

A modular agricultural robot from Bosch startup Deepfield Robotics deals with weeds the old fashioned way: violently

By Evan Ackerman
Posted 12 Nov 2015

#64 How to Make a Humanoid Robot Dance

Japanese roboticists demonstrate a female android singing and dancing along with a troupe of human performers

By Erico Guizzo
Posted 2 Nov 2010

#65 What Technologies Enabled Drones to Proliferate?

Five years ago few people had even heard of quadcopters. Now they seem to be everywhere

By Markus Waibel
Posted 19 Feb 2010

#66 Video Friday: Professor Ishiguro’s New Robot Child, and More

Your weekly selection of awesome robot videos

By Evan Ackerman, Erico Guizzo & Fan Shi
Posted 3 Aug 2018

#67 Drone Provides Colorado Flooding Assistance Until FEMA Freaks Out

Drones can provide near real-time maps in weather that grounds other aircraft, but FEMA has banned them

By Evan Ackerman
Posted 16 Sep 2013

#68 A Thousand Kilobots Self-Assemble Into Complex Shapes

This is probably the most robots that have ever been in the same place at the same time, ever

By Evan Ackerman
Posted 14 Aug 2014

#69 Boston Dynamics’ SpotMini Is All Electric, Agile, and Has a Capable Face-Arm

A fun-sized version of Spot is the most domesticated Boston Dynamics robot we’ve seen

By Evan Ackerman
Posted 23 Jun 2016

#70 Kenshiro Robot Gets New Muscles and Bones

This humanoid is trying to mimic the human body down to muscles and bones

By Angelica Lim
Posted 10 Dec 2012

#71 Roomba Inventor Joe Jones on His New Weed-Killing Robot, and What’s So Hard About Consumer Robotics

The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot

By Evan Ackerman
Posted 6 Jul 2017

#72 George Devol: A Life Devoted to Invention, and Robots

George Devol’s most famous invention—the first programmable industrial robot—started a revolution in manufacturing that continues to this day

By Bob Malone
Posted 26 Sep 2011

#73 World Robot Population Reaches 8.6 Million

Here’s an estimate of the number of industrial and service robots worldwide

By Erico Guizzo
Posted 14 Apr 2010

#74 U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful

U.S. Senator Tom Coburn criticizes the NSF for squandering "millions of dollars on wasteful projects," including three that involve robots

By Erico Guizzo
Posted 14 Jun 2011

#75 Inception Drive: A Compact, Infinitely Variable Transmission for Robotics

A novel nested-pulley configuration forms the heart of a transmission that could make robots safer and more energy efficient

By Evan Ackerman & Celia Gorman
Posted 20 Sep 2017

#76 iRobot Demonstrates New Weaponized Robot

iRobot has released video showing a Warrior robot deploying an anti-personnel obstacle breaching system

By John Palmisano
Posted 30 May 2010

#77 Robotics Trends for 2012

Nearly a quarter of the year is already behind us, but we thought we’d spend some time looking at the months ahead and make some predictions about what’s going to be big in robotics

By Erico Guizzo & Travis Deyle
Posted 20 Mar 2012

#78 DRC Finals: CMU’s CHIMP Gets Up After Fall, Shows How Awesome Robots Can Be

The most amazing run we saw all day came from CHIMP, which was the only robot to fall and get up again

By Evan Ackerman & Erico Guizzo
Posted 5 Jun 2015

#79 Lethal Microdrones, Dystopian Futures, and the Autonomous Weapons Debate

The future of weaponized robots requires a reasoned discussion, not scary videos

By Evan Ackerman
Posted 15 Nov 2017

#80 Every Kid Needs One of These DIY Robotics Kits

For just $200, this kit from a CMU spinoff company is a great way for total beginners to get started building robots

By Evan Ackerman
Posted 11 Jul 2012

#81 Beautiful Fluid Actuators from Disney Research Make Soft, Safe Robot Arms

Routing forces through air and water allows for displaced motors and safe, high-performance arms

By Evan Ackerman
Posted 9 Oct 2014

#82 Boston Dynamics Sand Flea Robot Demonstrates Astonishing Jumping Skills

Watch a brand new video of Boston Dynamics’ Sand Flea robot jumping 10 meters into the air

By Evan Ackerman
Posted 28 Mar 2012

#83 Eyeborg: Man Replaces False Eye With Bionic Camera

Canadian filmmaker Rob Spence has replaced his false eye with a bionic camera eye. He showed us his latest prototype

By Tim Hornyak
Posted 11 Jun 2010

#84 We Should Not Ban ‘Killer Robots,’ and Here’s Why

What we really need is a way of making autonomous armed robots ethical, because we’re not going to be able to prevent them from existing

By Evan Ackerman
Posted 28 Jul 2015

#85 Yale’s Robot Hand Copies How Your Fingers Work to Improve Object Manipulation

These robotic fingers can turn friction on and off to make it easier to manipulate objects with one hand

By Evan Ackerman
Posted 12 Sep 2018

#86 France Developing Advanced Humanoid Robot Romeo

Nao, the small French humanoid robot, is getting a big brother

By Erico Guizzo
Posted 13 Dec 2010

#87 DARPA Wants to Give Soldiers Robot Surrogates, Avatar Style

Soldiers controlling bipedal robot surrogates on the battlefield? It’s not science fiction, it’s DARPA’s 2012 budget

By Evan Ackerman
Posted 17 Feb 2012

#88 Whoa: Quadrotors Play Catch With Inverted Pendulum

Watch these quadrotors balance a stick on its end, and then toss it back and forth

By Evan Ackerman
Posted 21 Feb 2013

#89 Why We Should Build Humanlike Robots

Humans are brilliant, beautiful, compassionate, and capable of love. Why shouldn’t we aspire to make robots humanlike in these ways?

By David Hanson
Posted 1 Apr 2011

#90 DARPA Robotics Challenge Finals: Know Your Robots

All 25 robots in a single handy poster-size image

By Erico Guizzo & Evan Ackerman
Posted 3 Jun 2015

#91 Here’s That Extra Pair of Robot Arms You’ve Always Wanted

MIT researchers develop wearable robotic arms that can give you an extra hand (or two)

By Evan Ackerman
Posted 2 Jun 2014

#92 Rat Robot Beats on Live Rats to Make Them Depressed

A robotic rat can be used to depress live rats to make them suitable for human drug trials

By Evan Ackerman
Posted 13 Feb 2013

#93 MIT Cheetah Robot Bounds Off Tether, Outdoors

The newest version of MIT’s Cheetah is fast, it’s quiet, and it jumps

By Evan Ackerman
Posted 15 Sep 2014

#94 Bizarre Soft Robots Evolve to Run

These simulated robots may be wacky looking, but they’ve evolved on their own to be fast and efficient

By Evan Ackerman
Posted 11 Apr 2013

#95 Robot Car Intersections Are Terrifyingly Efficient

In the future, robots will blow through intersections without stopping, and you won’t be able to handle it

By Evan Ackerman
Posted 13 Mar 2012

#96 iRobot’s New Roomba 800 Series Has Better Vacuuming With Less Maintenance

A redesigned cleaning system makes the new vacuum way better at dealing with hair (and everything else)

By Evan Ackerman
Posted 12 Nov 2013

#97 Sawyer: Rethink Robotics Unveils New Robot

It’s smaller, faster, stronger, and more precise: meet Sawyer, Rethink Robotics’ new manufacturing robot

By Evan Ackerman & Erico Guizzo
Posted 19 Mar 2015

#98 Cynthia Breazeal Unveils Jibo, a Social Robot for the Home

The famed MIT roboticist is launching a crowdfunding campaign to bring social robots to consumers

By Erico Guizzo
Posted 16 Jul 2014

#99 These Robots Will Teach Kids Programming Skills

Startup Play-i says its robots can make computer programming fun and accessible

By Erico Guizzo
Posted 30 Oct 2013

#100 Watch a Swarm of Flying Robots Build a 6-Meter Brick Tower

This is what happens when a bunch of roboticists and architects get together in an art gallery

By Erico Guizzo
Posted 2 Dec 2011

.sm img { width: 150px !important; height: 108px !important; }

We’ve been writing about robots here at IEEE Spectrum for a long, long time. Erico started covering robotics for Spectrum in 2007, about the same time that Evan started BotJunkie.com. We joined forces in 2011, and have published thousands of articles since then, chronicling as many aspects of the field as we could. Autonomous cars, humanoids, legged robots, drones, robots in space—the last decade in robotics has been incredible.

To kick off 2020, we’re taking a look back at our most popular posts of the last 10 years. In order, listed below are the 100 articles with the highest total page views, providing a cross-section of what was the most interesting in robotics from 2010 until now.

Also, sometime in the next several weeks, we plan to post a selection of our favorite stories, focusing on what we think were the biggest developments in robotics of the past decade (including a few things that, surprisingly, did not make the list below). If you have suggestions of important robot stories we should include, let us know. Thank you for reading!

#1 How Google’s Self-Driving Car Works

Google engineers explain the technology behind their self-driving car and show videos of road tests

By Erico Guizzo
Posted 18 Oct 2011

#2 This Robot Can Do More Push-Ups Because It Sweats

A robot that uses artificial sweat can cool its motors without bulky radiators

By Evan Ackerman
Posted 13 Oct 2016

#3 Meet Geminoid F, a Smiling Female Android

Geminoid F displays facial expressions more naturally than previous androids

By Erico Guizzo
Posted 3 Apr 2010

#4 Latest Geminoid Is Incredibly Realistic

Geminoid DK is a realistic android nearly indistinguishable from a real human

By Evan Ackerman
Posted 5 Mar 2011

#5 The Next Generation of Boston Dynamics’ ATLAS Robot Is Quiet, Robust, and Tether Free

The latest ATLAS is by far the most advanced humanoid robot in existence

By Evan Ackerman & Erico Guizzo
Posted 23 Feb 2016

#6 The Uncanny Valley: The Original Essay by Masahiro Mori

“The Uncanny Valley” by Masahiro Mori is an influential essay in robotics. This is the first English translation authorized by Mori.

By Masahiro Mori
Posted 12 Jun 2012

#7 NASA JSC Unveils Valkyrie DRC Robot

NASA’s DARPA Robotics Challenge entry is much more than Robonaut with legs: it’s a completely new humanoid robot

By Evan Ackerman
Posted 10 Dec 2013

#8 Origami Robot Folds Itself Up, Does Cool Stuff, Dissolves Into Nothing

Tiny self-folding magnetically actuated robot creates itself when you want it, disappears when you don’t

By Evan Ackerman
Posted 28 May 2015

#9 Robots Bring Couple Together, Engagement Ensues

Yes, you really can find love at an IEEE conference

By Evan Ackerman & Erico Guizzo
Posted 31 Mar 2014

#10 Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter

The Deep Learning expert explains how convolutional nets work, why Facebook needs AI, what he dislikes about the Singularity, and more

By Lee Gomes
Posted 18 Feb 2015

#11 This Is the Most Amazing Biomimetic Anthropomorphic Robot Hand We’ve Ever Seen

Luke Skywalker, your new robotic hand is ready

By Evan Ackerman
Posted 17 Feb 2016

#12 Dutch Police Training Eagles to Take Down Drones

Attack eagles are training to become part of the Dutch National Police anti-drone arsenal

By Evan Ackerman
Posted 1 Feb 2016

#13 You (YOU!) Can Take Stanford’s ’Intro to AI’ Course Next Quarter, For Free

Sebastian Thrun and Peter Norvig are offering Stanford’s "Introduction to Artificial Intelligence" course online, for free, grades and all

By Evan Ackerman
Posted 4 Aug 2011

#14 Robot Hand Beats You at Rock, Paper, Scissors 100% Of The Time

Watch this high-speed robot hand cheat at rock, paper, scissors

By Evan Ackerman
Posted 26 Jun 2012

#15 You’ve Never Seen a Robot Drive System Like This Before

Using just a single spinning hemisphere mounted on a gimbal, this robot demonstrates some incredible agility

By Evan Ackerman
Posted 7 Jul 2011

#16 Fukushima Robot Operator Writes Tell-All Blog

An anonymous worker at Japan’s Fukushima Dai-ichi nuclear power plant has written dozens of blog posts describing his experience as a lead robot operator at the crippled facility

By Erico Guizzo
Posted 23 Aug 2011

#17 Should Quadrotors All Look Like This?

Researchers say we’ve been designing quadrotors the wrong way

By Evan Ackerman
Posted 13 Nov 2013

#18 Boston Dynamics’ PETMAN Humanoid Robot Walks and Does Push-Ups

Boston Dynamics releases stunning video showing off its most advanced humanoid robot

By Erico Guizzo
Posted 31 Oct 2011

#19 Boston Dynamics’ Spot Robot Dog Goes on Sale

Here’s everything we know about Boston Dynamics’ first commercial robot

By Erico Guizzo
Posted 24 Sep 2019

#20 Agility Robotics Introduces Cassie, a Dynamic and Talented Robot Delivery Ostrich

One day, robots like these will be scampering up your steps to drop off packages

By Evan Ackerman
Posted 9 Feb 2017

#21 Superfast Scanner Lets You Digitize Book By Flipping Pages

Tokyo University researchers develop scanner that can capture 200 pages in one minute

By Erico Guizzo
Posted 17 Mar 2010

#22 A Robot That Balances on a Ball

Masaaki Kumagai has built wheeled robots, crawling robots, and legged robots. Now he’s built a robot that rides on a ball

By Erico Guizzo
Posted 29 Apr 2010

#23 Top 10 Robotic Kinect Hacks

Microsoft’s Kinect 3D motion detector has been hacked into lots of awesome robots, and here are our 10 favorites

By Evan Ackerman
Posted 7 Mar 2011

#24 Latest AlphaDog Robot Prototypes Get Less Noisy, More Brainy

New video shows Boston Dynamics and DARPA putting AlphaDog through its paces

By Evan Ackerman
Posted 11 Sep 2012

#25 How South Korea’s DRC-HUBO Robot Won the DARPA Robotics Challenge

This transformer robot took first place because it was fast, adaptable, and didn’t fall down

By Erico Guizzo & Evan Ackerman
Posted 9 Jun 2015

#26 U.S. Army Considers Replacing Thousands of Soldiers With Robots

The U.S. Army could slash personnel numbers and toss in more robots instead

By Evan Ackerman
Posted 22 Jan 2014

#27 Google Acquires Seven Robot Companies, Wants Big Role in Robotics

The company is funding a major new robotics group and acquiring a bunch of robot startups

By Evan Ackerman & Erico Guizzo
Posted 4 Dec 2013

#28 Who Is SCHAFT, the Robot Company Bought by Google and Winner of the DRC?

Here’s everything we know about this secretive robotics startup

By Erico Guizzo & Evan Ackerman
Posted 6 Feb 2014

#29 Ground-Effect Robot Could Be Key To Future High-Speed Trains

Trains that levitate on cushions of air could be the future of fast and efficient travel, if this robot can figure out how to keep them stable

By Evan Ackerman
Posted 10 May 2011

#30 Hobby Robot Rides a Bike the Old-Fashioned Way

I don’t know where this little robot got its awesome bicycle, but it sure knows how to ride

By Evan Ackerman
Posted 24 Oct 2011

#31 SRI Demonstrates Abacus, the First New Rotary Transmission Design in 50 Years

Finally a gear system that could replace costly harmonic drives

By Evan Ackerman
Posted 19 Oct 2016

#32 Robotic Micro-Scallops Can Swim Through Your Eyeballs

Tiny robots modeled on scallops are able to swim through all the fluids in your body

By Evan Ackerman
Posted 4 Nov 2014

#33 Boston Dynamics Officially Unveils Its Wheel-Leg Robot: "Best of Both Worlds"

Handle is a humanoid robot on wheels, and it’s amazing

By Erico Guizzo & Evan Ackerman
Posted 27 Feb 2017

#34 iRobot Brings Visual Mapping and Navigation to the Roomba 980

The new robot vacuum uses VSLAM to navigate and clean larger spaces in satisfyingly straight lines

By Evan Ackerman & Erico Guizzo
Posted 16 Sep 2015

#35 When Will We Have Robots To Help With Household Chores?

Google, Microsoft, and Apple are investing in robots. Does that mean home robots are on the way?

By Satyandra K. Gupta
Posted 2 Jan 2014

#36 Robots Playing Ping Pong: What’s Real, and What’s Not?

Kuka’s robot vs. human ping pong match looks to be more hype than reality

By Evan Ackerman
Posted 12 Mar 2014

#37 BigDog Throws Cinder Blocks with Huge Robotic Face-Arm

I don’t know why BigDog needs a fifth limb to throw cinder blocks, but it’s incredibly awesome

By Evan Ackerman
Posted 28 Feb 2013

#38 Children Beating Up Robot Inspires New Escape Maneuver System

Japanese researchers show that children can act like horrible little brats towards robots

By Kate Darling
Posted 6 Aug 2015

#39 Boston Dynamics’ AlphaDog Quadruped Robot Prototype on Video

Boston Dynamics has just released some absolutely incredible video of their huge new quadruped robot, AlphaDog

By Evan Ackerman
Posted 30 Sep 2011

#40 Building a Super Robust Robot Hand

Researchers have built an anthropomorphic robot hand that can endure even strikes from a hammer without breaking into pieces

By Erico Guizzo
Posted 25 Jan 2011

#41 Who’s Afraid of the Uncanny Valley?

To design the androids of the future, we shouldn’t fear exploring the depths of the uncanny valley

By Erico Guizzo
Posted 2 Apr 2010

#42 Why AlphaGo Is Not AI

Google DeepMind’s artificial intelligence AlphaGo is a big advance but it will not get us to strong AI

By Jean-Christophe Baillie
Posted 17 Mar 2016

#43 Freaky Boneless Robot Walks on Soft Legs

This soft, inflatable, and totally creepy robot from Harvard can get up and walk on four squishy legs

By Evan Ackerman
Posted 29 Nov 2011

#44 Sweep Is a $250 LIDAR With Range of 40 Meters That Works Outdoors

Finally an affordable LIDAR for robots and drones

By Evan Ackerman
Posted 6 Apr 2016

#45 How Google Wants to Solve Robotic Grasping by Letting Robots Learn for Themselves

800,000 grasps is just the beginning for Google’s large-scale robotic grasping project

By Evan Ackerman
Posted 28 Mar 2016

#46 Whoa: Boston Dynamics Announces New WildCat Quadruped Robot

A new robot from Boston Dynamics can run outdoors, untethered, at up to 25 km/h

By Evan Ackerman
Posted 3 Oct 2013

#47 SCHAFT Unveils Awesome New Bipedal Robot at Japan Conference

SCHAFT demos a new bipedal robot designed to "help society"

By Evan Ackerman & Erico Guizzo
Posted 8 Apr 2016

#48 Riding Honda’s U3-X Unicycle of the Future

It only has one wheel, but Honda’s futuristic personal mobility device is no pedal-pusher

By Anne-Marie Corley
Posted 12 Apr 2010

#49 Lingodroid Robots Invent Their Own Spoken Language

These little robots make up their own words to tell each other where they are and where they want to go

By Evan Ackerman
Posted 17 May 2011

#50 Disney Robot With Air-Water Actuators Shows Off "Very Fluid" Motions

Meet Jimmy, a robot puppet powered by fluid actuators

By Erico Guizzo
Posted 1 Sep 2016

#51 Kilobots Are Cheap Enough to Swarm in the Thousands

What can you do with a $14 robot? Not much. What can you do with a thousand $14 robots? World domination

By Evan Ackerman
Posted 16 Jun 2011

#52 Honda Robotics Unveils Next-Generation ASIMO Robot

We heard some rumors that Honda was working on something big, and here it is: a brand new ASIMO

By Evan Ackerman
Posted 7 Nov 2011

#53 Cybernetic Third Arm Makes Drummers Even More Annoying

It keeps proper time and comes with an off switch, making this robotic third arm infinitely better than a human drummer

By Evan Ackerman
Posted 18 Feb 2016

#54 Chatbot Tries to Talk to Itself, Things Get Weird

"I am not a robot. I am a unicorn."

By Evan Ackerman
Posted 29 Aug 2011

#55 Dean Kamen’s "Luke Arm" Prosthesis Receives FDA Approval

This advanced bionic arm for amputees has been approved for commercialization

By Erico Guizzo
Posted 13 May 2014

#56 Meet the Amazing Robots That Will Compete in the DARPA Robotics Challenge

Over the next two years, robotics will be revolutionized, and here’s how it’s going to happen

By Evan Ackerman
Posted 24 Oct 2012

#57 ReWalk Robotics’s New Exoskeleton Lets Paraplegic Stroll the Streets of NYC

Yesterday, a paralyzed man strapped on a pair of robotic legs and stepped out a hotel door in midtown Manhattan

By Eliza Strickland
Posted 15 Jul 2015

#58 Drone Uses AI and 11,500 Crashes to Learn How to Fly

Crashing into objects has taught this drone to fly autonomously, by learning what not to do

By Evan Ackerman
Posted 10 May 2017

#59 Lego Announces Mindstorms EV3, a More ’Hackable’ Robotics Kit

Lego’s latest Mindstorms kit has a new IR sensor, runs on Linux, and is compatible with Android and iOS apps

By Erico Guizzo & Stephen Cass
Posted 7 Jan 2013

#60 Boston Dynamics’ Marc Raibert on Next-Gen ATLAS: "A Huge Amount of Work"

The founder of Boston Dynamics describes how his team built one of the most advanced humanoids ever

By Erico Guizzo & Evan Ackerman
Posted 24 Feb 2016

#61 AR Drone That Infects Other Drones With Virus Wins DroneGames

Other projects included a leashed auto-tweeting drone, and code to control a swarm of drones all at once

By Evan Ackerman
Posted 6 Dec 2012

#62 DARPA Robotics Challenge: A Compilation of Robots Falling Down

Gravity is a bad thing for robots

By Erico Guizzo & Evan Ackerman
Posted 6 Jun 2015

#63 Bosch’s Giant Robot Can Punch Weeds to Death

A modular agricultural robot from Bosch startup Deepfield Robotics deals with weeds the old fashioned way: violently

By Evan Ackerman
Posted 12 Nov 2015

#64 How to Make a Humanoid Robot Dance

Japanese roboticists demonstrate a female android singing and dancing along with a troupe of human performers

By Erico Guizzo
Posted 2 Nov 2010

#65 What Technologies Enabled Drones to Proliferate?

Five years ago few people had even heard of quadcopters. Now they seem to be everywhere

By Markus Waibel
Posted 19 Feb 2010

#66 Video Friday: Professor Ishiguro’s New Robot Child, and More

Your weekly selection of awesome robot videos

By Evan Ackerman, Erico Guizzo & Fan Shi
Posted 3 Aug 2018

#67 Drone Provides Colorado Flooding Assistance Until FEMA Freaks Out

Drones can provide near real-time maps in weather that grounds other aircraft, but FEMA has banned them

By Evan Ackerman
Posted 16 Sep 2013

#68 A Thousand Kilobots Self-Assemble Into Complex Shapes

This is probably the most robots that have ever been in the same place at the same time, ever

By Evan Ackerman
Posted 14 Aug 2014

#69 Boston Dynamics’ SpotMini Is All Electric, Agile, and Has a Capable Face-Arm

A fun-sized version of Spot is the most domesticated Boston Dynamics robot we’ve seen

By Evan Ackerman
Posted 23 Jun 2016

#70 Kenshiro Robot Gets New Muscles and Bones

This humanoid is trying to mimic the human body down to muscles and bones

By Angelica Lim
Posted 10 Dec 2012

#71 Roomba Inventor Joe Jones on His New Weed-Killing Robot, and What’s So Hard About Consumer Robotics

The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot

By Evan Ackerman
Posted 6 Jul 2017

#72 George Devol: A Life Devoted to Invention, and Robots

George Devol’s most famous invention—the first programmable industrial robot—started a revolution in manufacturing that continues to this day

By Bob Malone
Posted 26 Sep 2011

#73 World Robot Population Reaches 8.6 Million

Here’s an estimate of the number of industrial and service robots worldwide

By Erico Guizzo
Posted 14 Apr 2010

#74 U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful

U.S. Senator Tom Coburn criticizes the NSF for squandering "millions of dollars on wasteful projects," including three that involve robots

By Erico Guizzo
Posted 14 Jun 2011

#75 Inception Drive: A Compact, Infinitely Variable Transmission for Robotics

A novel nested-pulley configuration forms the heart of a transmission that could make robots safer and more energy efficient

By Evan Ackerman & Celia Gorman
Posted 20 Sep 2017

#76 iRobot Demonstrates New Weaponized Robot

iRobot has released video showing a Warrior robot deploying an anti-personnel obstacle breaching system

By John Palmisano
Posted 30 May 2010

#77 Robotics Trends for 2012

Nearly a quarter of the year is already behind us, but we thought we’d spend some time looking at the months ahead and make some predictions about what’s going to be big in robotics

By Erico Guizzo & Travis Deyle
Posted 20 Mar 2012

#78 DRC Finals: CMU’s CHIMP Gets Up After Fall, Shows How Awesome Robots Can Be

The most amazing run we saw all day came from CHIMP, which was the only robot to fall and get up again

By Evan Ackerman & Erico Guizzo
Posted 5 Jun 2015

#79 Lethal Microdrones, Dystopian Futures, and the Autonomous Weapons Debate

The future of weaponized robots requires a reasoned discussion, not scary videos

By Evan Ackerman
Posted 15 Nov 2017

#80 Every Kid Needs One of These DIY Robotics Kits

For just $200, this kit from a CMU spinoff company is a great way for total beginners to get started building robots

By Evan Ackerman
Posted 11 Jul 2012

#81 Beautiful Fluid Actuators from Disney Research Make Soft, Safe Robot Arms

Routing forces through air and water allows for displaced motors and safe, high-performance arms

By Evan Ackerman
Posted 9 Oct 2014

#82 Boston Dynamics Sand Flea Robot Demonstrates Astonishing Jumping Skills

Watch a brand new video of Boston Dynamics’ Sand Flea robot jumping 10 meters into the air

By Evan Ackerman
Posted 28 Mar 2012

#83 Eyeborg: Man Replaces False Eye With Bionic Camera

Canadian filmmaker Rob Spence has replaced his false eye with a bionic camera eye. He showed us his latest prototype

By Tim Hornyak
Posted 11 Jun 2010

#84 We Should Not Ban ‘Killer Robots,’ and Here’s Why

What we really need is a way of making autonomous armed robots ethical, because we’re not going to be able to prevent them from existing

By Evan Ackerman
Posted 28 Jul 2015

#85 Yale’s Robot Hand Copies How Your Fingers Work to Improve Object Manipulation

These robotic fingers can turn friction on and off to make it easier to manipulate objects with one hand

By Evan Ackerman
Posted 12 Sep 2018

#86 France Developing Advanced Humanoid Robot Romeo

Nao, the small French humanoid robot, is getting a big brother

By Erico Guizzo
Posted 13 Dec 2010

#87 DARPA Wants to Give Soldiers Robot Surrogates, Avatar Style

Soldiers controlling bipedal robot surrogates on the battlefield? It’s not science fiction, it’s DARPA’s 2012 budget

By Evan Ackerman
Posted 17 Feb 2012

#88 Whoa: Quadrotors Play Catch With Inverted Pendulum

Watch these quadrotors balance a stick on its end, and then toss it back and forth

By Evan Ackerman
Posted 21 Feb 2013

#89 Why We Should Build Humanlike Robots

Humans are brilliant, beautiful, compassionate, and capable of love. Why shouldn’t we aspire to make robots humanlike in these ways?

By David Hanson
Posted 1 Apr 2011

#90 DARPA Robotics Challenge Finals: Know Your Robots

All 25 robots in a single handy poster-size image

By Erico Guizzo & Evan Ackerman
Posted 3 Jun 2015

#91 Here’s That Extra Pair of Robot Arms You’ve Always Wanted

MIT researchers develop wearable robotic arms that can give you an extra hand (or two)

By Evan Ackerman
Posted 2 Jun 2014

#92 Rat Robot Beats on Live Rats to Make Them Depressed

A robotic rat can be used to depress live rats to make them suitable for human drug trials

By Evan Ackerman
Posted 13 Feb 2013

#93 MIT Cheetah Robot Bounds Off Tether, Outdoors

The newest version of MIT’s Cheetah is fast, it’s quiet, and it jumps

By Evan Ackerman
Posted 15 Sep 2014

#94 Bizarre Soft Robots Evolve to Run

These simulated robots may be wacky looking, but they’ve evolved on their own to be fast and efficient

By Evan Ackerman
Posted 11 Apr 2013

#95 Robot Car Intersections Are Terrifyingly Efficient

In the future, robots will blow through intersections without stopping, and you won’t be able to handle it

By Evan Ackerman
Posted 13 Mar 2012

#96 iRobot’s New Roomba 800 Series Has Better Vacuuming With Less Maintenance

A redesigned cleaning system makes the new vacuum way better at dealing with hair (and everything else)

By Evan Ackerman
Posted 12 Nov 2013

#97 Sawyer: Rethink Robotics Unveils New Robot

It’s smaller, faster, stronger, and more precise: meet Sawyer, Rethink Robotics’ new manufacturing robot

By Evan Ackerman & Erico Guizzo
Posted 19 Mar 2015

#98 Cynthia Breazeal Unveils Jibo, a Social Robot for the Home

The famed MIT roboticist is launching a crowdfunding campaign to bring social robots to consumers

By Erico Guizzo
Posted 16 Jul 2014

#99 These Robots Will Teach Kids Programming Skills

Startup Play-i says its robots can make computer programming fun and accessible

By Erico Guizzo
Posted 30 Oct 2013

#100 Watch a Swarm of Flying Robots Build a 6-Meter Brick Tower

This is what happens when a bunch of roboticists and architects get together in an art gallery

By Erico Guizzo
Posted 2 Dec 2011

As far as I know, the current state of the art in indoor mosquito management is frantically trying to determine where that buzzing noise is coming from so that you can whack the damn bug before it lands and you lose track of it.

This “system” rarely works, but at CES this week, we found something better: Israeli startup Bzigo (the name of both the company and the product), which makes a mosquito detection and tracking system that combines an IR camera and laser designator with computer vision algorithms that follows mosquitoes as they fly and tells you exactly where they land to help you smash them. It’s not quite as deadly as the backyard star wars system, but it’s a lot more practical, because you’ll be able to buy one.

Bzigo’s visual tracking system can reliably spot mosquitoes at distances of up to 8 meters. A single near-IR (850nm) camera with a pair of IR illuminators and a wide angle lens can spot mosquitoes over an entire room, day or night. Once a bug is detected, an eye-safe laser will follow it until it lands and then draws a box around it for you so you can attack with your implement of choice.

At maximum range, you run into plenty of situations where the apparent size of a mosquito can be less than a single pixel. Bzigo’s AI relies on a mosquito’s motion rather than an identifiable image of the bug itself, and tracking those potentially intermittent and far-off pixel traces requires four 1GHz cores running at 100% continuously (all on-device). That’s a lot of oomph, but the result is that false positives are down around 1%, and 90% of landings are detected. This is not to say that the system can only detect 90% of bugs— since mosquitoes take off and land frequently, they’re almost always detected after just a few flight cycles. It’s taken Bzigo four years to reach this level of accuracy and precision with detection and tracking, and it’s impressive.

The super obvious missing feature is that this system only points at mosquitoes, as opposed to actually dealing with them in a direct (and lethal) way. You could argue that it’s the detection and tracking that’s the hard part (and it certainly is for humans), and that automated bug destruction is a lower priority, and you’d probably be right.

Or at least, Bzigo would agree with you, because that’s the approach they’ve taken. However, there are plans for a Bzigo V2, which adds a dedicated mosquito killing feature. If you guessed that said feature would involve replacing the laser designator with a much higher powered laser bug zapper, you’d be wrong, because we’re told that the V2 will rely on a custom nano-drone to destroy its winged foes at close range.

Bzigo has raised one round of funding, and they’re currently looking to raise a bit more to fund manufacturing of the device. Once that comes through, the first version of the system should be available in 12-14 months for about $170.

This robot is Hiro-chan. It’s made by Vstone, a Japanese robotics company known for producing a variety of totally normal educational and hobby robotics kits and parts. Hiro-chan is not what we would call totally normal, since it very obviously does not have a face. Vstone calls Hiro-chan a “healing communication device,” and while the whole faceless aspect is definitely weird, there is a reason for it, which unsurprisingly involves Hiroshi Ishiguro and his ATR Lab.

Hiro-chan’s entire existence seems to be based around transitioning from sad to happy in response to hugs. If left alone, Hiro-chan’s mood will gradually worsen and it’ll start crying. If you pick it up and hug it, an accelerometer will sense the motion, and Hiro-chan’s mood will improve until it starts to laugh. This is the extent of the interaction, but you’ll be glad to know that the robot has access to over 100 utterance variations collected from an actual baby (or babies) to make sure that mood changes are fluid and seamless. 

According to Japanese blog RobotStart, the target demographic for Hiro-chan is seniors, although it’s simple enough in operation that pretty much anyone could likely pick one up and figure out what they’re supposed to do with it. The end goal is the “healing effect” (a sense of accomplishment, I guess?) that you’d get from making the robot feel better.

Photo: Vstone At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes.

So why doesn’t the robot have a face? Since the functionality of the robot depends on you getting it go from sad to happy, Vstone says that giving the robot a face (and a fixed expression) would make that much less convincing and emotionally fulfilling—the robot would have the “wrong” expression half the time. Instead, the user can listen to Hiro-chan’s audio cues and imagine a face. Or not. Either way, the Uncanny Valley effect is avoided (as long as you can get over the complete lack of face, which I personally couldn’t), and the cost of the robot is kept low since there’s no need for actuators or a display.

Photo: Hiroshi Ishiguro/Osaka University/ATR

The Telenoid robot developed by Hiroshi Ishiguro’s group at ATR in Japan.

This concept that a user could imagine or project features and emotions onto a robot as long as it provides a blank enough slate came from Hiroshi Ishiguro with Telenoid, followed by Elfoid and Hugvie. While Telenoid and Elfoid did have faces, those faces were designed to look neither young nor old, and neither male nor female. When you communicate with another human through Telenoid or Elfoid, the neutral look of the robot makes it easier for you to imagine that it looks something like whoever’s on the other end. Or that’s the idea, anyway. Hiro-chan itself was developed in cooperation with Hidenobu Sumioka, who leads the Presence Media Research Group at Hiroshi Ishiguro Laboratory at ATR.

Vstone says the lack of a face is expected to enhance user attachment to the robot, and that testing during product development “showed that designs without faces were as popular as designs with faces.” Users can also enhance attachment by making clothing for the robot, Vstone suggests, and will provide patterns on its website when Hiro-chan is released. Otherwise, there’s really not much to the robot: It runs on AA batteries, has an on-off switch, and mercifully, a volume control, although the FAQ on the robot suggests that it may sometimes laugh even if it’s all by itself in a different room, which is not creepy at all.

Photo: Vstone Vstone says the lack of a face is expected to enhance user attachment to the robot.

At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes, relating this anecdote: 

In tests at nursing homes that cooperated with the development of Hiro-chan, even those who did not respond to facility staff etc., spontaneously started crying when Hiro-chan started crying, When "Hiro-chan" started laughing, she was seen smiling. By introducing "Hiro-chan", you can expect not only the healing of the user himself, but also the effect of reducing the labor of the facility staff.

Sounds like a great idea, but I still don’t want one.

[ Vstone ]

This robot is Hiro-chan. It’s made by Vstone, a Japanese robotics company known for producing a variety of totally normal educational and hobby robotics kits and parts. Hiro-chan is not what we would call totally normal, since it very obviously does not have a face. Vstone calls Hiro-chan a “healing communication device,” and while the whole faceless aspect is definitely weird, there is a reason for it, which unsurprisingly involves Hiroshi Ishiguro and his ATR Lab.

Hiro-chan’s entire existence seems to be based around transitioning from sad to happy in response to hugs. If left alone, Hiro-chan’s mood will gradually worsen and it’ll start crying. If you pick it up and hug it, an accelerometer will sense the motion, and Hiro-chan’s mood will improve until it starts to laugh. This is the extent of the interaction, but you’ll be glad to know that the robot has access to over 100 utterance variations collected from an actual baby (or babies) to make sure that mood changes are fluid and seamless. 

According to Japanese blog RobotStart, the target demographic for Hiro-chan is seniors, although it’s simple enough in operation that pretty much anyone could likely pick one up and figure out what they’re supposed to do with it. The end goal is the “healing effect” (a sense of accomplishment, I guess?) that you’d get from making the robot feel better.

Photo: Vstone At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes.

So why doesn’t the robot have a face? Since the functionality of the robot depends on you getting it go from sad to happy, Vstone says that giving the robot a face (and a fixed expression) would make that much less convincing and emotionally fulfilling—the robot would have the “wrong” expression half the time. Instead, the user can listen to Hiro-chan’s audio cues and imagine a face. Or not. Either way, the Uncanny Valley effect is avoided (as long as you can get over the complete lack of face, which I personally couldn’t), and the cost of the robot is kept low since there’s no need for actuators or a display.

Photo: Hiroshi Ishiguro/Osaka University/ATR

The Telenoid robot developed by Hiroshi Ishiguro’s group at ATR in Japan.

This concept that a user could imagine or project features and emotions onto a robot as long as it provides a blank enough slate came from Hiroshi Ishiguro with Telenoid, followed by Elfoid and Hugvie. While Telenoid and Elfoid did have faces, those faces were designed to look neither young nor old, and neither male nor female. When you communicate with another human through Telenoid or Elfoid, the neutral look of the robot makes it easier for you to imagine that it looks something like whoever’s on the other end. Or that’s the idea, anyway. Hiro-chan itself was developed in cooperation with Hidenobu Sumioka, who leads the Presence Media Research Group at Hiroshi Ishiguro Laboratory at ATR.

Vstone says the lack of a face is expected to enhance user attachment to the robot, and that testing during product development “showed that designs without faces were as popular as designs with faces.” Users can also enhance attachment by making clothing for the robot, Vstone suggests, and will provide patterns on its website when Hiro-chan is released. Otherwise, there’s really not much to the robot: It runs on AA batteries, has an on-off switch, and mercifully, a volume control, although the FAQ on the robot suggests that it may sometimes laugh even if it’s all by itself in a different room, which is not creepy at all.

Photo: Vstone Vstone says the lack of a face is expected to enhance user attachment to the robot.

At 5,500 JPY (about US $50), Vstone expects that Hiro-chan could be helpful with seniors in nursing homes, relating this anecdote: 

In tests at nursing homes that cooperated with the development of Hiro-chan, even those who did not respond to facility staff etc., spontaneously started crying when Hiro-chan started crying, When "Hiro-chan" started laughing, she was seen smiling. By introducing "Hiro-chan", you can expect not only the healing of the user himself, but also the effect of reducing the labor of the facility staff.

Sounds like a great idea, but I still don’t want one.

[ Vstone ]

When Anki abruptly shut down in April of last year, things looked bleak for Vector, Cozmo, and the Overdrive little racing cars. Usually, abrupt shutdowns don’t end well, with assets and intellectual property getting liquidated and effectively disappearing forever. Despite some vague promises (more like hopes, really) from Anki at the time that their cloud-dependent robots would continue to operate, it was pretty clear that Anki’s robots wouldn’t have much of a future—at best, they’d continue to work only as long as there was money to support the cloud servers that gave them their spark of life.

A few weeks ago, The Robot Report reported that Anki’s intellectual property (patents, trademarks, and data) was acquired by Digital Dream Labs, an education tech startup based in Pittsburgh. Over the weekend, a new post on the Vector Kickstarter page (the campaign happened in 2018) from Digital Dream Labs CEO Jacob Hanchar announced that not only will Vector’s cloud servers keep running indefinitely, but that the next few months will see a new Kickstarter to add new features and future-proofing to Vectors everywhere.

Here’s the announcement from Hanchar:

I wanted to let you know that we have purchased Anki's assets and intend to restore the entire platform and continue to develop the robot we all know and love, Vector!

The most important part of this update is to let you know we have taken over the cloud servers and are going to maintain them going forward.  Therefore, if you were concerned about Vector 'dying' one day, you no longer have to worry!  

The next portion of this update is to let you know what we have planned next and we will be announcing a KickStarter under Digital Dream Labs in the next month or two.  While we are still brainstorming we are thinking the Kickstarter will focus on two features we have seen as major needs in the Vector community:

1)  We will develop an "Escape Pod".  This will, safely, expose settings and allow the user to move and set endpoints, and by doing so, remove the need for the cloud server.  In other words, if you're concerned Anki's demise could also happen to us, this is your guarantee that no matter what happens, you'll always get to play with Vector!

2)  We will develop a "Dev Vector".  Many users have asked us for open source and the ability to do more with their Vector even to the point of hosting him on their own servers.  With this feature, developers will be able to customize their robot through a bootloader we will develop.  With the robot unlocked, technologists and hobbyists across the globe will finally be able to hack, with safe guards in place, away at Vector for the ultimate AI and machine learning experience!

As a bonus, we will see about putting together an SDK so users can play with Vector's audio stream and system, which we have discovered is a major feature you guys love about this little guy!

This is just the beginning and subject to change, but because you have shown such loyalty and got this project off the ground in the first place, I felt it was necessary to communicate these developments as soon as possible! 

There are a few more details in the comments on this post—Hanchar notes that they didn’t get any of Anki’s physical inventory, meaning that at least for now, you won’t be able to buy any robots from them. However, Hanchar told The Robot Report that they’ve been talking with ex-Anki employees and manufacturers about getting new robots, with a goal of having the whole family (Vector, Cozmo, and Overdrive) available for the 2020 holidays. 

Photo: Anki Anki’s Cozmo robot.

Despite the announcement on the Vector Kickstarter page, it sounds like Cozmo will be the initial focus, because Cozmo works best with Digital Dream Labs’ existing educational products. The future of Vector, presumably, will depend on how well the forthcoming Kickstarter does. In its FAQ about the Anki acquisition, Digital Dream Labs says that they “will need to examine the business model surrounding Vector before we can relaunch that product,” and speaking with The Robot Report, Hanchar suggested that “monthly subscription packages” in a few different tiers might be the way to make sure that Vector stays profitable. 

It’s probably too early to get super excited about this, but it’s definitely far better news than we were expecting, and Anki’s robots now seem like they could potentially have a future. Hanchar even mentioned something about a “Vector 2.0,” whatever that means. In the short term, I think most folks would be pretty happy with a Vector 1.0 with support, some new features, and no expiration date, and that could be exactly what we’re getting. 

[ Anki Vector ]

When Anki abruptly shut down in April of last year, things looked bleak for Vector, Cozmo, and the Overdrive little racing cars. Usually, abrupt shutdowns don’t end well, with assets and intellectual property getting liquidated and effectively disappearing forever. Despite some vague promises (more like hopes, really) from Anki at the time that their cloud-dependent robots would continue to operate, it was pretty clear that Anki’s robots wouldn’t have much of a future—at best, they’d continue to work only as long as there was money to support the cloud servers that gave them their spark of life.

A few weeks ago, The Robot Report reported that Anki’s intellectual property (patents, trademarks, and data) was acquired by Digital Dream Labs, an education tech startup based in Pittsburgh. Over the weekend, a new post on the Vector Kickstarter page (the campaign happened in 2018) from Digital Dream Labs CEO Jacob Hanchar announced that not only will Vector’s cloud servers keep running indefinitely, but that the next few months will see a new Kickstarter to add new features and future-proofing to Vectors everywhere.

Here’s the announcement from Hanchar:

I wanted to let you know that we have purchased Anki's assets and intend to restore the entire platform and continue to develop the robot we all know and love, Vector!

The most important part of this update is to let you know we have taken over the cloud servers and are going to maintain them going forward.  Therefore, if you were concerned about Vector 'dying' one day, you no longer have to worry!  

The next portion of this update is to let you know what we have planned next and we will be announcing a KickStarter under Digital Dream Labs in the next month or two.  While we are still brainstorming we are thinking the Kickstarter will focus on two features we have seen as major needs in the Vector community:

1)  We will develop an "Escape Pod".  This will, safely, expose settings and allow the user to move and set endpoints, and by doing so, remove the need for the cloud server.  In other words, if you're concerned Anki's demise could also happen to us, this is your guarantee that no matter what happens, you'll always get to play with Vector!

2)  We will develop a "Dev Vector".  Many users have asked us for open source and the ability to do more with their Vector even to the point of hosting him on their own servers.  With this feature, developers will be able to customize their robot through a bootloader we will develop.  With the robot unlocked, technologists and hobbyists across the globe will finally be able to hack, with safe guards in place, away at Vector for the ultimate AI and machine learning experience!

As a bonus, we will see about putting together an SDK so users can play with Vector's audio stream and system, which we have discovered is a major feature you guys love about this little guy!

This is just the beginning and subject to change, but because you have shown such loyalty and got this project off the ground in the first place, I felt it was necessary to communicate these developments as soon as possible! 

There are a few more details in the comments on this post—Hanchar notes that they didn’t get any of Anki’s physical inventory, meaning that at least for now, you won’t be able to buy any robots from them. However, Hanchar told The Robot Report that they’ve been talking with ex-Anki employees and manufacturers about getting new robots, with a goal of having the whole family (Vector, Cozmo, and Overdrive) available for the 2020 holidays. 

Photo: Anki Anki’s Cozmo robot.

Despite the announcement on the Vector Kickstarter page, it sounds like Cozmo will be the initial focus, because Cozmo works best with Digital Dream Labs’ existing educational products. The future of Vector, presumably, will depend on how well the forthcoming Kickstarter does. In its FAQ about the Anki acquisition, Digital Dream Labs says that they “will need to examine the business model surrounding Vector before we can relaunch that product,” and speaking with The Robot Report, Hanchar suggested that “monthly subscription packages” in a few different tiers might be the way to make sure that Vector stays profitable. 

It’s probably too early to get super excited about this, but it’s definitely far better news than we were expecting, and Anki’s robots now seem like they could potentially have a future. Hanchar even mentioned something about a “Vector 2.0,” whatever that means. In the short term, I think most folks would be pretty happy with a Vector 1.0 with support, some new features, and no expiration date, and that could be exactly what we’re getting. 

[ Anki Vector ]

.mobileShow { display: none;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileShow { display: inline;} } .mobileHide { display: inline;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileHide { display: none;} }

Photo: Caltech This lower-body exoskeleton, developed by Wandercraft, will allow disabled users to walk more dynamically.

Bipedal robots have long struggled to walk as humans do—balancing on two legs and moving with that almost-but-not-quite falling forward motion that most of us have mastered by the time we’re a year or two old. It’s taken decades of work, but robots are starting to get comfortable with walking, putting them in a position to help people in need.

Roboticists at the California Institute of Technology have launched an initiative called RoAMS (Robotic Assisted Mobility Science), which uses the latest research in robotic walking to create a new kind of medical exoskeleton. With the ability to move dynamically, using neurocontrol interfaces, these exoskeletons will allow users to balance and walk without the crutches that are necessary with existing medical exoskeletons. This might not seem like much, but consider how often you find yourself standing up and using your hands at the same time.

“The only way we’re going to get exoskeletons into the real world helping people do everyday tasks is through dynamic locomotion,” explains Aaron Ames, a professor of civil and mechanical engineering at Caltech and colead of the RoAMS initiative. “We’re imagining deploying these exoskeletons in the home, where a user might want to do things like make a sandwich and bring it to the couch. And on the clinical side, there are a lot of medical benefits to standing upright and walking.”

The Caltech researchers say their exoskeleton is ready for a major test: They plan to demonstrate dynamic walking through neurocontrol this year.

Getting a bipedal exoskeleton to work so closely with a human is a real challenge. Ames explains that researchers have a deep and detailed understanding of how their robotic creations operate, but biological systems still present many unknowns. “So how do we get a human to successfully interface with these devices?” he asks.

There are other challenges as well. Ashraf S. Gorgey, an associate professor of physical medicine and rehabilitation at Virginia Commonwealth University, in Richmond, who has researched exoskeletons, says factors such as cost, durability, versatility, and even patients’ desire to use the device are just as important as the technology itself. But he adds that as a research system, Caltech’s approach appears promising: “Coming up with an exoskeleton that can provide balance to patients, I think that’s huge.”

Photo: Caltech Caltech researchers prepare for a walking demonstration with the exoskeleton.

One of Ames’s colleagues at Caltech, Joel Burdick, is developing a spinal stimulator that can potentially help bypass spinal injuries, providing an artificial connection between leg muscles and the brain. The RoAMS initiative will attempt to use this technology to exploit the user’s own nerves and muscles to assist with movement and control of the exoskeleton—even for patients with complete paraplegia. Coordinating nerves and muscles with motion can also be beneficial for people undergoing physical rehabilitation for spinal cord injuries or stroke, where walking with the support and assistance of an exoskeleton can significantly improve recovery, even if the exoskeleton does most of the work.

“You want to train up that neurocircuitry again, that firing of patterns that results in locomotion in the corresponding muscles,” explains Ames. “And the only way to do that is have the user moving dynamically like they would if they weren’t injured.”

Caltech is partnering with a French company called Wandercraft to transfer this research to a clinical setting. Wandercraft has developed an exoskeleton that has received clinical approval in Europe, where it has already enabled more than 20 paraplegic patients to walk. In 2020, the RoAMS initiative will focus on directly coupling brain or spine interfaces with Wandercraft’s exoskeleton to achieve stable dynamic walking with integrated neurocontrol, which has never been done before.

Ames notes that these exoskeletons are designed to meet very specific challenges. For now, their complexity and cost will likely make them impractical for most people with disabilities to use, especially when motorized wheelchairs can more affordably fulfill many of the same functions. But he is hoping that the RoAMS initiative is the first step toward bringing the technology to everyone who needs it, providing an option for situations that a wheelchair or walker can’t easily handle.

“That’s really what RoAMS is about,” Ames says. “I think this is something where we can make a potentially life-changing difference for people in the not-too-distant future.”

This article appears in the January 2020 print issue as “This Exoskeleton Will Obey Your Brain.”

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

IIT’s new HyQReal quadruped robot was released in May 2019. This highlight video shows previously unpublished footage of how we prepared the robot to pull a 3.3 ton airplane. Additionally, it shows the robot walking over unstructured terrain and during public events in October 2019. Including a face-to-face with a dog.

[ IIT ]

Thanks Claudio!

Agility Robotics has had a very busy 2019, and all 10 minutes of this video is worth watching.

Also: double Digits.

[ Agility Robotics ]

Happy (belated) holidays from Franka Emika!

[ Franka Emika ]

Thanks Anna!

Happy (belated) holidays from the GRASP lab!

[ GRASP Lab ]

Happy (belated) holidays from the Autonomous Robots Lab at the University of Nevada!

[ ARL ]

Happy (belated) holidays from the Georgia Tech Systems Research Lab!

[ GA Tech ]

Thanks Qiuyang!

NASA’s Jet Propulsion Laboratory has attached the Mars 2020 Helicopter to the belly of the Mars 2020 rover.

[ JPL ]

This isn’t a Roomba, mind you—are we at the point where “Roomba” is like “Xerox” or “Velcro,” representing a category rather than a brand?—but it does have a flying robot vacuum in it.

[ YouTube ] via [ Gizmodo ]

We’ve said it before, and it’s still true: Every quadrotor should have failsafe software like this.

[ Verity ]

KUKA robots are on duty at one of the largest tea factories in the world located in Rize, Turkey.

[ Kuka ]

This year, make sure and take your robot for more walks.

[ Sphero ]

Dorabot’s Robot for recycling, can identify, pick, and sort recyclable items such as plastic bottles, glass bottles, paper, cartons, and aluminum cans. The robot has deep learning-based computer vision and dynamic planning to select items in a moving conveyor belt. It also includes customized and erosion resistant grippers to pick irregularly shaped items, which results in a cost-effective integrated solution.

[ Dorabot ]

This cute little boat takes hyperlapse pictures autonomously, while more or less not sinking.

[ rctestflight ] via [ PetaPixel ]

Roboy’s Research Reviews takes a look at the OmniSkins paper from 2018.

[ RRR ]

When thinking about robot ethics (and robots in general), it’s typical to use humans and human ethics as a baseline. But what if we considered animals as a point of comparison instead? Ryan Calo, Kate Darling, and Paresh Kathrani were on a panel at the Animal Law Conference last month entitled Persons yet Unknown: Animals, Chimeras, Artificial Intelligence and Beyond where this idea was explored.

[ YouTube ]

Sasha Iatsenia, who was until very recently head of product at Kiwibot, gives a candid talk about “How (not) to build autonomous robots.”

We should mention that Kiwibot does seem to still be alive.

[ CCC ]

On this episode of the Artificial Intelligence Podcast, Lex Fridman interviews Sebastian Thrun.

[ AI Podcast ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA HRI 2020 – March 23-26, 2020 – Cambridge, U.K. ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

IIT’s new HyQReal quadruped robot was released in May 2019. This highlight video shows previously unpublished footage of how we prepared the robot to pull a 3.3 ton airplane. Additionally, it shows the robot walking over unstructured terrain and during public events in October 2019. Including a face-to-face with a dog.

[ IIT ]

Thanks Claudio!

Agility Robotics has had a very busy 2019, and all 10 minutes of this video is worth watching.

Also: double Digits.

[ Agility Robotics ]

Happy (belated) holidays from Franka Emika!

[ Franka Emika ]

Thanks Anna!

Happy (belated) holidays from the GRASP lab!

[ GRASP Lab ]

Happy (belated) holidays from the Autonomous Robots Lab at the University of Nevada!

[ ARL ]

Happy (belated) holidays from the Georgia Tech Systems Research Lab!

[ GA Tech ]

Thanks Qiuyang!

NASA’s Jet Propulsion Laboratory has attached the Mars 2020 Helicopter to the belly of the Mars 2020 rover.

[ JPL ]

This isn’t a Roomba, mind you—are we at the point where “Roomba” is like “Xerox” or “Velcro,” representing a category rather than a brand?—but it does have a flying robot vacuum in it.

[ YouTube ] via [ Gizmodo ]

We’ve said it before, and it’s still true: Every quadrotor should have failsafe software like this.

[ Verity ]

KUKA robots are on duty at one of the largest tea factories in the world located in Rize, Turkey.

[ Kuka ]

This year, make sure and take your robot for more walks.

[ Sphero ]

Dorabot’s Robot for recycling, can identify, pick, and sort recyclable items such as plastic bottles, glass bottles, paper, cartons, and aluminum cans. The robot has deep learning-based computer vision and dynamic planning to select items in a moving conveyor belt. It also includes customized and erosion resistant grippers to pick irregularly shaped items, which results in a cost-effective integrated solution.

[ Dorabot ]

This cute little boat takes hyperlapse pictures autonomously, while more or less not sinking.

[ rctestflight ] via [ PetaPixel ]

Roboy’s Research Reviews takes a look at the OmniSkins paper from 2018.

[ RRR ]

When thinking about robot ethics (and robots in general), it’s typical to use humans and human ethics as a baseline. But what if we considered animals as a point of comparison instead? Ryan Calo, Kate Darling, and Paresh Kathrani were on a panel at the Animal Law Conference last month entitled Persons yet Unknown: Animals, Chimeras, Artificial Intelligence and Beyond where this idea was explored.

[ YouTube ]

Sasha Iatsenia, who was until very recently head of product at Kiwibot, gives a candid talk about “How (not) to build autonomous robots.”

We should mention that Kiwibot does seem to still be alive.

[ CCC ]

On this episode of the Artificial Intelligence Podcast, Lex Fridman interviews Sebastian Thrun.

[ AI Podcast ]

.mobileShow { display: none;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileShow { display: inline;} } .mobileHide { display: inline;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileHide { display: none;} } Photo: FarmWise FarmWise’s AI-powered robots drive autonomously through crops, looking for weeds to kill.

At first glance, the crops don’t look any different from other crops blanketing the Salinas Valley, in California, which is often called “America’s salad bowl.” All you see are rows and rows of lettuce, broccoli, and cauliflower stretching to the horizon. But then the big orange robots roll through.

The machines are on a search-and-destroy mission. Their target? Weeds. Equipped with tractorlike wheels and an array of cameras and environmental sensors, they drive autonomously up and down the rows of produce, hunting for any leafy green invaders. Rather than spraying herbicides, they deploy a retractable hoe that kills the weeds swiftly and precisely.

The robots belong to FarmWise, a San Francisco startup that wants to use robotics and artificial intelligence to make agriculture more sustainable—and tastier. The company has raised US $14.5 million in a recent funding round, and in 2020 it plans to deploy its first commercial fleet of robots, with more than 10 machines serving farmers in the Salinas Valley.

FarmWise says that although its robots are currently optimized for weeding, future designs will do much more. “Our goal is to become a universal farming platform,” says cofounder and CEO ­Sébastien Boyer. “We want to automate pretty much all tasks from seeding all the way to harvesting.”

Boyer envisions the robots collecting vast amounts of data, including detailed images of the crops and parameters that affect their health such as temperature, humidity, and soil conditions. But it’s what the robots will do with the data that makes them truly remarkable. Using machine learning, they’ll identify each plant individually, determine whether it’s thriving, and tend to it accordingly. Thanks to these AI-powered robots, every broccoli stalk will get the attention it needs to be the best broccoli it can be.

Automation is not new to agriculture. Wheeled harvesters are increasingly autonomous, and farmers have long been flying drones to monitor their crops from above. Also under development are robots designed to pick fruits and vegetables—apples, peppers, strawberries, tomatoes, grapes, cucumbers, asparagus. More recently, a number of robotics companies have turned their attention to ways they can improve the quality or yield of crops.

Farming robots are still a “very nascent market,” says Rian Whitton, a senior analyst at ABI Research, in London, but it’s one that will “expand significantly over the next 10 years.” ABI forecasts that annual shipments of mobile robots for agriculture will exceed 100,000 units globally by 2030, 100 times the volume deployed today.

It’s still a small number compared with the millions of tractors and other farming vehicles sold each year, but Whitton notes that demand for automation will likely accelerate due to labor shortages in many parts of the world.

Photo: FarmWise FarmWise plans to deploy its first commercial fleet of robots in the Salinas Valley, in California.

FarmWise says it has worked closely with farmers to understand their needs and develop its robots based on their feedback. So how do they work? Boyer is not prepared to reveal specifics about the company’s technology, but he says the machines operate in three steps.

First, the sensor array captures images and other relevant data about the crops and stores that information on both onboard computers and cloud servers. The second step is the decision-making process, in which specialized deep-learning algorithms analyze the data. There’s an algorithm trained to detect plants in an image, and the robots combine that output with GPS and other location data to precisely identify each plant. Another algorithm is trained to decide whether a plant is, say, a lettuce head or a weed. The final step is the physical action that the machines perform on the crops—for example, deploying the weeding hoe.

Boyer says the robots perform the three steps in less than a second. Indeed, the robots can drive through the fields clearing the soil at a pace that would be virtually impossible for humans to match. FarmWise says its robots have removed weeds from more than 10 million plants to date.

Whitton, the ABI analyst, says focusing on weeding as an initial application makes sense. “There are potentially billions of dollars to be saved from less pesticide use, so that’s the fashionable use case,” he says. But he adds that commercial success for agriculture automation startups will depend on whether they can expand their services to perform additional farming tasks as well as operate in a variety of regions and climates.

Already FarmWise has a growing number of competitors. Deepfield Robotics, a spin-out of the German conglomerate Robert Bosch, is testing an autonomous vehicle that kills weeds by punching them into the ground. The Australian startup Agerris is developing mobile robots for monitoring and spraying crops. And Sunnyvale, Calif.–based Blue River Technology, acquired by John Deere in 2017, is building robotic machines for weeding large field crops like cotton and soybeans.

FarmWise says it has recently completed a redesign of its robots. The new version is better suited to withstand the harsh conditions often found in the field, including mud, dust, and water. The company is now expanding its staff as it prepares to deploy its robotic fleet in California, and eventually in other parts of the United States and abroad.

Boyer is confident that farms everywhere will one day be filled with robots—and that they’ll grow some of the best broccoli you’ve ever tasted.

.mobileShow { display: none;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileShow { display: inline;} } .mobileHide { display: inline;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileHide { display: none;} } Photo: Boeing No cockpit mars the clean lines of this unpiloted blue streak.

If you drive along the main northern road through South Australia with a good set of binoculars, you may soon be able to catch a glimpse of a strange, windowless jet, one that is about to embark on its maiden flight. It’s a prototype of the next big thing in aerial combat: a self-piloted warplane designed to work together with human-piloted aircraft.

The Royal Australian Air Force (RAAF) and Boeing Australia are building this fighterlike plane for possible operational use in the mid-2020s. Trials are set to start this year, and although the RAAF won’t confirm the exact location, the quiet electromagnetic environment, size, and remoteness of the Woomera Prohibited Area make it a likely candidate. Named for ancient Aboriginal spear throwers, Woomera spans an area bigger than North Korea, making it the largest weapons-testing range on the planet.

The autonomous plane, formally called the Airpower Teaming System but often known as “Loyal Wingman,” is 11 meters (38 feet) long and clean cut, with sharp angles offset by soft curves. The look is quietly aggressive.

Three prototypes will be built under a project first revealed by Boeing and the RAAF in February 2019. Those prototypes are not meant to meet predetermined specifications but rather to help aviators and engineers work out the future of air combat. This may be the first experiment to truly portend the end of the era of crewed warplanes.

“We want to explore the viability of an autonomous system and understand the challenges we’ll face,” says RAAF Air Commodore Darren Goldie.

Australia has chipped in US $27 million (AU $40 million), but the bulk of the cost is borne by Boeing, and the company will retain ownership of the three prototypes. Boeing says the project is the largest investment in uncrewed aircraft it’s ever made outside the United States, although a spokesperson would not give an exact figure.

The RAAF already operates a variety of advanced aircraft, such as Lockheed Martin F-35 jets, but these $100 million fighters are increasingly seen as too expensive to send into contested airspace. You don’t swat a fly with a gold mallet. The strategic purpose of the Wingman project is to explore whether comparatively cheap and expendable autonomous fighters could bulk up Australia’s air power. Sheer strength in numbers may prove handy in deterring other regional players, notably China, which are expanding their own fleets.

“Quantity has a quality of its own,” Goldie says.

The goal of the project is to put cost before capability, creating enough “combat mass” to overload enemy calculations. During operations, Loyal Wingman aircraft will act as extensions of the piloted aircraft they accompany. They could collect intelligence, jam enemy electronic systems, and possibly drop bombs or shoot down other planes.

“They could have a number of uses,” Goldie says. “An example might be a manned aircraft giving it a command to go out in advance to trigger enemy air defense systems—similar to that achieved by [U.S.-military] Miniature Air-Launched Decoys.”

The aircraft are also designed to operate as a swarm. Many of these autonomous fighters with cheap individual sensors, for example, could fly in a “distributed antenna” geometry, collectively creating a greater electromagnetic aperture than you could get with a single expensive sensor. Such a distributed antenna could also help the system resist jamming.

“This is a really big concept, because you’re giving the pilots in manned aircraft a bigger picture,” Boeing Australia director Shane Arnott says. These guidelines have created two opposing goals: On one hand, the Wingman must be stealthy, fast, and maneuverable, and with some level of autonomy. On the other, it must be cheap enough to be expendable.

The development of Wingman began with numerical simulations, as Boeing Australia and the RAAF applied computational fluid dynamics to calculate the aerodynamic properties of the plane. Physical prototypes were then built for testing in wind tunnels, designing electrical wiring, and the other stages of systems engineering. Measurements from sensors attached to a prototype were used to create and refine a “digital twin,” which Arnott describes as one of the most comprehensive Boeing has ever made. “That will become important as we upgrade the system, integrate new sensors, and come up with different approaches to help us with the certification phase,” Arnott says.

The physical result is a clean-sheet design with a custom exterior and a lot of off-the-shelf components inside. The composite exterior is designed to reflect radar as weakly as possible. Sharply angled surfaces, called chines, run from the nose to the air intakes on either side of the lower fuselage; chines then run further back from those intakes to the wings and to twin tail fins, which are slightly canted from the vertical.

This design avoids angles that might reflect radar signals straight back to the source, like a ball bouncing off the inside corner of a box. Instead, the design deflects them erratically. Payloads are hidden in the belly. Of course, if the goal is to trigger enemy air defense systems, such a plane could easily turn nonstealthy.

The design benefits from the absence of a pilot. There is no cockpit to break the line, nor a human who must be protected from the brain-draining forces of acceleration.

“The ability to remove the human means you’re fundamentally allowing a change in the design of the aircraft, particularly the pronounced forward part of the fuselage,” Goldie says. “Lowering the profile can lower the radar cross section and allow a widened flight envelope.”

The trade-off is cost. To keep it down, the Wingman uses what Boeing calls a “very light commercial jet engine” to achieve a range of about 3,700 km (2,300 miles), roughly the distance between Seville and Moscow. The internal sensors are derived from those miniaturized for commercial applications.

Additional savings have come from Boeing’s prior investments in automating its supply chains. The composite exterior is made using robotic manufacturing techniques first developed for commercial planes at Boeing’s aerostructures fabrication site in Melbourne, the company’s largest factory outside the United States.

The approach has yielded an aircraft that is cheaper, faster, and more agile than today’s drones. The most significant difference, however, is that the Wingman can make its own decisions. “Unmanned aircraft that are flown from the ground are just manned from a different part of the system. This is a different concept,” Goldie says. “There’s nobody physically telling the system to iteratively go up, left, right, or down. The aircraft could be told to fly to a position and do a particular role. Inherent in its design is an ability to achieve that reliably.”

Setting the exact parameters of the Loyal Wingman’s autonomy—which decisions will be made by the machine and which by a human—is the main challenge. If too much money is invested in perfecting the software, the Wingman could become too expensive; too little, however, may leave it incapable of carrying out the required operations.

The software itself has been developed using the digital twin, a simulation that has been digitally “flown” thousands of times. Boeing is also using 15 test-bed aircraft to “refine autonomous control algorithms, data fusion, object-detection systems, and collision-avoidance behaviors,” the company says on its website. These include five higher-performance test jets.

“We understand radar cross sections and g-force stress on an aircraft. We need to know more about the characteristics of the autonomy that underpins that, what it can achieve and how reliable it can be,” Goldie says.

“Say you have an autonomous aircraft flying in a fighter formation, and it suddenly starts jamming frequencies the other aircraft are using or [are] reliant upon,” he continues. “We can design the aircraft to not do those things, but how do we do that and keep costs down? That’s a challenge.”

Arnott also emphasizes the exploratory nature of the Loyal Wingman program. “Just as we’ve figured out what is ‘good enough’ for the airframe, we’re figuring out what level of autonomy is also ‘good enough,’ ” Arnott says. “That’s a big part of what this program is doing.”

The need to balance capability and cost also affects how the designers can protect the aircraft against enemy countermeasures. The Wingman’s stealth and maneuverability will make it harder to hit with antiaircraft missiles that rely on impact to destroy their targets, so the most plausible countermeasures are cybertechniques that hack the aircraft’s communications, perhaps to tell it to fly home, or electromagnetic methods that fry the airplane’s internal electronics.

Stealth protection can go only so far. And investing heavily in each aircraft’s defenses would raise costs. “How much do you build in resilience, or just accept this aircraft is not meant to be survivable?” Goldie says.

This year’s test flights should help engineers weigh trade-offs between resilience and cost. Those flights will also answer specific questions: Can the Wingman run low on fuel and decide to come home? Or can it decide to sacrifice itself to save a human pilot? And at the heart of it all is the fundamental question facing militaries the world over: Should air power be cheap and expendable or costly and capable?

Other countries have taken different approaches. The United Kingdom’s Royal Air Force has selected Boeing and several other contractors to produce design ideas for the Lightweight Affordable Novel Combat Aircraft program, with test flights planned in 2022. Boeing has also expressed interest in the U.S. Air Force’s similar Skyborg program, which uses the XQ-58 Valkyrie, a fighterlike drone made by Kratos, of San Diego.

China is also in the game. It has displayed the GJ-11 unmanned stealth combat aircraft and the GJ-2 reconnaissance and strike aircraft; the level of autonomy in these aircraft is not clear. China has also developed the LJ-1, a drone akin to the Loyal Wingman, which may also function as a cruise missile.

Military aerospace projects often have specific requirements that contractors must fulfill. The Loyal Wingman is instead trying to decide what the requirements themselves should be. “We are creating a market,” Arnott says.

The Australian project, in other words, is agnostic as to what role autonomous aircraft should play. It could result in an aircraft that is cheaper than the weapons that will shoot it down, meaning each lost Wingman is actually a net win. It could also result in an aircraft that can almost match a crewed fighter jet’s capabilities at half the cost.

This article appears in the January 2020 print issue as “A Robot Is My Wingman.”

.mobileShow { display: none;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileShow { display: inline;} } .mobileHide { display: inline;} /* Smartphone Portrait and Landscape */ @media only screen and (min-device-width : 320px) and (max-device-width : 480px){ .mobileHide { display: none;} } Photo: United Parcel Service This large quadcopter delivers medical samples at a Raleigh hospital complex.

When Amazon made public its plans to deliver packages by drone six years ago, many skeptics scoffed—including some at this magazine. It just didn’t seem safe or practical to have tiny buzzing robotic aircraft crisscrossing the sky with Amazon orders. Today, views on the prospect of getting stuff swiftly whisked to you this way have shifted, in part because some packages are already being delivered by drone, including examples in Europe, Australia, and Africa, sometimes with life-saving consequences. In 2020, we should see such operations multiply, even in the strictly regulated skies over the United States.

There are several reasons to believe that package delivery by drone may soon be coming to a city near you. The most obvious one is that technical barriers standing in the way are crumbling.

The chief challenge, of course, is the worry that an autonomous package-delivery drone might collide with an aircraft carrying people. In 2020, however, it’s going to be easier to ensure that won’t happen, because as of 1 January, airplanes and helicopters are required to broadcast their positions by radio using what is known as automatic dependent surveillance–broadcast out (ADS-B Out) equipment carried on board. (There are exceptions to that requirement, such as for gliders and balloons, or for aircraft operating only in uncontrolled airspace.) This makes it relatively straightforward for the operator of a properly equipped drone to determine whether a conventional airplane or helicopter is close enough to be of concern.

Indeed, DJI, the world’s leading drone maker, has promised that from here on out it will equip any drone it sells weighing over 250 grams (9 ounces) with the ability to receive ADS-B signals and to inform the operator that a conventional airplane or helicopter is flying nearby. DJI calls this feature AirSense. “It works very well,” says Brendan Schulman, vice president for policy and legal affairs at DJI—noting, though, that it works only “in one direction.” That is, pilots don’t get the benefit of ADS-B signals from drones.

Drones will not carry ADS-B Out equipment, Schulman explains, because the vast number of small drones would overwhelm air-traffic controllers with mostly useless information about their whereabouts. But it will eventually be possible for pilots and others to determine whether there are any drones close enough to worry about; the key is a system for the remote identification of drones that the U.S. Federal Aviation Administration is now working to establish. The FAA took the first formal step in that direction yesterday, when the agency published a Notice of Proposed Rulemaking on remote ID for drones.

Before the new regulations go into effect, the FAA will have to receive and react to public comments on its proposed rules for drone ID. That will take many months. But some form of electronic license plates for drones is definitely coming, and we’ll likely see that happening even before the FAA mandates it. This identification system will pave the way for package delivery and other beyond-line-of-sight operations that fly over people. (Indeed, the FAA has stated that it does not intend to establish rules for drone flights over people until remote ID is in place.)

Photo: United Parcel Service Technicians carry out certain preflight procedures, as with any airline.

One of the few U.S. sites where drones are making commercial deliveries already is Wake County, N.C. Since March of last year, drones have been ferrying medical samples at WakeMed’s sprawling hospital campus on the east side of Raleigh. Last September, UPS Flight Forward, the subsidiary of United Parcel Service that is carrying out these drone flights, obtained formal certification from the FAA as an air carrier. The following month, Wing, a division of Alphabet, Google’s parent company, launched the first residential drone-based delivery service to begin commercial operations in the United States, ferrying small packages from downtown Christiansburg, Va., to nearby neighborhoods. These projects in North Carolina and Virginia, two of a handful being carried out under the FAA’s UAS Integration Pilot Program, show that the idea of using drones to deliver packages is slowly but surely maturing.

“We’ve been operating this service five days a week, on the hour,” says Stuart Ginn, a former airline pilot who is now a head-and-neck surgeon at WakeMed. He was instrumental in bringing drone delivery to this hospital system in partnership with UPS and California-based Matternet.

Right now the drone flying at WakeMed doesn’t travel beyond the operators’ line of sight. But Ginn says that he and others behind the project should soon get FAA clearance to fly packages to the hospital by drone from a clinic located some 16 kilometers away. “I’d be surprised and disappointed if that doesn’t happen in 2020,” says Ginn. The ability to connect nearby medical facilities by drone, notes Ginn, will get used “in ways we don’t anticipate.”

This article appears in the January 2020 print issue as “The Delivery Drones Are Coming.”

aside.inlay.xlrg.XploreFree { font-family: "Georgia", serif; border-width: 4px 0; border-top: solid #888; border-bottom: solid #888; padding: 10px 0; font-size: 19px; font-weight: bold; text-align: center; } span.FreeRed { color: red; text-transform: uppercase; font-family: "Theinhardt-Medium", sans-serif; } span.XploreBlue { color: #03a6e3; font-family: "Theinhardt-Medium", sans-serif; }

How can headphone-wearing pedestrians tune out the chaotic world around them without compromising their own safety? One solution may come from the pedestrian equivalent of a vehicle collision warning system that aims to detect nearby vehicles based purely on sound.

The intelligent headphone system uses machine learning algorithms to interpret sounds and alert pedestrians to the location of vehicles up to 60 meters away. A prototype of the Pedestrian Audio Wearable System (PAWS) can only detect the location but not the trajectory of a nearby vehicle—never mind the locations or trajectories of multiple vehicles. Still, it’s a first step for a possible pedestrian-centered safety aid at a time when the number of pedestrians killed on U.S. roads reached a three-decade high in 2018.

“Sometimes the newer vehicles have sensors that can tell if there are pedestrians, but pedestrians usually don’t have a way to tell if vehicles are on a collision trajectory,” says Xiaofan Jiang, an assistant professor of electrical engineering and member of the Data Science Institute at Columbia University.

The idea first came to Jiang when he noticed that a new pair of noise-cancelling headphones was distracting him more than usual from his surroundings during a walk to work. That insight spurred Jiang and his colleagues at Columbia, the University of North Carolina at Chapel Hill, and Barnard College to develop PAWS and publish their work in the October 2019 issue of the IEEE Internet of Things Journal.

Photo: Electrical Engineering and Data Science Institute/Columbia University The Pedestrian Audio Wearable System detects nearby cars by using microphones and machine learning algorithms to analyze vehicle sounds. 

Many cars with collision warning systems rely upon visual cameras, radar, or lidar to detect nearby objects. But Jiang and his colleagues soon realized that a pedestrian-focused system would need a low-power sensor that could operate for more than six hours on standard batteries. “So we decided to go with an array of microphones, which are very inexpensive and low-power sensors,” Jiang says.

Read this article for free on IEEE Xplore until 28 January 2020.

The array of four microphones is located in different parts of the headphone. But the wearable warning system’s main hardware is designed to fit inside the left ear housing of commercial headphones and draws power from a rechargeable lithium-ion battery. A custom integrated circuit saves on power by only extracting the most relevant sound features from the captured audio and transmitting that information to a paired smartphone app.

The smartphone hosts the machine learning algorithms that were trained on audio from 60 different types of vehicles in a variety of environments: a street adjacent to a university campus and residential area, the side of a windy highway during hurricane season, and the busy streets of Manhattan.

However, relying purely on sound to detect vehicles has proven tricky. For one thing, the system tends to focus on localizing the loudest vehicle, which may not be the vehicle closest to the pedestrian.The system also still has trouble locating multiple vehicles or even estimating how many vehicles are present.

Photo: Electrical Engineering and Data Science Institute/Columbia University The hardware for the Pedestrian Audio Wearable System can fit inside the ear housing of commercial headphones.

As it stands, the PAWS capability to localize a vehicle up to 60 meters away might provide at least several seconds of warning depending on the speed of an oncoming vehicle. But a truly useful warning system would also be able to track the trajectory of a nearby vehicle and only provide a warning if it’s on course to potentially hit the pedestrian. That may require the researchers to figure out better ways to track both the pedestrian’s location and trajectory along with the same information for vehicles.

“If you imagine one person walking along the street, many cars may pass by but none will hit the person,” Jiang explains. “We have to take into account other information to make this collision detection more useful.”

More work continues on how the system would use noises or other signals to alert headphone wearers. Joshua New, a behavioral psychologist at Barnard College, plans to conduct experiments to see what warning cue works best to give people a heads up. For now, the team is leaning toward either providing a warning beep on one side of a stereo headphone or possibly simulating 3D warning sounds to provide more spatially-relevant information.

Beyond ordinary pedestrians, police officers performing a traffic stop on a busy road or construction workers wearing ear protection might also benefit from such technology, Jiang says. The PAWS project has already received US $1.3 million from the National Science Foundation, and the team has an eye on eventually handing a more refined version of the technology over to a company to commercialize it.

Of course, one technology will not solve the challenges of pedestrian safety. In its 2019 report, the Governors Highway Safety Association blamed higher numbers of pedestrian deaths on many factors such as a lack of safe road crossings, and generally unsafe driving by speeding, distracted, or drunk drivers. A headphone equipped with PAWS is unlikely to prevent even a majority of pedestrian deaths—but a few seconds’ warning might help spare some lives.

This post was updated on 8 January 2020. 

A version of this post appears in the February 2020 print issue as “Smart Headphones Warn of Nearby Cars.”

Back to IEEE Journal Watch

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores

Let us know if you have suggestions for next week, and enjoy today’s videos.

Thank you to our readers and Happy Holidays from IEEE Spectrum’s robotics team!
—Erico, Evan, and Fan

Happy Holidays from FZI Living Lab!

This is what a robot holiday video should be. Amazing work from FZI!

[ FZI ]

Thanks Arne!

This is the robot I’m most excited about for 2020:

[ IIT ]

Happy Holidays from ETH Zurich’s Autonomous Systems Lab!

ASL ]

Digit v2 demonstrates autonomous pick and place with multiple boxes.

[ Agility Robotics ]

Happy Holidays from EPFL LMTS, whose soft robots we wrote about this week!

NOW SMACK THEM!

[ LMTS ]

Happy Holidays from ETH Zurich’s Robotic Systems Lab!

[ RSL ]

Happy Holidays from OTTO Motors!

OTTO Motors is based in Ontario, which, being in Canada, is basically the North Pole.

[ OTTO Motors ]

Happy Holidays from FANUC!

[ FANUC ]

Brain Corp makes the brains required to turn manual cleaning machines into autonomous robotic cleaning machines.

Braaains.

[ Brain Corp ]

Happy Holidays from RE2 Robotics!

[ RE2 ]

Happy Holidays from Denso Robotics!

[ Denso ]

Happy Holidays from Robodev!

That sandwich thing looks pretty good, but I'm not sold on the potato.

[ Robodev ]

Thanks Andreas!

Happy Holidays from Kawasaki Robotics!

[ Kawasaki ]

On Dec. 17, 2019, engineers took NASA’s next Mars rover for its first spin. The test took place in the Spacecraft Assembly Facility clean room at NASA’s Jet Propulsion Laboratory in Pasadena, California. This was the first drive test for the new rover, which will move to Cape Canaveral, Florida, in the beginning of next year to prepare for its launch to Mars in the summer. Engineers are checking that all the systems are working together properly, the rover can operate under its own weight, and the rover can demonstrate many of its autonomous navigation functions. The launch window for Mars 2020 opens on July 17, 2020. The rover will land at Mars' Jezero Crater on Feb. 18, 2021.

[ JPL ]

Happy Holidays from Laval University’s Northern Robotics Laboratory!

[ Norlab ]

The Chaparral is a hybrid-electric vertical takeoff and landing (VTOL) cargo aircraft being developed by the team at Elroy Air in San Francisco, CA. The system will carry 300lbs of cargo over a 300mi range. This video reveals a bit more about the system than we've shown in the past. Enjoy!

[ Elroy Air ]

FANUC's new CRX-10iA and CRX-10iA/L collaborative robots feature quick setup, easy programming and reliable performance.

[ FANUC ]

Omron’s ping pong robot is pretty good at the game, as long as you’re only pretty good at the game. If you’re much better than pretty good, it’s pretty bad.

[ Omron ]

The Voliro drone may not look like it’s doing anything all that difficult but wait until it flips 90 degrees and stands on its head!

[ Voliro ]

Based on a unique, patented technology, ROVéo can swiftly tackle rough terrain, as well as steps and stairs, by simply adapting to their shape. It is ideal to monitor security both outside AND inside big industrial sites.

[ Rovenso ]

A picture says more than a thousand words, a video more than a thousand pictures. For this reason, we have produced a series of short films that present the researchers at the Max Planck Institute for Intelligent Systems, their projects and goals. We want to give an insight into our institute, making the work done here understandable for everyone. We continue the series with a portrait of the "Dynamic Locomotion" Max Planck research group lead by Dr. Alexander Badri-Spröwitz.

[ Max Planck ]

Thanks Fan!

This is a 13-minute-long IREX demo of Kawasaki’s Kaleido humanoid.

[ Kawasaki ]

Learn how TRI is working to build an uncrashable car, use robotics to amplify people’s capabilities as they age and leverage artificial intelligence to enable discovery of new materials for batteries and fuel cells.

[ Girl Geek X ]

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

Robotic Arena – January 25, 2020 – Wrocław, Poland DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, Wash., USA ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores

Let us know if you have suggestions for next week, and enjoy today’s videos.

Thank you to our readers and Happy Holidays from IEEE Spectrum’s robotics team!
—Erico, Evan, and Fan

Happy Holidays from FZI Living Lab!

This is what a robot holiday video should be. Amazing work from FZI!

[ FZI ]

Thanks Arne!

This is the robot I’m most excited about for 2020:

[ IIT ]

Happy Holidays from ETH Zurich’s Autonomous Systems Lab!

ASL ]

Digit v2 demonstrates autonomous pick and place with multiple boxes.

[ Agility Robotics ]

Happy Holidays from EPFL LMTS, whose soft robots we wrote about this week!

NOW SMACK THEM!

[ LMTS ]

Happy Holidays from ETH Zurich’s Robotic Systems Lab!

[ RSL ]

Happy Holidays from OTTO Motors!

OTTO Motors is based in Ontario, which, being in Canada, is basically the North Pole.

[ OTTO Motors ]

Happy Holidays from FANUC!

[ FANUC ]

Brain Corp makes the brains required to turn manual cleaning machines into autonomous robotic cleaning machines.

Braaains.

[ Brain Corp ]

Happy Holidays from RE2 Robotics!

[ RE2 ]

Happy Holidays from Denso Robotics!

[ Denso ]

Happy Holidays from Robodev!

That sandwich thing looks pretty good, but I'm not sold on the potato.

[ Robodev ]

Thanks Andreas!

Happy Holidays from Kawasaki Robotics!

[ Kawasaki ]

On Dec. 17, 2019, engineers took NASA’s next Mars rover for its first spin. The test took place in the Spacecraft Assembly Facility clean room at NASA’s Jet Propulsion Laboratory in Pasadena, California. This was the first drive test for the new rover, which will move to Cape Canaveral, Florida, in the beginning of next year to prepare for its launch to Mars in the summer. Engineers are checking that all the systems are working together properly, the rover can operate under its own weight, and the rover can demonstrate many of its autonomous navigation functions. The launch window for Mars 2020 opens on July 17, 2020. The rover will land at Mars' Jezero Crater on Feb. 18, 2021.

[ JPL ]

Happy Holidays from Laval University’s Northern Robotics Laboratory!

[ Norlab ]

The Chaparral is a hybrid-electric vertical takeoff and landing (VTOL) cargo aircraft being developed by the team at Elroy Air in San Francisco, CA. The system will carry 300lbs of cargo over a 300mi range. This video reveals a bit more about the system than we've shown in the past. Enjoy!

[ Elroy Air ]

FANUC's new CRX-10iA and CRX-10iA/L collaborative robots feature quick setup, easy programming and reliable performance.

[ FANUC ]

Omron’s ping pong robot is pretty good at the game, as long as you’re only pretty good at the game. If you’re much better than pretty good, it’s pretty bad.

[ Omron ]

The Voliro drone may not look like it’s doing anything all that difficult but wait until it flips 90 degrees and stands on its head!

[ Voliro ]

Based on a unique, patented technology, ROVéo can swiftly tackle rough terrain, as well as steps and stairs, by simply adapting to their shape. It is ideal to monitor security both outside AND inside big industrial sites.

[ Rovenso ]

A picture says more than a thousand words, a video more than a thousand pictures. For this reason, we have produced a series of short films that present the researchers at the Max Planck Institute for Intelligent Systems, their projects and goals. We want to give an insight into our institute, making the work done here understandable for everyone. We continue the series with a portrait of the "Dynamic Locomotion" Max Planck research group lead by Dr. Alexander Badri-Spröwitz.

[ Max Planck ]

Thanks Fan!

This is a 13-minute-long IREX demo of Kawasaki’s Kaleido humanoid.

[ Kawasaki ]

Learn how TRI is working to build an uncrashable car, use robotics to amplify people’s capabilities as they age and leverage artificial intelligence to enable discovery of new materials for batteries and fuel cells.

[ Girl Geek X ]

Researchers at EPFL have developed a soft robotic insect that uses artificial soft muscles called dielectric elastomer actuators to drive tiny feet that propel the little bot along at a respectable speed. And since the whole thing is squishy and already mostly flat, you can repeatedly smash it into the ground with a fly swatter, and then peel it off and watch it start running again. Get ready for one of the most brutal robot abuse videos you’ve ever seen.

We’re obligated to point out that the version of the robot that survives being raged on with the swatter is a tethered one, not the autonomous version with the battery and microcontroller and sensors, which might not react so well to repeated batterings. But still, it’s pretty cool to see it get peeled right off and keep on going, and the researchers say they’ve been able to do this smash n’ peel eight times in a row without destroying the robot.

Powered by dielectric elastomer actuators

One of the tricky things about building robots like these (that rely on very high-speed actuation) is power—the power levels themselves are usually low, in the milliwatt range, but the actuators generally require several kilovolts to function, meaning that you need a bunch of electronics that can boost the battery voltage up to something you can use. Even miniaturized power systems are in the tens of grams, which is obviously impractical for a robot that weighs one gram or less. Dielectric elastomer actuators, or DEAs, are no exception to this, so the researchers instead used a stack of DEAs that could run at a significantly lower voltage. These low-voltage stacked DEAs (LVSDEAs, because more initialisms are better) run at just 450 volts, but cycle at up to 600 hertz, using power electronics weighing just 780 milligrams.

Image: EPFL Each soft robot uses three LVSDEAs to operate three independent legs.

The LVSDEA actuation is converted into motion by using flexible angled legs, similar to a bristlebot. One leg on each side allows the robot to turn, pivoting around a third supporting leg in the front. Top speed of the 190-mg tethered robot is 18 mm/s (0.5 body-lengths/s), while the autonomous version with an 800-g payload of batteries and electronics and sensors could move at 12 mm/s for 14 minutes before running out of juice. Interestingly, stiffening the structure of the robot by holding it in a curved shape with a piece of tape significantly increased its performance, nearly doubling its speed to 30 mm/s (0.85 body-lengths/s) and boosting its payload capacity as well.

What we’re all waiting for, of course, is a soft robot that can be smashable and untethered at the same time. This is always the issue with soft robots—they’re almost always just mostly soft, requiring either off-board power or rigid components in the form of electronics or batteries. The EPFL researchers say that they’re “currently working on an untethered and entirely soft version” in partnership with Stanford, which we’re very excited to see.

[ EPFL ]

For the most part, robots are a mystery to end users. And that’s part of the point: Robots are autonomous, so they’re supposed to do their own thing (presumably the thing that you want them to do) and not bother you about it. But as humans start to work more closely with robots, in collaborative tasks or social or assistive contexts, it’s going to be hard for us to trust them if their autonomy is such that we find it difficult to understand what they’re doing.

In a paper published in Science Robotics, researchers from UCLA have developed a robotic system that can generate different kinds of real-time, human-readable explanations about its actions, and then did some testing to figure which of the explanations were the most effective at improving a human’s trust in the system. Does this mean we can totally understand and trust robots now? Not yet—but it’s a start.

This work was funded by DARPA’s Explainable AI (XAI) program, which has a goal of being able to “understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real world phenomena.” According to DARPA, “explainable AI—especially explainable machine learning—will be essential if [humans] are to understand, appropriately trust, and effectively manage an emerging generation of artificially intelligent machine partners.”

There are a few different issues that XAI has to tackle. One of those is the inherent opaqueness of machine learning models, where you throw a big pile of training data at some kind of network, which then does what you want it to do most of the time but also sometimes fails in weird ways that are very difficult to understand or predict. A second issue is figuring out how AI systems (and the robots that they inhabit) can effectively communicate what they’re doing with humans, via what DARPA refers to as an explanation interface. This is what UCLA has been working on.

The present project aims to disentangle explainability from task performance, measuring each separately to gauge the advantages and limitations of two major families of representations—symbolic representations and data-driven representations—in both task performance and fostering human trust. The goals are to explore (i) what constitutes a good performer for a complex robot manipulation task? (ii) How can we construct an effective explainer to explain robot behavior and foster human trust?

UCLA’s Baxter robot learned how to open a safety-cap medication bottle (tricky for robots and humans alike) by learning a manipulation model from haptic demonstrations provided by humans opening medication bottles while wearing a sensorized glove. This was combined with a symbolic action planner to allow the robot adjust its actions to adapt to bottles with different kinds of caps, and it does a good job without the inherent mystery of a neural network.

Intuitively, such an integration of the symbolic planner and haptic model enables the robot to ask itself: “On the basis of the human demonstration, the poses and forces I perceive right now, and the action sequence I have executed thus far, which action has the highest likelihood of opening the bottle?”

Both the haptic model and the symbolic planner can be leveraged to provide human-compatible explanations of what the robot is doing. The haptic model can visually explain an individual action that the robot is taking, while the symbolic planner can show a sequence of actions that are (ideally) leading towards a goal. What’s key here is that these explanations are coming from the planning system itself, rather than something that’s been added later to try and translate between a planner and a human.

Image: Science Robotics

As the robot performs a set of actions (top row of images), its symbolic planner (middle row) and haptic model (bottom row) generate explanations for each action. The red on the robot gripper’s palm indicates a large magnitude of force applied by the gripper, and green indicates no force. These explanations are provided in real time as the robot executes the actions.

To figure out whether these explanations made a difference in the level of a human’s trust or confidence or belief that the robot would be successful at its task, the researchers conducted a psychological study with 150 participants. While watching a video of the robot opening a medicine bottle, groups of participants were shown the haptic planner, the symbolic planner, or both planners at the same time, while two other groups were either shown no explanation at all, or a human-generated one-sentence summary of what the robot did. Survey results showed that the highest trust rating came from the group that had access to both the symbolic and haptic explanations, although the symbolic explanation was more impactful.

In general, humans appear to need real-time, symbolic explanations of the robot’s internal decisions for performed action sequences to establish trust in machines performing multistep complex tasks… Information at the haptic level may be excessively tedious and may not yield a sense of rational agency that allows the robot to gain human trust. To establish human trust in machines and enable humans to predict robot behaviors, it appears that an effective explanation should provide a symbolic interpretation and maintain a tight temporal coupling between the explanation and the robot’s immediate behavior.

This paper focuses on a very specific interpretation of the word “explain.” The robot is able to explain what it’s doing (i.e. the steps that it’s taking) in a way that is easy for humans to interpret, and it’s effective in doing so. However, it’s really just explaining the “what” rather than the “why,” because at least in this case, the “why” (as far as the robot knows) is really just “because a human did it this way” due to the way the robot learned to do the task.

While the “what” explanations did foster more trust in humans in this study, long term, XAI will need to include “why” as well, and the example of the robot unscrewing a medicine bottle illustrates a situation in which it would be useful.

Image: Science Robotics In one study, the researchers showed participants a video of the robot opening the bottle (A). Different groups saw different explanation panels along with the video: (B) Symbolic explanation panel; (C) Haptic explanation panel; (D) Text explanation panel.

You can see that there are several repetitive steps in this successful bottle opening, and as an observer, I have no way of knowing if the robot is repeating an action because the first action failed, or if that was just part of its plan. Maybe the opening the bottle really just takes one single grasp-push-twist sequence, but the robot’s gripper slipped the first time. 

Personally, when I think of a robot explaining what it’s doing, this is what I’m thinking of. Knowing what a robot was “thinking,” or at least the reasoning behind its actions or non-actions, would significantly increase my comfort with and confidence around robotic systems, because they wouldn’t seem so… Dumb? For example, is that robot just sitting there and not doing anything because it’s broken, or because it’s doing some really complicated motion planning? Is my Roomba wandering around randomly because it’s lost, or is it wandering around pseudorandomly because that’s the most efficient way to clean? Does that medicine bottle need to be twisted again because a gripper slipped the first time, or because it takes two twists to open?

Knowing what a robot was “thinking,” or at least the reasoning behind its actions or non-actions, would significantly increase my confidence around robotic systems. For example, is that robot just sitting there and not doing anything because it’s broken, or because it’s doing some really complicated motion planning?

Even if the robot makes a decision that I would disagree with, this level of “why” explanation or “because” explanation means that I can have confidence that the robot isn’t dumb or broken, but is either doing what it was programmed to do, or dealing with some situation that it wasn’t prepared for. In either case, I feel like my trust in it would significantly improve, because I know it’s doing what it’s supposed to be doing and/or the best it can, rather than just having some kind of internal blue screen of death experience or something like that. And if it is dead inside, well, I’d want to know that, too.

Longer-term, the UCLA researchers are working on the “why” as well, but it’s going to take a major shift in the robotics community for even the “what” to become a priority. The fundamental problem is that right now, roboticists in general are relentlessly focused on optimization for performance—who cares what’s going on inside your black box system as long as it can successfully grasp random objects 99.9 percent of the time?

But people should care, says lead author of the UCLA paper Mark Edmonds. “I think that explanation should be considered along with performance,” he says. “Even if you have better performance, if you’re not able to provide an explanation, is that actually better?” He added: “The purpose of XAI in general is not to encourage people to stop going down that performance-driven path, but to instead take a step back, and ask, ‘What is this system really learning, and how can we get it to tell us?’ ”

It’s a little scary, I think, to have systems (and in some cases safety critical systems) that work just because they work—because they were fed a ton of training data and consequently seem to do what they’re supposed to do to the extent that you’re able to test them. But you only ever have the vaguest of ideas why these systems are working, and as robots and AI become a more prominent part of our society, explainability will be a critical factor in allowing us to comfortably trust them.

“A Tale of Two Explanations: Enhancing Human Trust by Explaining Robot Behavior,” by M. Edmonds, F. Gao, H. Liu, X. Xie, S. Qi, Y. Zhu, Y.N. Wu, H. Lu, and S.-C. Zhu from the University of California, Los Angeles, and B. Rothrock from the California Institute of Technology, in Pasadena, Calif., appears in the current issue of Science Robotics.

Pages