Feed aggregator



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RSS 2022: 21 June–1 July 2022, NEW YORK CITYERF 2022: 28–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11–17 July 2022, BANGKOKIEEE CASE 2022: 20–24 August 2022, MEXICO CITYCLAWAR 2022: 12–14 September 2022, AZORES, PORTUGALANA Avatar XPRIZE Finals: 4–5 November 2022, LOS ANGELESCoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today’s videos!

The European Robocup Finals 2022, featuring Tech United vs. VDL Robotsports.

[ Tech United ]

Within the European Union project we aim to autonomously monitor habitats. Regular monitoring of individual plant species allows for more sophisticated decision-making. The video is recorded in Perugia Italy.

[ RSL ]

ICRA 2023 is in London!

[ ICRA 2023 ]

What can we learn from nature? What skills from the animal world can be used for industrial applications? Festo has been dealing with these questions in the Bionic Learning Network for years. In association with universities, institutes and development companies, we are developing research platforms whose basic technical principles are based on nature. A recurring theme here is the unique movements and functions of the elephant’s trunk.

[ Festo ]

We are proud to announce the Relaunch of Misty, providing you with a more intuitive and easy-to-use robot platform! So what is new, we hear you ask? To begin with, we have updated Misty’s conversational skills, focusing on both improved NLU capabilities and added more languages. Python has been added as our primary focus programming language going forward, complemented by enhanced Blockly drag and drop functionality. We think you will really enjoy our brand new Misty Studio, which is both more user friendly and with improved features.

[ Misty ]

We developed a self-contained end-effector for layouting on construction sites with aerial robots! The end-effector achieves high accuracy through the use of multiple contact points, compliance, and actuation.

[ Paper ]

The compliance and conformability of soft robots provide inherent advantages when working around delicate objects or in unstructured environments. However, rapid locomotion in soft robotics is challenging due to the slow propagation of motion in compliant structures, particularly underwater. Taking inspiration from cephalopods, here we present an underwater robot with a compliant body that can achieve repeatable jet propulsion by changing its internal volume and cross-sectional area to take advantage of jet propulsion as well as the added mass effect.

[ UCSD ]

I like this idea of making incidental art with robots.

[ RPL UCL ]

If you want to be at the cutting-edge of your research field and publish impactful research papers, you need the most cutting-edge hardware. Our technology is unique (we own the relevant IP), unrivaled and a must-have tool for those in robotics research.

[ Shadow ]

Hardware platforms for socially interactive robotics can be limited by cost or lack of functionality. This article presents the overall system—design, hardware, and software—for Quori, a novel, affordable, socially interactive humanoid robot platform for facilitating non-contact human-robot interaction (HRI) research.

[ Paper ]

Wyss Associate Faculty members, Conor Walsh and Rob Wood discuss their visions for the future of bio-inspired soft robotics.

[ Wyss Institute ]

Towel folding: still not easy for robots.

[ Ishikawa Lab ]

We present hybrid adhesive end-effectors for bimanual handling of deformable objects. The end-effectors are designed with features meant to accommodate surface irregularities in macroscale form, mesoscale waviness, and microscale roughness, achieving good shear adhesion on surfaces with little gripping force. The new gripping system combines passive mechanical compliance with a hybrid electrostatic-adhesive pad so that humanoid robots can grasp a wide range of materials including paperboard and textured plastics.

[ Paper ]

MIT CSAIL grad students speak about what they think is the most important unsolved problem in computer science today.

[ MIT CSAIL ]

At the National Centre of Competence in Research (NCCR) Robotics, a new generation of robots that can work side by side with humans—fighting disabilities, facing emergencies and transforming education—is developed.

[ NCCR ]

The OS-150 Robotics Laboratory is Lawrence Livermore National Laboratory’s facility for testing autonomous drones, vehicles, and robots of the future. The Lab, informally known as the “drone pen,” allows operators to pilot drones safely and build trust with their robotic teammates.

[ LLNL ]

I am not entirely certain whether a Roomba is capable of detecting and navigating pixelated poop IRL, but I’d like to think so.

[ iRobot ]

How Wing designed its hybrid drone for last-mile delivery.

[ Wing ]

Over the past ten years, AI has experienced breakthrough after breakthrough in fields as diverse as computer vision, speech recognition, and protein folding prediction. Many of these advancements hinge on the deep learning work conducted by our guest, Geoff Hinton, who has fundamentally changed the focus and direction of the field. Geoff joins Pieter Abbeel in our two-part season finale for a wide-ranging discussion inspired by insights gleaned from Hinton’s journey from academia to Google Brain.

[ Robot Brains ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. This week is going to be a little bit on the short side, because Evan is getting married this weekend [!!!!!! –Ed.] and is actually supposed to be writing his vows right now.

We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RSS 2022: 21 June–1 July 2022, NEW YORK CITY
ERF 2022: 28–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11–17 July 2022, BANGKOKIEEE CASE 2022: 20–24 August 2022, MEXICO CITYCLAWAR 2022: 12–14 September 2022, AZORES, PORTUGAL

Enjoy today’s videos!

These five videos from ICRA 2022 were created by David Garzón Ramos, a Ph.D. student at IRIDIA, Université libre de Bruxelles, and a member of the ERC DEMIURGE project. David won an award from the ICRA organizing committee to help him attend the conference and share his experiences, and here's how he described his approach to communicating the most exciting parts of ICRA:

At ICRA 2022, I collaborated with the Publicity Committee to portrait some curious, interesting, and emotive moments of the conference in a series of video digests. I believe that working with robots is fun! However, I also believe that it happens quite often that the fascinating ecosystem of contemporary robots is reserved to few fortunate researchers, makers, and engineers. In my videos, I tried to depict and share this rich ecosystem as it was happening in Philadelphia’s ICRA 2022. I focused in creating stories that could be accessible and appealing for the specialized and the nonspecialized public. I wandered around the conference capturing those moments that, at least to my eyes, could help to communicate an important message: robots and people can engage positively. What could be more engaging than having funky robots?! :)





Many thanks to David for producing and sharing these videos!

We’ll have more ICRA content in the coming weeks, but if you’re looking for the latest research being done on awesome robots, look no further than the annual Legged Locomotion workshop. All of the talks from the ICRA 2022 edition are now online, and you can watch the whole playlist (or just skip to your favorite humans and robots!) below.

[ Legged Robots ]



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. This week is going to be a little bit on the short side, because Evan is getting married this weekend [!!!!!! –Ed.] and is actually supposed to be writing his vows right now.

We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RSS 2022: 21 June–1 July 2022, NEW YORK CITY
ERF 2022: 28–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11–17 July 2022, BANGKOKIEEE CASE 2022: 20–24 August 2022, MEXICO CITYCLAWAR 2022: 12–14 September 2022, AZORES, PORTUGAL

Enjoy today’s videos!

These five videos from ICRA 2022 were created by David Garzón Ramos, a Ph.D. student at IRIDIA, Université libre de Bruxelles, and a member of the ERC DEMIURGE project. David won an award from the ICRA organizing committee to help him attend the conference and share his experiences, and here's how he described his approach to communicating the most exciting parts of ICRA:

At ICRA 2022, I collaborated with the Publicity Committee to portrait some curious, interesting, and emotive moments of the conference in a series of video digests. I believe that working with robots is fun! However, I also believe that it happens quite often that the fascinating ecosystem of contemporary robots is reserved to few fortunate researchers, makers, and engineers. In my videos, I tried to depict and share this rich ecosystem as it was happening in Philadelphia’s ICRA 2022. I focused in creating stories that could be accessible and appealing for the specialized and the nonspecialized public. I wandered around the conference capturing those moments that, at least to my eyes, could help to communicate an important message: robots and people can engage positively. What could be more engaging than having funky robots?! :)





Many thanks to David for producing and sharing these videos!

We’ll have more ICRA content in the coming weeks, but if you’re looking for the latest research being done on awesome robots, look no further than the annual Legged Locomotion workshop. All of the talks from the ICRA 2022 edition are now online, and you can watch the whole playlist (or just skip to your favorite humans and robots!) below.

[ Legged Robots ]

Buzzwire tasks are often used as benchmarks and as training environments for fine motor skills and high precision path following. These tasks require moving a wire loop along an arbitrarily shaped wire obstacle in a collision-free manner. While there have been some demonstrations of buzzwire tasks with robotic manipulators using reinforcement learning and admittance control, there does not seem to be any examples with humanoid robots. In this work, we consider the scenario where we control one arm of the REEM-C humanoid robot, with other joints fixed, as groundwork for eventual full-body control. In pursuit of this, we contribute by designing an optimal control problem that generates trajectories to solve the buzzwire in a time optimized manner. This is composed of task-space constraints to prevent collisions with the buzzwire obstacle, the physical limits of the robot, and an objective function to trade-off reducing time and increasing margins from collision. The formulation can be applied to a very general set of wire shapes and the objective and task constraints can be adapted to other hardware configurations. We evaluate this formulation using the arm of a REEM-C humanoid robot and provide an analysis of how the generated trajectories perform both in simulation and on hardware.

Despite promises about the near-term potential of social robots to share our daily lives, they remain unable to form autonomous, lasting, and engaging relationships with humans. Many companies are deploying social robots into the consumer and commercial market; however, both the companies and their products are relatively short lived for many reasons. For example, current social robots succeed in interacting with humans only within controlled environments, such as research labs, and for short time periods since longer interactions tend to provoke user disengagement. We interviewed 13 roboticists from robot manufacturing companies and research labs to delve deeper into the design process for social robots and unearth the many challenges robot creators face. Our research questions were: 1) What are the different design processes for creating social robots? 2) How are users involved in the design of social robots? 3) How are teams of robot creators constituted? Our qualitative investigation showed that varied design practices are applied when creating social robots but no consensus exists about an optimal or standard one. Results revealed that users have different degrees of involvement in the robot creation process, from no involvement to being a central part of robot development. Results also uncovered the need for multidisciplinary and international teams to work together to create robots. Drawing upon these insights, we identified implications for the field of Human-Robot Interaction that can shape the creation of best practices for social robot design.



When we think of bipedal humanoid robots, we tend to think of robots that aren’t just human-shaped, but also human-sized. There are exceptions, of course—among them, a subcategory of smaller humanoids that includes research and hobby humanoids that aren’t really intended to do anything practical. But at the In International Conference on Robotics and Automation (ICRA) last week, roboticists from Carnegie Mellon University (CMU) are asked an interesting question: What happens if you try to scale down a bipedal robot? Like, way down? This line from the paper asking this question sums it up: “our goal with this project is to make miniature walking robots, as small as a LEGO Minifigure (1centimeter leg) or smaller.”

The current robot, while small (its legs are 15cm long), is obviously much bigger than a LEGO minifig. But that’s okay, because it’s not supposed to be quite as tiny as the group's ultimate ambition would have it. At least not yet. It’s a platform that the CMU researchers are using to figure out how to proceed. They're still assessing what it’s going to take to shrink bipedal walking robots to the point where they could ride in Matchbox cars. At very small scales, robots run into all kinds of issues, including space and actuation efficiency. These crop up mainly because it’s simply not possible to cram the same number of batteries and motors that go into bigger bots into something that tiny. So, in order to make a tiny robot that can usefully walk, designers have to get creative.

Bipedal walking is already a somewhat creative form of locomotion. Despite how complex bipedal robots tend to be, if the only criteria for a bipedal robot is that it walks, then it’s kind of crazy how simple roboticists can make them. Here’s a 1990-ish (!) video from Tad McGeer, the first roboticist to explore the concept of passive dynamic walking by completely unpowered robots placed on a gentle downwards slope:


The above video comes from the AMBER Lab, which has been working on efficient walking for large humanoids for a long time (you remember DURUS, right?). For small humanoids, the CMU researchers are trying to figure out how to leverage the principle of dynamic walking to make robots that can move efficiently and controllably while needing the absolute minimum of hardware, and in a way that can be scaled. With a small battery and just one actuator per leg, CMU’s robot is fully controllable, with the ability to turn and to start and stop on its own.

“Building at a larger scale allows us to explore the parameter space of construction and control, so that we know how to scale down from there,” says Justin Yim, one of the authors of the ICRA paper. “If you want to get robots into small spaces for things like inspection or maintenance or exploration, walking could be a good option, and being able to build robots at that size scale is a first step.”

“Obviously [at that scale] we will not have a ton of space,” adds Aaron Johnson, who runs CMU’s Robomechanics Lab. “Minimally actuated designs that leverage passive dynamics will be key. We aren't there yet on the LEGO scale, but with this paper we wanted to understand the way this particular morphology walks before dealing with the smaller actuators and constraints.”


Scalable Minimally Actuated Leg Extension Bipedal Walker Based on 3D Passive Dynamics, by Sharfin Islam, Kamal Carter, Justin Yim, James Kyle, Sarah Bergbreiter, and Aaron M. Johnson from CMU, was presented at ICRA 2022.


When we think of bipedal humanoid robots, we tend to think of robots that aren’t just human-shaped, but also human-sized. There are exceptions, of course—among them, a subcategory of smaller humanoids that includes research and hobby humanoids that aren’t really intended to do anything practical. But at the In International Conference on Robotics and Automation (ICRA) last week, roboticists from Carnegie Mellon University (CMU) are asked an interesting question: What happens if you try to scale down a bipedal robot? Like, way down? This line from the paper asking this question sums it up: “our goal with this project is to make miniature walking robots, as small as a LEGO Minifigure (1centimeter leg) or smaller.”

The current robot, while small (its legs are 15cm long), is obviously much bigger than a LEGO minifig. But that’s okay, because it’s not supposed to be quite as tiny as the group's ultimate ambition would have it. At least not yet. It’s a platform that the CMU researchers are using to figure out how to proceed. They're still assessing what it’s going to take to shrink bipedal walking robots to the point where they could ride in Matchbox cars. At very small scales, robots run into all kinds of issues, including space and actuation efficiency. These crop up mainly because it’s simply not possible to cram the same number of batteries and motors that go into bigger bots into something that tiny. So, in order to make a tiny robot that can usefully walk, designers have to get creative.

Bipedal walking is already a somewhat creative form of locomotion. Despite how complex bipedal robots tend to be, if the only criteria for a bipedal robot is that it walks, then it’s kind of crazy how simple roboticists can make them. Here’s a 1990-ish (!) video from Tad McGeer, the first roboticist to explore the concept of passive dynamic walking by completely unpowered robots placed on a gentle downwards slope:


The above video comes from the AMBER Lab, which has been working on efficient walking for large humanoids for a long time (you remember DURUS, right?). For small humanoids, the CMU researchers are trying to figure out how to leverage the principle of dynamic walking to make robots that can move efficiently and controllably while needing the absolute minimum of hardware, and in a way that can be scaled. With a small battery and just one actuator per leg, CMU’s robot is fully controllable, with the ability to turn and to start and stop on its own.

“Building at a larger scale allows us to explore the parameter space of construction and control, so that we know how to scale down from there,” says Justin Yim, one of the authors of the ICRA paper. “If you want to get robots into small spaces for things like inspection or maintenance or exploration, walking could be a good option, and being able to build robots at that size scale is a first step.”

“Obviously [at that scale] we will not have a ton of space,” adds Aaron Johnson, who runs CMU’s Robomechanics Lab. “Minimally actuated designs that leverage passive dynamics will be key. We aren't there yet on the LEGO scale, but with this paper we wanted to understand the way this particular morphology walks before dealing with the smaller actuators and constraints.”


Scalable Minimally Actuated Leg Extension Bipedal Walker Based on 3D Passive Dynamics, by Sharfin Islam, Kamal Carter, Justin Yim, James Kyle, Sarah Bergbreiter, and Aaron M. Johnson from CMU, was presented at ICRA 2022.

One of the key challenges in implementing reinforcement learning methods for real-world robotic applications is the design of a suitable reward function. In field robotics, the absence of abundant datasets, limited training time, and high variation of environmental conditions complicate the task further. In this paper, we review reward learning techniques together with visual representations commonly used in current state-of-the-art works in robotics. We investigate a practical approach proposed in prior work to associate the reward with the stage of the progress in task completion based on visual observation. This approach was demonstrated in controlled laboratory conditions. We study its potential for a real-scale field application, autonomous pile loading, tested outdoors in three seasons: summer, autumn, and winter. In our framework, the cumulative reward combines the predictions about the process stage and the task completion (terminal stage). We use supervised classification methods to train prediction models and investigate the most common state-of-the-art visual representations. We use task-specific contrastive features for terminal stage prediction.

Grasping and dexterous manipulation remain fundamental challenges in robotics, above all when performed with multifingered robotic hands. Having simulation tools to design and test grasp and manipulation control strategies is paramount to get functional robotic manipulation systems. In this paper, we present a framework for modeling and simulating grasps in the Simulink environment, by connecting SynGrasp, a well established MATLAB toolbox for grasp simulation and analysis, and Simscape Multibody, a Simulink Library allowing the simulation of physical systems. The proposed approach can be used to simulate the grasp dynamics in Simscape, and then analyse the obtained grasps in SynGrasp. The devised functions and blocks can be easily customized to simulate different hands and objects.

In this article, I argue that the artificial components of hybrid bionic systems do not play a direct explanatory role, i.e., in simulative terms, in the overall context of the systems in which they are embedded in. More precisely, I claim that the internal procedures determining the output of such artificial devices, replacing biological tissues and connected to other biological tissues, cannot be used to directly explain the corresponding mechanisms of the biological component(s) they substitute (and therefore cannot be used to explain the local mechanisms determining an overall biological or cognitive function replicated by such bionic models). I ground this analysis on the use of the Minimal Cognitive Grid (MCG), a novel framework proposed in Lieto (Cognitive design for artificial minds, 2021) to rank the epistemological and explanatory status of biologically and cognitively inspred artificial systems. Despite the lack of such a direct mechanistic explanation from the artificial component, however, I also argue that the hybrid bionic systems can have an indirect explanatory role similar to the one played by some AI systems built by using an overall structural design approach (but including the partial adoption of functional components). In particular, the artificial replacement of part(s) of a biological system can provide i) a local functional account of that part(s) in the context of the overall functioning of the hybrid biological–artificial system and ii) global insights about the structural mechanisms of the biological elements connected to such artificial devices.



Your weekly selection of awesome robot videos

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE ARSO 2022: 28–30 May 2022, LONG BEACH, CALIF.RSS 2022: 21 June–1 July 2022, NEW YORK CITYERF 2022: 28–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11–17 July 2022, BANGKOKIEEE CASE 2022: 20–24 August 2022, MEXICO CITYCLAWAR 2022: 12–14 September 2022, AZORES, PORTUGALCoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today’s videos!

Finally, after the first Rocky movie in 1976, the Robotic Systems Lab presents a continuation of the iconic series. Our transformer robot visited Philly in 2022 as part of the International Conference on Robotics and Automation.

[ Swiss-Mile ]

Human cells grown in the lab could one day be used for a variety of tissue grafts, but these cells need the right kind of environment and stimulation. New research suggests that robot bodies could provide tendon cells with the same kind of stretching and twisting as they would experience in a real human body. It remains to be seen whether using robots to exercise human cells results in a better tissue for transplantation into patients.

[ Nature ]

Researchers from Carnegie Mellon University took an all-terrain vehicle on wild rides through tall grass, loose gravel and mud to gather data about how the ATV interacted with a challenging, off-road environment.

The resulting dataset, called TartanDrive, includes about 200,000 of these real-world interactions. The researchers believe the data is the largest real-world, multimodal, off-road driving dataset, both in terms of the number of interactions and types of sensors. The five hours of data could be useful for training a self-driving vehicle to navigate off road.

[ CMU ]

Chengxu Zhou from the University of Leeds writes, “we have recently done a demo with one operator teleoperating two legged manipulator for a bottle opening task.”

[ Real Robotics ]

Thanks, Chengxu!

We recently hosted a Youth Fly Day, bringing together 75 Freshman students from ICA Cristo Rey All Girls Academy of San Francisco for a day of hands-on exposure to and education about drones. It was an exciting opportunity for the Skydio team to help inspire the next generation of women pilots and engineers.

[ Skydio ]

Legged robotic systems leverage ground contact and the reaction forces they provide to achieve agile locomotion. However, uncertainty coupled with the discontinuous nature of contact can lead to failure in real-world environments with unexpected height variations, such as rocky hills or curbs. To enable dynamic traversal of extreme terrain, this work introduces the utilization of proprioception to estimate and react to unknown hybrid events and elevation changes and a two-degree-of-freedom tail to improve control independent of contact.

If you like this and are in the market for a new open source quadruped controller, CMU’s got that going on, too.

[ Robomechanics Lab ]

A bolt-on 360 camera kit for your drone that costs $430.

[ Insta360 ]

I think I may be too old to have any idea what’s going on here.

[ Neato ]

I’m not the biggest fan of the way the Stop Killer Robots folks go about trying to make their point, but they have a new documentary out, so here you go.

[ Immoral Code ]

This symposium hosted by the U.S. Department of Commerce and National Institute of Standards and Technology, Stanford Institute for Human-Centered Artificial Intelligence (HAI), and the FinRegLab, brought together leaders from government, industry, civil society, and academia to explore potential opportunities and challenges posed by artificial intelligence and machine learning deployment across different economic sectors, with a particular focus on financial services and healthcare.

[ Stanford HAI ]



Your weekly selection of awesome robot videos

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IEEE ARSO 2022: 28–30 May 2022, LONG BEACH, CALIF.RSS 2022: 21 June–1 July 2022, NEW YORK CITYERF 2022: 28–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11–17 July 2022, BANGKOKIEEE CASE 2022: 20–24 August 2022, MEXICO CITYCLAWAR 2022: 12–14 September 2022, AZORES, PORTUGALCoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today’s videos!

Finally, after the first Rocky movie in 1976, the Robotic Systems Lab presents a continuation of the iconic series. Our transformer robot visited Philly in 2022 as part of the International Conference on Robotics and Automation.

[ Swiss-Mile ]

Human cells grown in the lab could one day be used for a variety of tissue grafts, but these cells need the right kind of environment and stimulation. New research suggests that robot bodies could provide tendon cells with the same kind of stretching and twisting as they would experience in a real human body. It remains to be seen whether using robots to exercise human cells results in a better tissue for transplantation into patients.

[ Nature ]

Researchers from Carnegie Mellon University took an all-terrain vehicle on wild rides through tall grass, loose gravel and mud to gather data about how the ATV interacted with a challenging, off-road environment.

The resulting dataset, called TartanDrive, includes about 200,000 of these real-world interactions. The researchers believe the data is the largest real-world, multimodal, off-road driving dataset, both in terms of the number of interactions and types of sensors. The five hours of data could be useful for training a self-driving vehicle to navigate off road.

[ CMU ]

Chengxu Zhou from the University of Leeds writes, “we have recently done a demo with one operator teleoperating two legged manipulator for a bottle opening task.”

[ Real Robotics ]

Thanks, Chengxu!

We recently hosted a Youth Fly Day, bringing together 75 Freshman students from ICA Cristo Rey All Girls Academy of San Francisco for a day of hands-on exposure to and education about drones. It was an exciting opportunity for the Skydio team to help inspire the next generation of women pilots and engineers.

[ Skydio ]

Legged robotic systems leverage ground contact and the reaction forces they provide to achieve agile locomotion. However, uncertainty coupled with the discontinuous nature of contact can lead to failure in real-world environments with unexpected height variations, such as rocky hills or curbs. To enable dynamic traversal of extreme terrain, this work introduces the utilization of proprioception to estimate and react to unknown hybrid events and elevation changes and a two-degree-of-freedom tail to improve control independent of contact.

If you like this and are in the market for a new open source quadruped controller, CMU’s got that going on, too.

[ Robomechanics Lab ]

A bolt-on 360 camera kit for your drone that costs $430.

[ Insta360 ]

I think I may be too old to have any idea what’s going on here.

[ Neato ]

I’m not the biggest fan of the way the Stop Killer Robots folks go about trying to make their point, but they have a new documentary out, so here you go.

[ Immoral Code ]

This symposium hosted by the U.S. Department of Commerce and National Institute of Standards and Technology, Stanford Institute for Human-Centered Artificial Intelligence (HAI), and the FinRegLab, brought together leaders from government, industry, civil society, and academia to explore potential opportunities and challenges posed by artificial intelligence and machine learning deployment across different economic sectors, with a particular focus on financial services and healthcare.

[ Stanford HAI ]

Simultaneously evolving morphologies (bodies) and controllers (brains) of robots can cause a mismatch between the inherited body and brain in the offspring. To mitigate this problem, the addition of an infant learning period has been proposed relatively long ago by the so-called Triangle of Life approach. However, an empirical assessment is still lacking to-date. In this paper, we investigate the effects of such a learning mechanism from different perspectives. Using extensive simulations we show that learning can greatly increase task performance and reduce the number of generations required to reach a certain fitness level compared to the purely evolutionary approach. Furthermore, we demonstrate that the evolved morphologies will be also different, even though learning only directly affects the controllers. This provides a quantitative demonstration that changes in the brain can induce changes in the body. Finally, we examine the learning delta defined as the performance difference between the inherited and the learned brain, and find that it is growing throughout the evolutionary process. This shows that evolution produces robots with an increasing plasticity, that is, consecutive generations become better learners and, consequently, they perform better at the given task. Moreover, our results demonstrate that the Triangle of Life is not only a concept of theoretical interest, but a system methodology with practical benefits.

Research on psychological novelty effects within the fields of Social Robotics and Human-Robot Interaction (together: SHRI) so far has failed to gather the momentum it deserves. With the aid of exemplary descriptions of how psychological novelty is currently approached and researched across (certain main regions of) the larger scientific landscape, I argue that the treatment of novelty effects within the multidisciplinary SHRI reflects larger circumstances of fragmentation and heterogeneity in novelty research in general. I further propose that while the concept of novelty may currently function as a Boundary Object between the contributing domains of SHRI, a properly integrated, interdisciplinary concept of novelty is needed in order to capture and investigate the scope and scale of novelty effects within research on social human-robot interaction. Building on research on the New Ontological Category Hypothesis and related studies, I argue that the novelty of social robots can be understood as radical to the extent that their comprehension requires revisions of traditional core categories of being. In order to investigate the sui generis effects of such novelty, which should not be narrowly understood as mere “noise” in the data, it is paramount that the field of SHRI begin by working out a shared, integrative framework of psychological novelty and novelty effects.

Enabled by advancing technology, coral reef researchers increasingly prefer use of image-based surveys over approaches depending solely upon in situ observations, interpretations, and recordings of divers. The images collected, and derivative products such as orthographic projections and 3D models, allow researchers to study a comprehensive digital twin of their field sites. Spatio-temporally located twins can be compared and annotated, enabling researchers to virtually return to sites long after they have left them. While these new data expand the variety and specificity of biological investigation that can be pursued, they have introduced the much-discussed Big Data Problem: research labs lack the human and computational resources required to process and analyze imagery at the rate it can be collected. The rapid development of unmanned underwater vehicles suggests researchers will soon have access to an even greater volume of imagery and other sensor measurements than can be collected by diver-piloted platforms, further exacerbating data handling limitations. Thoroughly segmenting (tracing the extent of and taxonomically identifying) organisms enables researchers to extract the information image products contain, but is very time-consuming. Analytic techniques driven by neural networks offer the possibility that the segmentation process can be greatly accelerated through automation. In this study, we examine the efficacy of automated segmentation on three different image-derived data products: 3D models, and 2D and 2.5D orthographic projections thereof; we also contrast their relative accessibility and utility to different avenues of biological inquiry. The variety of network architectures and parameters tested performed similarly, ∼80% IoU for the genus Porites, suggesting that the primary limitations to an automated workflow are 1) the current capabilities of neural network technology, and 2) consistency and quality control in image product collection and human training/testing dataset generation.

The introduction and use of innovative technological devices to support the aging of frail elderly people does not necessarily correspond to an improvement in people’s quality of life. The strong technical curvature resulting from the use of telemedicine models often highlights limits in the usability of technologies in responding to the real needs of users. The theoretical framework of special pedagogy allows the assumption of the bio-psycho-social perspective and the constructs of quality of life and participation and opens up to inclusive logics that implement a profound and questioning reflection on all contexts of life, with the goal of exposing the set of disabling processes and indicating a valid support in the use of technological resources. The study, retracing the research phases of the Data System Platform for Smart Communities project (project admitted for funding in the Innolabs 2018–2019 call), completed in 2020, investigates the needs of strategic stakeholders and explores the factors that influence the adoption and diffusion of telemedicine devices by frail elderly people.

Pages