Login with your Social Account

Machine learning to help develop self-healing robots that ‘feel pain’

Machine learning to help develop self-healing robots that ‘feel pain’

The goal of the €3 million Self-healing soft robot (SHERO) project, funded by the European Commission, is to create a next-generation robot made from self-healing materials (flexible plastics) that can detect damage, take the necessary steps to temporarily heal itself and then resume its work – all without the need for human interaction.

Led by the University of Brussels (VUB), the research consortium includes the Department of Engineering (University of Cambridge), École Supérieure de Physique et de Chimie Industrielles de la ville de Paris (ESPCI), Swiss Federal Laboratories for Materials Science and Technology (Empa), and the Dutch Polymer manufacturer SupraPolix.

As part of the SHERO project, the Cambridge team, led by Dr Fumiya Iida from the Department of Engineering are looking at integrating self-healing materials into soft robotic arms.

Dr Thomas George Thuruthel, also from the Department of Engineering, said self-healing materials could have future applications in modular robotics, educational robotics and evolutionary robotics where a single robot can be ‘recycled’ to generate a fresh prototype.

“We will be using machine learning to work on the modelling and integration of these self-healing materials, to include self-healing actuators and sensors, damage detection, localisation and controlled healing,” he said. “The adaptation of models after the loss of sensory data and during the healing process is another area we are looking to address. The end goal is to integrate the self-healing sensors and actuators into demonstration platforms in order to perform specific tasks.”

Professor Bram Vanderborght, from VUB, who is leading the project with scientists from the robotics research centre Brubotics and the polymer research lab FYSC, said: “We are obviously very pleased to be working on the next generation of robots. Over the past few years, we have already taken the first steps in creating self-healing materials for robots. With this research we want to continue and, above all, ensure that robots that are used in our working environment are safer, but also more sustainable. Due to the self-repair mechanism of this new kind of robot, complex, costly repairs may be a thing of the past.”

Materials provided by Cambridge University

Robot uses machine learning to harvest lettuce

The ‘Vegebot’, developed by a team at the University of Cambridge, was initially trained to recognize and harvest iceberg lettuce in a lab setting. It has now been successfully tested in a variety of field conditions in cooperation with G’s Growers, local fruit and vegetable co-operative.

For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot

–Josie Hughes

Although the prototype is nowhere near as fast or efficient as a human worker, it demonstrates how the use of robotics in agriculture might be expanded, even for crops like iceberg lettuce which are particularly challenging to harvest mechanically. The results are published in The Journal of Field Robotics.

Crops such as potatoes and wheat have been harvested mechanically at scale for decades, but many other crops have to date resisted automation. Iceberg lettuce is one such crop. Although it is the most common type of lettuce grown in the UK, iceberg is easily damaged and grows relatively flat to the ground, presenting a challenge for robotic harvesters.

“Every field is different, every lettuce is different,” said co-author Simon Birrell from Cambridge’s Department of Engineering. “But if we can make a robotic harvester work with iceberg lettuce, we could also make it work with many other crops.”

“At the moment, harvesting is the only part of the lettuce life cycle that is done manually, and it’s very physically demanding,” said co-author Julia Cai, who worked on the computer vision components of the Vegebot while she was an undergraduate student in the lab of Dr. Fumiya Iida.

The Vegebot first identifies the ‘target’ crop within its field of vision, then determines whether a particular lettuce is healthy and ready to be harvested, and finally cuts the lettuce from the rest of the plant without crushing it so that it is ‘supermarket ready’. “For a human, the entire process takes a couple of seconds, but it’s a really challenging problem for a robot,” said co-author Josie Hughes.

The Vegebot has two main components: a computer vision system and a cutting system. The overhead camera on the Vegebot takes an image of the lettuce field and first identifies all the lettuces in the image, and then for each lettuce, classifies whether it should be harvested or not. A lettuce might be rejected because it’s not yet mature, or it might have a disease that could spread to other lettuces in the harvest.

The researchers developed and trained a machine learning algorithm on example images of lettuces. Once the Vegebot could recognise healthy lettuces in the lab, it was then trained in the field, in a variety of weather conditions, on thousands of real lettuces.

A second camera on the Vegebot is positioned near the cutting blade and helps ensure a smooth cut. The researchers were also able to adjust the pressure in the robot’s gripping arm so that it held the lettuce firmly enough not to drop it, but not so firm as to crush it. The force of the grip can be adjusted for other crops.

“We wanted to develop approaches that weren’t necessarily specific to iceberg lettuce so that they can be used for other types of above-ground crops,” said Iida, who leads the team behind the research.

In future, robotic harvesters could help address problems with labour shortages in agriculture, and could also help reduce food waste. At the moment, each field is typically harvested once, and any unripe vegetables or fruits are discarded. However, a robotic harvester could be trained to pick only ripe vegetables, and since it could harvest around the clock, it could perform multiple passes on the same field, returning at a later date to harvest the vegetables that were unripe during previous passes.

“We’re also collecting lots of data about lettuce, which could be used to improve efficiency, such as which fields have the highest yields,” said Hughes. “We’ve still got to speed our Vegebot up to the point where it could compete with a human, but we think robots have lots of potential in agri-tech.”

Iida’s group at Cambridge is also part of the world’s first Centre for Doctoral Training (CDT) in agri-food robotics. In collaboration with researchers at the University of Lincoln and the University of East Anglia, the Cambridge researchers will train the next generation of specialists in robotics and autonomous systems for application in the agri-tech sector. The Engineering and Physical Sciences Research Council (EPSRC) has awarded £6.6m for the new CDT, which will support at least 50 PhD students.

Reference:
Simon Birrell et al. ‘A Field-Tested Robotic Harvesting System for Iceberg Lettuce.’ Journal of Field Robotics (2019). DOI: 10.1002/rob.21888

Materials provided by the University of Cambridge

Neural network data matching

Machine learning unlocks mysteries of quantum physics

Understanding electrons’ intricate behaviour has led to discoveries that transformed society, such as the revolution in computing made possible by the invention of the transistor.

Today, through advances in technology, electron behaviour can be studied much more deeply than in the past, potentially enabling scientific breakthroughs as world-changing as the personal computer. However, the data these tools generate are too complex for humans to interpret.

A Cornell-led team has developed a way to use machine learning to analyze the data generated by scanning tunnelling microscopy (STM) – a technique that produces subatomic scale images of electronic motions in material surfaces at varying energies, providing information unattainable by any other method.

“Some of those images were taken on materials that have been deemed important and mysterious for two decades,” said Eun-Ah Kim, professor of physics. “You wonder what kinds of secrets are buried in those images. We would like to unlock those secrets.”

Kim is senior author of “Machine Learning in Electronic Quantum Matter Imaging Experiments,” which published in Nature June 19. First authors are Yi Zhang, formerly a postdoctoral researcher in Kim’s lab and now at Peking University in China, and Andrej Mesaros, a former postdoctoral researcher in Kim’s lab now at the Université Paris-Sud in France.

Co-authors include J.C. Séamus Davis, Cornell’s James Gilbert White Distinguished Professor in the Physical Sciences, an innovator in STM-driven studies.

The research yielded new insights into how electrons interact – and showed how machine learning can be used to drive further discovery in experimental quantum physics.

At the subatomic scale, a given sample will include trillion trillions of electrons interacting with each other and the surrounding infrastructure. Electrons’ behaviour is determined partly by the tension between their two competing tendencies: to move around, associated with kinetic energy; and to stay far away from each other, associated with repulsive interaction energy.

In this study, Kim and collaborators set out to discover which of these tendencies is more important in a high-temperature superconductive material.

Using STM, electrons tunnel through a vacuum between the conducting tip of the microscope and the surface of the sample being examined, providing detailed information about the electrons’ behaviour.

“The problem is, when you take data like that and record it, you get image-like data, but it’s not a natural image, like an apple or a pear,” Kim said. The data generated by the instrument is more like a pattern, she said, and about 10,000 times more complicated than a traditional measurement curve. “We don’t have a good tool to study those kinds of data sets.”

To interpret this data, the researchers simulated an ideal environment and added factors that would cause changes in electron behaviour. They then trained an artificial neural network – a kind of artificial intelligence that can learn a specific task using methods inspired by how the brain works – to recognize the circumstances associated with different theories. When the researchers input the experimental data into the neural network, it determined which of the theories the actual data most resembled.

This method, Kim said, confirmed the hypothesis that the repulsive interaction energy was more influential in the electrons’ behaviour.

A better understanding of how many electrons interact on different materials and under different conditions will likely lead to more discoveries, she said, including the development of new materials.

“The materials that led to the initial revolution of transistors were actually pretty simple materials. Now we have the ability to design much more complex materials,” Kim said. “If these powerful tools can reveal important aspects leading to the desired property, we would like to be able to make a material with that property.”

Also contributing were researchers at Brookhaven National Laboratory, Stanford University, Harvard University, San Jose State University, the National Institute of Advanced Industrial Science in Japan, the University of Tokyo and Oxford University.

Materials provided by Cornell University