Earlier this year, a report from analyst Gartner found that 77% of retailers plan to deploy AI by 2021, with the deployment of robotics for warehouse picking as the primary use case.
At the time, Kelsie Marian, senior research director at Gartner, described how retailers were keen to experiment with new technologies to meet the ever-changing expectations of customers that challenge traditional retail.
For instance, greater automation could be implemented in warehouse picking, involving smart robots working independently or alongside humans. “This means the robot will have to ‘mesh’ with the human team – essentially meaning that both sides will need to learn how to ‘collaborate’ to operate effectively together,” said Marian.
It is an area Ocado has been investing in for the past five years.
Graham Deacon is a robotics research fellow at Ocado Technology, the part of the online grocery delivery business that has been tasked to look five to 10 years ahead to investigate how to deliver the next big thing.
Five years ago, a group of academics working as part of the EU Horizon 2020 SecondHands consortium came to Ocado with an idea. Explaining how Ocado got involved in the project, Deacon said: “They initially wanted to develop a robot for a train station.” The idea was to have the robot assist people.
The SecondHands project was given its name because the robot provides a “second pair of hands” to workers. The robot offers useful assistance – such as holding, lifting, reaching or passing objects. People can concentrate on the “skilled” part of a job while the robot takes responsibility for the heavy lifting and support roles – thus enabling human and machine to actively enhance each other’s complementary strengths.
Deacon said this concept can be applied in Ocado warehouses, not to support train passengers, but instead to help maintenance engineers. “Humanoid robots are key for improving flexibility and safety in industrial contexts in a way that is genuinely useful,” he said.
Each day, there is a maintenance window when the automation in the Ocado warehouses ramp down, giving the technicians the ability to maintain the equipment. Ocado saw an opportunity to have SecondHand robots support human technicians in their daily maintenance duties at the company’s highly automated distribution centres.
Earlier this month, Ocado Technology, and the consortium of academics working on the SecondHands project, announced Armar-6, which represents the culmination of the five year programme. This is a cobot, or collaborative robot, and differs significantly to the kind of robots that have been used in industries like car manufacturing.
According to a 2019 Industrial robotics paper by McKinsey, the industrial robotics sector has been growing rather steadily since the 1960s. “After the first industrial robots appeared in the 1960s, a real growth spurt occurred as automotive OEMs [original equipment manufacturers] automated their weld shops,” McKinsey noted in the article.
But while it is acceptable to have robots work on a production line, Deacon said Ocado Technology was looking for a robot that could work in the less predictable environment of a warehouse, alongside human operators. On a production line he said an industrial robot can typically rely on a CAD file of the object it needs to pick up. “Things have to be pretty close to where they are expected to be in order for the robot task not to fail.”
To work alongside someone, the robot needs to understand the context of the job the person is undertaking. By “understanding” the task that needs to be achieved, the SecondHands project looked at how a robot could support a human, to enable the job to be completed quicker and more safely.
Another Horizon 2020 project, Soma, focused on investigating how a robot hand could be engineered to handle the soft manipulation of objects. The robot brain has to understand how much pressure it needs to exert, to pick up an object successfully without dropping it or damaging it by being too heavy handed.
Deacon said the Armar-6 hand is soft, which means it can shape its hand to the object.
There is no need to have a totally accurate CAD representation of the object. He said the softness of the grip is simply a parameter that can be programmed in software, based on the task that needs to be achieved.
“There is an example in the SecondHands project where the robot and a technician take a panel down off the underside of a piece of mechanical handling equipment,” he said. “Between them, they move the panel and put it on the floor. Normally, it would be flexible in the lateral direction, but if there is some kind of friction, the robot would stiffen up to avoid the obstacle.”
Deacon said that the programming that controls the robot’s arm movements needs to be context-aware. “Typically, we know when something is fragile and we’ll tune the robot to behave appropriately,” he said. But there is on-going research into how sensors can be used to detect an object’s sensitivity.
Beyond the soft grip, he said the robot also records over time the movements the technician makes while undertaking a maintenance duty. Technicians are given a set of maintenance tasks to complete, but they generally choose the order in which to complete these.
“We can predict what the technician is trying to do and have the robot understand how it can be useful,” said Deacon.
He said researchers from UCL have developed a way to make a wireframe diagram of someone based on a two-dimensional image. By collecting a database of different poses, a machine learning algorithm is able to work out the position of the technician’s joints. Deacon said this is used to enable the robot to understand how far along the maintenance task the technician has gone, and, as a consequence, it can provide the appropriate assistance needed.
While Armar-6 has been developed for maintenance, Deacon said, the technology is not limited to this. “It could be generalised to assisted living, or use in hospital and could also be relevant to our delivery process. One of the challenges in online deliveries to people’s homes is how to get the order from the kerbside to the kitchen,” he said, adding that the answer is not just about mechatronics: “You need to interact with people in a way that is natural.”
This is why the SecondHands projects also worked on building in natural language, voice-based interaction into Armar-6. Such technology is now widely used in people’s homes with devices like Google Home and Amazon Alexa. However, Deacon said these are quite different to how Armar-6 processes people’s sentences.
“While those devices wait for a sentence (an utterance) and then try to understand it, ours tries to understand each word, and reasons what it has to do next,” he said. “This is because it has to respond in the right time frame in order to be useful.”
Armar-6 tries to be contextually aware. It uses a neural network which has been trained on utterances not just from technicians, but other speech as well, to create enough of an understanding of the task that needs to be achieved.
Armar-6 and the SecondHands project represent a leap forward in the merger of mechatronics and artificial intelligence, and illustrate the complexities involved in building a robot that can truly interact with people.
Since the 1950s, with the likes of Robbie the Robot from Forbidden Planet, sci-fi buffs have imagined how robots could live and work in society. What Armar-6 and the SecondHands project has shown is that there are huge technical hurdles that need to be tackled before fiction can become reality.