0
Select Articles

Working Hand in Claw PUBLIC ACCESS

Robots Collaborate with Humans in Very Limited Ways, but What if We Could Teach Them to be More Intuitive About What We Need?

[+] Author Notes

Lina Zeldovich is a technology writer based in Woodside, N.Y.

Mechanical Engineering 140(08), 36-41 (Aug 01, 2018) (6 pages) Paper No: ME-18-AUG2; doi: 10.1115/1.2018-AUG-2

Robots have been helping people for decades by doing tasks that are too hard, too dangerous, or physically impossible for humans to carry out themselves. But so far, the biggest successes for human-robot collaborations—such as those in manufacturing and automotive industries—still require the two to be separated for safety reasons. This article delves into how roboticists are working to upgrade this paradigm and make it possible for humans and robots to work together side by side, each excelling at what they do best and helping the other with their pitfalls.

“Hey Baxter, Can You Hand Me The Soldering Iron?” assistant professor at Cornell University’s computer science department, asks his red-and-black robot, who is being trained to understand human requests and assist accordingly.

As Baxter reaches out for the tool’s tip, Knepper interrupts. “It’s hot!” he shouts with a warning gesture. The robot stops midway, considers the warning, and moves its hulking arm two inches farther down to grab the iron by its handle—and then passes it to Knepper.

Robots have been helping people for decades by doing tasks that are too hard, too dangerous, or physically impossible for humans to carry out themselves. But so far, the biggest successes for human-robot collaborations—such as those in manufacturing and automotive industries— still require the two to be separated for safety reasons. Hardwired for a limited set of tasks, industrial robots don’t have the intelligence to know when their actions may endanger a human. They work in cages and are shut down when people enter. They are not true teammates.

Knepper and other roboticists want to upgrade this paradigm and make it possible for humans and robots to work together side by side, each excelling at what they do best and helping the other with their pitfalls. Humans are creative visionaries, which is hard to model computationally. Robots, on the other hand, are better at precision, exact repetitions, and withstanding harsh physical conditions.

“Robots are precise, repeatable, and can work all day long without a break,” Knepper said. “But they aren’t good at dealing with changes or failures, while humans excel at dealing with novelty.”

Human-robot partnerships offer substantial benefits. Many aspects of aircraft assembly are still done by hand because they require two people working in tandem inside and outside the airframe, which is hard to automate. Many smaller businesses could use robots for repetitive tasks that vary throughout the day, but to program the currently available models to do that for all but the simplest tasks usually takes someone with a Ph.D., said Brian Scassellati, who builds socially assistive robots at Yale University.

“From automotive industry to small businesses, there is a real need for smart robotic collaborators in the workplace,” Scassellati said.

Smart robots can also make human teammates safer and more efficient at their jobs, added Bradley Hayes, who works on machine learning techniques at the University of Colorado in Boulder. It would be so much better to have an expert looking over your shoulder, warning you when you’re about to do something dangerous, helping you hold something steady, or just generally offering you a robo-hand.

At Cornell University, a Baxter robot learns to interact with humans verbally by responding to commands and issuing warnings. Photo: Cornell University

Grahic Jump LocationAt Cornell University, a Baxter robot learns to interact with humans verbally by responding to commands and issuing warnings. Photo: Cornell University

Stryker’s Mako surgical robot helps surgeons make precise cuts for joint replacements by guiding their hands along a path previously selected by the clinician. Photo: Stryker

Grahic Jump LocationStryker’s Mako surgical robot helps surgeons make precise cuts for joint replacements by guiding their hands along a path previously selected by the clinician. Photo: Stryker

“I can use a sander but I can’t do a perfectly smooth finish,” Hayes said. “But if the robot was holding the tool with me, it could assure that I only move within the right plane, improving the execution of my desired intent.”

It turns out that such robots already exist. They are slowly coming out of their factory cages or research labs, and entering our workspaces, offices, and even homes. And in some cases, they’re already quite literally holding our hands.

Over the past few years, modern technologies allowed roboticists to build safer models like Rethink Robotics’ Sawyer and Baxter and Universal Robotics’ UR arms, which use force sensors to trigger a stop if they unexpectedly bump into someone.

“You can approach the problem of safety from many angles, such as appropriately sized motors, force sensors, proximity sensors, and software,” said Matei Ciocarlie of Columbia University, who worked on developing manipulation skills for the Personal Robot 2 (PR2).

PR2 was developed at an incubator started by Silicon Valley’s Willow Garage, which developed the nearly ubiquitous Robot Operating Systems. Equipped with a multitude of sensors and smart software, PR2 assisted a quadriplegic patient with shaving, scratching, bringing water, and other requests.

“We had to devise interfaces to control the robot and do things around the house,” Ciocarlie said, to ensure PR2 was completely safe to work with humans. “The robot and the person were sharing the ultimate human space—the home.”

Some robots are also wheeling into office environments. If you visit Manuela Veloso, who leads the machine learning department at Carnegie Mellon University, you’ll be ushered around the campus by one of her CoBots, or collaborative robots. CoBots buzz around between different labs, deliver packages to various departments, escort guests, fetch coffee, and do other human-requested chores. When they encounter something unfamiliar—like finding the location of the nearest coffee shop or running down today’s weather forecast—they can search Google for answers.

Essentially computers on wheels, CoBots lack arms to grab or move objects, though some have baskets where humans can place things. Yet they can interact with people through text and speech, freeing those individuals from time-consuming errands.

“CoBots are office helpers, but they can also be useful in other places where people need service—supermarkets, shopping malls, or nursing homes,” Veloso said.

Other robots are joining the human workforce as team-mates. Launched about two years ago, ABB Robotics’ small, lightweight, and easily programmed YuMi was designed for testing electronic devices. It excels at the repetitive motions, placing and sliding motions needed to align a printed circuit board with a test station, but it also proved useful in assembling small USB gadgets. It can also stress-test parts, which requires repetitive picking up and dropping motions, or load medical samples, such as urine and blood, into testing machines.

CoBots deliver mail, fetch coffee, escort guests and answer questions in Carnegie Mellon’s hallways. Photo: Carnegie Mellon

Grahic Jump LocationCoBots deliver mail, fetch coffee, escort guests and answer questions in Carnegie Mellon’s hallways. Photo: Carnegie Mellon

“You can do it with any robot arm, but YuMi can do this alongside humans without safety gates,” ABB application manager Nicolas de Keijser emphasized. “It doesn’t necessarily do collaborative tasks, but the fact you can put into an office environment is an attractive feature.”

The “hand-holding” robots proved their worth in the most difficult of settings—the operating room. Stryker’s Mako is a surgical robot that helps surgeons achieve better precision in carving out bone for insertion of hip and knee replacement implants.

This is very different than the well-known Da Vinci surgical robot. The Da Vinci is controlled entirely by surgeons for minimally invasive procedures. Its cameras provide an enlarged view of the surgical area and dampen the natural vibrations of the hands, enabling surgeons to perform intricate procedures very accurately. Yet Da Vinci leaves all the decisions, from the length of incisions to the depth of cuts and choice of stitches, in the surgeons’ hands.

The Mako robot, on the other hand, does far more handholding. To fit the hip or knee implant, surgeons must carve out a precise region of the bone to mate with the artificial. This varies with patient anatomy and requires very complex saw blade movements.

Embracing the concept that humans are creative and robots are precise, the collaboration works as follows: The surgeon outlines the ideal carving trajectory based on the patient’s anatomy. The Mako keeps the blade within those human-defined boundaries, essentially, holding the surgeon’s hand steady, just like Hayes described in the sander example.

“The surgeon is pushing on the saw, but the robot arm defines where the saw goes, based on the surgeon’s plan,” explains Robert Cohen, vice president and general manager of global R&D for Stryker’s joint replacement division. “The surgeon can’t go too far to the left, or too far to the right, and can’t push too far down. The robot arm will stop him.”

But turning robots into real work buddies, is a much more difficult task.

ABB, a company well known for large, powerful industrial robots, is now making smaller models that can work safely next to people. Photo: ABB

Grahic Jump LocationABB, a company well known for large, powerful industrial robots, is now making smaller models that can work safely next to people. Photo: ABB

A good assistant isn’t the one who hands you a screwdriver when you ask for it, but the one who knows when you’re going to need that tool—and gets it ahead of time.

“That is extremely challenging, because you need to model the human thought process,” Scassellati said. “There’s a tremendous amount of deference, in terms of software capabilities that the robot needs to have.”

Even the simplest verbal commands can be ambiguous, which makes them hard for a robot to interpret. Human workers don’t use full sentences on the job—a mere “gimme the next one” or “hand me that one” does the trick. Yet the words “that one” and “next” can mean many different things. And the words, “it’s hot!” can refer to a soldering iron or to the room’s temperature.

Scassellati’s workaround is something called a hierarchical task model. Instead of executing a task as a sequence of actions (like the instructions for assembling an IKEA chair), his robots assess the different steps involved in the task and how they depend on each other. This lets his team train the robot on how people transition within the task.

They also teach robots the spoken terms associated with the task, so the robot will understand verbal instructions. And sometimes the robots talk back.

Once they pick up some vocabulary, they begin to ask questions: “I saw that you put the legs together before you build the back. Do you always have to do this in that order?”

As robots learn how tasks are done, they graduate from reacting to behaving proactively. Instead of simply following human instructions, they build on their learned experience to act more intuitively.

In his research, Hayes focused on teaching robots to reason through how to be helpful without explicitly being told, for example, figuring out that in eight seconds the human will need a particular wooden board, and getting it within five seconds to have it ready. Robots are good leaners, especially if trainers keep demonstrations constrained and only change one or two variables at a time, Hayes said. It is like learning on the job, and it takes place in small incremental steps.

“You don’t necessarily need a lot of demonstration to teach a very simple assistive behavior in a well structured environment,” Hayes said. “Teaching a robot a helpful behavior is a fairly quick process-under an hour for one assistive behavior.”

Scassellati said that alter assembling a bunch of chairs alongside different humans, robots can build enough expertise to tell an expert from a rookie, and guide the beginner along.

“One of our robots automatically derives how confusing something can be to you,” Scassellati said. “It knows you’re a novice and it knows there may be parts that are confusing, so it will take the parts that you shouldn’t be using and remove them further away from you, so that you will only use the pieces that you should be using.”

Training robots to be self-learners takes a modular programming approach that is a bit of a balancing act. The robot needs dedicated code modules to handle specific tasks, yet those modules must be abstract enough so that researchers do not have to reinvent the wheel all the time.

“That makes the robots more flexible,” said Knepper. “You don’t want to write a different program when you want your robot to use nails instead of screws.”

Mastering that level of intelligence requires an enormous amount of customized programming, but the end result is really close to a robotic henchmen.

Programming those tasks sounds like an arduous job, but luckily, robots are just as good at knowledge sharing as humans-and perhaps even better. Once various modules or code libraries are written (usually in common engineering languages like Python or C++), they can be uploaded to the cloud and shared by different robots.

“One robot’s experiences can contribute to the collective robotic intelligence,” said Hayes.

What’s really important though, is that once developers write enough code to turn robots into learners and the robots learn enough tasks, the entire robotic paradigm can change drastically.

“That’s when the lines really begin to blur,” said Ciocarlie. “Are you programming a tool or are you teaching a collaborator?”

That’s when the robot really stops being a machine and becomes a teammate and a work buddy who will hand people the parts they need, ask questions, talk back, and perhaps crack a joke now and then.

And more importantly, having learned that humans are damaged easily, that robot would shout out a warning to a person unwittingly about to do something dangerous, like grabbing a soldering iron by its hot tip.

Universal Robotics developed the first low-speed robotic arms designed to work safely near people. Photo: Universal Robotics

Grahic Jump LocationUniversal Robotics developed the first low-speed robotic arms designed to work safely near people. Photo: Universal Robotics

Copyright © 2018 by ASME
Topics: Robots
View article in PDF format.

References

Figures

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In