ProShop
Published

Collaborative Robots Learn to Collaborate

Accessible 3D vision unlocks the potential of machine learning for making our autonomous partners more humanlike.

Share

Leaders-In background

To be truly collaborative, robots must be capable of more than working safely alongside human beings. Russell Toris, director of robotics at Fetch Robotics, says robots also need to act (and “think”) more like people.

This is particularly true of autonomous mobile robots (AMRs) like those manufactured by Fetch. Typically employed for material transport and data collection (such as counting inventory), these wheeled systems use vision sensors and navigation software to dynamically adapt to new environments and situations. Increasingly common in warehouses and distribution centers, this technology is likely to spread to other applications and industries, including our own. In fact, our January issue’s coverage of JIMTOF in Japan touched on the promise of machine-tending robot arms on wheels. Whatever the application, ensuring that a robot can safely occupy the same spaces as humans is an entirely different proposition than ensuring that its behavior neither hinders implementation nor wastes resources by making humans uncomfortable.

Mr. Toris cites the example of two people approaching each other from opposite directions in a narrow hallway. They typically will acknowledge each other in some way before passing, even if only by a nod or a glance. One or both will likely slow down, and perhaps step aside to allow the other a wider berth. A robot with a myopic focus on moving as efficiently as possible from point A to point B will not be nearly as considerate. It might not collide with the person striding toward it, but its movements will seem as cold as they are efficient, and possibly even threatening.

The robot is “aggressive.” The robot is “rude.” The robot is “acting drunk.” We cannot help but assign human traits to inanimate objects, particularly objects that act autonomously and purport to be our collaborators. This tendency influences our behavior, Mr. Toris says. Employees might lose time keeping a wary eye out for rampaging robots. They might even stop working entirely to observe odd behavior. Whatever the specifics of the situation, we are less likely to use any technology to its full potential, or even use it at all, if it evinces feelings of hesitation or intimidation.    

Instead, what if AMRs could maintain a comfortable distance as they pass? What if they could differentiate between a person, a forklift and a pallet, and adjust their behavior accordingly? Robots may not be able to nod or glance, but what if they could use sound or light (say, a turn signal) to notify people of their intentions? Making behavior more natural and more predictable is a primary design philosophy at Fetch Robotics. “We design robots for people, not robots for robots,” Mr. Toris says.

This is possible through the intersection of two inherently intertwined technologies. The first is 3D vision systems, which are more affordable than ever due to advances in seemingly unrelated fields like autonomous vehicles, Mr. Toris says. Although the 2D laser sensors used for most AMRs are extremely accurate and capable of detecting distant objects, their vision is limited to a shin’s-eye view of the most basic geometric shapes. Add 3D cameras to complement the 2D sensors, as Fetch has done, and the robots can paint a more comprehensive picture of their environments. More robust visual data is critical not only for distinguishing objects, but also for fueling the machine-learning algorithms that enable the robots to determine how best to respond to those objects.

Fetch must teach its robots in order for them to learn, and teaching requires masses of data. To collect that data, the company has constructed a mock warehouse to train AMRs at its facility in San Jose, California. Mobile robots have been navigating the aisles for four years now, filtering rich vision-sensor feedback through artificial neural networks (ANNs) to distinguish obstacles and determine not only how to navigate around them, but to navigate around them appropriately. These ANNs consist of layer upon layer of interconnected, computerized nodes, creating a vast web that filters data from the robot’s sensors (2D lasers complemented by 3D cameras). Each time the robot identifies and/or responds correctly to an obstacle, individual nodes are weighted accordingly. This makes the same outcome more likely in the future, even when the ANN is tasked with filtering novel sensor data from a novel environment.

Four years’ worth of data from the mock factory ensure that the latest AMRs will benefit from all the experience of their predecessors, Mr. Toris says. Four years from now, the dataset will be even more robust, and new machine-learning techniques likely will be available. Whatever the future of AMRs in CNC machine shops, it is well worth considering how robot design might change as a result of both technological developments and changes in thinking about the nature of automation.

ProShop
HCL CAMWorks
IMTS+
High Accuracy Linear Encoders
KraussMaffei
SolidCAM
Paperless Parts
MMS Made in the USA
World Machine Tool Survey
JTEKT
DANOBAT
To any Measurement Question there is an Answer

Related Content

Leveraging Data to Drive Manufacturing Innovation

Global manufacturer Fictiv is rapidly expanding its use of data and artificial intelligence to help manufacturers wade through process variables and production strategies. With the release of a new AI platform for material selection, Fictive CEO Dave Evans talks about how the company is leveraging data to unlock creative problem solving for manufacturers.

Read More

Shop Quotes Smarter, Works Harder with Machine Monitoring

Temco first installed MT-LINKi to optimize quoting. Now, the software helps the shop optimize its machines — and machine purchases.

Read More

Manufacturer, Integrator, Software Developer: Wolfram Manufacturing is a Triple Threat

Wolfram Manufacturing showcased its new facility, which houses its machine shop along with space for its work as a provider of its own machine monitoring software and as an integrator for Caron Engineering.

Read More

Can Connecting ERP to Machine Tool Monitoring Address the Workforce Challenge?

It can if RFID tags are added. Here is how this startup sees a local Internet of Things aiding CNC machine shops.

Read More

Read Next

Workforce Development

Building Out a Foundation for Student Machinists

Autodesk and Haas have teamed up to produce an introductory course for students that covers the basics of CAD, CAM and CNC while providing them with a portfolio part.

Read More

Registration Now Open for the Precision Machining Technology Show (PMTS) 2025

The precision machining industry’s premier event returns to Cleveland, OH, April 1-3.   

Read More

5 Rules of Thumb for Buying CNC Machine Tools

Use these tips to carefully plan your machine tool purchases and to avoid regretting your decision later.

Read More
HCL CAMWorks