Collaborative Robots Learn to Collaborate
Accessible 3D vision unlocks the potential of machine learning for making our autonomous partners more humanlike.
To be truly collaborative, robots must be capable of more than working safely alongside human beings. Russell Toris, director of robotics at Fetch Robotics, says robots also need to act (and “think”) more like people.
This is particularly true of autonomous mobile robots (AMRs) like those manufactured by Fetch. Typically employed for material transport and data collection (such as counting inventory), these wheeled systems use vision sensors and navigation software to dynamically adapt to new environments and situations. Increasingly common in warehouses and distribution centers, this technology is likely to spread to other applications and industries, including our own. In fact, our January issue’s coverage of JIMTOF in Japan touched on the promise of machine-tending robot arms on wheels. Whatever the application, ensuring that a robot can safely occupy the same spaces as humans is an entirely different proposition than ensuring that its behavior neither hinders implementation nor wastes resources by making humans uncomfortable.
Mr. Toris cites the example of two people approaching each other from opposite directions in a narrow hallway. They typically will acknowledge each other in some way before passing, even if only by a nod or a glance. One or both will likely slow down, and perhaps step aside to allow the other a wider berth. A robot with a myopic focus on moving as efficiently as possible from point A to point B will not be nearly as considerate. It might not collide with the person striding toward it, but its movements will seem as cold as they are efficient, and possibly even threatening.
The robot is “aggressive.” The robot is “rude.” The robot is “acting drunk.” We cannot help but assign human traits to inanimate objects, particularly objects that act autonomously and purport to be our collaborators. This tendency influences our behavior, Mr. Toris says. Employees might lose time keeping a wary eye out for rampaging robots. They might even stop working entirely to observe odd behavior. Whatever the specifics of the situation, we are less likely to use any technology to its full potential, or even use it at all, if it evinces feelings of hesitation or intimidation.
Instead, what if AMRs could maintain a comfortable distance as they pass? What if they could differentiate between a person, a forklift and a pallet, and adjust their behavior accordingly? Robots may not be able to nod or glance, but what if they could use sound or light (say, a turn signal) to notify people of their intentions? Making behavior more natural and more predictable is a primary design philosophy at Fetch Robotics. “We design robots for people, not robots for robots,” Mr. Toris says.
This is possible through the intersection of two inherently intertwined technologies. The first is 3D vision systems, which are more affordable than ever due to advances in seemingly unrelated fields like autonomous vehicles, Mr. Toris says. Although the 2D laser sensors used for most AMRs are extremely accurate and capable of detecting distant objects, their vision is limited to a shin’s-eye view of the most basic geometric shapes. Add 3D cameras to complement the 2D sensors, as Fetch has done, and the robots can paint a more comprehensive picture of their environments. More robust visual data is critical not only for distinguishing objects, but also for fueling the machine-learning algorithms that enable the robots to determine how best to respond to those objects.
Fetch must teach its robots in order for them to learn, and teaching requires masses of data. To collect that data, the company has constructed a mock warehouse to train AMRs at its facility in San Jose, California. Mobile robots have been navigating the aisles for four years now, filtering rich vision-sensor feedback through artificial neural networks (ANNs) to distinguish obstacles and determine not only how to navigate around them, but to navigate around them appropriately. These ANNs consist of layer upon layer of interconnected, computerized nodes, creating a vast web that filters data from the robot’s sensors (2D lasers complemented by 3D cameras). Each time the robot identifies and/or responds correctly to an obstacle, individual nodes are weighted accordingly. This makes the same outcome more likely in the future, even when the ANN is tasked with filtering novel sensor data from a novel environment.
Four years’ worth of data from the mock factory ensure that the latest AMRs will benefit from all the experience of their predecessors, Mr. Toris says. Four years from now, the dataset will be even more robust, and new machine-learning techniques likely will be available. Whatever the future of AMRs in CNC machine shops, it is well worth considering how robot design might change as a result of both technological developments and changes in thinking about the nature of automation.
Related Content
Made in the USA - Season 1 Episode 2: The Automation Puzzle
There is a fundamental question we need to answer when we talk about automation: To what extent is automation an answer to the skilled workforce shortage, and to what extent is automation vs. Skilled labor the wrong comparison to make in the first place?
Read More3 Ways Artificial Intelligence Will Revolutionize Machine Shops
AI will become a tool to increase productivity in the same way that robotics has.
Read MoreAutomating Part Programming Cuts the Time to Engaging Work
CAM Assist cuts repetition from part programming — early users say it could be a useful tool for training new programmers.
Read MoreLean Approach to Automated Machine Tending Delivers Quicker Paths to Success
Almost any shop can automate at least some of its production, even in low-volume, high-mix applications. The key to getting started is finding the simplest solutions that fit your requirements. It helps to work with an automation partner that understands your needs.
Read MoreRead Next
Registration Now Open for the Precision Machining Technology Show (PMTS) 2025
The precision machining industry’s premier event returns to Cleveland, OH, April 1-3.
Read MoreThe Future of High Feed Milling in Modern Manufacturing
Achieve higher metal removal rates and enhanced predictability with ISCAR’s advanced high-feed milling tools — optimized for today’s competitive global market.
Read More5 Rules of Thumb for Buying CNC Machine Tools
Use these tips to carefully plan your machine tool purchases and to avoid regretting your decision later.
Read More