The Modern AGV Delivers Flexibility
Automatically guided vehicles are getting smarter and more versatile.
#robotics
The most visible aspect of most modern automated guided vehicle systems (AGVs) is something that you can’t readily see: obvious visible guidance that allows them to roll from station to station. Early AGV systems used magnetic tapes or painted lines that would allow them to wend their way through plants. But this meant that they were limited in where they could go. The method doesn’t allow any flexibility to deviate from the route.
More and more AGV developers are using navigation systems that offer more flexibility. For example, for a factory in suburban Chicago that it built not only to produce products but to be something of an Industry 4.0 showcase, machine builder Trumpf (trumpf.com) worked with a German AGV supplier, Stopa (stopa.com/en). Trumpf makes laser systems, so it isn’t entirely surprising that they equipped the Stopa VarioCarts with laser sensors. There are reflectors placed around the factory and the laser system detects them with its 270° field of view. A control room monitors the routes and can adjust them as required.
Yet there are still the reflectors in this approach. OTTO Motors (ottomotors.com) has another approach. This one is primarily based on the capabilities provided by LiDAR (light detection and ranging systems). The system provides 3D information on the surrounding environment. The LiDAR unit, which uses lasers that are bounced around the environment via a rotating scanning mirror assembly, is well-suited to “big-picture” imaging. The AGV’s processor uses the data to implement object identification, collision prediction, and avoidance strategies.
“Using LiDAR to map the facility provides a source of absolute position estimation in an indoor facility,” notes OTTO Motors perception manager James Servos. “It provides our position against the fixed environment, such as walls, and allows the AGV to maintain its precise position anywhere in the facility.”
The OTTO platform also integrates data from an Inertial Measurement Unit or IMU—a sensor also used in aircraft and satellites. The IMU data enables the processor to calculate the AGV’s position based on velocity and time measurements. And finally, a wheel encoder measures wheel rotation to determine the AGV’s speed and acceleration.
Integrating the internal data from these latter two sensors—which answer “how fast am I going? How far have I gone”—with the external LiDAR data—“where am I? Where am I trying to go?”—gives the vehicle a dependable sense of, well, its place in the world. It also empowers it to recalculate, GPS-style, a better route to its destination when obstacles appear.
AGV/Cobot Hybrids
AGVs have traditionally been used only to move material or parts from place to place. That’s changing. Robotics and automation developers including KUKA Robotics Corp. (kuka.com) and FANUC America Corp. (fanucamerica.com) are now combining AGVs with collaborative robots, enabling not only transportation tasks but also loading/unloading and measurement tasks.
Such centaur-like robots include the KMR iiwa. The device combines a lightweight, collaborative LBR iiwa robot with an AGV platform. The LBR iiwa is equipped with seven highly sensitive joint sensors in each axis that cause it to stop when touched. The AGV platform is furnished with laser scanners that react immediately when a person or object is in its path.
Mobile robots are a “quantum leap in the evolution of AGVs,” in the mind of KUKA president and CEO Joe Gemma. When robots “learn how to walk,” it no longer becomes necessary to transport the workpiece to the robot, he notes. “Instead, the robot moves directly to the workpiece.” And instead of investing in automation for each machining operation, a mobile robot can tend to multiple machines.
FANUC has a similar combination—the company’s CR-14iA/L arm mounted on an AGV. The cobot arm, which can carry a payload of 31 pounds, can autonomously go to a workstation, load parts into bins on the AGV, then unload them at another station.
The challenge of making a mobile robot practical goes beyond the chore of getting the device from one place to another, Mike Cicco, FANUC America Corp. president & CEO notes. Relocating a robot used to also require the tedious reteaching of all its movement points using a pendant control. FANUC’s mobile robots, instead, automatically recalibrate their movements with a vision system that reads fiducial markers—reference dots—placed on the workstation, saving hours of programming time.
RELATED CONTENT
-
GM Is Down with Diesels
General Motors is one company that is clearly embracing the diesel engine.
-
On Electric Pickups, Flying Taxis, and Auto Industry Transformation
Ford goes for vertical integration, DENSO and Honeywell take to the skies, how suppliers feel about their customers, how vehicle customers feel about shopping, and insights from a software exec
-
When Automated Production Turning is the Low-Cost Option
For the right parts, or families of parts, an automated CNC turning cell is simply the least expensive way to produce high-quality parts. Here’s why.