Robotic Bin Picking Made Simple(r)
The result of “thinking inside the bin,” is a design featuring a six-axis Yaskawa robot, an IFM Effector 200 photoelectric distance-measuring sensor, and, mounted to an IAI servo-driven slide, a Cognex In-Sight 8000 camera.
#robotics
When Systematix (systematix-inc.com), a systems integrator, was presented with the task of developing an automated system to remove car seat lumbar actuator assemblies from a bin and into a wire nest for assembly, its first idea was to use a robot and a 3D sensor.
But then its engineers thought of something. They realized that each actuator in the bin didn’t have to be mapped in all three dimensions but two would suffice. They could simply mount a 2D camera on a vertical slide such that each component is simply measured in X and Y.
Because there are sheets of cardboard separating layers of the randomly oriented parts and those dividers are removed once the parts on top of them are removed, there would be the need to measure the Z axis (i.e., depth) just once per layer.
The result of this “thinking inside the bin” is a design featuring a six-axis Yaskawa robot (motoman.com), an IFM Effector 200 photoelectric distance-measuring sensor (ifm.com), and, mounted to an IAI servo-driven slide (intelligentactuator.com), a Cognex (cognex.com) In-Sight 8000 camera.
The camera uses RedLine, the latest iteration of PatMax, the geometric pattern-matching technology that Cognex first patented in 1996. Up until then, pattern matching technology relied upon a pixel-grid analysis process called normalized correlation. That method looks for statistical similarity between a gray-level model or reference image of an object and portions of the image to determine the object’s X-Y position. PatMax instead learns an object’s geometry from a reference image using a set of boundary curves tied to a pixel grid and then looks for similar shapes in the image without relying on specific gray levels. This approach, now widely used by numerous machine vision companies, greatly improves how accurately an object can be recognized despite differences in angle, size and shading.
The system not only gets the job done in the required time, but presumably, the use of the long-proven tech was somewhat more cost-effective than a less-straightforward approach.
RELATED CONTENT
-
GM Unit Stresses Driver Training in Autonomous Cars
General Motors Co.’s Cruise Automation unit says it puts backup drivers and auditors through extensive training before allowing them to participate in real-world autonomous vehicle tests.
-
On Developments at Lincoln, Magna, Fiskar, Volvo and More
Lincoln’s plans for electric; Magna and Fisker working together; Polestar in South Carolina; the Volvo XC60 driven; VW gets deep into 3D; Porsche exec on electric; BMW and hydrogen; Staubli cell for tire sensors; and Bridgestone invests in autonomous trucking company.
-
On Automotive: An All Electric Edition
A look at electric vehicle-related developments, from new products to recycling old batteries.