Discrete and Process Automation

Case study: Smart robot assistant uses 3D camera system

04 March 2022

To meet the growing demands in scaling and changing production environments toward fully automated and intelligently networked production, Germany's ONTEC Automation GmbH has developed an autonomously driving robotic assistance system. The Smart Robot Assistant consists of a powerful and efficient intralogistics platform, a flexible robot arm and a robust 3D stereo camera system from the Ensenso N series by IDS Imaging Development Systems GmbH.

The solution is versatile and takes over monotonous, weighty set-up and placement tasks. The autonomous transport system is suitable for floor-level lifting of Euro pallets up to container or industrial format as well as mesh pallets in various sizes with a maximum load of up to 1,200 kg. For a customer in the textile industry, the automated guided vehicle (AGV) is used for the automated loading of coil creels. For this purpose, it picks up pallets with yarn spools, transports them to the designated creel and loads it for further processing. Using a specially developed gripper system, up to 1,000 yarn packages per eight hour shift are picked up and pushed onto a mandrel of the creel. The sizing scheme and the position of the coils are captured by an Ensenso 3D camera (N45 series) installed on the gripper arm.

The camera networks with the vehicle's PLC and can thus read out and pass on data. In the application,Source: IDS Imaging Development Systems GmbHSource: IDS Imaging Development Systems GmbH SPSComm controls the communication between the software parts of the vehicle, gripper and camera. This way, the camera knows when the vehicle and the grabber are in position to take a picture. This takes an image and passes on a point cloud to a software solution from ONTEC based on the standard HALCON software, which reports the coordinates of the coils on the pallet to the robot.

The robot can then accurately pick up the coils and process them further. As soon as the gripper has cleared a layer of the yarn spools, the Ensenso camera takes a picture of the packaging material lying between the yarn spools and provides point clouds of this as well. These point clouds are processed similarly to provide the robot with the information with which a needle gripper removes the intermediate layers. "This approach means that the number of layers and finishing patterns of the pallets do not have to be defined in advance and even incomplete pallets can be processed without any problems," explained Tim Böckel, software developer at ONTEC. "The gripper does not have to be converted for the use of the needle gripper. For this application, it has a normal gripping component for the coils and a needle gripping component for the intermediate layers."

The Ensenso N 45's 3D stereo electronics are completely decoupled from the housing, allowing the use of a lightweight plastic composite as the housing material. The low weight facilitates the use on robot arms such as the Smart Robotic Assistant. The camera can also cope with demanding environmental conditions. Even in difficult lighting conditions, the integrated projector projects a high-contrast texture onto the object to be imaged by means of a pattern mask with a random dot pattern, thus supplementing the structures on featureless homogenous surfaces. This means that the integrated camera meets the requirements exactly.

"By pre-configuring within NxView, the task was solved well." This sample programme with source code demonstrates the main functions of the NxLib library, which can be used to open one or more stereo and color cameras whose image and depth data are visualized. Parameters such as exposure time, binning, AOI and depth measuring range can be adjusted live for the matching method used.

The matching process empowers the Ensenso 3D camera to recognize a very high number of pixels, including their position change, by means of the auxiliary structures projected onto the surface and to create complete, homogeneous depth information of the scene from this. This in turn ensures the necessary precision with which the Smart Robot Assistant proceeds. Other selection criteria for the camera were, among others, the standard vision interface Gigabit Ethernet and the global shutter 1.3 MP sensor. "The camera only takes one image pair of the entire pallet in favor of a faster throughput time, but it has to provide the coordinates from a relatively large distance with an accuracy in the millimeter range to enable the robot arm to grip precisely," explained Matthias Hofmann, IT specialist for application development at ONTEC. "We therefore need the high resolution of the camera to be able to safely record the edges of the coils with the 3D camera." The localization of the edges is important in order to be able to pass on as accurate as possible the position from the center of the spool to the gripper.

To contact the author of this article, email engineering360editors@globalspec.com


Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement
Find Free Electronics Datasheets
Advertisement