Robot and 3D imaging to automatically debone poultry

20-03-2012 | | |
Robot and 3D imaging to automatically debone poultry

Researchers at the Georgia Tech Research Institute (GTRI) are developing an intelligent cutting and deboning system. It uses 3D imaging and a robotic cutting arm to automatically perform precision cuts that optimise yield while eliminating the risk of bone fragments in the finished product.

By Georgia Tech, Atlanta, GA, USA
“Each and every bird is unique in its size and shape. It seems only natural that to maximise yield and minimise bone chips in the deboning process that we develop the sensing and actuation to allow an automated deboning system to adapt to the bird as opposed to forcing the bird to conform to the machine,” says Gary McMurray, chief of the Food Processing Technology Division at the Georgia Tech Research Institute (GTRI).
McMurray is spearheading GTRI’s development of an intelligent cutting and deboning system that uses 3D imaging and a robotic cutting arm to automatically perform precision cuts that optimise yield while eliminating the risk of bone fragments in the finished product.
In a nutshell, vision analysis enables the system to perform optimal cuts for each bird regardless of size and shape, while a force feedback algorithm allows the system’s knife to make a cut without penetrating into the bone itself.
More specifically, a unique 3D vision system models the bird to determine where the cuts need to be made for that particular bird. Prior to making the cut, the bird is positioned under the vision system; images and measurements are taken; cut geometry is determined; and the knife’s exact position and orientation is determined for the proper cut. Finally, the bird and cutting robot are aligned into this pose, and the cut is made.
Measuring 3D locations
According to Michael Matthews, GTRI research engineer, the vision system works by measuring the 3D locations of various points on the outside of the bird, and using these points as inputs, algorithms can estimate the positions of internal structures (bones, ligaments, etc.), which help to define a proper cut.
“Since everything is registered to calibrated reference frames, we are able to solve for all cut geometries and precisely align the bird and the cutting robot to make these cuts. So far, our statistics research shows that our external measurements correlate very well to the internal structure of the birds, and therefore, will transition to ideal cutting paths,” adds Matthews.
The system’s force feedback algorithm can detect the transition from tissue to bone and move the system’s cutting knife around the surface of the bone while maintaining a constant force. Since the ligaments are attached to the bone, maintaining contact with the bone will allow the system to cut all the ligaments around the shoulder joint. This allows the knife to continue making the best cut possible without actually cutting the bone itself. The idea is similar for other cuts that separate meat from bone, allowing meat to be optimally removed without cutting the bone.
Detour around bone
“The algorithm makes use of a force sensor affixed to the knife handle. During a cutting operation, the sensor enables the robot to detect imminent contact with a bone and to take an appropriate detour around the bone, instead of cutting straight through it. Cutting through bone would generate bone chips, which pose a safety hazard. Fine tuning is needed to adjust the force thresholds to be able to tell the difference between meat, tendon, ligaments, and bone, each of which have different material properties,” explains Ai-Ping Hu, GTRI research engineer.
With these accomplishments, researchers are now focused on demonstrating the system’s ability to perform cuts from the clavicle bone to the shoulder and from the shoulder to the scapula bone. This will allow them to determine the system’s relative performance as compared to manual deboning.
The prototype (see photo) uses a fixed 2-degree-of-freedom cutting robot (the robot with the knife) for making simple planar cuts, and the bird is mounted on a 6-degree-of-freedom ABB robot arm that allows alignment of the bird and cutting robot into any desired pose. The ABB robot arm actually places the bird under the vision system, and after cut geometries and the required bird-knife relative pose are determined, it moves the bird into place with respect to the cutting robot. The cutting robot finally makes the cut, moving through a simple 2D cut, using coordinates obtained from the vision system specific to each cut.
Data driven process
“This prototype allows us to test all possible cut geometries, which should enable us to design a smaller and more simplified final system. Such a system would be more in line with a conventional cone conveyor, including a vision system and simple cutting robots,” says Matthews.
“The poultry plant of the future will be a data-driven process that optimises the operations to maximise yield and minimise costs for each and every bird. This project is another step into that future. At GTRI we are on our way to work with our partners to make this future a reality,” adds McMurray.
Reprinted from PoultryTech, a publication of the Agricultural Technology Research Programme of the Georgia Tech Research Institute, a programme conducted in cooperation with the Georgia Poultry Federation with funding from the Georgia Legislature.
 
 
 

Join 31,000+ subscribers

Subscribe to our newsletter to stay updated about all the need-to-know content in the poultry sector, three times a week.
Tech