DK201901374A1 - System for cutting and trimming meat cuts - Google Patents

System for cutting and trimming meat cuts Download PDF

Info

Publication number
DK201901374A1
DK201901374A1 DKPA201901374A DKPA201901374A DK201901374A1 DK 201901374 A1 DK201901374 A1 DK 201901374A1 DK PA201901374 A DKPA201901374 A DK PA201901374A DK PA201901374 A DKPA201901374 A DK PA201901374A DK 201901374 A1 DK201901374 A1 DK 201901374A1
Authority
DK
Denmark
Prior art keywords
workpiece
starting material
processing means
operator
cutting
Prior art date
Application number
DKPA201901374A
Inventor
Andreas Holger Dirac Paul
Bager Christensen Lars
Toftelund Madsen Niels
Askjær Hass Morten
Original Assignee
Teknologisk Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologisk Inst filed Critical Teknologisk Inst
Priority to DKPA201901374A priority Critical patent/DK180419B1/en
Application granted granted Critical
Publication of DK201901374A1 publication Critical patent/DK201901374A1/en
Publication of DK180419B1 publication Critical patent/DK180419B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • A22C17/0086Calculating cutting patterns based on visual recognition
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides a system and a related method for facilitating instructions to the executing operator about cutting and trimming meat cuts in slaughterhouses.

Description

DK 2019 01374 A1 1
SYSTEM FOR CUTTING AND TRIMMING MEAT CUTS
TECHNICAL FIELD The present invention provides a system and a related method for facilitating instructions to the executing operator about cutting and trimming meat cuts in slaughterhouses.
BACKGROUND ART When cutting carcasses and parts of carcasses, it is often a requirement that the resulting products must have certain predefined dimensions. For example, pork bellies are often cut and trimmed to have length and width dimensions specified by the customer. Often the size is important for the customer due to certain size requirements of further processing equipment.
The trimming is often done manually. To assist the operator to obtain the correct product size, a frame or a template, with the right dimensions, is placed upon the belly to guide the operator to trim away the parts outside the frame.
However, the use of this frame has several disadvantages. The operator must carry the frame and use time to place the frame onto the belly. If the belly dimensions are not suited for the intended product, i.e. do not comply with the frame dimensions, the operator cannot utilize the frame correctly, and will typically not process the belly.
The frame is usually drawn up to one side of the belly and frequently squeezes the belly together, which results in the belly being insufficiently trimmed, i.e. it will be larger than specified.
Contact between the frame and the underlying conveyor can result in the generation of foreign bodies, i.e. plastic fragments from the conveyor belt, which may end up as foreign objects in the products, thus reducing quality and result in consumer complaints.
Finally, the operator must himself decide how to place the frame on the belly and decide in what order to trim off excess product from the different sides of the belly, which can result in a suboptimal value of the trimmings.
Augmented Reality (AR) is an emerging technology for supporting industrial service and maintenance operations. The potential includes hands-free support for complicated procedures, and updated assistance for performing a specific maintenance task, e.g. replacement of a defective component.
WO2008102148 describes a method for handling food pieces by subjecting a food piece to an X-ray analysis for detection of foreign objects in the food piece, followed by
DK 2019 01374 A1 2 transportation of the food piece from the analysis position to an operator, and providing the operator with an indication of the location of the foreign object on the food piece by a “static” marking on the food piece on the fly. WO2014079448 describes a method for removing undesired parts or objects in pieces of meat, in which method the operator is assisted by an image on a screen that the operator carries in front of his eyes, and on which screen a (static) representation of the objects in question are projected.
However, a system for cutting and trimming meat cuts assisted by a dynamic presentation of suggested cutting lines as described herein has not been disclosed.
SUMMARY OF THE INVENTI ON The present invention provides a system for assisting an operator in the cutting and trimming of meat cuts, including removal of foreign objects and/or trimming/cutting according to dimension requirements.
In contrast to known systems for representing cutting lines on a moving meat target, that are based on a static representation relative to the position of the meat piece as it was when the image was recorded, and as a result can only be projected correctly onto the moving workpiece as long as it remains in the same relative position on the moving conveyor, the system of the present invention allows for a dynamic presentation of proposed cutting lines, i.e. the projected image aligns to, and fits with the actual position of the workpiece in question, even if the workpiece has been moved and changed position on the conveyor, e.g. as a result of the operators work and impact on the workpiece.
The system of the present invention may be applied at abattoirs, for processing incoming carcasses or parts of carcasses, intended for being cut into meat pieces of a certain size and quality. The workpiece may in particular be a carcass, or a part hereof, e.g. a pork belly.
The system of the invention may also be used for detecting quality defects, such as the occurrence of foreign objects on the surface of the incoming meat item, and advice the operator of the occurrence and location of such quality defects, and may also propose preferred cutting lines, and even a preferred cutting sequence.
In addition, the system of the invention may include the use of AR-technology, e.g. by projecting the proposed cutting lines onto a screen in front of the operator.
Thus, in its first aspect, the invention provides a system for displaying instructions to a meat processing operator, which system is described in more details below.
DK 2019 01374 A1 3 In another aspect, the invention provides a method for presenting information to a meat processing operator, which method is described in more details below.
Other objects of the invention will be apparent to the person skilled in the art from reading the following detailed description and accompanying drawings.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
DETAILED DISCLOSURE OF THE INVENTION The system of the invention In its first aspect, the invention provides a system for assisting an operator (6) in the cutting and trimming of meat cuts, e.g. according to dimension requirements, or for detecting and correcting quality defects, such as the occurrence of foreign objects.
The system may be applied at abattoirs, for processing incoming carcasses or parts of carcasses, intended for being cut into meat pieces of a certain size. The workpiece/starting material/ meat item/target product (7) may in particular be a carcass, or a part hereof, e.g. a pork belly.
The system of the present invention may be characterised by comprising: an in-let conveyor belt (2) comprising a position sensor, in communication with the processing means (4), and capable of conveying the workpiece/ starting material (7); a machine vision device (3), comprising at least two camera stations (3A, 3B), each station, in communication with the processing means (4), configured for identification of the incoming workpiece/starting material (7), for making size measurements, and capable of accessing the dimensions of the workpiece/starting material (7), and transmitting the digitalised data to the processing means (4); a processing means (4), in communication with the in-let conveyor (2), the machine vision device (3), and the projecting means (5), and capable of: calculating the speed of the in-let conveyor belt (2), for keeping the projection means (5) synchronized with the conveyor belt (2); assessing the dimensions of the workpiece/starting material (7) and evaluating the possibility of compliance of each starting material dimensions to product requirements; calculating the optimal cutting/trimming pattern in view of the specifications for the final product/cut; and recognizing and tracking the workpiece/starting material/target product (7) while being transported from the site of measurement (3) to the location of the executing cutting operator (6); and a projection means (5), in communication with the in-let conveyor (2) and the processing means (4), and capable of displaying an over-layer of information, on the
DK 2019 01374 A1 4 workpiece/target product (7) to be processed, to the executing cutting operator (6), while the workpiece/target product (7) is being conveyed; wherein, the first camera station (3A) feeds data to the processing means (4) for identifying, and for locating, the incoming workpiece/starting material (7), and for calculating cutting lines, and, if present, optionally also identifies foreign objects and notes their position on the surface of the incoming workpiece/starting material (7); and the second camera station (3B), in a dynamic process, feeds data to the processing means (4) for identifying and locating the incoming workpiece/starting material (7), and communicates the current position of the workpiece/starting material (7) and the proposed cutting lines and location of foreign objects, if any, to the projecting means (5).
In this way the system of the invention is capable of generating a vision based loop, wherein the second camera station (3B), in a dynamic process involving data from the first camera station (3A), that are analysed and computed by the processing means (4), the result of which is transmitted to the projecting means (5), and allows for a “real- time” feed-back via the projecting means, thus propagating instructions to the operator.
In one embodiment, the system further comprises one or more buffer-conveyors (2A) and/or an outlet-conveyor (2B).
In another embodiment, the system of the invention further comprises a means for storing information (a server/ database) (9).
In a third embodiment, the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7).
In a fourth embodiment, the projection means (5) for use according to the invention is a printer or a laser marker.
In a fifth embodiment, the projection means (5) for use according to the invention is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
In a further embodiment the system of the present invention may be characterised by comprising: an in-let conveyor belt (2), in communication with the processing means (4), and capable of conveying the starting material (7); a machine vision device (3), in communication with the processing means (4), configured for making size measurements, and capable of accessing the dimensions of the starting material (7), and transmitting the digitalised data to the processing means (4); a processing means (4), in communication with the conveyor (2), the machine vision device (3), and the projecting means (5), and capable of: calculating the speed of the in-let conveyor belt (2), to keep the projection means (5) synchronized with the conveyor belt (2);
DK 2019 01374 A1 assessing the dimensions of the starting material (7) and evaluate the possibility of compliance of each starting material dimensions to product requirements; calculate the optimal cutting/trimming pattern in view of the specifications for the final product/cut; and 5 recognize and track the starting material (7) while being transported from the site of measurement (3) to the location of the executing cutting operator (6); and a projection means (5), in communication with the in-let conveyor (2), and the processing means (4), and capable of displaying an over-layer of information, on the meat item (7) to be processed, to the executing cutting operator (6), while the meat item (7) is being conveyed.
In one embodiment, the system of the present invention may be characterised as described above, wherein: the machine vision device (3) comprises two camera stations (3A, 3B), each station, in communication with the processing means (4), configured for identification of the incoming starting material (7); and wherein the first camera station (3A) provides data to the processing means (4) for identification of the incoming starting material (7), and for calculating cutting lines, and, if present, optionally also identifies foreign objects and notes their position on the surface of the incoming starting material (7); and the second camera station (3B) provides data to the processing means (4) for identification and for location of the incoming starting material (7) and communicates the position of cutting lines and foreign objects, if any, to the projecting means (5). The in-let conveyor For the processing means (4) to be able to calculate and track the position of the incoming workpiece/starting material (7), while in motion, and keep the projection means (5) synchronized with the conveyor belt (2), the system shall know the speed of the conveyor belt (2). This may be accomplished in various ways.
The in-let conveyor belt (2) for use according to the invention may be equipped with a position sensor, in communication with, and receiving operational guidance from, the processing means (4). This ensures that the system can performing certain functions, like controlling (determine and adjust) the speed, and synchronize with the projection means (5). There are four widely used methods of applying encoders to conveyors: motor mount, roller shaft mount, belt/chain driven and surface mount.
Any type of encoder may be employed according to the invention.
In another embodiment, the system of the invention comprises more conveyors, e.g. one or more buffer-conveyors (2A), and/or an outlet-conveyor (2B).
DK 2019 01374 A1 6 After identification of the incoming workpiece/starting material (7) at the first camera station (3A), that feeds the obtained ID to the processing means (4), the workpiece may be removed from the system and stored on a (buffer) conveyor belt (2A) before it is re-introduced to the system for continued processing, and identification/monitoring by a second (or further) camera station (3B). This may e.g. be accomplished by use of an outlet conveyor (2B). The machine vision device The machine vision device (3) for use according to the invention may be any machine vision device, configured for making size measurements, and capable of accessing the dimensions of the workpiece/starting material (7) while in motion.
For performing such operations, the vision device shall be in communication with, and be able to transmit data to the processing means (4). In one embodiment, the vision device (3) comprises the use of two cameras (3), one camera station (3A) for identifying the arriving workpiece/starting material (7) and for communicating image data to the processing means (4), and another camera station (3B) for tracing the conveyed workpiece/starting material (7) and communicating image data to the processing means (4).
Based on the image data established by the first camera station (3A), the processor (4) extracts characteristics associated with the meat product in question (7), and calculates optimal cutting lines, and, optionally, also identifies foreign objects, if present, and notes their position on the meat surface. Computed cutting lines, along with the specific location of foreign objects, if any, are reported to the projection means (5), that projects the finding onto the workpiece/target product (7), while on the fly, in the form of over-lay lines.
Over-lays may e.g. be projected in different colours, for assisting the operator (6) in producing an optimized yield of the raw material by cutting along the projected lines in a specific order (e.g. green before red), and for performing a corrective action by removing any foreign objects detected as surface contamination.
As the workpiece/target product (7) is being moved towards the processing area (8), the second camera station (3B), that is positioned above the processing area (8), identifies the incoming workpiece/target product (7), based on the information obtained by the first camera station and recorded by the processing means (4), and communicates the location and position of the moving workpiece/target product (7) to the projection means (5), in the form of cutting lines and the position of any foreign object to be removed.
DK 2019 01374 A1 7 In this way the second camera station (3B), by employing feed-back/loop processes in cooperation with the processing means (4) and the projecting means (5), is able to provide dynamic cutting lines to the operator (6).
Preferably, the machine vision device (3) for use according to the invention is a multispectral vision device. A multispectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i.e. infrared and ultra-violet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green and blue. Multispectral imaging measures light in a small number (typically 3 to 15) of spectral bands.
Multispectral imaging can also be accomplished by using camera(s) sensitive to all the relevant spectral bands and sequentially illuminating the object with each spectral band, while capturing a frame for each band.
The machine vision device (3) for use according to the invention may comprise a light source capable of emitting electromagnetic waves in the ranges 350nm to 700nm to (from UV to visible regions) and 700nm to 950nm (i.e. NIR), and a sensor capable of receiving electromagnetic waves within the same ranges (i.e. 350nm to 950nm), and may be a device as described in e.g. WO 2017/121713. Machine vision systems essentially comes in three main categories: 1D vision analyses a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera; 2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera/area-scan camera, e.g. a multispectral RGB (visible/colour images) camera; and 3D vision systems typically comprise multiple cameras or one or more laser displacement sensors. Commercially available 3D scanners or range cameras include the time-of-flight camera, the structure light camera, stereo camera or 3D camera, e.g. the Sick 3D ruler.
In one embodiment, the vision device of the first camera station (3A) is a line- scan camera.
In one embodiment, the vision device of the second camera station (3B) is an industrial 2D multispectral RGB camera, or area scan camera.
In one embodiment, the machine vision device (3) for use according to the invention is calibrated for correct size measurement.
DK 2019 01374 A1 8 The processing means The processing means (4) for use according to the invention may be any commercially available processor/PC, in communication with the conveyor (2), the machine vision device (3), and the projecting means (5), and shall capable of: calculating the speed of the in-let conveyor belt (2), to keep the projection means (5) synchronized with the conveyor belt (2); assessing the dimensions of the workpiece/starting material (7) and evaluate the possibility of compliance of each starting material dimensions to product requirements; calculate the optimal cutting/trimming pattern in view of the specifications for the final product/cut; and recognize and track the workpiece/target product (7) while being transported from the site of measurement (3) to the location of the executing cutting operator (6).
The processing means (4) used according to the invention may also be used for calculating the optimal cutting lines, and for location of existing foreign objects, if any.
Identification and feature extraction may be accomplished using pattern recognition and digital image processing, and particularly feature extraction. Means for storing information In a particular embodiment, the method of the invention further comprises a step (c1) wherein the digitalised data analysed in step (c) by the processing means (4) is transmitted to a means for storing information (server/database) (9).
According to this embodiment, the processing means (4) used according to the invention may be in communication with a means for storing information, i.e. a central server or database (9). This central database (9) may contain pre-loaded information related to the products in question, e.g. origin of the product, product ID, product specifications, etc.
After receipt of the digitalised data from the vision device (3), the processing means (4) can establish a product ID, e.g. by reference to a product catalogue or product specifications stored on the server (9), and the product ID may be allocated to each workpiece/target product (7), and transmitted to the server/database (9) for further action/use. The projection means By displaying an over-layer of information onto the workpiece/target product (7) to be processed, while in motion and lying on the conveyor belt (2), the projection means (5) for use according to the invention shall assist the executing operator (6) in cutting and/or trimming the meat item (7) in question.
DK 2019 01374 A1 9 For fulfilling its tasks, the projection means (5) shall be in communication with the in-let conveyor (2), and the processing means (4), and shall receive guidance from the processing means (4). A projection means (5) for use according to the invention may be a monitor, a projector, a headlight, a laser, or a printer, capable of creating and/or marking a guiding cutting/trimming pattern on the workpiece/target product (7) to be processed, but may also include augmented reality (AR) equipment (5A), e.g. smart-glasses or the like, which over-lay an image of the cutting curves onto the meat item (7). Via the projection means (5), the system may provide the executing cutting operator (6) information about the cutting and trimming of meat cuts according to dimension requirements, or about quality defects such as the occurrence of foreign objects that shall be removed.
The projector may also display information about the order in which the operator (6) should carry out the suggested cuts and trims.
In one embodiment, the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7). In another embodiment, the projection means (5) for use according to the invention is a printer or a laser marker.
In a third embodiment, the projection means (5) for use according to the invention is a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
In a fourth embodiment, instructions about optimal cutting curves are displayed as an over-layer onto the workpiece/target product (7). In a fifth embodiment, instructions about the optimal sequence for cutting the curves is displayed as an over-layer onto the workpiece/target product (7). In a sixth embodiment, instructions about the optimal sequence by which cuts are performed is displayed by means of different colours or patterns for different curves.
In a seventh embodiment, instructions about the optimal cutting sequence is presented by displaying the cutting curves in a sequence.
In an eight embodiment, instructions about the locations of foreign objects is displayed as an over-layer onto the workpiece/target product (7). In a ninth embodiment, instructions about the instructions displayed are the location of excess fat, cartilage and/or bone to be removed from the piece.
The method of the invention In another aspect, the invention provides a method for presenting information to a meat processing operator (6), by use of the system of the invention.
The method of the invention may be characterised by comprising the subsequent steps of: (a) receiving a workpiece/starting material (7) on a conveyor belt (2);
DK 2019 01374 A1 10 (b) obtaining one or more images of the incoming workpiece/starting material (7) using a machine vision device (3) comprising two consecutive camera stations (3A, 3B), and transmitting the digitalized data to a processing means (4); (c) analysing the digitalized data obtained by the first camera station (3A) in step (b) for determining identity (ID) and characteristics of the incoming workpiece/ starting material (7), and optionally for calculating cutting lines and the location of existing foreign objects, if any, in communication with the processing means (4); (d) analysing the digitalized data obtained by the second camera station (3B) in step (b) for determining identity (ID) and characteristics of the incoming workpiece/starting material (7), in communication with the processing means (4), and transmitting the digitalized information/instructions to a projection means (5); (e) using the projection means (5) of step (d) for presenting the cutting lines and the location of foreign objects, if any, calculated in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the workpiece/starting material (7) to be processed.
In one embodiment, the method of the invention further comprises a step (c1) wherein the digitalised data analysed in step (c) by the processing means (4) is transmitted to a means for storing information (server/database) (9).
In another embodiment, the system further comprises one or more buffer- conveyors (2A) and/or an outlet-conveyor (2B).
In athird embodiment, the digitalized data obtained in step (c) by the first camera station (3A) is analysed by feature extraction techniques.
In a fourth embodiment, the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7).
In a fifth embodiment, the projection means (5) for use according to the invention is a printer or a laser marker.
In a sixth embodiment, the projection means (5) for use according to the invention is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
In a further embodiment, the invention provides a method for presenting information to a meat processing operator (6), which method comprises the subsequent steps of: (a) receiving the starting material (7) on a conveyor belt (2); (b) obtaining one or more images of the incoming starting material (7) using a machine vision device (3), and transmitting the digitalized data to a processing means (4);
DK 2019 01374 A1 11 (c) analysing the digitalized data obtained in step (b) using a processing means (4), and transmitting the digitalized information/instructions to a projection means (5); (d) using a projection means (5), presenting the information/instructions obtained in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the meat item (7) to be processed.
In one embodiment, the method of the invention for presenting information to a meat processing operator (6) comprises the subsequent steps of: (a) receiving a meat item/starting material/target product (7) on a conveyor belt (2); (b) obtaining one or more images of the incoming meat item/starting material/target product (7) using a machine vision device (3) comprising two consecutive camera stations (3A, 3B), and transmitting the digitalized data to a processing means (4); (c) analysing the digitalized data obtained by the first camera station (3A) in step (b) for defining identity and for extracting feature characteristics, and for calculating cutting lines and, optionally, the location of existing foreign objects, if any, in communication with the processing means (4); (d) analysing the digitalized data obtained by the second camera station (3B) in step (b) for identification of the incoming meat item, in communication with the processing means (4), and transmitting the digitalized information/instructions to a projection means (5); (e) using the projection means (5) of step (d) for presenting the information/instructions obtained in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the meat item (7) to be processed.
The analysis in step (c) may be performed by the processing means (4) using pattern recognition, that represents a branch of machine learning that focuses on the recognition of patterns and regularities in data, in this case data obtained from the vision device (3).
The projection means (5) used according to the method may e.g. be a projector capable of presenting (over-lay) an image onto the workpiece/target product (7), or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene, or it may be accomplished by marking the cutting curves onto the meat item (7) by means of a printer or a laser marking or similar.
The method of the invention may provide the executing cutting operator (6) information about the cutting and trimming of workpiece/starting material/target product (7) according to dimension requirements, or about quality defects such as the occurrence of foreign objects that shall be removed. The projector may also display
DK 2019 01374 A1 12 information about the order in which the operator (6) should carry out the suggested cuts and trims.
In one embodiment, a vision camera (3), calibrated for size measurements, is used to assess the dimensions of the workpiece/starting material/target product (7).
In another embodiment, the processing means (4) used according to the method makes use of algorithms to analyse data from the vision camera (3) in order to: (a) prior to cutting, evaluate the possibility of compliance of the dimensions of each workpiece/starting material/target product (7) to product requirements; (b) calculate the optimum cutting curves/lines and/or cutting order to optimize the total product value (meat product and trimmings); (c) recognize and track the workpiece/target product (7) at the location of the cutting operator; and/or (d) display the cutting curves and/or the cutting order to the operator (6).
In a further embodiment, the method of the invention is used for providing the executing cutting operator (6) information about possible quality defects, e.g. the occurrence of foreign objects on the meat item (7).
BRI EF DESCRIPTION OF THE DRAWINGS The present invention is further illustrated by reference to the accompanying drawing, in which: Fig. 1 shows an embodiment of the system according to the invention: An in-let conveyor (2); A first camera station (3A); A second camera station (3B); A processing means (4); A projection means (5); The executing operator (6); The workpiece/ starting material (7); Fig. 2 shows another embodiment of the system according to the invention: An in- let conveyor (2); A first camera station (3A); A second camera station (3B); A processing means (4); A means for obtaining AR (5A); The executing operator (6); The workpiece/ starting material (7); Fig. 3A shows a front/side view of an embodiment of the system according to the invention: An in-let conveyor (2); A first camera station (3A); A second camera station (3B); A projection means (5); The executing operator (6); Fig. 3B shows a front view an embodiment of the system according to the invention: An in-let conveyor (2); A first camera station (3A); A second camera station (3B); A projection means (5); Fig. 3C shows a side view an embodiment of the system according to the invention: An in-let conveyor (2); A first camera station (3A); A second camera station (3B); A projection means (5);
DK 2019 01374 A1 13 Fig. 4 shows another embodiment of the system according to the invention: An in- let conveyor (2); A (buffer) conveyor (2A); An outlet conveyor (2B); A first camera station (3A); A second camera station (3B); A projection means (5); Fig. 5 shows another embodiment of the system according to the invention: An in- let conveyor (2); A machine vision device (3); A processing means (4); A projection means (5)/A means for obtaining AR (5A); The executing operator (6); The workpiece/starting material (7). List of reference signs In the figures, identical structures, elements or parts that appear in more than one figure are generally labelled with the same numeral in all the figures in which they appear.
1. System for displaying instructions to a meat processing operator;
2. In-let conveyor; 2A. A (buffer) conveyor belt (2A); 2B. An outlet conveyor belt (2B);
3. Machine vision device; 3A. First camera station; 3B. Second camera station;
4. Processing means;
5. Projection means; 5A. Augmented reality (AR) equipment;
6. Executing operator;
7. Workpiece/starting material/target product;
8. Processing area;
9. Means for storing information (server/database).
EXAMPLE The invention is further illustrated with reference to the following example, which is not intended to be in any way limiting to the scope of the invention as claimed. Application of Augmented Reality (AR) for speed/capacity and precision, and the best platforms/channels has been validated in a meat production task, including contaminant detection and cutting recipe support has been benchmarked in a simulated trial environment, and the most promising channel has been demonstrated in a meat production operation, with a vision sensor setup as the supporting information provider.
DK 2019 01374 A1 14 Four different AR platforms candidates were benchmarked. Benchmarking was made with a simulated piece of meat wherein submerged contaminants were inserted to mimic bone fragments in meat.
The tested platforms included a tablet, a set of smart glasses, a video projector and a video monitor. The operator performance was measured using a HTC VIVE based tracking system with a pointer attached to the controller. From the controller, the 3D position of the pointer is recorded together with the time spend for the task, leading to a capacity and quality measure for the 3D operation of pointing out the submerged contamination.
Young volunteers, 48 operators in total, all novices to the specific task, were recruited. A statistical balanced experiment was set up with four groups, each assisted by a tablet, a video monitor, a video projector and a set of AR glasses (EPSON Moverio200), respectively. The pointer was tracked to give the position accuracy for pointing out the submerged target contamination. The time spend for the entire pointing procedure was recorded to give an estimated efficiency for each operator performing the task.
A setup was built with a projector as the preferred platform to illustrate the potentiality in a real operation from a meat production. The setup included a vision sensor, to determine the size of product (length and width) and the position of eventually surface contamination (Dyna-CQ™, available from DMRI, Denmark). The results from the vision sensor is communicated to the operator using a tracking software for recognition and tracking of each product. Finally, the relevant cutting instruction and calculated position of the detected surface contamination (e.g. polymer fragments from wrapping films or carrier trays) is presented via a video projector to the operator.
Input to the tracking/identification software is a standard RGB camera placed in a fixed position relative to the projecting device.
Results The potential of using AR technology was demonstrated using a pilot scale production setup. The vision Dyna-CQ™ sensor gives the absolute size (length and width) of the incoming meat products and detects any surface contamination. The product in the pilot experiment should meet specific size requirements. By comparing to the actual measured size, a cutting recipe in form of a set of cutting lines can be calculated for each product.
The calculated cutting lines are projected onto the meat surface in order to guide the operator in performing the optimized cutting operation. The sequential order of the two cutting procedures leaves a significant volume of meat on either of the two
DK 2019 01374 A1 15 trimmings: the caudal (hip) or ventral (belly) part.
Actual price differences between the two parts may influence the yield of a specific cutting sequence.
In the pilot setup, the sequential order of cutting is communicated to the operator using colour codes.
The sequential order of cutting leaves the lower left-hand cut-off either on the ventral or the caudal part.
As these by-products often are priced differently, an optimization potential is available for the operator by selecting a specific cutting sequential order.
Conclusion Despite the simplicity of the simulated meat product, the results seemed to be relevant for the meat industry, as capacity and precision are parameters with direct impact on the financial the bottom line.
Benchmarking the four AR platforms is also relevant for the meat industry, as all components of the best performing system are off the shelf commercially available products.
Different tracking software platforms are also available from different vendors, and the potential indicated in this experiment aligns with previous demonstrated yield improvement of fresh meat production.

Claims (10)

DK 2019 01374 A1 1 CLAIMS
1. A system (1) for displaying instructions to a meat processing operator (6), which system comprises: an in-let conveyor belt (2) comprising a position sensor, in communication with the processing means (4), and capable of conveying the workpiece/starting material (7); a machine vision device (3), comprising two camera stations (3A, 3B), each station, in communication with the processing means (4), configured for identification of the incoming workpiece/starting material (7), for making size measurements, and capable of accessing the dimensions of the workpiece/starting material (7), and transmitting the digitalised data to the processing means (4); a processing means (4), in communication with the in-let conveyor (2), the machine vision device (3), and the projecting means (5), and capable of: calculating the speed of the in-let conveyor belt (2), for keeping the projection means (5) synchronized with the conveyor belt (2); assessing the dimensions of the workpiece/starting material (7) and evaluating the possibility of compliance of each starting material dimensions to product requirements; calculating the optimal cutting/trimming pattern in view of the specifications for the final product/cut; and recognizing and tracking the workpiece/starting material (7) while being transported from the site of measurement (3) to the location of the executing cutting operator (6); and a projection means (5), in communication with the in-let conveyor (2) and the processing means (4), and capable of displaying an over-layer of information, on the meat item (7) to be processed, to the executing cutting operator (6), while the workpiece/target product (7) is being conveyed; wherein, the first camera station (3A) feeds data to the processing means (4) for identifying, and for locating, the incoming workpiece/starting material (7), and for calculating cutting lines, and, if present, optionally also identifies foreign objects and notes their position on the surface of the incoming workpiece/starting material (7); and the second camera station (3B), in a dynamic process, feeds data to the processing means (4) for identifying and for locating the incoming workpiece/starting material (7), and communicates the current position of the workpiece/starting material (7) and the proposed cutting lines and location of foreign objects, if any, to the projecting means (5).
2. The system according to claim 1, which system further comprises one or more buffer-conveyors (2A) and/or an outlet-conveyor (2B).
DK 2019 01374 A1 2
3. The system according to either one of claims 1-2, wherein the projection means (5) is a projector capable of over-laying an image onto the workpiece/target product (7).
4. The system according to claim 3, wherein the projection means (5) is a printer or a laser marker.
5. The system according to either one of claims 1-2, wherein the projection means (5) is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
6. A method for presenting information to a meat processing operator (6), which method comprises the subsequent steps of: (a) receiving a meat item/starting material (7) on a conveyor belt (2); (b) obtaining one or more images of the incoming meat item/starting material (7) using a machine vision device (3) comprising two consecutive camera stations (3A, 3B), and transmitting the digitalized data to a processing means (4); (c) analysing the digitalized data obtained by the first camera station (3A) in step (b) for identity and characteristics, and optionally for calculating cutting lines and the location of existing foreign objects, if any, in communication with the processing means (4); (d) analysing the digitalized data obtained by the second camera station (3B) in step (b) for identification of the incoming workpiece/starting material (7), in communication with the processing means (4), and transmitting the digitalized information/instructions to a projection means (5); (e) using the projection means (5) of step (d) for presenting the cutting lines and the location of foreign objects, if any, calculated in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the workpiece/starting material (7) to be processed.
7. The method according to claim 6, which method further comprises a step (cl) wherein the digitalised data analysed in step (c) by the processing means (4) is transmitted to a means for storing information (server/database) (9).
8. The method according to either one of claims 6-7, which system further comprises one or more buffer-conveyors (2A) and/or an outlet-conveyor (2B).
9. The method according to any one of claims 6-8, wherein the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7).
10. The method according to any one of claims 6-8, wherein the projection means (5) for use according to the invention is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
DKPA201901374A 2019-11-25 2019-11-25 System for cutting and trimming meat cuts DK180419B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DKPA201901374A DK180419B1 (en) 2019-11-25 2019-11-25 System for cutting and trimming meat cuts

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DKPA201901374A DK180419B1 (en) 2019-11-25 2019-11-25 System for cutting and trimming meat cuts

Publications (2)

Publication Number Publication Date
DK201901374A1 true DK201901374A1 (en) 2021-04-22
DK180419B1 DK180419B1 (en) 2021-04-22

Family

ID=75572962

Family Applications (1)

Application Number Title Priority Date Filing Date
DKPA201901374A DK180419B1 (en) 2019-11-25 2019-11-25 System for cutting and trimming meat cuts

Country Status (1)

Country Link
DK (1) DK180419B1 (en)

Also Published As

Publication number Publication date
DK180419B1 (en) 2021-04-22

Similar Documents

Publication Publication Date Title
RU2710146C1 (en) Device for obtaining and analyzing characteristic data for food industry products, an apparatus comprising such a device and a method for processing food industry products
US6031567A (en) Method and apparatus for video lumber grading
US20070193425A1 (en) Slicing of food products
US7110572B1 (en) Animal carcase analysis
US10360692B2 (en) Tracking system and method for tracking wood products in a production line
EP2503331A2 (en) Method and system for the real-time automatic analysis of the quality of samples of processed fish meat based on the external appearance thereof, said samples travelling on a conveyor belt such that surface defects can be detected and the fish meat can be sorted according to quality standards
DK2187752T3 (en) DEVICE AND PROCEDURE FOR SEPARATING SEPARATELY
JP2018146251A (en) Foreign matter detection system, foreign matter detection method and program thereof
WO2020109210A1 (en) System for cutting and trimming meat cuts
DK180419B1 (en) System for cutting and trimming meat cuts
WO2021240279A1 (en) Method and apparatus for identifying possible surface defects of a leather hide
AU2019208245B2 (en) Image acquisition for meat grading
EP1174034A1 (en) Method for trimming pork bellies
CN111458344A (en) Mask defect visual detection method, equipment and storage medium
DK180440B1 (en) On-line determination of quality characteristics of meat products
DK180343B1 (en) System and method for automatic removal of foreign objects from a food surface
EP4025058B1 (en) Automatic removal of contamination on carcasses
AU767212B2 (en) Animal carcase analysis

Legal Events

Date Code Title Description
PAT Application published

Effective date: 20210422

PME Patent granted

Effective date: 20210422