EP3887105A1 - System for cutting and trimming meat cuts - Google Patents
System for cutting and trimming meat cutsInfo
- Publication number
- EP3887105A1 EP3887105A1 EP19812942.1A EP19812942A EP3887105A1 EP 3887105 A1 EP3887105 A1 EP 3887105A1 EP 19812942 A EP19812942 A EP 19812942A EP 3887105 A1 EP3887105 A1 EP 3887105A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- workpiece
- starting material
- operator
- processing means
- cutting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000005520 cutting process Methods 0.000 title claims abstract description 79
- 235000013372 meat Nutrition 0.000 title claims abstract description 53
- 238000009966 trimming Methods 0.000 title claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 79
- 238000000034 method Methods 0.000 claims abstract description 36
- 239000007858 starting material Substances 0.000 claims description 57
- 239000000047 product Substances 0.000 claims description 53
- 238000004891 communication Methods 0.000 claims description 24
- 230000003190 augmentative effect Effects 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 10
- 239000004984 smart glass Substances 0.000 claims description 8
- 230000001360 synchronised effect Effects 0.000 claims description 5
- 239000012467 final product Substances 0.000 claims description 4
- 239000003550 marker Substances 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 abstract description 5
- 210000001015 abdomen Anatomy 0.000 description 13
- 238000011109 contamination Methods 0.000 description 6
- 230000007547 defect Effects 0.000 description 6
- 235000013305 food Nutrition 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 235000013622 meat product Nutrition 0.000 description 4
- 238000000701 chemical imaging Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 239000012634 fragment Substances 0.000 description 3
- 235000015277 pork Nutrition 0.000 description 3
- 230000003595 spectral effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 210000000988 bone and bone Anatomy 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 239000000356 contaminant Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000002441 X-ray diffraction Methods 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 210000000845 cartilage Anatomy 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010330 laser marking Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011020 pilot scale process Methods 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B26—HAND CUTTING TOOLS; CUTTING; SEVERING
- B26D—CUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
- B26D5/00—Arrangements for operating and controlling machines or devices for cutting, cutting-out, stamping-out, punching, perforating, or severing by means other than cutting
- B26D5/007—Control means comprising cameras, vision or image processing systems
-
- A—HUMAN NECESSITIES
- A22—BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
- A22C—PROCESSING MEAT, POULTRY, OR FISH
- A22C17/00—Other devices for processing meat or bones
- A22C17/0073—Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
- A22C17/0086—Calculating cutting patterns based on visual recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B26—HAND CUTTING TOOLS; CUTTING; SEVERING
- B26D—CUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
- B26D2210/00—Machines or methods used for cutting special materials
- B26D2210/02—Machines or methods used for cutting special materials for cutting food products, e.g. food slicers
Definitions
- the present invention provides a system and a related method for facilitating instructions to the executing operator about cutting and trimming meat cuts in slaughterhouses.
- pork bellies are often cut and trimmed to have length and width dimensions specified by the customer. Often the size is important for the customer due to certain size requirements of further processing equipment.
- the trimming is often done manually.
- a frame or a template with the right dimensions, is placed upon the belly to guide the operator to trim away the parts outside the frame.
- this frame has several disadvantages.
- the operator must carry the frame and use time to place the frame onto the belly. If the belly dimensions are not suited for the intended product, i .e. do not comply with the frame dimensions, the operator cannot utilize the frame correctly, and will typically not process the belly.
- the frame is usually drawn up to one side of the belly and frequently squeezes the belly together, which results in the belly being insufficiently trimmed, i.e. it will be larger than specified .
- the operator must himself decide how to place the frame on the belly and decide in what order to trim off excess product from the different sides of the belly, which can result in a suboptimal value of the trimmings.
- Augmented Reality is an emerging technology for supporting industrial service and maintenance operations.
- the potential includes hands-free support for complicated procedures, and updated assistance for performing a specific maintenance task, e.g. replacement of a defective component.
- W02008102148 describes a method for handling food pieces by subjecting a food piece to an X-ray analysis for detection of foreign objects in the food piece, followed by transportation of the food piece from the analysis position to an operator, and providing the operator with an indication of the location of the foreign object on the food piece by a "static" marking on the food piece on the fly.
- WO2014079448 describes a method for removing undesired parts or objects in pieces of meat, in which method the operator is assisted by an image on a screen that the operator carries in front of his eyes, and on which screen a (static) representation of the objects in question are projected .
- the present invention provides a system for assisting an operator in the cutting and trimming of meat cuts, including removal of foreign objects and/or trimming/cutting according to dimension requirements.
- the system of the present invention allows for a dynamic presentation of proposed cutting lines, i .e. the projected image aligns to, and fits with the actual position of the workpiece in question, even if the workpiece has been moved and changed position on the conveyor, e.g. as a result of the operators work and impact on the workpiece.
- the system of the present invention may be applied at abattoirs, for processing incoming carcasses or parts of carcasses, intended for being cut into meat pieces of a certain size and quality.
- the workpiece may in particular be a carcass, or a part hereof, e.g . a pork belly.
- the system of the invention may also be used for detecting quality defects, such as the occurrence of foreign objects on the surface of the incoming meat item, and advice the operator of the occurrence and location of such quality defects, and may also propose preferred cutting lines, and even a preferred cutting sequence.
- quality defects such as the occurrence of foreign objects on the surface of the incoming meat item
- advice the operator of the occurrence and location of such quality defects and may also propose preferred cutting lines, and even a preferred cutting sequence.
- system of the invention may include the use of AR-technology, e.g . by projecting the proposed cutting lines onto a screen in front of the operator.
- the invention provides a system for displaying instructions to a meat processing operator, which system is described in more details below.
- the invention provides a method for presenting information to a meat processing operator, which method is described in more details below.
- the invention provides a system for assisting an operator (6) in the cutting and trimming of meat cuts, e.g . according to dimension requirements, or for detecting and correcting quality defects, such as the occurrence of foreign objects.
- the system may be applied at abattoirs, for processing incoming carcasses or parts of carcasses, intended for being cut into meat pieces of a certain size.
- the workpiece/starting material/meat item/target product (7) may in particular be a carcass, or a part hereof, e.g . a pork belly.
- an in-let conveyor belt (2) comprising a position sensor, in communication with the processing means (4), and capable of conveying the workpiece/starting material (7);
- a machine vision device (3) comprising at least two camera stations (3A, 3B), each station, in communication with the processing means (4), configured for identification of the incoming workpiece/starting material (7), for making size measurements, and capable of accessing the dimensions of the workpiece/starting material (7), and transmitting the digitalised data to the processing means (4) ;
- a processing means (4) in communication with the in-let conveyor (2), the machine vision device (3), and the projecting means (5), and capable of:
- a projection means (5) in communication with the in-let conveyor (2) and the processing means (4), and capable of displaying an over-layer of information, on the workpiece/target product (7) to be processed, to the executing cutting operator (6), while the workpiece/target product (7) is being conveyed;
- the first camera station (3A) feeds data to the processing means (4) for identifying, and for locating, the incoming workpiece/starting material (7), and for calculating cutting lines, and, if present, optionally also identifies foreign objects and notes their position on the surface of the incoming workpiece/starting material (7);
- the second camera station (3B) in a dynamic process, feeds data to the processing means (4) for identifying and locating the incoming workpiece/starting material (7), and communicates the current position of the workpiece/starting material (7) and the proposed cutting lines and location of foreign objects, if any, to the projecting means (5).
- the system of the invention is capable of generating a vision based loop, wherein the second camera station (3B), in a dynamic process involving data from the first camera station (3A), that are analysed and computed by the processing means
- the system further comprises one or more buffer-conveyors (2A) and/or an outlet-conveyor (2B).
- system of the invention further comprises a means for storing information (a server/database) (9).
- the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7).
- the projection means (5) for use according to the invention is a printer or a laser marker.
- the projection means (5) for use according to the invention is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
- AR augmented reality
- an in-let conveyor belt (2) in communication with the processing means (4), and capable of conveying the starting material (7);
- a machine vision device (3) in communication with the processing means (4), configured for making size measurements, and capable of accessing the dimensions of the starting material (7), and transmitting the digitalised data to the processing means (4);
- a processing means (4) in communication with the conveyor (2), the machine vision device (3), and the projecting means (5), and capable of:
- a projection means (5) in communication with the in-let conveyor (2), and the processing means (4), and capable of displaying an over-layer of information, on the meat item (7) to be processed, to the executing cutting operator (6), while the meat item (7) is being conveyed.
- system of the present invention may be characterised as described above, wherein :
- the machine vision device (3) comprises two camera stations (3A, 3B), each station, in communication with the processing means (4), configured for identification of the incoming starting material (7) ; and wherein
- the first camera station (3A) provides data to the processing means (4) for identification of the incoming starting material (7), and for calculating cutting lines, and, if present, optionally also identifies foreign objects and notes their position on the surface of the incoming starting material (7);
- the second camera station (3B) provides data to the processing means (4) for identification and for location of the incoming starting material (7) and communicates the position of cutting lines and foreign objects, if any, to the projecting means (5) .
- the in-let conveyor The in-let conveyor
- the processing means (4) For the processing means (4) to be able to calculate and track the position of the incoming workpiece/starting material (7), while in motion, and keep the projection means (5) synchronized with the conveyor belt (2), the system shall know the speed of the conveyor belt (2) . This may be accomplished in various ways.
- the in-let conveyor belt (2) for use according to the invention may be equipped with a position sensor, in communication with, and receiving operational guidance from, the processing means (4) . This ensures that the system can performing certain functions, like controlling (determine and adjust) the speed, and synchronize with the projection means (5) .
- the system of the invention comprises more conveyors, e.g . one or more buffer-conveyors (2A), and/or an outlet-conveyor (2B) .
- the workpiece may be removed from the system and stored on a (buffer) conveyor belt (2A) before it is re-introduced to the system for continued processing, and identification/monitoring by a second (or further) camera station (3B) .
- This may e.g . be accomplished by use of an outlet conveyor (2B) .
- the machine vision device is configured to:
- the machine vision device (3) for use according to the invention may be any machine vision device, configured for making size measurements, and capable of accessing the dimensions of the workpiece/starting material (7) while in motion .
- the vision device shall be in communication with, and be able to transmit data to the processing means (4) .
- the vision device (3) comprises the use of two cameras (3), one camera station (3A) for identifying the arriving workpiece/starting material (7) and for communicating image data to the processing means (4), and another camera station (3B) for tracing the conveyed workpiece/starting material (7) and communicating image data to the processing means (4) .
- the processor (4) Based on the image data established by the first camera station (3A), the processor (4) extracts characteristics associated with the meat product in question (7), and calculates optimal cutting lines, and, optionally, also identifies foreign objects, if present, and notes their position on the meat surface. Computed cutting lines, along with the specific location of foreign objects, if any, are reported to the projection means (5), that projects the finding onto the workpiece/target product (7), while on the fly, in the form of over-lay lines.
- Over-lays may e.g. be projected in different colours, for assisting the operator (6) in producing an optimized yield of the raw material by cutting along the projected lines in a specific order (e.g. green before red), and for performing a corrective action by removing any foreign objects detected as surface contamination .
- the second camera station (3B) As the workpiece/target product (7) is being moved towards the processing area (8), the second camera station (3B), that is positioned above the processing area (8), identifies the incoming workpiece/target product (7), based on the information obtained by the first camera station and recorded by the processing means (4), and communicates the location and position of the moving workpiece/target product (7) to the projection means (5), in the form of cutting lines and the position of any foreign object to be removed.
- the second camera station (3B) by employing feed-back/loop processes in cooperation with the processing means (4) and the projecting means (5), is able to provide dynamic cutting lines to the operator (6).
- the machine vision device (3) for use according to the invention is a multispectral vision device.
- a multispectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i .e. infrared and ultra-violet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green and blue. Multispectral imaging measures light in a small number (typically 3 to 15) of spectral bands.
- Multispectral imaging can also be accomplished by using camera(s) sensitive to all the relevant spectral bands and sequentially illuminating the object with each spectral band, while capturing a frame for each band.
- the machine vision device (3) for use according to the invention may comprise a light source capable of emitting electromagnetic waves in the ranges 350nm to 700nm to (from UV to visible regions) and 700nm to 950nm (i .e. NIR), and a sensor capable of receiving electromagnetic waves within the same ranges (i .e. 350nm to 950nm), and may be a device as described in e.g. WO 2017/121713.
- Machine vision systems essentially comes in three main categories:
- ID vision analyses a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera;
- 2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera/area-scan camera, e.g . a multispectral RGB (visible/colour images) camera; and
- 3D vision systems typically comprise multiple cameras or one or more laser displacement sensors.
- Commercially available 3D scanners or range cameras include the time-of-flight camera, the structure light camera, stereo camera or 3D camera, e.g. the Sick 3D ruler.
- the vision device of the first camera station (3A) is a line- scan camera.
- the vision device of the second camera station (3B) is an industrial 2D multispectral RGB camera, or area scan camera.
- the machine vision device (3) for use according to the invention is calibrated for correct size measurement.
- the processing means (4) for use according to the invention may be any commercially available processor/PC, in communication with the conveyor (2), the machine vision device (3), and the projecting means (5), and shall capable of:
- the processing means (4) used according to the invention may also be used for calculating the optimal cutting lines, and for location of existing foreign objects, if any. Identification and feature extraction may be accomplished using pattern recognition and digital image processing, and particularly feature extraction .
- the method of the invention further comprises a step (cl) wherein the digitalised data analysed in step (c) by the processing means (4) is transmitted to a means for storing information (server/database) (9).
- the processing means (4) used according to the invention may be in communication with a means for storing information, i.e. a central server or database (9).
- This central database (9) may contain pre-loaded information related to the products in question, e.g. origin of the product, product ID, product specifications, etc.
- the processing means (4) can establish a product ID, e.g . by reference to a product catalogue or product specifications stored on the server (9), and the product ID may be allocated to each workpiece/target product (7), and transmitted to the server/database (9) for further action/use.
- a product ID e.g . by reference to a product catalogue or product specifications stored on the server (9)
- the product ID may be allocated to each workpiece/target product (7), and transmitted to the server/database (9) for further action/use.
- the projection means The projection means
- the projection means (5) for use according to the invention shall assist the executing operator (6) in cutting and/or trimming the meat item (7) in question .
- the projection means (5) shall be in communication with the in-let conveyor (2), and the processing means (4), and shall receive guidance from the processing means (4) .
- a projection means (5) for use according to the invention may be a monitor, a projector, a headlight, a laser, or a printer, capable of creating and/or marking a guiding cutting/trimming pattern on the workpiece/target product (7) to be processed, but may also include augmented reality (AR) equipment (5A), e.g. smart-glasses or the like, which over-lay an image of the cutting curves onto the meat item (7) .
- AR augmented reality
- the system may provide the executing cutting operator (6) information about the cutting and trimming of meat cuts according to dimension requirements, or about quality defects such as the occurrence of foreign objects that shall be removed.
- the projector may also display information about the order in which the operator (6) should carry out the suggested cuts and trims.
- the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7) .
- the projection means (5) for use according to the invention is a printer or a laser marker.
- the projection means (5) for use according to the invention is a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
- AR augmented reality
- instructions about optimal cutting curves are displayed as an over-layer onto the workpiece/target product (7) .
- instructions about the optimal sequence for cutting the curves is displayed as an over-layer onto the workpiece/target product (7) .
- instructions about the optimal sequence by which cuts are performed is displayed by means of different colours or patterns for different curves.
- instructions about the optimal cutting sequence is presented by displaying the cutting curves in a sequence.
- instructions about the locations of foreign objects is displayed as an over-layer onto the workpiece/target product (7) .
- instructions about the instructions displayed are the location of excess fat, cartilage and/or bone to be removed from the piece.
- the invention provides a method for presenting information to a meat processing operator (6), by use of the system of the invention.
- the method of the invention may be characterised by comprising the subsequent steps of:
- step (c) analysing the digitalized data obtained by the first camera station (3A) in step (b) for determining identity (ID) and characteristics of the incoming
- step (d) analysing the digitalized data obtained by the second camera station (3B) in step (b) for determining identity (ID) and characteristics of the incoming
- step (e) using the projection means (5) of step (d) for presenting the cutting lines and the location of foreign objects, if any, calculated in step (c) to the meat
- processing operator (6) in the form of an over-layer of the information on the workpiece/starting material (7) to be processed .
- the method of the invention further comprises a step (cl) wherein the digitalised data analysed in step (c) by the processing means (4) is transmitted to a means for storing information (server/database) (9).
- system further comprises one or more buffer- conveyors (2A) and/or an outlet-conveyor (2B) .
- the digitalized data obtained in step (c) by the first camera station (3A) is analysed by feature extraction techniques.
- the projection means (5) for use according to the invention is a projector capable of over-laying an image onto the workpiece/target product (7) .
- the projection means (5) for use according to the invention is a printer or a laser marker.
- the projection means (5) for use according to the invention is or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene.
- AR augmented reality
- the invention provides a method for presenting information to a meat processing operator (6), which method comprises the subsequent steps of:
- step (b) obtaining one or more images of the incoming starting material (7) using a machine vision device (3), and transmitting the digitalized data to a processing means (4) ;
- step (c) analysing the digitalized data obtained in step (b) using a processing means (4), and transmitting the digitalized information/instructions to a projection means (5);
- step (d) using a projection means (5), presenting the information/instructions obtained in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the meat item (7) to be processed.
- the method of the invention for presenting information to a meat processing operator (6) comprises the subsequent steps of:
- step (c) analysing the digitalized data obtained by the first camera station (3A) in step (b) for defining identity and for extracting feature characteristics, and for calculating cutting lines and, optionally, the location of existing foreign objects, if any, in communication with the processing means (4);
- step (d) analysing the digitalized data obtained by the second camera station (3B) in step (b) for identification of the incoming meat item, in communication with the processing means (4), and transmitting the digitalized information/instructions to a projection means (5);
- step (e) using the projection means (5) of step (d) for presenting the
- step (c) information/instructions obtained in step (c) to the meat processing operator (6) in the form of an over-layer of the information on the meat item (7) to be processed.
- step (c) may be performed by the processing means (4) using pattern recognition, that represents a branch of machine learning that focuses on the recognition of patterns and regularities in data, in this case data obtained from the vision device (3).
- the projection means (5) used according to the method may e.g. be a projector capable of presenting (over-lay) an image onto the workpiece/target product (7), or a pair of smart-glasses (5A), presenting information to the executing cutting operator (6) in the form of an augmented reality (AR) vision of the scene, or it may be accomplished by marking the cutting curves onto the meat item (7) by means of a printer or a laser marking or similar.
- augmented reality AR
- the method of the invention may provide the executing cutting operator (6) information about the cutting and trimming of workpiece/starting material/target product (7) according to dimension requirements, or about quality defects such as the occurrence of foreign objects that shall be removed.
- the projector may also display information about the order in which the operator (6) should carry out the suggested cuts and trims.
- a vision camera (3) calibrated for size measurements, is used to assess the dimensions of the workpiece/starting material/target product (7) .
- the processing means (4) used according to the method makes use of algorithms to analyse data from the vision camera (3) in order to:
- the method of the invention is used for providing the executing cutting operator (6) information about possible quality defects, e.g . the occurrence of foreign objects on the meat item (7) .
- FIG. 1 shows an embodiment of the system according to the invention : An in-let conveyor (2); A first camera station (3A) ; A second camera station (3B); A processing means (4) ; A projection means (5) ; The executing operator (6) ; The workpiece/ starting material (7) ;
- FIG. 2 shows another embodiment of the system according to the invention : An in let conveyor (2); A first camera station (3A); A second camera station (3B) ; A processing means (4) ; A means for obtaining AR (5A) ; The executing operator (6); The
- FIG. 3A shows a front/side view of an embodiment of the system according to the invention :
- An in-let conveyor (2) A first camera station (3A) ;
- FIG. 3B shows a front view an embodiment of the system according to the invention :
- An in-let conveyor (2) A first camera station (3A) ;
- FIG. 3C shows a side view an embodiment of the system according to the invention : An in-let conveyor (2) ; A first camera station (3A) ; A second camera station (3B); A projection means (5) ;
- Fig. 4 shows another embodiment of the system according to the invention : An in let conveyor (2); A (buffer) conveyor (2A); An outlet conveyor (2B); A first camera station (3A); A second camera station (3B); A projection means (5);
- FIG. 5 shows another embodiment of the system according to the invention : An in- let conveyor (2); A machine vision device (3); A processing means (4); A projection means (5)/A means for obtaining AR (5A); The executing operator (6); The
- AR Augmented reality
- Means for storing information (server/database).
- AR Augmented Reality
- contaminant detection and cutting recipe support has been benchmarked in a simulated trial environment, and the most promising channel has been demonstrated in a meat production operation, with a vision sensor setup as the supporting information provider.
- Four different AR platforms candidates were benchmarked. Benchmarking was made with a simulated piece of meat wherein submerged contaminants were inserted to mimic bone fragments in meat.
- the tested platforms included a tablet, a set of smart glasses, a video projector and a video monitor.
- the operator performance was measured using a HTC VIVE based tracking system with a pointer attached to the controller. From the controller, the 3D position of the pointer is recorded together with the time spend for the task, leading to a capacity and quality measure for the 3D operation of pointing out the submerged contamination.
- the pointer was tracked to give the position accuracy for pointing out the submerged target contamination.
- the time spend for the entire pointing procedure was recorded to give an estimated efficiency for each operator performing the task.
- a setup was built with a projector as the preferred platform to illustrate the potentiality in a real operation from a meat production.
- the setup included a vision sensor, to determine the size of product (length and width) and the position of eventually surface contamination (Dyna-CQTM, available from DMRI, Denmark).
- the results from the vision sensor is communicated to the operator using a tracking software for recognition and tracking of each product.
- the relevant cutting instruction and calculated position of the detected surface contamination e.g. polymer fragments from wrapping films or carrier trays
- Input to the tracking/identification software is a standard RGB camera placed in a fixed position relative to the projecting device.
- the potential of using AR technology was demonstrated using a pilot scale production setup.
- the vision Dyna-CQTM sensor gives the absolute size (length and width) of the incoming meat products and detects any surface contamination.
- the product in the pilot experiment should meet specific size requirements. By comparing to the actual measured size, a cutting recipe in form of a set of cutting lines can be calculated for each product.
- the calculated cutting lines are projected onto the meat surface in order to guide the operator in performing the optimized cutting operation.
- the sequential order of the two cutting procedures leaves a significant volume of meat on either of the two trimmings: the caudal (hip) or ventral (belly) part. Actual price differences between the two parts may influence the yield of a specific cutting sequence.
- the sequential order of cutting is communicated to the operator using colour codes.
- the sequential order of cutting leaves the lower left-hand cut-off either on the ventral or the caudal part.
- an optimization potential is available for the operator by selecting a specific cutting sequential order.
- Benchmarking the four AR platforms is also relevant for the meat industry, as all components of the best performing system are off the shelf commercially available products. Different tracking software platforms are also available from different vendors, and the potential indicated in this experiment aligns with previous demonstrated yield improvement of fresh meat production .
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Forests & Forestry (AREA)
- Mechanical Engineering (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Food Science & Technology (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201800912 | 2018-11-26 | ||
PCT/EP2019/082374 WO2020109210A1 (en) | 2018-11-26 | 2019-11-25 | System for cutting and trimming meat cuts |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3887105A1 true EP3887105A1 (en) | 2021-10-06 |
Family
ID=68733029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19812942.1A Pending EP3887105A1 (en) | 2018-11-26 | 2019-11-25 | System for cutting and trimming meat cuts |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3887105A1 (en) |
WO (1) | WO2020109210A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102330181B1 (en) * | 2021-06-03 | 2021-11-24 | 주식회사 세종푸드 | Customized meat processing method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9128810B1 (en) * | 2004-01-09 | 2015-09-08 | John Bean Technologies Corporation | Method and system for portioning workpieces to directly-controlled and/or indirectly-controlled characteristics |
GB0703496D0 (en) | 2007-02-22 | 2007-04-04 | Ishida Europ Ltd | Method and apparatus for handling food pieces |
DK177704B1 (en) | 2012-11-22 | 2014-03-24 | Attec Danmark As | Method and means for controlling and removing foreign matter in food |
EP3403075A1 (en) | 2016-01-11 | 2018-11-21 | Teknologisk Institut | A method and device for scanning of objects using a combination of spectral ranges within vision, nir and x-rays |
-
2019
- 2019-11-25 EP EP19812942.1A patent/EP3887105A1/en active Pending
- 2019-11-25 WO PCT/EP2019/082374 patent/WO2020109210A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2020109210A1 (en) | 2020-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2710146C1 (en) | Device for obtaining and analyzing characteristic data for food industry products, an apparatus comprising such a device and a method for processing food industry products | |
US6031567A (en) | Method and apparatus for video lumber grading | |
US20070193425A1 (en) | Slicing of food products | |
US20100267320A1 (en) | Fat Cover Measurement Device | |
US9675091B1 (en) | Automated monitoring in cutting up slaughtered animals | |
EP2503331A2 (en) | Method and system for the real-time automatic analysis of the quality of samples of processed fish meat based on the external appearance thereof, said samples travelling on a conveyor belt such that surface defects can be detected and the fish meat can be sorted according to quality standards | |
WO2001022081A1 (en) | Animal carcase analysis | |
KR20210122254A (en) | Food processing apparatus and method | |
JP2018146251A (en) | Foreign matter detection system, foreign matter detection method and program thereof | |
US20190116816A1 (en) | System for registration and presentation of performance data to an operator | |
WO2017118757A1 (en) | A system and method for determining the presence and/or position of at least one bone in a meat piece | |
WO2020109210A1 (en) | System for cutting and trimming meat cuts | |
AU2019208245B2 (en) | Image acquisition for meat grading | |
CN216931665U (en) | Device for automatically cutting pig carcass | |
DK180419B1 (en) | System for cutting and trimming meat cuts | |
EP4158590A1 (en) | Method and apparatus for identifying possible surface defects of a leather hide | |
JP2020024124A (en) | Food foreign matter inspection device and foreign matter inspection method | |
DK180440B1 (en) | On-line determination of quality characteristics of meat products | |
CN109187593A (en) | A kind of control method and system of detection device | |
EP4025058B1 (en) | Automatic removal of contamination on carcasses | |
WO2020109208A1 (en) | System and method for automatic removal of foreign objects from a food surface | |
WO2024182367A1 (en) | A vision-based quality control and audit system and method of auditing, for carcass processing facility | |
AU767212B2 (en) | Animal carcase analysis | |
NZ749419A (en) | Measuring device for multispectral measuring of quality features or defects of products and method therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210628 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20240312 |
|
GRAL | Information related to payment of fee for publishing/printing deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR3 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |