WO2020104636A1 - A method at a slaughterhouse - Google Patents

A method at a slaughterhouse

Info

Publication number
WO2020104636A1
WO2020104636A1 PCT/EP2019/082187 EP2019082187W WO2020104636A1 WO 2020104636 A1 WO2020104636 A1 WO 2020104636A1 EP 2019082187 W EP2019082187 W EP 2019082187W WO 2020104636 A1 WO2020104636 A1 WO 2020104636A1
Authority
WO
WIPO (PCT)
Prior art keywords
tray
meat product
computer
classification
image
Prior art date
Application number
PCT/EP2019/082187
Other languages
French (fr)
Inventor
Thomas LAURIDSEN
Flemming Bonde NIELSEN
Original Assignee
Frontmatec Smørum A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Frontmatec Smørum A/S filed Critical Frontmatec Smørum A/S
Publication of WO2020104636A1 publication Critical patent/WO2020104636A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish

Definitions

  • a method at a slaughterhouse or abattoir for classifying meat products also known as cuts.
  • a method comprising: at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes: recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor; performing classification of the meat product by a computer- implemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; and recording with the manufacturing execution system, a classification product type
  • the likelihood values may be represented as it is known in the art e.g. as a estimated probabilities.
  • the classifier may be implemented as one classifier or an array of binary classifiers.
  • the classifier may implement a so-called Sigmoid function or a so-called Softmax function.
  • one or more current production recipes may be acquired from the manufacturing execution system. There may be one current production recipe for the slaughterhouse or for a particular production line in the slaughterhouse to be accordingly acquired. In some situations, e.g. in connection with transitioning from one production recipe to another and/or in connection with receiving different inputs to a production line, there may be multiple current production recipes.
  • a current production recipe represents which products the manufacturing execution system is engaged to produce along the production line at a current time slot or period of time.
  • restricting classification to product types comprised by the one or more current production recipes may comprise ignoring likelihood values associated with product types not comprised by one or more current production recipes.
  • a production recipe comprises a list or another collection of meat product types.
  • a production recipe may be related to a particular portion of the carcass input at the slaughterhouse.
  • a production recipe comprises four meat product types, e.g. for “pork side”: ham, loin, belly, and shoulder.
  • a production recipe comprises five meat product types, e.g. for“pork loin”: bone out loin, boin ribs, rind, fat, and trimmings.
  • the conveyor is configured with a load cell and the method may comprise acquiring from the load cell a mass value representing the mass of the meat product and the tray.
  • the method may further comprise including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier. It has been observed during experimentation that including the mass value, acquired from the load cell, in the feature descriptors input to the computer- implemented classifier improves classification by reducing misclassification.
  • an electronic device with a user interface for interaction with a human operator may be in communication with the manufacturing execution system; and the computer-implemented classifier may perform classification in accordance with a mapping function; the method may comprise via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray.
  • the method may further comprise associating the operator input with a tray identifier associated with the tray at a visual inspection position.
  • the method may further comprise determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
  • the computer-implemented classifier can be trained or retrained by supervised learning in accordance with human visual inspection. Despite involving human supervision, such a method greatly reduces the amount of human labour and gradually improves the classification towards an acceptable level of classification performance.
  • the user interface may comprise a touch-sensitive display screen encapsulated in accordance with standard specifications applicable or required at slaughterhouses.
  • the user interface may display a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
  • the graphical indicator may comprise one or more of a name e.g. a short- name of the product and a graphical image or icon. Displaying such a graphical indicator greatly assists the human operator in correcting or entering a classification.
  • the computer-implemented classifier may comprise one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
  • a neural network such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
  • images acquired at the image recording station are provided as feature descriptors input to the classifier.
  • the feature descriptors are supplemented by a mass value and the identification of one or more current production recipes.
  • the manufacturing execution system sets or gets such production recipes including current production recipes.
  • the image recording station may comprise multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes; the method may comprise recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. The method may further comprise including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
  • the method may comprise generating a computer- readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer-implemented classifier.
  • Information about the computer-readable three-dimensional model may comprise information about geometrical dimensions of identified portions of the three-dimensional model (3D model) and/or information about texture.
  • the geometrical dimensions may include one or more of distances and curvatures.
  • the at least one image of a product may be one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
  • one or more colour images representing visible light may be supplemented by an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
  • the image recording station may comprise one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.
  • a tray may comprise an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
  • the identification member may be a label with a printed code e.g. a bar code or a QR code or similar.
  • the identification member may comprise a radio frequency transmitter and receiver e.g. a so-called RFID tag.
  • the manufacturing execution system may register and may maintain unique sequence numbers for respective unique trays.
  • the method may comprise routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.
  • association with may be understood as a relation in database, a list or table or in other data structures between data items.
  • some type of code attached to, connected to or located in a predefined way relative to the physical item serves establish“associated with”.
  • a manufacturing execution system is a computer- implemented system which may comprise programmed computers, sensors and actuators communicating via a data network.
  • a manufacturing execution system is used in slaughterhouses to track and document the transformation from live stock or carcass bodies to meat products e.g. as generally known to a consumer.
  • MES may provide information that helps manufacturing decision makers understand how current conditions on a production line can be optimized to improve production output.
  • MES works in real time to enable the control of multiple elements of the production process (e.g. inputs, personnel, machines and support services).
  • MES may operate across multiple function areas, for example: management of product definitions, resource scheduling, order execution and dispatch, production analysis and downtime management for overall equipment effectiveness (OEE), Product Quality, or materials track and trace.
  • OEE overall equipment effectiveness
  • MES are especially important in regulated industries, such as food industry, where documentation and proof of processes, events and actions may be required.
  • fig. 1 shows a block diagram of a slaughterhouse production line and a classification module in communication with a manufacturing execution system
  • fig. 2 shows a flowchart of classification and assisting a computer- implemented classifier at a slaughterhouse.
  • Fig. 1 shows block diagram of a slaughterhouse production line (101 ) and a classification module (116) in communication with a manufacturing execution system (108).
  • the slaughterhouse production line (101 ) comprises a conveyer (102) for transporting trays (104) with meat products (106) along the conveyer.
  • the slaughterhouse production line (101 ) may comprise an image recording station (114), configured such that at least one image (IMGx) is recorded when the trays (104) with the meat products (106) passes by.
  • the slaughterhouse production line (101 ) may comprise one or more sensors (110a, 1 10b) along the conveyer.
  • a manufacturing execution system (108) may be in communication with the one or more sensors (110a,110b) for tracking the trays (104) with meat products (106).
  • a computer-implemented classifier (116) may be configured to perform classification of the meat product (106).
  • the classifier (116) may receive input such as feature descriptors input comprising information about the at least one image (IMGx).
  • the classifier (116) may provide output such as outputting an array of likelihood values with likelihood values associated with respective meat product types (P1 ,...Pi).
  • the likelihood values may be associated with meat product types (P1 ,...Pi) selected from one or more production recipes (R1 ,...Ri) each listing multiple meat product types (P1 ,...Pi).
  • the manufacturing execution system (108) may be configured to identify one or more current production recipes (R1 ,...Ri). The identification of one or more current production recipes (R1 , ... R,) may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer-implemented classifier (116) in classification of the meat product (106).
  • the classification of the meat product (106) may be restricted to classification of meat product types (P1 ,...Pi) comprised by the one or more current production recipes (R1 ,...Ri), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106).
  • the manufacturing execution system (108) may record a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
  • the slaughterhouse production line (101 ) may comprise a load cell (118).
  • the load cell (118) may be configured to provide a mass value representing the mass of the meat product (106) and the tray (104).
  • the mass value may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106).
  • the image recording station (114) may comprise multiple cameras arranged above the trays passing on the conveyor (102) and inclined along respective multiple axes. The image recording station (114) may record respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. Information about each of the respective multiple images, acquired from the respective cameras, may be included in the feature descriptors input to the computer-implemented classifier (116).
  • the slaughterhouse production line (101 ) may comprise an electronic device with (120) a user interface for interaction with a human operator configured to be in communication with the manufacturing execution system (108).
  • the computer-implemented classifier (116) may perform classification in accordance with a mapping function.
  • a human operator at a visual inspection position (122) may perform visual inspection of the meat product (106) in the tray (104).
  • the human operator, at the visual inspection position (122), may provide input to the electronic device (120) via the user interface. The input may be selecting a product type in response to the visual inspection of the meat product (106) in the tray (104).
  • the computer-implemented classifier (116) may be part of a classification module (130).
  • the classification module (130) may further comprise a selector (126) and an image processor (128).
  • the manufacturing execution system (108) may comprise a data base (124). Alternatively, the manufacturing execution system (108) may be connected with or may be in communication with a data base (124).
  • Fig. 2 shows a flowchart of a method (201 ) of classification and assisting a computer-implemented classifier at a slaughterhouse.
  • the method is related to the features of fig. 1.
  • the method may comprise acquiring (203) an image recorded by an image recording station.
  • the method may comprise querying (204) a data base of a manufacturing execution system for the current production recipes each listing multiple meat product types.
  • the method may comprise reading (205) a mass value from a load cell.
  • the method may comprise inputting (206) the image, current production recipe and mass value to a computer-implemented classifier.
  • the method may comprise performing (207) classification.
  • the computer-implemented classifier may perform classification in accordance with a mapping function.
  • the method may comprise recording (208) classification at the manufacturing execution system together with an identifier.
  • the identifier may be selected by a selector based on an output from the computer-implemented classifier.
  • the output from the computer-implemented classifier may by an array of likelihood values with likelihood values associated with respective meat product types.
  • the likelihood values may be associated with meat product types selected from one or more production recipes each listing multiple meat product types.
  • the method may comprise reading (209) the identifier at a visual inspection position.
  • the method may comprise receiving (210) input from the human operator.
  • the input from the human operator may be associated with a tray identifier associated with the tray at a visual inspection position.
  • the method may comprise updating (211 ) a mapping function, based on a determination that the that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Biophysics (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medicinal Chemistry (AREA)
  • Image Analysis (AREA)

Abstract

A method, comprising at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes:recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor;performing classification of the meat product by a computer-implemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types;acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes;recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.

Description

A method at a slaughterhouse
A method at a slaughterhouse or abattoir for classifying meat products, also known as cuts.
It is observed that there is a need for more efficient and robust classification of meat products in slaughterhouses.
SUMMARY
There is provided a method comprising: at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes: recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor; performing classification of the meat product by a computer- implemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; and recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.
Thereby it is possible to reduce misclassification rates for computer- implemented classification systems in slaughterhouses (aka. abattoirs) and/or to free human operators occupied with classifying meat products arriving in a fast and steady stream from that wearisome manual task. In one or more aspects, the likelihood values may be represented as it is known in the art e.g. as a estimated probabilities. The classifier may be implemented as one classifier or an array of binary classifiers. The classifier may implement a so-called Sigmoid function or a so-called Softmax function.
In one or more aspects, one or more current production recipes may be acquired from the manufacturing execution system. There may be one current production recipe for the slaughterhouse or for a particular production line in the slaughterhouse to be accordingly acquired. In some situations, e.g. in connection with transitioning from one production recipe to another and/or in connection with receiving different inputs to a production line, there may be multiple current production recipes. A current production recipe represents which products the manufacturing execution system is engaged to produce along the production line at a current time slot or period of time.
It should be understood that restricting classification to product types comprised by the one or more current production recipes may comprise ignoring likelihood values associated with product types not comprised by one or more current production recipes.
As it is common practice, a person skilled in the art knows how to record a classification product type associated with a tracking identifier associated with the tray at or with the manufacturing execution system. This may involve database queries.
In one or more aspects, a production recipe comprises a list or another collection of meat product types. A production recipe may be related to a particular portion of the carcass input at the slaughterhouse. In some examples, a production recipe comprises four meat product types, e.g. for “pork side”: ham, loin, belly, and shoulder. In another example, a production recipe comprises five meat product types, e.g. for“pork loin”: bone out loin, boin ribs, rind, fat, and trimmings. In some embodiments, the conveyor is configured with a load cell and the method may comprise acquiring from the load cell a mass value representing the mass of the meat product and the tray. The method may further comprise including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier. It has been observed during experimentation that including the mass value, acquired from the load cell, in the feature descriptors input to the computer- implemented classifier improves classification by reducing misclassification.
The mass value may be represented in Kilograms or Pounds or in accordance with another system. In some embodiments, an electronic device with a user interface for interaction with a human operator may be in communication with the manufacturing execution system; and the computer-implemented classifier may perform classification in accordance with a mapping function; the method may comprise via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray. The method may further comprise associating the operator input with a tray identifier associated with the tray at a visual inspection position. The method may further comprise determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
Thereby the computer-implemented classifier can be trained or retrained by supervised learning in accordance with human visual inspection. Despite involving human supervision, such a method greatly reduces the amount of human labour and gradually improves the classification towards an acceptable level of classification performance.
The user interface may comprise a touch-sensitive display screen encapsulated in accordance with standard specifications applicable or required at slaughterhouses.
In some embodiments, the user interface may display a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
The graphical indicator may comprise one or more of a name e.g. a short- name of the product and a graphical image or icon. Displaying such a graphical indicator greatly assists the human operator in correcting or entering a classification.
In some embodiments, the computer-implemented classifier may comprise one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
In some aspects images acquired at the image recording station are provided as feature descriptors input to the classifier. In some aspects the feature descriptors are supplemented by a mass value and the identification of one or more current production recipes. The manufacturing execution system sets or gets such production recipes including current production recipes.
In some embodiments, the image recording station may comprise multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes; the method may comprise recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. The method may further comprise including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
In some embodiments, the method may comprise generating a computer- readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer-implemented classifier.
Information about the computer-readable three-dimensional model may comprise information about geometrical dimensions of identified portions of the three-dimensional model (3D model) and/or information about texture. The geometrical dimensions may include one or more of distances and curvatures. In some embodiments, the at least one image of a product may be one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product. In particular, one or more colour images representing visible light may be supplemented by an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
In some embodiments, the image recording station may comprise one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation. In some embodiments, a tray may comprise an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
The identification member may be a label with a printed code e.g. a bar code or a QR code or similar. Alternatively or additionally the identification member may comprise a radio frequency transmitter and receiver e.g. a so-called RFID tag.
In some embodiments, the manufacturing execution system may register and may maintain unique sequence numbers for respective unique trays. In some embodiments, the method may comprise routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.
Generally, herein, “associated with” may be understood as a relation in database, a list or table or in other data structures between data items. In case the term is used in connection with a physical item that is“associated with” a data item, some type of code attached to, connected to or located in a predefined way relative to the physical item serves establish“associated with”.
Generally, herein, a manufacturing execution system (MES) is a computer- implemented system which may comprise programmed computers, sensors and actuators communicating via a data network. A manufacturing execution system is used in slaughterhouses to track and document the transformation from live stock or carcass bodies to meat products e.g. as generally known to a consumer.
MES may provide information that helps manufacturing decision makers understand how current conditions on a production line can be optimized to improve production output. MES works in real time to enable the control of multiple elements of the production process (e.g. inputs, personnel, machines and support services). MES may operate across multiple function areas, for example: management of product definitions, resource scheduling, order execution and dispatch, production analysis and downtime management for overall equipment effectiveness (OEE), Product Quality, or materials track and trace. MES are especially important in regulated industries, such as food industry, where documentation and proof of processes, events and actions may be required.
BRIEF DESCRIPTION OF THE FIGURES
A more detailed description follows below with reference to the drawing, in which: fig. 1 shows a block diagram of a slaughterhouse production line and a classification module in communication with a manufacturing execution system; and fig. 2 shows a flowchart of classification and assisting a computer- implemented classifier at a slaughterhouse. DETAILED DESCRIPTION
Fig. 1 shows block diagram of a slaughterhouse production line (101 ) and a classification module (116) in communication with a manufacturing execution system (108). The slaughterhouse production line (101 ) comprises a conveyer (102) for transporting trays (104) with meat products (106) along the conveyer. The slaughterhouse production line (101 ) may comprise an image recording station (114), configured such that at least one image (IMGx) is recorded when the trays (104) with the meat products (106) passes by. The slaughterhouse production line (101 ) may comprise one or more sensors (110a, 1 10b) along the conveyer. A manufacturing execution system (108) may be in communication with the one or more sensors (110a,110b) for tracking the trays (104) with meat products (106).
A computer-implemented classifier (116) may be configured to perform classification of the meat product (106). The classifier (116) may receive input such as feature descriptors input comprising information about the at least one image (IMGx). The classifier (116) may provide output such as outputting an array of likelihood values with likelihood values associated with respective meat product types (P1 ,...Pi). The likelihood values may be associated with meat product types (P1 ,...Pi) selected from one or more production recipes (R1 ,...Ri) each listing multiple meat product types (P1 ,...Pi).
The manufacturing execution system (108) may be configured to identify one or more current production recipes (R1 ,...Ri). The identification of one or more current production recipes (R1 , ... R,) may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer-implemented classifier (116) in classification of the meat product (106). The classification of the meat product (106) may be restricted to classification of meat product types (P1 ,...Pi) comprised by the one or more current production recipes (R1 ,...Ri), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106). The manufacturing execution system (108) may record a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
The slaughterhouse production line (101 ) may comprise a load cell (118). The load cell (118) may be configured to provide a mass value representing the mass of the meat product (106) and the tray (104).
The mass value may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer- implemented classifier (116) in classification of the meat product (106). The image recording station (114) may comprise multiple cameras arranged above the trays passing on the conveyor (102) and inclined along respective multiple axes. The image recording station (114) may record respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. Information about each of the respective multiple images, acquired from the respective cameras, may be included in the feature descriptors input to the computer-implemented classifier (116).
The slaughterhouse production line (101 ) may comprise an electronic device with (120) a user interface for interaction with a human operator configured to be in communication with the manufacturing execution system (108). The computer-implemented classifier (116) may perform classification in accordance with a mapping function. A human operator at a visual inspection position (122) may perform visual inspection of the meat product (106) in the tray (104). The human operator, at the visual inspection position (122), may provide input to the electronic device (120) via the user interface. The input may be selecting a product type in response to the visual inspection of the meat product (106) in the tray (104).
The computer-implemented classifier (116) may be part of a classification module (130). The classification module (130) may further comprise a selector (126) and an image processor (128).
The manufacturing execution system (108) may comprise a data base (124). Alternatively, the manufacturing execution system (108) may be connected with or may be in communication with a data base (124). Fig. 2 shows a flowchart of a method (201 ) of classification and assisting a computer-implemented classifier at a slaughterhouse.
The method is related to the features of fig. 1.
The method may comprise acquiring (203) an image recorded by an image recording station. The method may comprise querying (204) a data base of a manufacturing execution system for the current production recipes each listing multiple meat product types. The method may comprise reading (205) a mass value from a load cell. The method may comprise inputting (206) the image, current production recipe and mass value to a computer-implemented classifier. The method may comprise performing (207) classification. The computer-implemented classifier may perform classification in accordance with a mapping function.
The method may comprise recording (208) classification at the manufacturing execution system together with an identifier. The identifier may be selected by a selector based on an output from the computer-implemented classifier. The output from the computer-implemented classifier may by an array of likelihood values with likelihood values associated with respective meat product types. The likelihood values may be associated with meat product types selected from one or more production recipes each listing multiple meat product types.
The method may comprise reading (209) the identifier at a visual inspection position. The method may comprise receiving (210) input from the human operator. The input from the human operator may be associated with a tray identifier associated with the tray at a visual inspection position. The method may comprise updating (211 ) a mapping function, based on a determination that the that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.

Claims

1. A method, comprising: at a slaughterhouse with a conveyor (102), transporting trays (104) with meat products (106), and a manufacturing execution system (108) in
communication with one or more sensors (110) tracking the trays (104) with meat products (106) and engaged in accordance with one or more production recipes: recording at least one image of a meat product (106) while the meat product (106), placed in a tray (104), passes an image recording station (114) at the conveyor (102); performing classification of the meat product (106) by a computer- implemented classifier (116) receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturing execution system (108),
identification of one or more current production recipes; and assisting the computer-implemented classifier (116) by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier (116) and/or by restricting classification to product types comprised by the one or more current production recipes; recording with the manufacturing execution system (108), a
classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
2. A method according to claim 1 , wherein the conveyor is configured with a load cell, the method comprising: acquiring from the load cell a mass value representing the mass of the meat product and the tray; and including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier.
3. A method according to any of the above claims, wherein an electronic device with a user interface for interaction with a human operator is in communication with the manufacturing execution system; and wherein the computer-implemented classifier performs classification in accordance with a mapping function; the method comprising: via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray; associating the operator input with a tray identifier associated with the tray at a visual inspection position; and determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier. 4. A method according to claim 3, wherein the user interface displays a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
5. A method according to any of the above claims, wherein the computer- implemented classifier comprises one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
6. A method according to any of the above claims, wherein the image recording station comprises multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes;
comprising: recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor; including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
7. A method according to any of the above claims, comprising: generating a computer-readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer- implemented classifier. 8. A method according to any of the above claims, wherein the at least one image of a product is one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
9. A method according to any of the above claims, wherein the image recording station comprises one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.
10. A method according to any of the above claims, wherein a tray comprises an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
11. A method according to any of the above claims, wherein the
manufacturing execution system registers and maintains unique sequence numbers for respective unique trays.
12. A method according to any of the above claims, comprising routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.
13. A manufacturing system for a slaughterhouse with a conveyor (102) for transporting trays (104) with meat products (106), comprising: one or more sensors (110) tracking the trays (104) with meat products
(106); one or more cameras, at an image recording station, for recording at least one image of a meat product (106) while the meat product (106), placed in a tray (104), passes the image recording station; and a computer configured to perform the method in accordance with any of claims 1 -12.
PCT/EP2019/082187 2018-11-23 2019-11-22 A method at a slaughterhouse WO2020104636A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA201870776 2018-11-23
DKPA201870776A DK179792B1 (en) 2018-11-23 2018-11-23 A method for classifying meat products in a slaughterhouse

Publications (1)

Publication Number Publication Date
WO2020104636A1 true WO2020104636A1 (en) 2020-05-28

Family

ID=69156041

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/082187 WO2020104636A1 (en) 2018-11-23 2019-11-22 A method at a slaughterhouse

Country Status (2)

Country Link
DK (1) DK179792B1 (en)
WO (1) WO2020104636A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11803958B1 (en) 2021-10-21 2023-10-31 Triumph Foods Llc Systems and methods for determining muscle fascicle fracturing

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2362710A (en) * 2000-05-25 2001-11-28 Intelligent Mfg Systems Ltd Scanning meat samples for bones where sample container has a data storage element read by control means
WO2018167089A1 (en) * 2017-03-13 2018-09-20 Carometec A/S 3d imaging system and method of imaging carcasses

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2362710A (en) * 2000-05-25 2001-11-28 Intelligent Mfg Systems Ltd Scanning meat samples for bones where sample container has a data storage element read by control means
WO2018167089A1 (en) * 2017-03-13 2018-09-20 Carometec A/S 3d imaging system and method of imaging carcasses

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
J LECLERE1 ET AL: "3.II-P9 A BEEF CARCASS CLASSIFICATION BY ON-LINE IMAGE ANALYSIS", 1 January 2001 (2001-01-01), XP055659177, Retrieved from the Internet <URL:http://icomst-proceedings.helsinki.fi/papers/2000_06_12.pdf> [retrieved on 20200117] *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11803958B1 (en) 2021-10-21 2023-10-31 Triumph Foods Llc Systems and methods for determining muscle fascicle fracturing

Also Published As

Publication number Publication date
DK201870776A1 (en) 2019-06-20
DK179792B1 (en) 2019-06-20

Similar Documents

Publication Publication Date Title
AU2007242918B2 (en) Method and system for associating source information for a source unit with a product converted therefrom
US11175650B2 (en) Product knitting systems and methods
US10282600B2 (en) Visual task feedback for workstations in materials handling facilities
EP0820029A2 (en) Indentification and tracking of articles
US20180181913A1 (en) Shelf allocation assistance device, shelf allocation assistance system, shelf allocation assistance method, and recording medium
CN108573368B (en) Management device and readable recording medium
EP4088233A1 (en) Systems and methods for anomaly recognition and detection using lifelong deep neural networks
CN110942035A (en) Method, system, device and storage medium for acquiring commodity information
KR20240001241A (en) Image-based anomaly detection based on machine learning analysis of objects
CN116596441A (en) Intelligent warehouse service management method and system based on cloud computing
WO2020104636A1 (en) A method at a slaughterhouse
CN114186933A (en) Cold chain food intelligent supervision platform
Chen et al. AI applications to shop floor management in lean manufacturing
CN108480220A (en) A kind of materials-sorting system
CN113971574A (en) Meat product processing process safety tracing method based on Internet of things
US20130006697A1 (en) Using prime numbers and prime number factorization to track articles through transit points in a supply chain
US20240000088A1 (en) A method of tracking a food item in a processing facility, and a system for processing food items
US20240159856A1 (en) Determining a location of rfid tag
US20230153978A1 (en) Methods and systems for grading devices
US10962961B2 (en) Systems and methods for tracking cutting implements in processing facilities
US20230252407A1 (en) Systems and methods of defining and identifying product display areas on product display shelves
US20240104495A1 (en) System and method for tracking inventory inside warehouse with put-away accuracy using machine learning models
US20220067645A1 (en) Systems and Methods for Automatic Determination of a Packing Configuration to Pack Items in Shipping Boxes
Zhao et al. Online Fault Detection Based on Kernel Perceptron for Evolving Features
Prakash Integrating structured and unstructured data for imbalanced classification using meat-cut images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19808583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19808583

Country of ref document: EP

Kind code of ref document: A1