DK179792B1 - A method for classifying meat products in a slaughterhouse - Google Patents

A method for classifying meat products in a slaughterhouse Download PDF

Info

Publication number
DK179792B1
DK179792B1 DKPA201870776A DKPA201870776A DK179792B1 DK 179792 B1 DK179792 B1 DK 179792B1 DK PA201870776 A DKPA201870776 A DK PA201870776A DK PA201870776 A DKPA201870776 A DK PA201870776A DK 179792 B1 DK179792 B1 DK 179792B1
Authority
DK
Denmark
Prior art keywords
tray
meat product
computer
classification
image
Prior art date
Application number
DKPA201870776A
Other languages
Danish (da)
Inventor
Lauridsen Thomas
Bonde Nielsen Flemming
Original Assignee
Frontmatec Smørum A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Frontmatec Smørum A/S filed Critical Frontmatec Smørum A/S
Priority to DKPA201870776A priority Critical patent/DK179792B1/en
Application granted granted Critical
Publication of DK201870776A1 publication Critical patent/DK201870776A1/en
Publication of DK179792B1 publication Critical patent/DK179792B1/en
Priority to PCT/EP2019/082187 priority patent/WO2020104636A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; fish

Abstract

A method, comprising at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes: recording at least oneimage of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor; performing classification of the meat product by a computer-implemented classifier receiving as its input feature descriptors comprising information about the at least one image andoutputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturingexecution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.

Description

A method at a slaughterhouse
A method at a slaughterhouse or abattoir for classifying meat products, also known as cuts.
It is observed that there is a need for more efficient and robust classification of meat products in slaughterhouses.
SUMMARY
There is provided a method comprising:
at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes:
recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor;
performing classification of the meat product by a computerimplemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types;
wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types;
acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; and recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.
Thereby it is possible to reduce misclassification rates for computerimplemented classification systems in slaughterhouses (aka. abattoirs) and/or to free human operators occupied with classifying meat products arriving in a fast and steady stream from that wearisome manual task.
In one or more aspects, the likelihood values may be represented as it is known in the art e.g. as a estimated probabilities. The classifier may be implemented as one classifier or an array of binary classifiers. The classifier may implement a so-called Sigmoid function or a so-called Softmax function.
In one or more aspects, one or more current production recipes may be acquired from the manufacturing execution system. There may be one current production recipe for the slaughterhouse or for a particular production line in the slaughterhouse to be accordingly acquired. In some situations, e.g. in connection with transitioning from one production recipe to another and/or in connection with receiving different inputs to a production line, there may be multiple current production recipes. A current production recipe represents which products the manufacturing execution system is engaged to produce along the production line at a current time slot or period of time.
It should be understood that restricting classification to product types comprised by the one or more current production recipes may comprise ignoring likelihood values associated with product types not comprised by one or more current production recipes.
As it is common practice, a person skilled in the art knows how to record a classification product type associated with a tracking identifier associated with the tray at or with the manufacturing execution system. This may involve database queries.
In one or more aspects, a production recipe comprises a list or another collection of meat product types. A production recipe may be related to a particular portion of the carcass input at the slaughterhouse. In some examples, a production recipe comprises four meat product types, e.g. for “pork side”: ham, loin, belly, and shoulder. In another example, a production recipe comprises five meat product types, e.g. for “pork loin”: bone out loin, boin ribs, rind, fat, and trimmings.
In some embodiments, the conveyor is configured with a load cell and the method may comprise acquiring from the load cell a mass value representing the mass of the meat product and the tray. The method may further comprise including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier.
It has been observed during experimentation that including the mass value, acquired from the load cell, in the feature descriptors input to the computerimplemented classifier improves classification by reducing misclassification.
The mass value may be represented in Kilograms or Pounds or in accordance with another system.
In some embodiments, an electronic device with a user interface for interaction with a human operator may be in communication with the manufacturing execution system; and the computer-implemented classifier may perform classification in accordance with a mapping function; the method may comprise via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray. The method may further comprise associating the operator input with a tray identifier associated with the tray at a visual inspection position. The method may further comprise determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
Thereby the computer-implemented classifier can be trained or retrained by supervised learning in accordance with human visual inspection. Despite involving human supervision, such a method greatly reduces the amount of human labour and gradually improves the classification towards an acceptable level of classification performance.
The user interface may comprise a touch-sensitive display screen encapsulated in accordance with standard specifications applicable or required at slaughterhouses.
In some embodiments, the user interface may display a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
The graphical indicator may comprise one or more of a name e.g. a shortname of the product and a graphical image or icon. Displaying such a graphical indicator greatly assists the human operator in correcting or entering a classification.
In some embodiments, the computer-implemented classifier may comprise one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
In some aspects images acquired at the image recording station are provided as feature descriptors input to the classifier. In some aspects the feature descriptors are supplemented by a mass value and the identification of one or more current production recipes. The manufacturing execution system sets or gets such production recipes including current production recipes.
In some embodiments, the image recording station may comprise multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes; the method may comprise recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. The method may further comprise including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
In some embodiments, the method may comprise generating a computerreadable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer-implemented classifier.
Information about the computer-readable three-dimensional model may comprise information about geometrical dimensions of identified portions of the three-dimensional model (3D model) and/or information about texture. The geometrical dimensions may include one or more of distances and curvatures.
In some embodiments, the at least one image of a product may be one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
In particular, one or more colour images representing visible light may be supplemented by an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
In some embodiments, the image recording station may comprise one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.
In some embodiments, a tray may comprise an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
The identification member may be a label with a printed code e.g. a bar code or a QR code or similar. Alternatively or additionally the identification member may comprise a radio frequency transmitter and receiver e.g. a so-called RFID tag.
In some embodiments, the manufacturing execution system may register and may maintain unique sequence numbers for respective unique trays.
In some embodiments, the method may comprise routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.
Generally, herein, “associated with” may be understood as a relation in database, a list or table or in other data structures between data items. In case the term is used in connection with a physical item that is “associated with” a data item, some type of code attached to, connected to or located in a predefined way relative to the physical item serves establish “associated with”.
Generally, herein, a manufacturing execution system (MES) is a computerimplemented system which may comprise programmed computers, sensors and actuators communicating via a data network. A manufacturing execution system is used in slaughterhouses to track and document the transformation from live stock or carcass bodies to meat products e.g. as generally known to a consumer.
MES may provide information that helps manufacturing decision makers understand how current conditions on a production line can be optimized to improve production output. MES works in real time to enable the control of multiple elements of the production process (e.g. inputs, personnel, machines and support services). MES may operate across multiple function areas, for example: management of product definitions, resource scheduling, order execution and dispatch, production analysis and downtime management for overall equipment effectiveness (OEE), Product Quality, or materials track and trace. MES are especially important in regulated industries, such as food industry, where documentation and proof of processes, events and actions may be required.
BRIEF DESCRIPTION OF THE FIGURES
A more detailed description follows below with reference to the drawing, in which:
fig. 1 shows a block diagram of a slaughterhouse production line and a classification module in communication with a manufacturing execution system; and fig. 2 shows a flowchart of classification and assisting a computerimplemented classifier at a slaughterhouse.
DETAILED DESCRIPTION
Fig. 1 shows block diagram of a slaughterhouse production line (101) and a classification module (116) in communication with a manufacturing execution system (108). The slaughterhouse production line (101) comprises a conveyer (102) for transporting trays (104) with meat products (106) along the conveyer. The slaughterhouse production line (101) may comprise an image recording station (114), configured such that at least one image (IMGx) is recorded when the trays (104) with the meat products (106) passes by. The slaughterhouse production line (101) may comprise one or more sensors (110a, 110b) along the conveyer. A manufacturing execution system (108) may be in communication with the one or more sensors (110a,110b) for tracking the trays (104) with meat products (106).
A computer-implemented classifier (116) may be configured to perform classification of the meat product (106). The classifier (116) may receive input such as feature descriptors input comprising information about the at least one image (IMGx). The classifier (116) may provide output such as outputting an array of likelihood values with likelihood values associated with respective meat product types (P1,...Pi). The likelihood values may be associated with meat product types (P1,...Pi) selected from one or more production recipes (R1,.Ri) each listing multiple meat product types (P1,.Pi).
The manufacturing execution system (108) may be configured to identify one or more current production recipes (R1,.Ri). The identification of one or more current production recipes (R1,.Ri) may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer-implemented classifier (116) in classification of the meat product (106). The classification of the meat product (106) may be restricted to classification of meat product types (P1,...Pi) comprised by the one or more current production recipes (R1,.Ri), thereby assisting the computerimplemented classifier (116) in classification of the meat product (106). The manufacturing execution system (108) may record a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
The slaughterhouse production line (101) may comprise a load cell (118). The load cell (118) may be configured to provide a mass value representing the mass of the meat product (106) and the tray (104).
The mass value may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computerimplemented classifier (116) in classification of the meat product (106).
The image recording station (114) may comprise multiple cameras arranged above the trays passing on the conveyor (102) and inclined along respective multiple axes. The image recording station (114) may record respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. Information about each of the respective multiple images, acquired from the respective cameras, may be included in the feature descriptors input to the computer-implemented classifier (116).
The slaughterhouse production line (101) may comprise an electronic device with (120) a user interface for interaction with a human operator configured to be in communication with the manufacturing execution system (108). The computer-implemented classifier (116) may perform classification in accordance with a mapping function. A human operator at a visual inspection position (122) may perform visual inspection of the meat product (106) in the tray (104). The human operator, at the visual inspection position (122), may provide input to the electronic device (120) via the user interface. The input may be selecting a product type in response to the visual inspection of the meat product (106) in the tray (104).
The computer-implemented classifier (116) may be part of a classification module (130). The classification module (130) may further comprise a selector (126) and an image processor (128).
The manufacturing execution system (108) may comprise a data base (124). Alternatively, the manufacturing execution system (108) may be connected with or may be in communication with a data base (124).
Fig. 2 shows a flowchart of a method (201) of classification and assisting a computer-implemented classifier at a slaughterhouse.
The method is related to the features of fig. 1.
The method may comprise acquiring (203) an image recorded by an image recording station. The method may comprise querying (204) a data base of a manufacturing execution system for the current production recipes each listing multiple meat product types. The method may comprise reading (205) a mass value from a load cell. The method may comprise inputting (206) the image, current production recipe and mass value to a computer-implemented classifier. The method may comprise performing (207) classification. The computer-implemented classifier may perform classification in accordance with a mapping function.
The method may comprise recording (208) classification at the manufacturing execution system together with an identifier. The identifier may be selected by a selector based on an output from the computer-implemented classifier. The output from the computer-implemented classifier may by an array of likelihood values with likelihood values associated with respective meat product types. The likelihood values may be associated with meat product types selected from one or more production recipes each listing multiple meat product types.
The method may comprise reading (209) the identifier at a visual inspection position. The method may comprise receiving (210) input from the human operator. The input from the human operator may be associated with a tray identifier associated with the tray at a visual inspection position. The method may comprise updating (211) a mapping function, based on a determination that the that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.

Claims (5)

Patentkravclaims 1. Fremgangsmåde, som omfatter:A method comprising: i et slagteri med en transportør (102), der transporterer bakker (104) med kødprodukter (106), og et produktionseksekveringssystem (108), som kommunikerer med en eller flere sensorer (110), der sporer bakkerne (104) med kødprodukter (106), og er i funktion i overensstemmelse med en eller flere produktionsrecepter:in a slaughterhouse with a conveyor (102) transporting trays (104) with meat products (106) and a production execution system (108) communicating with one or more sensors (110) tracking the trays (104) with meat products (106) ), and function according to one or more production recipes: at optage mindst et billede af et kødprodukt (106) mens kødproduktet (106), der er anbragt i bakken (104), passerer en billedoptagelsesstation (114) ved transportøren (102);receiving at least one image of a meat product (106) while the meat product (106) disposed in the tray (104) passes an image recording station (114) at the conveyor (102); at udføre klassificering af kødproduktet (106) ved hjælp af en computerimplementeret klassificeringsindretning (116), der som sin indlæsning modtager egenskabsdeskriptorer, der omfatter informationer om det mindst ene billede og udsender et sæt af sandsynlighedsværdier med sandsynlighedsværdier, som er associeret med respektive kødprodukttyper;performing the classification of the meat product (106) by means of a computer-implemented rating device (116) which receives as its input property descriptors which include information on the at least one image and outputs a set of probability values with probability values associated with respective meat product types; hvor sandsynlighedsværdierne er associeret med kødprodukttyper udvalgt blandt en eller flere produktionsrecepter, som hver oplister flere kødprodukttyper;wherein the probability values are associated with meat product types selected from one or more production recipes, each listing several meat product types; at opnå identifikation af en eller flere aktuelle produktionsrecepter fra produktionseksekveringssystemet (108); og at assistere den computerimplementerede klassificeringsindretning (116) ved at inkludere identifikationen af en eller flere aktuelle produktionsrecepter i egenskabsdeskriptorerne, der er indlæst i den computerimplementerede klassificeringsindretning (116), og/eller ved at indskrænke klassificering til produkttyper, der er omfattet af den ene eller flere aktuelle produktionsrecepter;obtaining identification of one or more current production prescriptions from the production execution system (108); and assisting the computer-implemented rating device (116) by including the identification of one or more current production prescriptions in the property descriptors loaded into the computer-implemented rating device (116) and / or by restricting classification to product types covered by one or more several current production recipes; at registrere med produktionseksekveringssystemet (108) en klassificeringsprodukttype, som er associeret med en sporingsidentifikator, som er associeret med bakken (104) og bestemt af klassificeringen.registering with the production execution system (108) a classification product type associated with a trace identifier associated with the tray (104) and determined by the classification. 2. Fremgangsmåde ifølge krav 1, hvor transportøren er konfigureret med en vejecelle, hvilken fremgangsmåde omfatter:The method of claim 1, wherein the conveyor is configured with a weighing cell, comprising: at opnå fra vejecellen en vægtværdi, der repræsenterer vægten af kødproduktet og bakken; og at inkludere vægtværdien, der er opnået fra vejecellen, i egenskabsdeskriptorerne, der er indlæst i den computerimplementerede klassificeringsindretning.obtaining from the weighing cell a weight value representing the weight of the meat product and the tray; and including the weight value obtained from the weighing cell in the property descriptors loaded into the computer-implemented rating device. 3. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor en elektronisk indretning med en brugergrænseflade til interaktion med en operatør kommunikerer med produktionseksekveringssystemet; og hvor den computerimplementerede klassificeringsindretning udfører klassificering i overensstemmelse med en mapping-funktion; hvilken fremgangsmåde omfatter:A method according to any one of the preceding claims, wherein an electronic device with a user interface for interaction with an operator communicates with the production execution system; and wherein the computer-implemented rating device performs classification in accordance with a mapping function; which method comprises: at modtage via brugergrænsefladen i en visuel inspektionsposition en operatørindlæsning, der udvælger en produkttype som reaktion på operatøren, der udfører en visuel inspektion af kødproduktet i bakken;receiving through the user interface in a visual inspection position an operator loading which selects a product type in response to the operator performing a visual inspection of the meat product in the tray; at associere operatørindlæsningen med en bakkeidentifikator, som er associeret med bakken i en visuel inspektionsposition; og at bestemme, at operatørindlæsningen svarer til udvælgelse afen produkttype, som er forskellig fra klassificeringsprodukttypen; og i overensstemmelse dermed opdatere mapping-funktionen i den computerimplementerede klassificeringsindretning.associating the operator input with a tray identifier associated with the tray in a visual inspection position; and determining that the operator input corresponds to selecting a product type different from the classification product type; and accordingly update the mapping feature of the computer-implemented rating device. 4. Fremgangsmåde ifølge krav 3, hvor brugergrænsefladen viser en grafisk indikator, som repræsenterer en kødprodukttype, der er associeret med en bakke, som aktuelt befinder sig i den visuelle inspektionsposition.The method of claim 3, wherein the user interface shows a graphical indicator representing a meat product type associated with a tray currently in the visual inspection position. 5. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor den computerimplementerede klassificeringsindretning omfatter et eller flere af: et neuralt netværk såsom et dybt neuralt netværk (deep neural network), et konvolutionelt neuralt netværk (convolutional neural network), en støttevektormaskine (support vector machine) og en forstærkningslæringsalgoritme (reinforcement learning algorithm).A method according to any one of the preceding claims, wherein the computer implemented classifier comprises one or more of: a neural network such as a deep neural network, a convolutional neural network, a support vector machine ( support vector machine) and a reinforcement learning algorithm. 6. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor billedoptagelsesstationen omfatter flere kameraer, som er anbragt over bakkerne, der passerer på transportøren og er hældende langs respektive flere akser; omfattende:A method according to any one of the preceding claims, wherein the image capture station comprises multiple cameras disposed over the trays passing on the conveyor and inclined along respective multiple axes; comprehensive: at optage respektive flere billeder af et kødprodukt, mens kødproduktet, der er anbragt i en bakke, passerer billedoptagelsesstationen ved transportøren;recording respective multiple images of a meat product while the meat product placed in a tray passes the image capture station at the conveyor; at inkludere informationer om hvert af de respektive flere billeder, der er hentet fra de respektive kameraer, i egenskabsdeskriptorerne, der er indlæst i den computerimplementerede klassificeringsindretning.including information about each of the respective multiple images retrieved from the respective cameras in the property descriptors loaded in the computer implemented rating device. 7. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, omfattende:A method according to any one of the preceding claims, comprising: at generere en computerlæsbar tredimensionel model af kødproduktet i bakken og inkludere informationer om den computerlæsbare tredimensionelle model i egenskabsdeskriptorerne, derer indlæst i den computerimplementerede klassificeringsindretning.generating a computer-readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the property descriptors loaded in the computer-implemented rating device. 8. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor det mindst ene billede af et produkt er et eller flere af: et farvebillede, der viser et synligt lys, et billede, der viser infrarødt lys, et billede, der viser ultraviolet lys, et billede, der viser absorption af røntgenstråling, og et billede, der optages, mens struktureret lys projiceres på kødproduktet.A method according to any one of the preceding claims, wherein the at least one image of a product is one or more of: a color image showing a visible light, an image showing an infrared light, an image showing an ultraviolet light, an image that shows X-ray absorption, and an image that is recorded while textured light is projected onto the meat product. 9. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor billedoptagelsesstationen omfatter et eller flere kameraer, som optager stråling ved: en eller flere synlige farver, infrarødt lys, ultraviolet lys eller røntgenstråling.A method according to any one of the preceding claims, wherein the image capture station comprises one or more cameras which record radiation by: one or more visible colors, infrared light, ultraviolet light or x-ray radiation. 10. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor en bakke omfatter et identifikationselement, som er kodet med en identifikator for bakken, og hvor produktionseksekveringssystemet holder styr på bakken blandt andre bakker.A method according to any one of the preceding claims, wherein a tray comprises an identification element encoded with an identifier for the tray and wherein the production execution system keeps track of the tray among other trays. 11. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, hvor produktionseksekveringssystemet registrerer og bibeholder unikke sekvensnumre for respektive unikke bakker.The method of any preceding claim, wherein the production execution system records and maintains unique sequence numbers for respective unique trays. 5 12. Fremgangsmåde ifølge et hvilket som helst af de foregående krav, omfattende dirigering af bakken i et transportørsystem med flere spor og/eller et opbevaringskammer i overensstemmelse med klassificeringsprodukttypen, som er associeret med bakken.A method according to any one of the preceding claims, comprising routing the tray in a multi-track conveyor system and / or a storage chamber in accordance with the classification product type associated with the tray.
DKPA201870776A 2018-11-23 2018-11-23 A method for classifying meat products in a slaughterhouse DK179792B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DKPA201870776A DK179792B1 (en) 2018-11-23 2018-11-23 A method for classifying meat products in a slaughterhouse
PCT/EP2019/082187 WO2020104636A1 (en) 2018-11-23 2019-11-22 A method at a slaughterhouse

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DKPA201870776A DK179792B1 (en) 2018-11-23 2018-11-23 A method for classifying meat products in a slaughterhouse

Publications (2)

Publication Number Publication Date
DK201870776A1 DK201870776A1 (en) 2019-06-20
DK179792B1 true DK179792B1 (en) 2019-06-20

Family

ID=69156041

Family Applications (1)

Application Number Title Priority Date Filing Date
DKPA201870776A DK179792B1 (en) 2018-11-23 2018-11-23 A method for classifying meat products in a slaughterhouse

Country Status (2)

Country Link
DK (1) DK179792B1 (en)
WO (1) WO2020104636A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11803958B1 (en) 2021-10-21 2023-10-31 Triumph Foods Llc Systems and methods for determining muscle fascicle fracturing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2362710B (en) * 2000-05-25 2002-07-17 Intelligent Mfg Systems Ltd Analysis of samples
EP3595453B1 (en) * 2017-03-13 2021-08-25 Frontmatec Smørum A/S 3d imaging system and method of imaging carcasses

Also Published As

Publication number Publication date
DK201870776A1 (en) 2019-06-20
WO2020104636A1 (en) 2020-05-28

Similar Documents

Publication Publication Date Title
US10762378B2 (en) Photo analytics calibration
US20190220692A1 (en) Method and apparatus for checkout based on image identification technique of convolutional neural network
AU2007242918A1 (en) Method and system for associating source information for a source unit with a product converted therefrom
US20180181913A1 (en) Shelf allocation assistance device, shelf allocation assistance system, shelf allocation assistance method, and recording medium
US20230011901A1 (en) Systems and methods for anomaly recognition and detection using lifelong deep neural networks
DK179792B1 (en) A method for classifying meat products in a slaughterhouse
JP2021149142A (en) Meat discrimination device
CN114186933A (en) Cold chain food intelligent supervision platform
Jarkas et al. ResNet and Yolov5-enabled non-invasive meat identification for high-accuracy box label verification
CN109685002A (en) A kind of dataset acquisition method, system and electronic device
CN111595237B (en) Distributed system and method for measuring fabric size based on machine vision
CN112069841B (en) X-ray contraband parcel tracking method and device
CN113971574A (en) Meat product processing process safety tracing method based on Internet of things
US10181108B1 (en) Item attribute collection
US20240000088A1 (en) A method of tracking a food item in a processing facility, and a system for processing food items
AU2022100022A4 (en) Meat processing tracking, tracing and yield measurement
US10814354B1 (en) Computerized system, method and processor-executable code to autonomously screen mailable items to identify candidate items for content inspection or testing
US20230252407A1 (en) Systems and methods of defining and identifying product display areas on product display shelves
US20230153978A1 (en) Methods and systems for grading devices
US20230169452A1 (en) System Configuration for Learning and Recognizing Packaging Associated with a Product
US20240054447A1 (en) Inventory characterization and identification system
WO2023152893A1 (en) Management device, management system, management method, and program
Prakash Integrating structured and unstructured data for imbalanced classification using meat-cut images
CN114998722A (en) Information management method and device
GB2605760A (en) Determining location of RFID tag

Legal Events

Date Code Title Description
PAT Application published

Effective date: 20190617

PME Patent granted

Effective date: 20190620

PPF Opposition filed

Opponent name: DK:TEKNOLOGISK INSTITUT

Effective date: 20200320

PIU Opposition: patent invalid

Opponent name: DK:TEKNOLOGISK INSTITUT

Effective date: 20210705