DK201870776A1 - A method at a slaughterhouse - Google Patents
A method at a slaughterhouse Download PDFInfo
- Publication number
- DK201870776A1 DK201870776A1 DKPA201870776A DKPA201870776A DK201870776A1 DK 201870776 A1 DK201870776 A1 DK 201870776A1 DK PA201870776 A DKPA201870776 A DK PA201870776A DK PA201870776 A DKPA201870776 A DK PA201870776A DK 201870776 A1 DK201870776 A1 DK 201870776A1
- Authority
- DK
- Denmark
- Prior art keywords
- tray
- meat product
- classification
- computer
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000004519 manufacturing process Methods 0.000 claims abstract description 81
- 235000013622 meat product Nutrition 0.000 claims abstract description 67
- 238000011179 visual inspection Methods 0.000 claims description 15
- 230000006870 function Effects 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 8
- 230000005855 radiation Effects 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 4
- 238000010521 absorption reaction Methods 0.000 claims description 3
- 230000003993 interaction Effects 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 3
- 239000003086 colorant Substances 0.000 claims description 2
- 238000013527 convolutional neural network Methods 0.000 claims description 2
- 230000002787 reinforcement Effects 0.000 claims description 2
- 238000012706 support-vector machine Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 2
- 235000015277 pork Nutrition 0.000 description 2
- 241000283690 Bos taurus Species 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A22—BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
- A22B—SLAUGHTERING
- A22B5/00—Accessories for use during or after slaughtering
- A22B5/0064—Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
-
- A—HUMAN NECESSITIES
- A22—BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
- A22B—SLAUGHTERING
- A22B5/00—Accessories for use during or after slaughtering
-
- A—HUMAN NECESSITIES
- A22—BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
- A22B—SLAUGHTERING
- A22B5/00—Accessories for use during or after slaughtering
- A22B5/0064—Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
- A22B5/007—Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/02—Food
- G01N33/12—Meat; Fish
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Food Science & Technology (AREA)
- Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Biophysics (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medicinal Chemistry (AREA)
- Image Analysis (AREA)
Abstract
A method, comprising at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes: recording at least oneimage of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor; performing classification of the meat product by a computer-implemented classifier receiving as its input feature descriptors comprising information about the at least one image andoutputting an array of likelihood values with likelihood values associated with respective meat product types; wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types; acquiring, from the manufacturingexecution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting classification to product types comprised by the one or more current production recipes; recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.
Description
A method at a slaughterhouse
A method at a slaughterhouse or abattoir for classifying meat products, also known as cuts.
It is observed that there is a need for more efficient and robust classification of meat products in slaughterhouses.
SUMMARY
There is provided a method comprising:
at a slaughterhouse with a conveyor, transporting trays with meat products, and a manufacturing execution system in communication with one or more sensors tracking the trays with meat products and engaged in accordance with one or more production recipes:
recording at least one image of a meat product while the meat product, placed in a tray, passes an image recording station at the conveyor;
performing classification of the meat product by a computerimplemented classifier receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types;
wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types;
acquiring, from the manufacturing execution system, identification of one or more current production recipes; and assisting the computer-implemented classifier by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier and/or by restricting
DK 2018 70776 A1 classification to product types comprised by the one or more current production recipes; and recording with the manufacturing execution system, a classification product type associated with a tracking identifier associated with the tray and determined by the classification.
Thereby it is possible to reduce misclassification rates for computerimplemented classification systems in slaughterhouses (aka. abattoirs) and/or to free human operators occupied with classifying meat products arriving in a fast and steady stream from that wearisome manual task.
In one or more aspects, the likelihood values may be represented as it is known in the art e.g. as a estimated probabilities. The classifier may be implemented as one classifier or an array of binary classifiers. The classifier may implement a so-called Sigmoid function or a so-called Softmax function.
In one or more aspects, one or more current production recipes may be acquired from the manufacturing execution system. There may be one current production recipe for the slaughterhouse or for a particular production line in the slaughterhouse to be accordingly acquired. In some situations, e.g. in connection with transitioning from one production recipe to another and/or in connection with receiving different inputs to a production line, there may be multiple current production recipes. A current production recipe represents which products the manufacturing execution system is engaged to produce along the production line at a current time slot or period of time.
It should be understood that restricting classification to product types comprised by the one or more current production recipes may comprise ignoring likelihood values associated with product types not comprised by one or more current production recipes.
As it is common practice, a person skilled in the art knows how to record a classification product type associated with a tracking identifier associated with
DK 2018 70776 A1 the tray at or with the manufacturing execution system. This may involve database queries.
In one or more aspects, a production recipe comprises a list or another collection of meat product types. A production recipe may be related to a particular portion of the carcass input at the slaughterhouse. In some examples, a production recipe comprises four meat product types, e.g. for “pork side”: ham, loin, belly, and shoulder. In another example, a production recipe comprises five meat product types, e.g. for “pork loin”: bone out loin, boin ribs, rind, fat, and trimmings.
In some embodiments, the conveyor is configured with a load cell and the method may comprise acquiring from the load cell a mass value representing the mass of the meat product and the tray. The method may further comprise including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier.
It has been observed during experimentation that including the mass value, acquired from the load cell, in the feature descriptors input to the computerimplemented classifier improves classification by reducing misclassification.
The mass value may be represented in Kilograms or Pounds or in accordance with another system.
In some embodiments, an electronic device with a user interface for interaction with a human operator may be in communication with the manufacturing execution system; and the computer-implemented classifier may perform classification in accordance with a mapping function; the method may comprise via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray. The method may further comprise associating the operator input with a tray identifier associated with the tray at a visual inspection position. The method may further comprise determining that the operator input corresponds to selection of a product type
DK 2018 70776 A1 different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
Thereby the computer-implemented classifier can be trained or retrained by supervised learning in accordance with human visual inspection. Despite involving human supervision, such a method greatly reduces the amount of human labour and gradually improves the classification towards an acceptable level of classification performance.
The user interface may comprise a touch-sensitive display screen encapsulated in accordance with standard specifications applicable or required at slaughterhouses.
In some embodiments, the user interface may display a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
The graphical indicator may comprise one or more of a name e.g. a shortname of the product and a graphical image or icon. Displaying such a graphical indicator greatly assists the human operator in correcting or entering a classification.
In some embodiments, the computer-implemented classifier may comprise one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
In some aspects images acquired at the image recording station are provided as feature descriptors input to the classifier. In some aspects the feature descriptors are supplemented by a mass value and the identification of one or more current production recipes. The manufacturing execution system sets or gets such production recipes including current production recipes.
In some embodiments, the image recording station may comprise multiple cameras arranged above the trays passing on the conveyor and inclined along
DK 2018 70776 A1 respective multiple axes; the method may comprise recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. The method may further comprise including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
In some embodiments, the method may comprise generating a computerreadable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computer-implemented classifier.
Information about the computer-readable three-dimensional model may comprise information about geometrical dimensions of identified portions of the three-dimensional model (3D model) and/or information about texture. The geometrical dimensions may include one or more of distances and curvatures.
In some embodiments, the at least one image of a product may be one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
In particular, one or more colour images representing visible light may be supplemented by an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
In some embodiments, the image recording station may comprise one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.
DK 2018 70776 A1
In some embodiments, a tray may comprise an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
The identification member may be a label with a printed code e.g. a bar code or a QR code or similar. Alternatively or additionally the identification member may comprise a radio frequency transmitter and receiver e.g. a so-called RFID tag.
In some embodiments, the manufacturing execution system may register and may maintain unique sequence numbers for respective unique trays.
In some embodiments, the method may comprise routing the tray on a conveyor system with multiple tracks and/or storage compartments in accordance with the classification product type associated with the tray.
Generally, herein, “associated with” may be understood as a relation in database, a list or table or in other data structures between data items. In case the term is used in connection with a physical item that is “associated with” a data item, some type of code attached to, connected to or located in a predefined way relative to the physical item serves establish “associated with”.
Generally, herein, a manufacturing execution system (MES) is a computerimplemented system which may comprise programmed computers, sensors and actuators communicating via a data network. A manufacturing execution system is used in slaughterhouses to track and document the transformation from live stock or carcass bodies to meat products e.g. as generally known to a consumer.
MES may provide information that helps manufacturing decision makers understand how current conditions on a production line can be optimized to improve production output. MES works in real time to enable the control of multiple elements of the production process (e.g. inputs, personnel, machines
DK 2018 70776 A1 and support services). MES may operate across multiple function areas, for example: management of product definitions, resource scheduling, order execution and dispatch, production analysis and downtime management for overall equipment effectiveness (OEE), Product Quality, or materials track and trace. MES are especially important in regulated industries, such as food industry, where documentation and proof of processes, events and actions may be required.
BRIEF DESCRIPTION OF THE FIGURES
A more detailed description follows below with reference to the drawing, in which:
fig. 1 shows a block diagram of a slaughterhouse production line and a classification module in communication with a manufacturing execution system; and fig. 2 shows a flowchart of classification and assisting a computerimplemented classifier at a slaughterhouse.
DETAILED DESCRIPTION
Fig. 1 shows block diagram of a slaughterhouse production line (101) and a classification module (116) in communication with a manufacturing execution system (108). The slaughterhouse production line (101) comprises a conveyer (102) for transporting trays (104) with meat products (106) along the conveyer. The slaughterhouse production line (101) may comprise an image recording station (114), configured such that at least one image (IMGx) is recorded when the trays (104) with the meat products (106) passes by. The slaughterhouse production line (101) may comprise one or more sensors (110a, 110b) along the conveyer. A manufacturing execution system (108) may be in communication with the one or more sensors (110a,110b) for tracking the trays (104) with meat products (106).
DK 2018 70776 A1
A computer-implemented classifier (116) may be configured to perform classification of the meat product (106). The classifier (116) may receive input such as feature descriptors input comprising information about the at least one image (IMGx). The classifier (116) may provide output such as outputting an array of likelihood values with likelihood values associated with respective meat product types (P1,...Pi). The likelihood values may be associated with meat product types (P1,...Pi) selected from one or more production recipes (R1,.Ri) each listing multiple meat product types (P1,.Pi).
The manufacturing execution system (108) may be configured to identify one or more current production recipes (R1,.Ri). The identification of one or more current production recipes (R1,.Ri) may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computer-implemented classifier (116) in classification of the meat product (106). The classification of the meat product (106) may be restricted to classification of meat product types (P1,...Pi) comprised by the one or more current production recipes (R1,.Ri), thereby assisting the computerimplemented classifier (116) in classification of the meat product (106). The manufacturing execution system (108) may record a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
The slaughterhouse production line (101) may comprise a load cell (118). The load cell (118) may be configured to provide a mass value representing the mass of the meat product (106) and the tray (104).
The mass value may be included in the feature descriptors input to the computer-implemented classifier (116), thereby assisting the computerimplemented classifier (116) in classification of the meat product (106).
The image recording station (114) may comprise multiple cameras arranged above the trays passing on the conveyor (102) and inclined along respective multiple axes. The image recording station (114) may record respective
DK 2018 70776 A1 multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor. Information about each of the respective multiple images, acquired from the respective cameras, may be included in the feature descriptors input to the computer-implemented classifier (116).
The slaughterhouse production line (101) may comprise an electronic device with (120) a user interface for interaction with a human operator configured to be in communication with the manufacturing execution system (108). The computer-implemented classifier (116) may perform classification in accordance with a mapping function. A human operator at a visual inspection position (122) may perform visual inspection of the meat product (106) in the tray (104). The human operator, at the visual inspection position (122), may provide input to the electronic device (120) via the user interface. The input may be selecting a product type in response to the visual inspection of the meat product (106) in the tray (104).
The computer-implemented classifier (116) may be part of a classification module (130). The classification module (130) may further comprise a selector (126) and an image processor (128).
The manufacturing execution system (108) may comprise a data base (124). Alternatively, the manufacturing execution system (108) may be connected with or may be in communication with a data base (124).
Fig. 2 shows a flowchart of a method (201) of classification and assisting a computer-implemented classifier at a slaughterhouse.
The method is related to the features of fig. 1.
The method may comprise acquiring (203) an image recorded by an image recording station. The method may comprise querying (204) a data base of a manufacturing execution system for the current production recipes each listing multiple meat product types. The method may comprise reading (205) a mass
DK 2018 70776 A1 value from a load cell. The method may comprise inputting (206) the image, current production recipe and mass value to a computer-implemented classifier. The method may comprise performing (207) classification. The computer-implemented classifier may perform classification in accordance with a mapping function.
The method may comprise recording (208) classification at the manufacturing execution system together with an identifier. The identifier may be selected by a selector based on an output from the computer-implemented classifier. The output from the computer-implemented classifier may by an array of likelihood values with likelihood values associated with respective meat product types. The likelihood values may be associated with meat product types selected from one or more production recipes each listing multiple meat product types.
The method may comprise reading (209) the identifier at a visual inspection position. The method may comprise receiving (210) input from the human operator. The input from the human operator may be associated with a tray identifier associated with the tray at a visual inspection position. The method may comprise updating (211) a mapping function, based on a determination that the that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
Claims (12)
1. A method, comprising:
at a slaughterhouse with a conveyor (102), transporting trays (104) with meat products (106), and a manufacturing execution system (108) in communication with one or more sensors (110) tracking the trays (104) with meat products (106) and engaged in accordance with one or more production recipes:
recording at least one image of a meat product (106) while the meat product (106), placed in a tray (104), passes an image recording station (114) at the conveyor (102);
performing classification of the meat product (106) by a computerimplemented classifier (116) receiving as its input feature descriptors comprising information about the at least one image and outputting an array of likelihood values with likelihood values associated with respective meat product types;
wherein the likelihood values are associated with meat product types selected from one or more production recipes each listing multiple meat product types;
acquiring, from the manufacturing execution system (108), identification of one or more current production recipes; and assisting the computer-implemented classifier (116) by including the identification of one or more current production recipes in the feature descriptors input to the computer-implemented classifier (116) and/or by restricting classification to product types comprised by the one or more current production recipes;
DK 2018 70776 A1 recording with the manufacturing execution system (108), a classification product type associated with a tracking identifier associated with the tray (104) and determined by the classification.
2. A method according to claim 1, wherein the conveyor is configured with a load cell, the method comprising:
acquiring from the load cell a mass value representing the mass of the meat product and the tray; and including the mass value, acquired from the load cell, in the feature descriptors input to the computer-implemented classifier.
3. A method according to any of the above claims, wherein an electronic device with a user interface for interaction with a human operator is in communication with the manufacturing execution system; and wherein the computer-implemented classifier performs classification in accordance with a mapping function; the method comprising:
via the user interface, at a visual inspection position, receiving an operator input selecting a product type in response to the human operator performing visual inspection of the meat product in the tray;
associating the operator input with a tray identifier associated with the tray at a visual inspection position; and determining that the operator input corresponds to selection of a product type different from the classification product type; and in accordance therewith updating the mapping function of the computer-implemented classifier.
DK 2018 70776 A1
4. A method according to claim 3, wherein the user interface displays a graphical indicator representing a meat product type associated with a tray currently at the visual inspection position.
5. A method according to any of the above claims, wherein the computerimplemented classifier comprises one or more of: a neural network, such as a deep neural network, a convolutional neural network, a support vector machine, and a reinforcement learning algorithm.
6. A method according to any of the above claims, wherein the image recording station comprises multiple cameras arranged above the trays passing on the conveyor and inclined along respective multiple axes; comprising:
recording respective multiple images of a meat product while the meat product, placed in a tray, passes the image recording station at the conveyor;
including information about each of the respective multiple images, acquired from the respective cameras, in the feature descriptors input to the computer-implemented classifier.
7. A method according to any of the above claims, comprising:
generating a computer-readable three-dimensional model of the meat product in the tray and including information about the computer-readable three-dimensional model in the feature descriptors input to the computerimplemented classifier.
DK 2018 70776 A1
8. A method according to any of the above claims, wherein the at least one image of a product is one or more of: a colour image representing visible light, an image representing infrared light, an image representing ultra-violet light, an image representing Roentgen radiation absorption, and an image recorded while projecting structured light onto the meat product.
9. A method according to any of the above claims, wherein the image recording station comprises one or more cameras recording radiation at: one or more visible colours, infrared light, ultraviolet light, or Roentgen radiation.
10. A method according to any of the above claims, wherein a tray comprises an identification member encoded with an identifier for the tray and wherein the manufacturing execution system keeps track of the tray among other trays.
11. A method according to any of the above claims, wherein the manufacturing execution system registers and maintains unique sequence numbers for respective unique trays.
12. A method according to any of the above claims, comprising routing the tray on a conveyor system with multiple tracks and/or storage compartment in accordance with the classification product type associated with the tray.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201870776A DK179792B1 (en) | 2018-11-23 | 2018-11-23 | A method for classifying meat products in a slaughterhouse |
PCT/EP2019/082187 WO2020104636A1 (en) | 2018-11-23 | 2019-11-22 | A method at a slaughterhouse |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201870776A DK179792B1 (en) | 2018-11-23 | 2018-11-23 | A method for classifying meat products in a slaughterhouse |
Publications (2)
Publication Number | Publication Date |
---|---|
DK179792B1 DK179792B1 (en) | 2019-06-20 |
DK201870776A1 true DK201870776A1 (en) | 2019-06-20 |
Family
ID=69156041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DKPA201870776A DK179792B1 (en) | 2018-11-23 | 2018-11-23 | A method for classifying meat products in a slaughterhouse |
Country Status (2)
Country | Link |
---|---|
DK (1) | DK179792B1 (en) |
WO (1) | WO2020104636A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11803958B1 (en) | 2021-10-21 | 2023-10-31 | Triumph Foods Llc | Systems and methods for determining muscle fascicle fracturing |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2362710B (en) * | 2000-05-25 | 2002-07-17 | Intelligent Mfg Systems Ltd | Analysis of samples |
AU2018233961B2 (en) * | 2017-03-13 | 2023-12-21 | Frontmatec Smørum A/S | 3D imaging system and method of imaging carcasses |
-
2018
- 2018-11-23 DK DKPA201870776A patent/DK179792B1/en not_active IP Right Cessation
-
2019
- 2019-11-22 WO PCT/EP2019/082187 patent/WO2020104636A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
DK179792B1 (en) | 2019-06-20 |
WO2020104636A1 (en) | 2020-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2007242918B2 (en) | Method and system for associating source information for a source unit with a product converted therefrom | |
US10890898B2 (en) | Traceability systems and methods | |
EP0820029A2 (en) | Indentification and tracking of articles | |
US20230011901A1 (en) | Systems and methods for anomaly recognition and detection using lifelong deep neural networks | |
US20180181913A1 (en) | Shelf allocation assistance device, shelf allocation assistance system, shelf allocation assistance method, and recording medium | |
KR20240001241A (en) | Image-based anomaly detection based on machine learning analysis of objects | |
CN114186933A (en) | Cold chain food intelligent supervision platform | |
CN116596441A (en) | Intelligent warehouse service management method and system based on cloud computing | |
DK179792B1 (en) | A method for classifying meat products in a slaughterhouse | |
CN118179989A (en) | Raw materials intelligence screening weighing system | |
US20240000088A1 (en) | A method of tracking a food item in a processing facility, and a system for processing food items | |
Prakash et al. | Using artificial intelligence to automate meat cut identification from the semimembranosus muscle on beef boning lines | |
CN108480220A (en) | A kind of materials-sorting system | |
CN113971574A (en) | Meat product processing process safety tracing method based on Internet of things | |
CN111595237B (en) | Distributed system and method for measuring fabric size based on machine vision | |
WO2022195284A1 (en) | Determining location of rfid tag | |
US10181108B1 (en) | Item attribute collection | |
US20130006697A1 (en) | Using prime numbers and prime number factorization to track articles through transit points in a supply chain | |
US20230153978A1 (en) | Methods and systems for grading devices | |
US10962961B2 (en) | Systems and methods for tracking cutting implements in processing facilities | |
Prakash | Integrating structured and unstructured data for imbalanced classification using meat-cut images | |
Brambilla et al. | AI Solutions for Grilled Eggplants Sorting: A Comparative Analysis of Image-Based Techniques | |
Benyezza et al. | Automated Fruits Inspection and Sorting Smart System for Industry 4.0 Based on OpenCV and PLC | |
WO2023101850A1 (en) | System configuration for learning and recognizing packaging associated with a product | |
WO2023283596A1 (en) | System and method for smart manufacturing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PAT | Application published |
Effective date: 20190617 |
|
PME | Patent granted |
Effective date: 20190620 |
|
PPF | Opposition filed |
Opponent name: DK:TEKNOLOGISK INSTITUT Effective date: 20200320 |
|
PIU | Opposition: patent invalid |
Opponent name: DK:TEKNOLOGISK INSTITUT Effective date: 20210705 |