DK180199B1 - Cellular meat production - Google Patents

Cellular meat production Download PDF

Info

Publication number
DK180199B1
DK180199B1 DKPA201801006A DKPA201801006A DK180199B1 DK 180199 B1 DK180199 B1 DK 180199B1 DK PA201801006 A DKPA201801006 A DK PA201801006A DK PA201801006 A DKPA201801006 A DK PA201801006A DK 180199 B1 DK180199 B1 DK 180199B1
Authority
DK
Denmark
Prior art keywords
processing
robot
workpiece
cell
robotic
Prior art date
Application number
DKPA201801006A
Other languages
Danish (da)
Inventor
Wu Haiyan
Grothe Henrik
Nielsen Jespersen Klaus
Philip Philipsen Mark
Original Assignee
Teknologisk Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologisk Inst filed Critical Teknologisk Inst
Priority to DKPA201801006A priority Critical patent/DK180199B1/en
Priority to PCT/EP2019/085048 priority patent/WO2020126890A1/en
Priority to EP19831624.2A priority patent/EP3897162A1/en
Publication of DK201801006A1 publication Critical patent/DK201801006A1/en
Application granted granted Critical
Publication of DK180199B1 publication Critical patent/DK180199B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • A22C17/0086Calculating cutting patterns based on visual recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

This invention relates to robotic carcass processing systems and methods for production in parallel cell stations, and the system and method of the invention solve several of the disadvantages associated with conventional meat production methods.

Description

DK 180199 B1 1
CELLULAR MEAT PRODUCTION
TECHNICAL FIELD This invention relates to robotic carcass processing systems and methods for production in parallel cell stations, and the system and method of the invention solve several of the disadvantages associated with conventional meat production methods.
BACKGROUND ART Through many years, the meat industry has been industrialised in terms of organisation, work specialisation and use of long production lines. Although robots and automation have gained ground, many slaughterhouse processes still are carried out partly or entirely by hand, i.a. due to the biological variation of the carcasses to be processed.
The production methods of conventional slaughterhouses are largely characterized by long production lines with specialized workstations operating serially, one after the other. While specialization and the use of few production lines generally increases efficiency, this set-up also has its drawbacks.
Losses following breakdown somewhere along the production line is greater than for other forms of production. Sometimes carcasses can be damaged by previous operating processes, and the sensitivity to fluctuations in commaodity supply increases, as production is concentrated on fewer process units. Machines are not always suited for processing a variety of sizes, and biological variations, contaminants, etc, significantly reduces the overall effectivity. Equipment errors, as well as cleaning and maintenance, often result in a complete stop of the entire production line, affecting all products on the line, and greatly influences the capacity.
Long production lines also are unsuited for producing small series, with frequent conversions/change of product focus, and are unsuited for a product mix where multiple products are run simultaneously. Moreover, highly specialized production lines involve much internal transport of the goods, which negatively affects the overall efficiency/economy.
Finally, long production lines increase the risk of repetitive work at the individual workstations.
For many vears robots have represented the solution to much of the labour- intensive work that is undertaken in various industries.
DK 180199 B1 2 Thus WO 2006/085744 describes a robot cell and a method for changing and storing elements in a robot cell.
WO 2015/168511 describes a robotic carcass processing method and system.
However, the robotic carcass processing methods and systems described herein have never been disclosed.
SUMMARY OF THE INVENTION The present invention provides an alternative system and a related method of processing carcasses at abattoirs. The system and method of the invention differ from those conventionally used in abattoirs by taking place in several parallel cell stations rather than a few serial production lines.
Prominent features of the present invention are e.g.: Rather than using specialised machinery, multi-function robots are introduced; Rather than running a fixed production flow, a programmable and varied production flow is accomplished; Rather than undertaking a fixed schedule for cleaning and maintenance, a need-adapted cleaning and maintenance schedules are introduced.
Another essential feature of the method of the invention is increased flexibility, allowing the handling of small product series, handling of a varied range of products, and customised production, focusing of different customer’s special needs. Production in robot cells according to the invention makes it possible to settle several orders/productions in parallel, just by configuration via software.
Moreover, the method of the invention facilitates platform based development rather than single development projects, and the method of the invention also allows for a better capacity adaptation. As the productions units are programmable, there are less physical limitations, and the production plant can easily be adapted to fluctuations in the delivery of animals. While the cell may be equipped with a variety of tools, which provides access to a palette op operations, the arrangement, using of programmable robots, also allows for use of the same tool for different operations, thus reducing the number of tools necessary. Losses as result of stops are reduced, down-time as result of cleaning and maintenance is reduced, and the plant may run 24/7.
Composed of separate, delimited, self-containing units, each unit can be isolated for better cleaning. Due to the fact, that each cell does not need to be served by an operator, or the operator can supervise the process from outside the closed cell, the process may involve the use of X-ray, NMR-equipment, and similar hazardous processes, that may otherwise be harmful to an operator.
Finally, the method of the invention converts repetitive work into process monitoring and management. In this respect, the method of the invention may include the use of virtual
DK 180199 B1 3 reality (VR) or augmented reality (AR), as the operator, working from outside the cell, may monitor and correct the production process e.g. by wearing a virtual reality headset, which presents a virtual environment with a digital twin of the real robot cell.
Therefore, in its first aspect, the invention provides a robotic carcass processing system as described below.
In another aspect, the invention provides a method for automatic processing of carcasses, Further aspects of the invention relate to use of the robotic carcass processing system of the invention in the method of the invention for automatic processing of carcasses.
Other objects of the invention will be apparent to the person skilled in the art from reading the following detailed description and accompanying drawings.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention is further illustrated by reference to the accompanying drawing, in which: Fig. 1 shows an overview of an in-cell processing phase, taking place inside a robot cell (RCx) of the invention: starting material/intermediate products arrive via an in-cell (in-let) conveyor (5m); starting material/intermediate products optionally are placed on a work bench/processing table (4), and are being tracked and monitored by the in-cell sensing means (3), e.g, by the in-cell sensor (3A), and/or by the in-cell vision device (3B); the active robot (Rx) choses a tool suited for the intended process from a toolbox (6x) and performs the intended operation; the processed product/end-product is placed on an internal out-let conveyor (5out) for transport to the outside of the robot cell (RCx); all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (i.e. a server/database) (10), and optionally using machine learning methodology applied to data obtained by the sensing means (3); and the robot cell and the processes taking place herein may be monitored from outside the cell by an operator, optionally equipped with a VR/AR device; Fig. 2 shows an overview of the external system of the invention: starting material arrives via an in-let conveyor (7A), and the staring material is being tracked (iocalised/analysed) by the external sensing means (9);
DK 180199 B1 4 staring material is being distributed to one of the Robot Cells (RC1x) via a conductor (8), in operation with the in-let conveyor (5m); staring material is processed in one of the robot cells (cf. Fig. 1), and processed starting material {intermediate products) may then be re-distributed to another Robot Cell (RC2…) via an out-let convevor (Su), optionally guided in-cell sensing means (3), and/or the external sensing means (9), both in operation with the processing means (2); this process {re-distribution of intermediate products) may be repeated; the processed material is being transported to an out-let conveyor (7B); and all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (a server/database) (10); and Fig 3 shows a robot (Rx) for use according to the invention, mounted with a working tool (6), and onto which robot a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot.
DETAILED DISCLOSURE OF THE INVENTION The system of the invention In its first aspect, the invention provides a robotic carcass processing system, as described in more details in the claims.
The robotic carcass processing system of the invention takes its starting point in incoming starting material, which may be any carcasses, or parts thereof, conventionally processed in slaughterhouses. During the further processing, the starting material turns into i.a. processed products, meat items, intermediate products, and — eventually — end- products.
The system of the invention may also be characterised by comprising an in-let processing step, an outlet processing step, and in between these steps internal processing steps are taking place in robot cells/production cells, which robot cells represent a closed or possibly sealed environment. Moreover, all processes are carried out in communication with, and guided or assisted by a processing means.
Moreover, the entire production may be monitored, and possibly corrected from outside the closed environment of the robot cell, by an operator. The workstation The system of the invention may be characterised by comprising the following (in- cell) elements:
DK 180199 B1one or more robotic workstations (RC, RCz; 8€x), configured for operation in parallel, and independently of each other, each of which robotic workstations comprises: one or more industrial robots (R1, R2, ... Rx), in operation with a processing 5 means (2), each robot capable of holding, and configured for operating, a working tool (6); one or more in-let/out-let conveyors (5), in operation with a processing means (2), and capable of transporting a workpiece into the robotic cell (RC), and/or the processed material out of the robotic cell (RC); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, e.g. as assessed by a processing means (2); one or more processing tables/workbenches/support stand (4), onto which a workpiece can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or characteristics of the workpiece, comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B); and one or more processing means (2), in collaboration with each other, and in operation with said industrial robots (R), said processing table/workbench/support stand (15), said in-cell sensing means (3), and configured for processing digitalized data obtained by said in-cell sensing (3), and configured for applying machine learning to said obtained digitalised data.
Each robotic workstation (RC) of the invention is configured for, and undertakes one or more of the following tasks: - receives the workpieces in question entering the cell via an in-let internal conveyor (5m); - processes the workpiece in question, according to a pre-programmed schedule, according to real-time, online measurements, and/or, optionally, by applying machine learning methodology on digitalised data obtained by the sensing means (3); - determines the character, structure, nature, size, orientation, location, quality, etc., of the meat item to be processed; - makes quality assessment of the workpiece in question, and providing feed- back to the robot in question, optionally accomplished by a self-learning algorithm; - guides the motion of the industrial robot (R) during operation/processing; - selects and picks a working tool (6) from a tool box, and mounts this tool on the industrial robot (Rx) in question, in accordance with the intended (programmed or self-taught) task;
DK 180199 B1 6 - replaces the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task; - performs self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or - places the processed material on the out-let conveyor (Sa), for transport to outside the cell, In one embodiment, the system of the invention comprises two or more robotic workstations, optionally in inter-communication with one or more of the other robotic workstations.
In another embodiment, the robotic carcass processing system of the invention comprises the following additional (external) elements: one or more transporting means (5, 7), in operation with the processing means (2), for commodity supply and configured for distributing of the workpiece to one of the two or more robotic workstations (RC:, … RCx), or for redistributing processed material/intermediate products to another of the available robot cells (RC), or for transport of processed material/end-product to the out-let conveyor (7B); an in-cell sensing means (3), installed at, or within operating distance of the product supply means (7), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the starting material, which sensing means (3) comprises: one or more sensors (3A); and/or one or more machine vision devices (3B); one or more conductors (8), in operation with the processing means (2), configured for allocating each identified workpiece from the in-let conveyor (7A) to a robotic workstations (RC), or for allocating processed material, returning from one robotic workstation, to another robotic workstation for further processing, or for allocating the end-products to the out-let conveyor (7B); one or more processing means (2), in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic workstations (RC), said industrial robots (R), said out-let conveyor (7B), for determining the localisation, the character, Structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing.
The robotic workstation (RC) The system of the invention comprises one or more robotic workstations, also termed robot cells (RC, RCz2, RCz, ... RC), in which workstations/cells the actual processing of the meat items takes place. The robot cells for use according to the
DK 180199 B1 7 invention shall be configured for operation in parallel, and independently of each other, but may be inter-communicating with each other via the processing means (2).
The robotic carcass processing workstation (RC) of the invention may be characterised by comprising the following elements: one or more industrial robots (R1, … Rx), in operation with the processing means (2), each robot configured for holding, and capable of operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2); one or more processing tables/workbenches/support stand (4), onto which the workpiece can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or character of the workpiece, comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B).
one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3).
In one embodiment, the robotic carcass processing workstation (RC) of the invention further comprises a means for storing information (a server/database) (10).
The robot cell of the invention may be regarded as a closed or possibly sealed environment, suited for implementation of clean-room techniques, and they should be established using cleaning-friendly materials.
Each robot cell receives material to be processed via an in-let conveyor (5in), and delivers processed material via an out-let conveyor (5out). In one embodiment, supply and delivery of the goods is implemented using one and the same conveyor (Sinout), just by reversing the transporting direction of the conveyor.
Each robotic workstation of the invention comprises one or more industrial robots (Ri, Rz, R3, ... Rx), one or more working tools (6), one or more processing tables/workbenches/support stands (4), an in-cell sensing means (3), and one or more mbot(s) (R), and the in-cell sensing means (3), shall be in communication with, and receive operational guidance from, the processing means (2).
In another embodiment, the robot cell and the processes taking place herein may be monitored from outside the cell by an operator (13), optionally by use of projection mapping, or equipped with virtual reality (VR) or augmented reality (AR) eguipment, The workstations/cells (RC) for use according to the invention shall be configured for, and able to perform one or more of the following tasks: - receipt of the workpiece in question entering the cell via an in-let internal conveyor (5m);
DK 180199 B1 8 - processing of the workpiece in question, according to a pre-programmed schedule, according to real-time, online measurements, and/or according to a self- learning algorithm (reinforced learning/machine learning); - selecting a working tool (6) for mounting on the industrial robot (Rx) in question, in accordance with the intended (programmed) task; - replacement of the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task; - quality assessment of the workpiece in question, and providing feed-back to the robot in question; - performing self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or - placing the processed material on an internal out-let conveyor (Sou), for transport to the out-let conveyor (78).
The industrial robot (R) The system of the invention comprises the use of one or more robots (Ri, Rz, R3, … Rx), including the use of single robots, and the use of multiple robots, working together.
The robot for use according to the invention may be any available automated industrial robot, capable of being programmed, and capable of moving on two or more axes. The robot shall be in communication with, and receive operational guidance from, the processing means (2).
The industrial robot shall be configured for holding, and capable of operating a working tool (6).
In another embodiment, the robot also shall be able to choose and change tools (6), e.g. by choosing from a toolbox.
In a further embodiment, the robot (Rx) for use according to the invention is mounted with a working tool (6), and a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot.
The robot for use according to the invention may be any commercially available industrial robot. Industrial robots may be classified based on their coordinate systems, i.e. based on reachable coordinates of a point on the end-effector, and include Cartesian robots (when arms of a robot move in the XYZ rectangular coordinate system), Cylindrical robots (when arms of a robot move in one angular and two linear directions), Spherical robots (the arms move in two angular and one linear direction), SCARA robots (Selective Compliance Arm for Robotic Assembly; have two parallel revolute joints
DK 180199 B1 9 providing compliance in a selected plane), and Articulated robots (also known as the anthropomorphic robot, the robot arm has 3 revolute joints). The working tools (6) For performing the intended tasks, the system of the invention also comprises one or more working tools (6), that can be attached to the industrial robot (Rx), and is suited for the intended task.
The working tools (6) may be available from a tool box, from which one or more robots can pick and choose the desired tool, according to the intended task.
The working tool (6) may be introduced by use of one or more robots (R). One tool (6) may be mounted on one robot, and two or more robots may work together to solve the intended task.
In another embodiment, a multi-tool is mounted on a robot (R), configured for operating the multi-tool, and able to switch between the individual tools, as need be.
Examples of working tools frequently used in abattoirs are knives, e.g. fixed knives, rotary knives, oscillating knives, and wizard knives; saws, e.g. round saws, band saws and chain saws. Tools for use according to the invention also may include scissors, e.g. scissors having multiple blades, including blades of different sizes.
The system of the invention may also comprise tools for cleaning of the robot cell and its interior, including tools for cleaning other tools. Cleaning tools include e.g. brushes, brooms, and pressure devices.
The processing means (2) For receiving input from, and for guiding the relevant devices used according to the invention, the system of the invention comprises one or more processing means (2). If two or more processing means are employed, these processors may be in inter- communication with one or more of the other processing means, For performing the necessary processing actions, the processing means for use according to the invention shall be in communication with, and/or configured for processing digitalized data obtained by one or more of the following devices: the robot(s) (R); the in-cell sensing means (3), i.e. the in-cell sensor (3A) and/or the in-cell machine vision device (3B); the processing table/workbench/support stand (4); the transporting means (5, 7); the external sensing means (9), i.e. the external sensor (9A) and/or the external machine vision device (9B); the conductors (8); and
DK 180199 B1 10 the robotic workstations (RC).
The processor(s) for use according to the invention shall also be configured for making assessments and determinations, and be able to perform one or more of the following tasks: Determine the position/location, the character, structure, nature, size, orientation, quality, etc. of a workpiece; guide the motion of the industrial robot (R) during operation/processing; calculate the speed of the transporting means/conveyor (5, 7); calculate the optimal cutting/trimming pattern in view of the specifications for the processed material; recognize and track the workpiece while being transported from arrival (5in, 7A) to the processing table/workbench/support stand (4); recognize and track processed materials while being transported to outside (50ut) the cell to another workstation (RC), or to be delivered as the processed material via the out-let conveyor (7B).
In one embodiment, the processing means (2) for use according to the invention shall be programmed for performing i.a. machine vision and/or machine learning (more details below).
The carcass processing system of the invention may also be in operation with an external Enterprise Resource Planning (ERP) system, ERP represents a software application that manages functional areas across the business, ERP integrates company functions such as order processing, sales, procurement, inventory management and financial systems, In another embodiment, the processing means (2) for use according to the invention shall be configured for running a local ERP application, or for communicating with an external ERP application.
The sensing means (3, 9) For tracking incoming products, as well as processed products, the system needs to know the position/location, and possibly also the characteristics, of each item. This may be accomplished by use of sensors and/or machine vision devices.
Tracking of the products takes place inside the robotic cell, but may also take place outside the robotic cell, when dealing with starting materials and/or processed/end-products.
Sensing means for use according to the invention comprises sensors (3A, 9A), and/or machine vision devices (3B, 9B).
DK 180199 B1 11 As defined herein, a sensor is a device, module, or subsystem, whose purpose is to detect events, or changes in its environment, and send the information to other electronics, and in particular the processing device (2) used according to the invention.
The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
Like sensors, machine vision devises can determine positions or locations of specific subjects, but machine vision devises also can determine e.g. the character or the structure of a given object.
Machine vision Machine vision represents a combination of hardware and software capable of providing operational guidance to other devices based on the capture and processing of images, and usually rely on digital sensors protected inside industrial cameras, with specialised optics to acquire images, so that computer hardware and software can process, analyse and measure various characteristics for decision making.
A machine vision system typically comprises lighting, a camera with a lens, an image sensor, a vision processing means, and communication means. The lens captures the image and presents it to the sensor in the form of light. The sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis. Lighting illuminates the part to be inspected, and allows its features to stand out so they can be clearly seen by camera.
Processing may be accomplished by conventional processors, including central processing units (CPU) and/or graphics processing units (GPU), e.g. in a PC-based system, or in an embedded vision system, and is performed by software and may consist of several steps. First, an image is acquired from the sensor. In some cases, pre- processing may be required to optimize the image and ensure that all the necessary features stand out. Next, the software locates the specific features, runs measurements, and compares these to the specification. Finally, a decision is made and the results are communicated.
Machine vision systems essentially comes in three main categories: 1D vision captures a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera; 2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera; and 3D vision systems comprise multiple sensors, including or one or more laser displacement sensors.
DK 180199 B1 12 The machine vision device for use according to the invention also comprises the use of e.g. X-ray and/or NMR-equipment.
Any category, or combination of categories, may be implemented in the processing system of the invention.
The machine vision device (3B, 9B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
If the incoming workpiece is considered represented by a spatial distribution, the vision device for use according to the invention preferably comprise a combination of a 1D and 3D scanning device (3B, 9B), to determine the height and calculate angle of the food surface, thus allowing optimal guidance of the robotic arm to position the working tool (6) in the right angle.
The machine vision hardware components for use according to the invention, such as sensors and processors, are commercially available, and machine vision systems can be assembled from single components, or purchased as an integrated system, with all components in a single device.
The 3D scanning devices (3B, 9B) for use according to the invention may be any commercially available 3D scanner or range camera, such as a time-of-flight camera, structure light camera, or stereo camera, e.g. a Sick 3D ruler.
The transporting means (5, 7) For transporting of the meat items from in-put to out-put, and for inter-cell transport, the carcass processing system of the invention must comprise one or more transporting means (5, 7), and these transporting means shall be in communication with, and receive guidance from the processing means (2).
Transporting according to the invention is for commodity supply and for distributing of workpieces to one of robotic workstations (RC, ... RCx), and/or for redistributing processed material/intermediate products to another of the available robot cells (RC), and/or for transport of processed material to the out-let conveyor (78).
Transport may take place by use of a conveyor belt, a conveyor/overhead rail, a lift, or an automated guided vehicle (AGV).
DK 180199 B1 13 The processing table/workbench/support stand (4) For optimal support during processing, the system of the invention includes one or more processing tables/workbenches/support stands (4), onto which the workpiece can be placed.
In contrast to conventional abattoir processes, that is performed while the carcasses are hanging on an overhead rail, processing on tables/workbenches/support stands provides allows for maximal support of the meat item during processing.
The use of workbenches may involve a transition from a hanging position to a lay-down position, which transition may be accomplished by use of a lay-down mechanism. The external conductor (8) The system of the invention also comprises one or more conductors (8). As defined herein, a conductor represents some equipment that can redistribute the incoming meat items to other conveyors. For fulfilling this task, the conductor (8) shall be in communication with, and receive operational guidance from, the processing means (2). In the system of the invention, the conductor (8) is configured for allocating each identified workpiece from the in-let conveyor (7A) to a robotic workstations (RC), or for allocating processed material/intermediate products (12), returning from one robotic workstation, to another robotic workstation for further processing, or for allocating the end-products to the out-let conveyor (7B). The operator (13) Moreover, the entire production may be monitored and possibly corrected from outside the closed environment of the robot cell (RC) by an operator (13).
Monitoring of the carcass processing may be accomplished by use of projection mapping, or by use of virtual reality (VR), or augmented reality (AR) equipment, Machine learning Machine learning is an application of artificial intelligence (AD) which use statistical techniques to perform automated decision-making, and optionally improve performance on specific tasks based on experiance without being explicitly programmed.
In supervised learning, the computer is presented with example inputs and their desired outputs, given by the supervisor, and the goal is to learn a general rule that maps inputs to outputs. Using supervised learning, large sets of reference data, covering different product types, are created and stored on a means for storing information (a server/database) (10).
DK 180199 B1 14 Data thus becomes a reference dataset, located on a means for storing information (a server/database) (10), and accessible for further processing by the processing means (2).
As data is building up, the system becomes more and more automatic and autonomous, but occasionally, e.g. caused by quality errors or recognition errors, manual assistance may be needed, for additional training of the system, After receipt of the digitalized data from the machine vision device (3B, 9B), information about the product type can be determined, and, e.g. by reference to a product catalogue also stored on the server (10), a product ID can be allocated to workpiece product/mixture of workpieces, and transmitted to the server (10) for further action/use.
In one embodiment, information about the product type or product ID is used for specifying the destination of the processed material, e.9. a specific place in the packing room, The method of the invention In another aspect, the invention provides a method for automatic processing of carcasses.
The method of the invention for automatic processing of carcasses may be characterised by comprising the subsequent steps of: (a) analysis of the incoming workpiece for identification and decision on the further processing, using the external sensing means (9); (b) distribution of the analysed workpiece to a robotic workstation (RC); (c1) in the robotic workstation (RC), subjecting the incoming workpiece to in- cell analysis by use of the in-cell sensing means (3), and processing the workpiece according to a predetermined schedule, or according to an actual measured value, and, optionally, by use of a machine learning methodology applied to the data obtained by the in-cell sensing means (3); (c2) optionally, redistribution of a processed workpiece, processed according to step c1, to another robotic workstation (RC) for further processing; and (d) transporting the processed material to the out-let conveyor (7B) for final processing.
The analysis carried out according to step (a) is performed using an (external) sensing means (9), and in particular an (external) sensor (94), and/or an (external) machine vision device (9B), in communication with the processing means (2). While a sensor (9A) may help locating and tracking the incoming workpiece, the vision device (9B) may help providing additional information about the incoming product.
By help of the processing means (2), location and identification of the product is determined, and a decision on the further processing is calculated according to a
DK 180199 B1 15 predetermined (programmed or self-taught) schedule, or according to actual measured values, i.e. information obtained using the sensor (9A), or using the vision device (9B). Based on the information obtained in step (a), the incoming workpiece is distributed to a robotic workstation (RC) for processing.
This transport is accomplished using a conductor (8) in collaboration with an (in-let) conveyor (5in). The conductor (8) secures that the product is being directed onto the correct in-let conveyor, as determined by the processing means (2). Having arrived at the robot cell (RC), the actual processing of the product takes place.
First the workpiece should be placed in a supported position, i.e. on a processing table, workbench, or a support stand (4), onto which the workpiece can be placed for optimal support during processing and conclusion of the intended task.
Transition from the in-let conveyor (5m) to the processing table/stand (4) may optionally be accomplished using a “lay down mechanism”, frequently used in abattoirs.
Next, for determining the location and/or character of the incoming workpiece, this is being analysed using the in-cell sensing means (3). In one embodiment, this analysis takes place using one or more in-cell sensors (3A), in communication with the processor (2). The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
In another embodiment, analysis takes place using an in-cell machine vision devices (3B), in communication with the processor (2). The machine vision device (3B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
Having examined the workpiece, a robot (R), guided by the processing means (2), and already mounted with a working tool (6), or after having picked a working tool from the toolbox (6), starts working on the workpiece, properly supported on the on a processing table, workbench, or a support stand (4). If needed, the robot (R) may change working tool for completion of its task.
Processing may be accomplished by use of one robot (R) only, or using multiple robots working together.
Processing of the workpiece may be carried out according to a predetermined schedule, or according to an actual, measured value.
A scheduled programme may be
DK 180199 B1 16 stored on a means for storing information (server/database) (10), from which the processor (2) may get the necessary information.
In another embodiment, processing of the workpiece may be carried out according to a self-taught schedule, applying the machine learning techniques described above.
Having completed the intended processing of the workpiece, whether completed by use of a single robot (R), or by use of two or more robots, each equipped with one working tool (6), or one or more equipped with a multi-tool, it may be necessary to subject the workpiece to another processing, optionally to take place in another robot cell (RC), as determined and guided by the processing means (2), or, optionally, as determined and guided by the operator (13), which is optionally equipped with a virtual reality (VR) or augmented reality (AR) device.
In case the workpiece shall be further processed it may be redistribution to another robotic workstation (RC) for further processing (i.e. step (c2)). This may be accomplished via an out-let conveyor (5out), optionally via the conductor (8), and an in- let conveyor leading to the other robot cell (RC). For practical reasons, the in-let conveyor may turn into the our-let conveyor, simply by reversing its transport direction.
After arrival at another robot cell (RC), the process of step (c1) may be repeated.
After finishing the workpiece, the processed material is being transported out of the robot cell (RC), via the out-let conveyor (5out), via the conductor (8), and via the out-let conveyor (7B), for final processing. Final processing/finishing may include packaging, shipping, etc.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
List of reference signs In the figures, identical structures, elements or parts that appear in more than one figure are generally labelled with the same numeral in all the figures in which they appear.
2 Processing means/CPU 3 In-cell sensing means 3A —In-cell sensor 3B > In-cell machine vision device 4 Workbench/processing table/support stand 5 In-cell conveyors 5n In-let conveyor Bot Out-let conveyor
DK 180199 B1 17 6 Tool box/Working tool 7 External conveyors få In-ist conveyor 7B 0ut-let conveyor 8 Conductor 9 External sensing means 9A External sensor 9B External machine vision device Means for storing information (server/database) 10 13 Operator RC Robotic workstation (Robot Cell) R Industrial robot

Claims (3)

DK 180199 B1 1 PatentkravDK 180199 B1 1 Patent claim 1. Et robot-slagtekrop-forarbejdningssystem, hvilket system omfatter: en eller flere robot-arbejdsstationer (RC:, RC2, .. RCx), konfigureret til at fungere parallelt og uafhængige af hinanden, hvor hver arbejdsstation indeholder: en eller flere industrirobotter (R:, Rz, ... Rx), i kommunikation med en procesbehandlings indretning (2), hver robot værende i stand til at holde, og konfigureret til at betjene, et arbejdsredskab (6); et eller flere arbejdsredskaber (6) til montering på den nævnte industrirobot (Rx) og egnet til den tilsigtede opgave som vurderet af procesbehandlings indretningen (2); en eller flere behandlingsborde/arbejdsbænke /understøttelsesstativer (4), hvorpå et arbejdsemne kan placeres for optimal understøttelse under behandling og afslutning af den tilsigtede opgave; et i cellen værende detekteringsredskab (3) i cellen, konfigureret til bestemmelse af emnets placering og / eller karakter, omfattende: en eller flere i cellen værende detekteringsredskaber (3A); og / eller en eller flere maskin-billedgenkendelses indretninger (3B); et eller flere procesbehandlings indretninger (2), i kommunikation med de nævnte industrirobotter (R), og konfigureret til behandling af digitaliserede data opnået ved hjælp af detekteringsredskabet (3), og eventuelt konfigureret til anvendelse af maskin-indlærings-metodologier på de digitaliserede data opnået ved nævnte detekteringsredskaber (3), og hvilket system yderligere er kendetegnet ved at omfatte: et eller flere transportører (5, 7), i kommunikation med procesbehandlings indretningen (2), til vare-forsyning og konfigureret til distribution af arbejdsemnet til en af de to eller flere robot-arbejdsstationer (RC1, RCz, RC3, ... RCx), eller til omfordeling af forarbejdet materiale til et andet af de tilgængelige robot-arbejdsstationer (RC), eller til transport af forarbejdet materiale til en udlednings-transportør (7B); et eksternt detekteringsredskab (9), installeret ved eller inden for driftsafstand fra en tilførsels-transportør (7A), hvilket detekteringsredskab er i kommunikation med procesbehandlings indretningen (2), konfigureret til bestemmelse af placering/position og/eller karakteren af udgangsmaterialet, hvilket detekteringsredskab (9) omfatter: en eller flere sensorer (9A); og / eller en eller flere maskin-billedgenkendelses indretning (9B); en eller flere konduktører (8), i kommunikation med procesbehandlings indretningen (2), konfigureret til at allokere hvert identificeret arbejdsemne fra tilførings-transportøren (7A) til en robot-arbejdsstation (RC), eller til at tildele behandletA robotic carcass processing system, comprising: one or more robotic workstations (RC :, RC2, .. RCx), configured to operate in parallel and independently, each workstation containing: one or more industrial robots ( R :, Rz, ... Rx), in communication with a process processing device (2), each robot being capable of holding, and configured to operate, a work tool (6); one or more work tools (6) for mounting on said industrial robot (Rx) and suitable for the intended task as assessed by the process processing device (2); one or more treatment tables / workbenches / support racks (4) on which a workpiece can be placed for optimal support during treatment and completion of the intended task; an in-cell detection tool (3) in the cell, configured to determine the location and / or character of the workpiece, comprising: one or more in-cell detection tools (3A); and / or one or more machine image recognition devices (3B); one or more process processing devices (2), in communication with said industrial robots (R), and configured to process digitized data obtained by means of the detection tool (3), and optionally configured to apply machine learning methodologies to the digitized data obtained by said detection tools (3), and which system is further characterized by comprising: one or more conveyors (5, 7), in communication with the process processing device (2), for supplying goods and configured for distributing the workpiece to one of the two or more robotic workstations (RC1, RCz, RC3, ... RCx), or for redistributing processed material to another of the available robotic workstations (RC), or for transporting processed material to a discharge conveyor (7B); an external detection tool (9), installed at or within operating distance of a supply conveyor (7A), which detection tool is in communication with the process processing device (2), configured to determine the location / position and / or the nature of the starting material, which detection tool (9) comprises: one or more sensors (9A); and / or one or more machine image recognition devices (9B); one or more conductors (8), in communication with the process processing device (2), configured to allocate each identified workpiece from the feed conveyor (7A) to a robotic workstation (RC), or to assign processed; DK 180199 B1 2 materiale, som vender tilbage fra en robot-arbejdsstation, til en anden robot- arbejdsstation til videre behandling, eller til tildeling af slutprodukter til udlednings- transportgren (7B); et eller flere procesbehandlings indretninger (2), i kommunikation med og konfigureret for behandling af digitaliserede data opnået af nævnte tilførsels-transportør (7A), nævnte i cellen værende og/eller eksterne detekteringsredskaber (3, 9), nævnte konduktør (8), nævnte transportører (5, 7), nævnte robot-arbejdsstationer (RC), nævnte industrirobotter (R), nævnte udlednings-transportør (7B), til bestemmelse af lokalisering, karakter, struktur, art, størrelse, orientering, kvalitet osv., og til styring af industrirobottens bevægelse under drift/behandling.DK 180199 B1 2 material returning from a robot workstation, to another robot workstation for further processing, or for the assignment of final products to the discharge transport branch (7B); one or more process processing devices (2), in communication with and configured for processing digitized data obtained by said supply conveyor (7A), said in-cell and / or external detection tools (3, 9), said conductor (8), said conveyors (5, 7), said robot workstations (RC), said industrial robots (R), said discharge conveyor (7B), for determining location, character, structure, kind, size, orientation, quality, etc., and for controlling the movement of the industrial robot during operation / treatment. 2. Robot-slagtekrop-forarbejdningssystemet ifølge krav 1, hvilket system desuden omfatter et middel til lagring af information (en server/database) (10) i kommunikation med procesbehandlings indretningen (2).The robotic carcass processing system according to claim 1, which system further comprises a means for storing information (a server / database) (10) in communication with the process processing device (2). 3. En metode til automatisk behandling af slagtekroppe, hvilken metode omfatter de efterfølgende trin: (a) analyse af det tilførte arbejdsemne til identifikation og beslutning om den videre behandling ved hjælp af de eksterne detekteringsredskaber (9); (b) videredistribution af det analyserede emne til en robot-arbejdsstation (RC); (c1) i robot-arbejdsstationen (RC), udsættelse af det indgående arbejdsemne for en i cellen værende analyse ved hjælp af det i cellen værende detekteringsredskab (3), og behandling af emnet i henhold til en forudbestemt tidsplan eller ifølge en faktisk målt værdi, og eventuelt ved anvendelse af en maskin-indlærings-metodologi anvendt på de data, der er opnået ved hjælp af det i cellen værende detekteringsredskab (3); (c2) eventuelt, omfordeling af et behandlet arbejdsemne, behandlet ifølge trin c1, til en anden robot-arbejdsstation (RC) til yderligere behandling; og (d) transport af det forarbejdede arbejdsemne til en udførsels-transportør (7B) til endelig behandling.A method for the automatic treatment of carcases, which method comprises the following steps: (a) analysis of the supplied workpiece for identification and decision on the further processing by means of the external detection tools (9); (b) redistributing the analyzed item to a robotic workstation (RC); (c1) in the robotic workstation (RC), subjecting the incoming workpiece to an in-cell analysis using the in-cell detection tool (3), and processing the workpiece according to a predetermined schedule or according to an actual measured value , and optionally using a machine learning methodology applied to the data obtained by the in-cell detection tool (3); (c2) optionally, redistributing a treated workpiece, treated according to step c1, to another robotic workstation (RC) for further processing; and (d) transporting the processed workpiece to an export conveyor (7B) for final processing. 4, Robot-slagtekrop-forarbejdningssystemet ifølge et af kravene 1-2 til anvendelse i fremgangsmåden til automatisk behandling af slagtekroppe ifølge krav 3.The robot carcass processing system according to any one of claims 1-2 for use in the method of automatic processing of carcasses according to claim 3.
DKPA201801006A 2018-12-17 2018-12-17 Cellular meat production DK180199B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production
PCT/EP2019/085048 WO2020126890A1 (en) 2018-12-17 2019-12-13 Cellular meat production
EP19831624.2A EP3897162A1 (en) 2018-12-17 2019-12-13 Cellular meat production

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production

Publications (2)

Publication Number Publication Date
DK201801006A1 DK201801006A1 (en) 2020-07-31
DK180199B1 true DK180199B1 (en) 2020-08-13

Family

ID=69104360

Family Applications (1)

Application Number Title Priority Date Filing Date
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production

Country Status (3)

Country Link
EP (1) EP3897162A1 (en)
DK (1) DK180199B1 (en)
WO (1) WO2020126890A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK181162B1 (en) * 2021-01-12 2023-03-17 Teknologisk Inst Robotic packing system and method for use at slaughterhouses
DK181011B1 (en) * 2021-07-09 2022-09-21 Teknologisk Inst Digital process monitoring
DK181203B1 (en) 2021-09-28 2023-04-27 Teknologisk Inst A robotic carcass processing system and method for use in a slaughterhouse

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1484979B1 (en) * 2002-03-18 2006-08-09 Scanvaegt International A/S Method and system for monitoring the processing of items
NL1027332C2 (en) 2004-10-25 2006-04-26 Meerpaal B V De Robot cell for exchanging and storing e.g. tools for CNC machines, uses processor and memory to select suitable location in magazine for objects to be transferred from or to
MX2011010360A (en) * 2009-04-03 2012-04-30 Robotic Technologies Ltd Carcass cutting methods and apparatus.
EP3136869A4 (en) 2014-05-01 2018-02-21 Jarvis Products Corporation Robotic carcass processing method and system
US9955702B1 (en) * 2016-10-28 2018-05-01 Jarvis Products Corporation Beef splitting method and system
US10117438B2 (en) * 2016-10-28 2018-11-06 Jarvis Products Corporation Beef splitting method and system
ES2897481T3 (en) * 2017-03-13 2022-03-01 Frontmatec Smoerum As 3D Imaging System and Imaging Method of Carcasses of Sacrificed Animals
EP3476555A1 (en) 2017-10-27 2019-05-01 Creaholic SA Production cell

Also Published As

Publication number Publication date
WO2020126890A1 (en) 2020-06-25
DK201801006A1 (en) 2020-07-31
EP3897162A1 (en) 2021-10-27

Similar Documents

Publication Publication Date Title
DK180199B1 (en) Cellular meat production
US10399778B1 (en) Identification and planning system and method for fulfillment of orders
US10060857B1 (en) Robotic feature mapping and motion control
US11504853B2 (en) Robotic system architecture and control processes
US11667030B2 (en) Machining station, workpiece holding system, and method of machining a workpiece
US20230278811A1 (en) Robotic system for processing packages arriving out of sequence
CN106672345B (en) A kind of method and system of industrial robot automatic sorting
US11701777B2 (en) Adaptive grasp planning for bin picking
EP3761245A1 (en) Robotic sortation system
JP2010519013A (en) Method and system for processing items
WO2020231319A1 (en) Robot cell setup system and process
CN110520259A (en) Control device, picking up system, logistics system, program and control method
CN104603704A (en) Ultra-flexible production manufacturing
Boschetti A picking strategy for circular conveyor tracking
DK180283B1 (en) Automatic suspension of meat items
CN110494258A (en) Control device, picking up system, logistics system, program, control method and production method
Coelho et al. Simulation Of An Order Picking System In A Manufacturing Supermarket Using Collaborative Robots.
WO2021185805A2 (en) A relocatable robotic system for production facilities
Rieder et al. Robot-human-learning for robotic picking processes
CN110914021A (en) Operating device with an operating device for carrying out at least one work step, and method and computer program
US11312584B2 (en) Method of palletizing non-uniform articles
EP3909715A1 (en) Pallet conveyance system, pallet conveyance method, and pallet conveyance program
Wu et al. Application of visual servoing for grasping and placing operation in slaughterhouse
EP4125015A1 (en) Management system for goods picking and packing
Ferreira et al. Smart system for calibration of automotive racks in Logistics 4.0 based on CAD environment

Legal Events

Date Code Title Description
PAT Application published

Effective date: 20200618

PME Patent granted

Effective date: 20200813