DK201801006A1 - Cellular meat production - Google Patents

Cellular meat production Download PDF

Info

Publication number
DK201801006A1
DK201801006A1 DKPA201801006A DKPA201801006A DK201801006A1 DK 201801006 A1 DK201801006 A1 DK 201801006A1 DK PA201801006 A DKPA201801006 A DK PA201801006A DK PA201801006 A DKPA201801006 A DK PA201801006A DK 201801006 A1 DK201801006 A1 DK 201801006A1
Authority
DK
Denmark
Prior art keywords
processing
robotic
cell
robot
sensing means
Prior art date
Application number
DKPA201801006A
Inventor
Wu Haiyan
Grothe Henrik
Nielsen Jespersen Klaus
Philip Philipsen Mark
Original Assignee
Teknologisk Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologisk Inst filed Critical Teknologisk Inst
Priority to DKPA201801006A priority Critical patent/DK180199B1/en
Priority to PCT/EP2019/085048 priority patent/WO2020126890A1/en
Priority to EP19831624.2A priority patent/EP3897162A1/en
Publication of DK201801006A1 publication Critical patent/DK201801006A1/en
Application granted granted Critical
Publication of DK180199B1 publication Critical patent/DK180199B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • A22C17/0086Calculating cutting patterns based on visual recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Food Science & Technology (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Manipulator (AREA)

Abstract

This invention relates to robotic carcass processing systems and methods for production in parallel cell stations, and the system and method of the invention solve several of the disadvantages associated with conventional meat production methods.

Description

DK 2018 01006 A1 1
CELLULAR MEAT PRODUCTION
TECHNICAL FIELD This invention relates to robotic carcass processing systems and methods for production in parallel cell stations, and the system and method of the invention solve several of the disadvantages associated with conventional meat production methods.
BACKGROUND ART Through many years, the meat industry has been industrialised in terms of organisation, work specialisation and use of long production lines. Although robots and automation have gained ground, many slaughterhouse processes still are carried out partly or entirely by hand, i.a. due to the biological variation of the carcasses to be processed.
The production methods of conventional slaughterhouses are largely characterized by long production lines with specialized work stations operating serially, one after the other. While specialization and the use of few production lines generally increases efficiency, this set-up also has its drawbacks.
Losses following breakdown somewhere along the production line is greater than for other forms of production. Sometimes carcasses can be damaged by previous operating processes, and the sensitivity to fluctuations in commodity supply increases, as production is concentrated on fewer process units. Machines are not always suited for processing a variety of sizes, and biological variations, contaminants, etc, significantly reduces the overall effectivity. Equipment errors, as well as cleaning and maintenance, often result in a complete stop of the entire production line, affecting all products on the line, and greatly influences the capacity.
Long production lines also are unsuited for producing small series, with frequent conversions/change of product focus, and are unsuited for a product mix where multiple products are run simultaneously. Moreover, highly specialized production lines involve much internal transport of the goods, which negatively affects the overall efficiency/economy.
Finally, long production lines increase the risk of repetitive work at the individual work stations.
For many years robots have represented the solution to much of the labour intensive work that is undertaken in various industries.
DK 2018 01006 A1 2 Thus WO 2006/085744 describes a robot cell and a method for changing and storing elements in a robot cell.
WO 2015/168511 describes a robotic carcass processing method and system.
However, the robotic carcass processing methods and systems described hersin have never been disclosed.
SUMMARY OF THE INVENTION The present invention provides an alternative system and a related method of processing carcasses at abattoirs. The system and method of the invention differ from those conventionally used in abattoirs by taking place in several parallel cell stations rather than a few serial production lines.
Prominent features of the present invention are e.g.: Rather than using specialised machinery, multi-function robots are introduced; Rather than running a fixed production flow, a programmable and varied production flow is accomplished; Rather than undertaking a fixed schedule for cleaning and maintenance, a need-adapted cleaning and maintenance schedules are introduced.
Another essential feature of the method of the invention is increased flexibility, allowing the handling of small product series, handling of a varied range of products, and customised production, focusing of different customer’s special needs. Production in robot cells according to the invention makes it possible to settle several orders/productions in parallel, just by configuration via software.
Moreover, the method of the invention facilitates platform based development rather than single development projects, and the method of the invention also allows for a better capacity adaptation. As the productions units are programmable, there are less physical limitations, and the production plant can easily be adapted to fluctuations in the delivery of animals. While the cell may be equipped with a variety of tools, which provides access to a palette op operations, the arrangement, using of programmable robots, also allows for use of the same tool for different operations, thus reducing the number of tools necessary. Losses as result of stops are reduced, down-time as result of cleaning and maintenance is reduced, and the plant may run 24/7.
Composed of separate, delimited, self-containing units, each unit can be isolated for better cleaning. Due to the fact, that each cell does not need to be served by an operator, or the operator can supervise the process from outside the closed cell, the process may involve the use of X-ray, NMR-equipment, and similar hazardous processes, that may otherwise be harmful to an operator.
Finally, the method of the invention converts repetitive work into process monitoring and management. In this respect, the method of the invention may include the use of virtual
DK 2018 01006 A1 3 reality (VR) or augmented reality (AR), as the operator, working from outside the cell, may monitor and correct the production process e.g. by wearing a virtual reality headset, which presents a virtual environment with a digital twin of the real robot cell.
Therefore, in its first aspect, the invention provides a robotic carcass processing system as described below.
In another aspect, the invention provides a method for automatic processing of carcasses, Further aspects of the invention relate to use of the robotic carcass processing system of the invention in the method of the invention for automatic processing of carcasses.
Other objects of the invention will be apparent to the person skilled in the art from reading the following detailed description and accompanying drawings.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention is further illustrated by reference to the accompanying drawing, in which: Fig. 1 shows an overview of an in-cell processing phase, taking place inside a robot cell (RCx) of the invention: starting material/intermediate products arrive via an in-cell (in-let) conveyor (5m); starting material/intermediate products optionally are placed on a work bench/processing table (4), and are being tracked and monitored by the in-cell sensing means (3), e.g. by the in-cell sensor (34), and/or by the in-cell vision device (3B); the active robot (Rx) choses a tool suited for the intended process from a toolbox (6x) and performs the intended operation; the processed product/end-product is placed on an internal out-let conveyor (5out) for transport to the outside of the robot cell (RCx); all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (i.e. a server/database) (10), and optionally using machine learning methodology applied to data obtained by the sensing means (3); and the robot cell and the processes taking place herein may be monitored from outside the cell by an operator, optionally equipped with a VR/AR device (14); Fig. 2 shows an overview of the external system of the invention: starting material arrives via an in-let conveyor (7A), and the staring material is being tracked (localised/analysed) by the external sensing means (9);
DK 2018 01006 A1 4 staring material is being distributed to one of the Robot Cells (RC1x) via a conductor (8), in operation with the in-let conveyor (Sin); staring material is processed in one of the robot cells (cf. Fig. 1), and processed starting material (Intermediate products) may then be re-distributed to another Robot Cell (RC2…) via an out-let conveyor (So), optionally guided in-cell sensing means (3), and/or the external sensing means (9), both in operation with the processing means (2); this process {re-distribution of intermediate products) may be repeated; the final product (12) is being transported to an out-let conveyor (7B); and all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (a server/database) (10); and Fig 3 shows a robot (Rx) for use according to the invention, mounted with a working tool (6), and onto which robot a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot.
DETAILED DISCLOSURE OF THE INVENTION The system of the invention In its first aspect, the invention provides a robotic carcass processing system.
The robotic carcass processing system of the invention takes its starting point in incoming starting material, which may be any carcasses, or parts thereof, conventionally processed in slaughterhouses. During the further processing, the starting material turns into i.a. processed products, meat items, intermediate products, and — eventually — end- products.
The system of the invention may also be characterised by comprising an in-let processing step, an outlet processing step, and in between these steps internal processing steps are taking place in robot cells/production cells, which robot cells represent a closed or possibly sealed environment. Moreover, all processes are carried out in communication with, and guided or assisted by a processing means.
Moreover, the entire production may be monitored, and possibly corrected from outside the closed environment of the robot cell, by an operator.
The work station The system of the invention (1) may be characterised by comprising the following (in-cell) elements: one or more robotic work stations (RC, RC, ... RCx), configured for operation in parallel, and independently of each other, each of which robotic work stations comprises:
DK 2018 01006 A1 one or more industrial robots (R1, R2, ... Rx), in operation with the processing means (2), each robot capable of holding, and configured for operating, a working tool (14); one or more in-let/out-let conveyors (5), in operation with the processing 5 means (2), and capable of transporting the starting material (11) into the robotic cell (RC), and/or the end-product (12) out of the robotic cell (RC); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, e.g. as assessed by the processing means (2); one or more processing tables/workbenches/support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or characteristics of the starting material (11), comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B); and one or more processing means (2), in collaboration with each other, and in operation with said industrial robots (R), said processing table/workbench/support stand (15), said in-cell sensing means (3), and configured for processing digitalized data obtained by said in-cell sensing (3), and configured for applying machine learning to said obtained digitalised data.
Each robotic work station (RC) of the invention is configured for, and undertakes one or more of the following tasks: - receives the starting material (11) in question entering the cell via an in-let internal conveyor (5in); - processes the starting material (11) in question, according to a pre- programmed schedule, according to real-time, online measurements, and/or, optionally, by applying machine learning methodology on digitalised data obtained by the sensing means (3); - determines the character, structure, nature, size, orientation, location, quality, etc., of the meat item to be processed; - makes quality assessment of the workpiece in question, and providing feed- back to the robot in question, optionally accomplished by a self-learning algorithm; - guides the motion of the industrial robot (R) during operation/processing; - selects and picks a working tool (6) from a tool box, and mounts this tool on the industrial robot (Rx) in question, in accordance with the intended (programmed or self-taught) task; - replaces the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task;
DK 2018 01006 A1 6 - performs self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or - places the end-product (12) on the out-let conveyor (Seu), for transport to outside the cell, In one embodiment, the system of the invention comprises two or more robotic work stations, optionally in inter-communication with one or more of the other robotic work stations.
In another embodiment, the robotic carcass processing system of the invention comprises the following additional (external) elements: one or more transporting means (5, 7), in operation with the processing means (2), for commodity supply and configured for distributing of starting materials (11) to one of the two or more robotic work stations (RC:, … RC), or for redistributing processed material/intermediate products to another of the available robot cells (RC), or for transport of processed material/end-product to the out-let conveyor (70); an in-cell sensing means (3), installed at, or within operating distance of the product supply means (7), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the starting material, which sensing means (3) comprises: one or more sensors (3A); and/or one or more machine vision devices (3B); one or more conductors (8), in operation with the processing means (2), configured for allocating each identified starting material (11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material/intermediate products (12), returning from one robotic work station, to another robotic work station for further processing, or for allocating the end-products to the out-let conveyor (7C); one or more processing means (2), in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic work stations (RC), said industrial robots (R), said out-let conveyor (7C), for determining the localisation, the character, structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing.
The robotic work station (RC) The system of the invention comprises one or more robotic work stations, also termed robot cells (RC1, RC2, RC, … RCx), in which work stations/cells the actual processing of the meat items takes place. The robot cells for use according to the invention shall be configured for operation in parallel, and independently of each other, but may be inter-communicating with each other via the processing means (2).
DK 2018 01006 A1 7 The robotic carcass processing work station (RC) of the invention may be characterised by comprising the following elements: one or more industrial robots (Ri, ... Rx), in operation with the processing means (2), each robot configured for holding, and capable of operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2); one or more processing tables/workbenches/support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or character of the starting material (11), comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B).
one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3).
In one embodiment, the robotic carcass processing work station (RC) of the invention further comprises a means for storing information (a server/database) (10).
The robot cell of the invention may be regarded as a closed or possibly sealed environment, suited for implementation of clean-room techniques, and they should be established using cleaning-friendly materials.
Each robot cell receives material to be processed via an in-let conveyor (5in), and delivers processed material via an out-let conveyor (5out). In one embodiment, supply and delivery of the goods is implemented using one and the same conveyor (5in/0ut), just by reversing the transporting direction of the conveyor.
Each robotic work station of the invention comprises one or more industrial robots (Ri, R2, R3, ... Rx), one or more working tools (6), one or more processing tables/workbenches/support stands (4), an in-cell sensing means (3), and one or more robot(s) (R), and the in-cell sensing means (3), shall be in communication with, and receive operational guidance from, the processing means (2).
In another embodiment, the robot cell and the processes taking place herein may be monitored from outside the cell by an operator (13), optionally by use of projection mapping, or equipped with virtual reality (VR) or augmented reality (AR) equipment (14).
The work stations/cells (RC) for use according to the invention shall be configured for, and able to perform one or more of the following tasks: - receipt of the starting material (11) in question entering the cell via an in-let internal conveyor (5in);
DK 2018 01006 A1 8 - processing of the starting material (11) in question, according to a pre- programmed schedule, according to real-time, online measurements, and/or according to a self-learning algorithm (reinforced learning/machine learning); - selecting a working tool (6) for mounting on the industrial robot (Rx) in question, in accordance with the intended (programmed) task; - replacement of the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task; - quality assessment of the workpiece in question, and providing feed-back to the robot in question; - performing self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or - placing the end-product (12) on an internal out-let conveyor (Su), for transport to the out-let conveyor (70).
The industrial robot (R) The system of the invention comprises the use of one or more robots (Ri, Raz, R3, ... Rx), including the use of single robots, and the use of multiple robots, working together.
The robot for use according to the invention may be any available automated industrial robot, capable of being programmed, and capable of moving on two or more axes. The robot shall be in communication with, and receive operational guidance from, the processing means (2).
The industrial robot shall be configured for holding, and capable of operating a working tool (6).
In another embodiment, the robot also shall be able to choose and change tools (6), e.g. by choosing from a toolbox.
In a further embodiment, the robot (Rx) for use according to the invention is mounted with a working tool (6), and a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot.
The robot for use according to the invention may be any commercially available industrial robot. Industrial robots may be classified based on their coordinate systems, i.e. based on reachable coordinates of a point on the end-effector, and include Cartesian robots (when arms of a robot move in the XYZ rectangular coordinate system), Cylindrical robots (when arms of a robot move in one angular and two linear directions), Spherical robots (the arms move in two angular and one linear direction), SCARA robots (Selective Compliance Arm for Robotic Assembly; have two parallel revolute joints
DK 2018 01006 A1 9 providing compliance in a selected plane), and Articulated robots (also known as the anthropomorphic robot, the robot arm has 3 revolute joints). The working tools (6) For performing the intended tasks, the system of the invention also comprises one or more working tools (6), that can be attached to the industrial robot (Rx), and is suited for the intended task.
The working tools (6) may be available from a tool box, from which one or more robots can pick and choose the desired tool, according to the intended task.
The working tool (6) may be introduced by use of one or more robots (R). One tool (6) may be mounted on one robot, and two or more robots may work together to solve the intended task.
In another embodiment, a multi-tool is mounted on a robot (R), configured for operating the multi-tool, and able to switch between the individual tools, as need be.
Examples of working tools frequently used in abattoirs are knives, e.g. fixed knives, rotary knives, oscillating knives, and wizard knives; saws, e.g. round saws, band saws and chain saws. Tools for use according to the invention also may include scissors, e.g. scissors having multiple blades, including blades of different sizes.
The system of the invention may also comprise tools for cleaning of the robot cell and its interior, including tools for cleaning other tools. Cleaning tools include e.g. brushes, brooms, and pressure devices.
The processing means (2) For receiving input from, and for guiding the relevant devices used according to the invention, the system of the invention comprises one or more processing means (2). If two or more processing means are employed, these processors may be in inter- communication with one or more of the other processing means.
For performing the necessary processing actions, the processing means for use according to the invention shall be in communication with, and/or configured for processing digitalized data obtained by one or more of the following devices: the robot(s) (R); the in-cell sensing means (3), i.e. the in-cell sensor (3A) and/or the in-cell machine vision device (3B); the processing table/workbench/support stand (4); the transporting means (5, 7); the external sensing means (9), i.e. the external sensor (9A) and/or the external machine vision device (9B); the conductors (8); and
DK 2018 01006 A1 10 the robotic work stations (RC).
The processor(s) for use according to the invention shall also be configured for making assessments and determinations, and be able to perform one or more of the following tasks: Determine the position/location, the character, structure, nature, size, orientation, quality, etc. of a workpiece (11); guide the motion of the industrial robot (R) during operation/processing; calculate the speed of the transporting means/conveyor (5, 7); calculate the optimal cutting/trimming pattern in view of the specifications for the final product/cut (12); recognize and track the starting material (11) while being transported from arrival (5in, 7A) to the processing table/workbench/support stand (4); recognize and track processed materials while being transported to outside (Sout) the cell to another work station (RC), or to be delivered as the final product (12) via the out-let conveyor (7B).
In one embodiment, the processing means (2) for use according to the invention shall be programmed for performing i.a. machine vision and/or machine learning (more details below).
The carcass processing system of the invention may also be in operation with an external Enterprise Resource Planning (ERP) system, ERP represents a software application that manages functional areas across the business, ERP integrates company functions such as order processing, sales, procurement, inventory management and financial systems, In another embodiment, the processing means (2) for use according to the invention shall be configured for running a local ERP application, or for communicating with an external ERP application.
The sensing means (3, 9) For tracking incoming products, as well as processed products, the system needs to know the position/location, and possibly also the characteristics, of each item. This may be accomplished by use of sensors and/or machine vision devices, Tracking of the products takes place inside the robotic cell, but may also take place outside the robotic cell, when dealing with starting materials and/or processed/end-products.
Sensing means for use according to the invention comprises sensors (3A, 9A), and/or machine vision devices (3B, 9B).
DK 2018 01006 A1 11 As defined herein, a sensor is a device, module, or subsystem, whose purpose is to detect events, or changes in its environment, and send the information to other electronics, and in particular the processing device (2) used according to the invention.
The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
Like sensors, machine vision devises can determine positions or locations of specific subjects, but machine vision devises also can determine e.g. the character or the structure of a given object.
Machine vision Machine vision represents a combination of hardware and software capable of providing operational guidance to other devices based on the capture and processing of images, and usually rely on digital sensors protected inside industrial cameras, with specialised optics to acquire images, so that computer hardware and software can process, analyse and measure various characteristics for decision making.
A machine vision system typically comprises lighting, a camera with a lens, an image Sensor, a vision processing means, and communication means. The lens captures the image and presents it to the sensor in the form of light. The sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis. Lighting illuminates the part to be inspected, and allows its features to stand out so they can be clearly seen by camera.
Processing may be accomplished by conventional processors, including central processing units (CPU) and/or graphics processing units (GPU), e.g. in a PC-based system, or in an embedded vision system, and is performed by software and may consist of several steps. First, an image is acquired from the sensor. In some cases, pre- processing may be required to optimize the image and ensure that all the necessary features stand out. Next, the software locates the specific features, runs measurements, and compares these to the specification. Finally, a decision is made and the results are communicated.
Machine vision systems essentially comes in three main categories: 1D vision captures a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera; 2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera; and 3D vision systems comprise multiple sensors, including or one or more laser displacement sensors.
DK 2018 01006 A1 12 The machine vision device for use according to the invention also comprises the use of e.g. X-ray and/or NMR-equipment.
Any category, or combination of categories, may be implemented in the processing system of the invention.
The machine vision device (3B, 9B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
If the incoming meat product (11) is considered represented by a spatial distribution, the vision device for use according to the invention preferably comprise a combination of a 1D and 3D scanning device (3B, 9B), to determine the height and calculate angle of the food surface, thus allowing optimal guidance of the robotic arm to position the working tool (6) in the right angle.
The machine vision hardware components for use according to the invention, such as sensors and processors, are commercially available, and machine vision systems can be assembled from single components, or purchased as an integrated system, with all components in a single device.
The 3D scanning devices (3B, 9B) for use according to the invention may be any commercially available 3D scanner or range camera, such as a time-of-flight camera, structure light camera, or stereo camera, e.g. a Sick 3D ruler.
The transporting means (5, 7) For transporting of the meat items from in-put to out-put, and for inter-cell transport, the carcass processing system of the invention must comprise one or more transporting means (5, 7), and these transporting means shall be in communication with, and receive guidance from the processing means (2).
Transporting according to the invention is for commodity supply and for distributing of starting materials (11) to one of robotic work stations (RC1, … RCx), and/or for redistributing processed material/intermediate products to another of the available robot cells (RC), and/or for transport of processed material/end-product (12) to the out-let conveyor (7B), Transport may take place by use of a conveyor belt, a conveyor/overhead rail, a lift, or an automated guided vehicle (AGV).
DK 2018 01006 A1 13 The processing table/workbench/support stand (4) For optimal support during processing, the system of the invention includes one or more processing tables/workbenches/support stands (4), onto which the workpiece (11) can be placed.
In contrast to conventional abattoir processes, that is performed while the carcasses are hanging on an overhead rail, processing on tables/workbenches/support stands provides allows for maximal support of the meat item during processing.
The use of workbenches may involve a transition from a hanging position to a lay-down position, which transition may be accomplished by use of a lay-down mechanism. The external conductor (8) The system of the invention also comprises one or more conductors (8). As defined herein, a conductor represents some equipment that can redistribute the incoming meat items to other conveyors. For fulfilling this task, the conductor (8) shall be in communication with, and receive operational guidance from, the processing means (2).
In the system of the invention, the conductor (8) is configured for allocating each identified starting material (11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material/intermediate products (11), returning from one robotic work station, to another robotic work station for further processing, or for allocating the end-products to the out-let conveyor (7C). The operator (13) Moreover, the entire production may be monitored and possibly corrected from outside the closed environment of the robot cell (RC) by an operator (13).
Monitoring of the carcass processing may be accomplished by use of projection mapping, or by use of virtual reality (VR}, or augmented reality (AR) equipment (14), Machine learning Machine learning is an application of artificial intelligence (AT) which use statistical techniques to parform automated decision-making, and optionally improve performance on specific tasks based on experience without being explicitly programmed.
In supervised learning, the computer is presented with example inputs and their desired outputs, given by the supervisor, and the goal is to learn a general rule that maps inputs to outputs. Using supervised learning, large sets of reference data, covering different product types, are created and stored on a means for storing information (a server/database) (10).
DK 2018 01006 A1 14 Data thus becomes a reference dataset, located on a means for storing information (a server/database) (10), and accessible for further processing by the processing means (2).
As data is building up, the system becomes more and more automatic and autonomous, but occasionally, e.g. caused by quality errors or recognition errors, manual assistance may be needed, for additional training of the system, After receipt of the digitalized data from the machine vision device (3B, 9B), information about the product type can be determined, and, e.g. by reference to a product catalogue also stored on the server (10), a product ID can be allocated to each meat product/mixture of meat products (11), and transmitted to the server (10) for further action/use.
In one embodiment, information about the product type or product ID is used for specifying the destination of the meat product(s) (12), e.g. a specific place in the packing room, The method of the invention In another aspect, the invention provides a method for automatic processing of carcasses.
The method of the invention for automatic processing of carcasses may be characterised by comprising the subsequent steps of: (a) analysis of incoming starting material (11) for identification and decision on the further processing, using the external sensing means (9); (b) distribution of the analysed starting material (11) to a robotic work station (RC); (cl) in the robotic work station (RC), subjecting the incoming starting material (11) to in-cell analysis by use of the in-cell sensing means (3), and processing the material (11) according to a predetermined schedule, or according to an actual measured value, and, optionally, by use of a machine learning methodology applied to the data obtained by the in-cell sensing means (3); (c2) optionally, redistribution of a processed workpiece, processed according to step c1, to another robotic work station (RC) for further processing; and (d) transporting the end-product (12) to the out-let conveyor (7B) for final processing.
The analysis carried out according to step (a) is performed using an (external) sensing means (9), and in particular an (external) sensor (84), and/or an (externabD machine vision device (9B), in communication with the processing means (2). While a sensor (9A) may help locating and tracking the incoming meat item (11), the vision device (9B) may help providing additional information about the incoming product.
DK 2018 01006 A1 15 By help of the processing means (2), location and identification of the product is determined, and a decision on the further processing is calculated according to a predetermined (programmed or self-taught) schedule, or according to actual measured values, i.e. information obtained using the sensor (9A), or using the vision device (9B).
Based on the information obtained in step (a), the incoming meat product (11) is distributed to a robotic work station (RC) for processing. This transport is accomplished using a conductor (8) in collaboration with an (in-let) conveyor (5). The conductor (8) secures that the product is being directed onto the correct in-let conveyor, as determined by the processing means (2).
Having arrived at the robot cell (RC), the actual processing of the product takes place. First the starting material (11) should be placed in a supported position, i.e. on a processing table, workbench, or a support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task. Transition from the in-let conveyor (5m) to the processing table/stand (4) may optionally be accomplished using a “lay down mechanism”, frequently used in abattoirs.
Next, for determining the location and/or character of the incoming meat product (11), this is being analysed using the in-cell sensing means (3).
In one embodiment, this analysis takes place using one or more in-cell sensors (3A), in communication with the processor (2).
The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
In another embodiment, analysis takes place using an in-cell machine vision devices (3B), in communication with the processor (2).
The machine vision device (3B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
Having examined the starting product (11), a robot (R), guided by the processing means (2), and already mounted with a working tool (6), or after having picked a working tool from the toolbox (6), starts working on the meat item (11), properly supported on the on a processing table, workbench, or a support stand (4). If needed, the robot (R) may change working tool for completion of its task.
Processing may be accomplished by use of one robot (R) only, or using multiple robots working together.
DK 2018 01006 A1 16 Processing of the meat item (11) may be carried out according to a predetermined schedule, or according to an actual, measured value. A scheduled programme may be stored on a means for storing information (server/database) (10), from which the processor (2) may get the necessary information.
In another embodiment, processing of the meat item (11) may be carried out according to a self-taught schedule, applying the machine learning techniques described above.
Having completed the intended processing of the workpiece (11), whether completed by use of a single robot (R), or by use of two or more robots, each equipped with one working tool (6), or one or more equipped with a multi-tool, it may be necessary to subject the workpiece (11) to another processing, optionally to take place in another robot cell (RC), as determined and guided by the processing means (2), or, optionally, as determined and guided by the operator (13), which is optionally equipped with a virtual reality (VR) or augmented reality (AR) device (14).
In case the workpiece (11) shall be further processed it may be redistribution to another robotic work station (RC) for further processing (i.e. step (c2)). This may be accomplished via an out-let conveyor (5out), optionally via the conductor (8), and an in- let conveyor leading to the other robot cell (RC). For practical reasons, the in-let conveyor may turn into the our-let conveyor, simply by reversing its transport direction.
After arrival at another robot cell (RC), the process of step (c1) may be repeated.
After finishing the workpiece (11), the end-product (12) is being transported out of the robot cell (RC), via the out-let conveyor (5out), via the conductor (8), and via the out-let conveyor (7B), for final processing. Final processing/finishing may include packaging, shipping, etc.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
List of reference signs In the figures, identical structures, elements or parts that appear in more than one figure are generally labelled with the same numeral in all the figures in which they appear.
1 A robotic carcass processing system 2 Processing means/CPU 3 In-cell sensing means 3A — In-cell sensor 3B > In-cell machine vision device 4 Workbench/processing table
DK 2018 01006 A1 17 In-cell conveyors 5n In-let conveyor Bot Out-let conveyor 6 Tool box/Working tool 5 7 External conveyors
7A In-let conveyor 78 Out-let conveyor 8 Conductor 9 External sensing means
9A External sensor 9B External machine vision device 10 Means for storing information (server/database) 11 Starting material/workpiece 12 End-product
13 Operator 14 virtual reality (YR)/augmented reality (AR) equipment RC Robotic work station (Robot Cell) R Industrial robot

Claims (9)

DK 2018 01006 A1 1 CLAIMS
1. A robotic carcass processing system (1), which system comprises: one or more robotic work stations (RC, RC, ... RE), configured for operation in parallel, and independently of each other, each of which røbotic work stations comprise: one or more industrial robots (R1, R2, ... Rx), in operation with the processing means (2), each robot capable of holding, and configured for operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2); one or more processing tables/workbenches/support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or character of the starting material (11), comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B); one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3), and, optionally, configured for applying machine learning methodologies to the digitalized data obtained by said sensing means (3).
2. A robotic carcass processing system of claim 1, which system further comprises: one or more transporting means (5, 7), in operation with the processing means (2), for commodity supply and configured for distributing of starting materials (11) to one of the two or more robotic work stations (RC, RCz, RCs, … RC), or for redistributing processed material/intermediate products to another of the available robot cells (RC), or for transport of processed material/end-product to the out-let conveyor (70); an external sensing means (9), installed at, or within operating distance of the product supply means (7A), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the starting material, which sensing means (9) comprises: one or more sensors (9A); and/or one or more machine vision devices (9B);
DK 2018 01006 A1 2 one or more conductors (8), in operation with the processing means (2), configured for allocating each identified starting material (11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material/intermediate products (12), returning from one robotic work station, to another robotic work station for further processing, or for allocating the end-products to the out-let conveyor (7C); one or more processing means (2), in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic work stations (RC), said industrial robots (R), said out-let conveyor (7C), for determining the localisation, the character, structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing.
3. The robotic carcass processing system of either one of claims 1-2, which system further comprises a means for storing information (a server/database) (10), in communication with the processing means (2).
4. A robotic carcass processing work station (RC), which work station comprises: one or more industrial robots (R1, … Rx), in operation with the processing means (2), each robot configured for holding, and capable of operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2); one or more processing tables/workbenches/support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task; an in-cell sensing means (3), configured for determining the location and/or character of the starting material (11), comprising: one or more in-cell sensors (3A); and/or one or more in-cell machine vision devices (3B). one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3).
5. The robotic carcass processing work station (RC) of claim 4, which work station further comprises a means for storing information (a server/database) (10).
6. A method for automatic processing of carcasses, which method comprises the subsequent steps of:
DK 2018 01006 A1 3 (a) analysis of incoming starting material (11) for identification and decision on the further processing, using the external sensing means (9); (b) distribution of the analysed starting material (11) to a robotic work station (RC); (cl) in the robotic work station (RC), subjecting the incoming starting material (11) to in-cell analysis by use of the in-cell sensing means (3), and processing the material (11) according to a predetermined schedule, or according to an actual measured value, and, optionally, by use of a machine learning methodology applied to the data obtained by the in-cell sensing means (3); (c2) optionally, redistribution of a processed workpiece, processed according to step c1, to another robotic work station (RC) for further processing; and (d) transporting the end-product (12) to the out-let conveyor (7B) for final processing.
7. The robotic carcass processing work station (RC) of either one of claims 4-5, for use in the robotic carcass processing system (1) of claims 1-3.
8. The robotic carcass processing system (1), of claims 1-3 for use in the method for automatic processing of carcasses of claim 6.
9. The robotic carcass processing work station (RC) of either one of claims 4-5, for use in the method for automatic processing of carcasses of claim 6.
DKPA201801006A 2018-12-17 2018-12-17 Cellular meat production DK180199B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production
PCT/EP2019/085048 WO2020126890A1 (en) 2018-12-17 2019-12-13 Cellular meat production
EP19831624.2A EP3897162A1 (en) 2018-12-17 2019-12-13 Cellular meat production

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production

Publications (2)

Publication Number Publication Date
DK201801006A1 true DK201801006A1 (en) 2020-07-31
DK180199B1 DK180199B1 (en) 2020-08-13

Family

ID=69104360

Family Applications (1)

Application Number Title Priority Date Filing Date
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production

Country Status (3)

Country Link
EP (1) EP3897162A1 (en)
DK (1) DK180199B1 (en)
WO (1) WO2020126890A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK181162B1 (en) * 2021-01-12 2023-03-17 Teknologisk Inst Robotic packing system and method for use at slaughterhouses
DK181011B1 (en) * 2021-07-09 2022-09-21 Teknologisk Inst Digital process monitoring
DK181203B1 (en) 2021-09-28 2023-04-27 Teknologisk Inst A robotic carcass processing system and method for use in a slaughterhouse

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2478030C (en) * 2002-03-18 2011-05-31 Scanvaegt International A/S Method and system for monitoring the processing of items
NL1027332C2 (en) 2004-10-25 2006-04-26 Meerpaal B V De Robot cell for exchanging and storing e.g. tools for CNC machines, uses processor and memory to select suitable location in magazine for objects to be transferred from or to
CN102421297B (en) * 2009-04-03 2015-11-25 机器人技术有限公司 Carcass cutting methods and equipment
CN106231910A (en) 2014-05-01 2016-12-14 亚维斯产品公司 Robot trunk processing method and system
US9955702B1 (en) * 2016-10-28 2018-05-01 Jarvis Products Corporation Beef splitting method and system
US10117438B2 (en) * 2016-10-28 2018-11-06 Jarvis Products Corporation Beef splitting method and system
WO2018167089A1 (en) * 2017-03-13 2018-09-20 Carometec A/S 3d imaging system and method of imaging carcasses
EP3476555A1 (en) 2017-10-27 2019-05-01 Creaholic SA Production cell

Also Published As

Publication number Publication date
DK180199B1 (en) 2020-08-13
WO2020126890A1 (en) 2020-06-25
EP3897162A1 (en) 2021-10-27

Similar Documents

Publication Publication Date Title
US10752442B2 (en) Identification and planning system and method for fulfillment of orders
DK180199B1 (en) Cellular meat production
DE102019130902B4 (en) A robot system with a dynamic packing mechanism
US10060857B1 (en) Robotic feature mapping and motion control
US11504853B2 (en) Robotic system architecture and control processes
CA2759740C (en) Methods, apparatuses and computer program products for utilizing near field communication to guide robots
US11701777B2 (en) Adaptive grasp planning for bin picking
EP3761245A1 (en) Robotic sortation system
CN106672345B (en) A kind of method and system of industrial robot automatic sorting
WO2020231319A1 (en) Robot cell setup system and process
Boschetti A picking strategy for circular conveyor tracking
WO2021185805A2 (en) A relocatable robotic system for production facilities
Rieder et al. Robot-human-learning for robotic picking processes
CN113601501A (en) Flexible operation method and device for robot and robot
CN110914021A (en) Operating device with an operating device for carrying out at least one work step, and method and computer program
Lourenço et al. On the design of the ROBO-PARTNER intra-factory logistics autonomous robot
Wu et al. Application of visual servoing for grasping and placing operation in slaughterhouse
Ferreira et al. Smart system for calibration of automotive racks in Logistics 4.0 based on CAD environment
Suszyński et al. No Clamp Robotic Assembly with Use of Point Cloud Data from Low-Cost Triangulation Scanner
US11370124B2 (en) Method and system for object tracking in robotic vision guidance
Tomzik et al. Requirements for a cloud-based control system interacting with soft bodies
EP4137780A1 (en) Autonomous measuring robot system
CN113763462A (en) Method and system for automatically controlling feeding
DK202100195A1 (en) A relocatable robotic system for production facilities
US11660757B2 (en) Robot control system simultaneously performing workpiece selection and robot task

Legal Events

Date Code Title Description
PAT Application published

Effective date: 20200618

PME Patent granted

Effective date: 20200813