WO2020126890A1 - Cellular meat production - Google Patents

Cellular meat production Download PDF

Info

Publication number
WO2020126890A1
WO2020126890A1 PCT/EP2019/085048 EP2019085048W WO2020126890A1 WO 2020126890 A1 WO2020126890 A1 WO 2020126890A1 EP 2019085048 W EP2019085048 W EP 2019085048W WO 2020126890 A1 WO2020126890 A1 WO 2020126890A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing
robotic
cell
workpiece
robot
Prior art date
Application number
PCT/EP2019/085048
Other languages
French (fr)
Inventor
Niels-Henrik GROTHE
Haiyan Wu
Klaus Nielsen JESPERSEN
Mark Philip PHILIPSEN
Kristian Damlund GREGERSEN
Original Assignee
Teknologisk Institut
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologisk Institut filed Critical Teknologisk Institut
Priority to EP19831624.2A priority Critical patent/EP3897162A1/en
Publication of WO2020126890A1 publication Critical patent/WO2020126890A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • A22C17/0086Calculating cutting patterns based on visual recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • This invention relates to robotic carcass processing systems and methods for production in parallel cell stations.
  • the system and method of the invention solve several of the disadvantages associated with conventional meat production methods.
  • Losses following breakdown somewhere along the production line is greater than for other forms of production .
  • carcasses can be damaged by previous operating processes, and the sensitivity to fluctuations in commodity supply increases, as production is concentrated on fewer process units.
  • Machines are not always suited for processing a variety of sizes, and biological variations, contaminants, etc, significantly reduce the overall effectivity.
  • Equipment errors, as well as cleaning and maintenance often result in a complete stop of the entire production line, affecting all products on the line, and greatly influences the capacity.
  • WO 2006/085744 describes a robot cell and a method for changing and storing elements in a robot cell .
  • WO 2015/168511 descri bes a robotic carcass processing method and system.
  • WO 2019/081663 discloses a production cell comprising a robot.
  • the present invention provides an alternative system and a related method of processing carcasses at abattoirs.
  • the system of the invention may be described as a robotic carcass processing system, or automated carcass manufacturing system, composed of a plurality of collaborative robot cells or process modules, and the system and method of the invention differ from those conventionally used in abattoirs by taking place in several parallel cell stations rather than a few serial production lines.
  • Prominent features of the present invention are e.g. : Rather than using specialised machinery, multi-functional robots are introduced; Rather than running a fixed production flow, a programmable and varied production flow is accomplished; Rather than undertaking a fixed schedule for cleaning and maintenance, a need-adapted cleaning and maintenance schedules are introduced.
  • Another essential feature of the method of the invention is increased flexibility, allowing the handling of small product series, handling of a varied range of products, and customised production, focusing of different customer's special needs.
  • Production in robot cells according to the invention makes it possible to settle several orders/productions in parallel, j ust by configuration via software.
  • the method of the invention facilitates platform-based development rather than single development projects, and the method of the invention also allows for a better capacity adaptation.
  • the production units are programmable, there are fever physical limitations, and the production plant can easily be adapted to fluctuations in the delivery of animals.
  • the cell may be equipped with a variety of tools, which provides access to a palette of operations, the arrangement, using of programmable robots, also allows for use of the same tool for different operations, thus reducing the number of tools necessary. Losses as result of stops are reduced, down-time as result of cleaning and maintenance is reduced, and the plant may run 24/7.
  • Clean-in-place is a method of cleaning the interior su rfaces of various process equipment without disassembly. Composed of separate, delimited, self-containing units, each unit can be isolated for better cleaning, including CIP. Due to the fact, that each cell does not need to be served by an operator, or the operator can supervise the process from outside the closed cell, the process may involve the use of X-ray, NMR-equipment, and similar hazardous processes, that may otherwise be harmful to an operator.
  • the method of the invention converts repetitive work into process monitoring and management.
  • the method of the invention may include the use of virtual reality (VR) or augmented reality (AR), as the operator, working from outside the cell, may monitor and correct the production process e.g. by wearing a virtual reality headset, which presents a virtual environment with a digital twin of the real robot cell.
  • VR virtual reality
  • AR augmented reality
  • the invention provides a robotic carcass processing system as described below.
  • the invention provides a method for automatic processing of carcasses.
  • Fig. 1 shows an overview of an in-cell processing phase, taking place inside a robot cell (RCx) of the invention :
  • starting material/intermediate products arrive via an in-cell (in-let) conveyor (5in) ; starting material/intermediate products optionally are placed on a work bench/processing table (4), and are being tracked and monitored by the in-cell sensing means (3), e.g. by the in-cell sensor (3A), and/or by the in-cell vision device (3B);
  • the active robot (Rx) choses a tool (6x) suited for the intended process from a toolbox (6A) and performs the intended operation;
  • the processed product/end-product is placed on an internal out-let conveyor (5 0 ut) for transport to the outside of the robot cell (RCx) ;
  • Fig. 2 shows an overview of the external system of the invention :
  • starting material arrives via an in-let conveyor (7A), and the staring material is being tracked (localised/analysed) by the external sensing means (9);
  • staring material is being distributed to one of the Robot Cells (RCi-x) via a conductor (8), in operation with the in-let conveyor (5in);
  • staring material is processed in one of the robot cells (cf. Fig. 1), and processed starting material (intermediate products) may then be re-distributed to another Robot Cell (RC ⁇ -x) via an out-let conveyor (5 0ut ), optionally guided in-cell sensing means (3), and/or the external sensing means (9), both in operation with the processing means (2);
  • the processed product (12) is being transported to an out-let conveyor (7B); and all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (a server/database) (10);
  • Fig. 3 shows a robot (Rx) for use according to the invention, mounted with a working tool (6), and onto which robot a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot; and
  • Fig. 4 shows an example of a robot cell of the invention, equipped with three industrial robots (Rxi, Rx ⁇ , Rx3), individually working on the two half-carcasses ( 11), presented on and supported by the workbench (4) after being delivered inside the production cell (RCx) by an in-let conveyor (5in), optionally assisted by one or more of the robots, which production cell is guarded by a safety fence ( 15), and the entire in-cell processes may be monitored by one or more cameras (3B), mounted in the ceiling (not shown on the figure), and being in communication with the processing means (2) (not shown on the figure).
  • the invention provides a robotic carcass processing system, composed of one or more collaborative or non-collaborative robot cells.
  • the robotic carcass processing system of the invention takes its starting point in incoming starting material, which may be any carcasses, or parts thereof, conventionally processed in slaughterhouses. During the further processing, the starting material turns into i.a. processed products, meat items, intermediate products, and - eventually - end- products.
  • the system of the invention may also be characterised by comprising an in-let processing step, an outlet processing step, and in between these steps, internal processing steps are taking place in robot cells/production cells, which robot cells represent a closed or possibly sealed environment. Moreover, all processes are carried out in communication with, and guided or assisted by a processing means.
  • the entire production may be monitored, and possibly corrected from outside the closed environment of the robot cell, by an operator.
  • the system of the invention ( 1) may be characterised by comprising the following (in-cell) elements:
  • robotic workstations RCi, RC2, ... RCx
  • robotic workstations RCi, RC2, ... RCx
  • workstations comprises:
  • one or more industrial robots (Ri, R2, ... Rx), in operation with the processing means (2), each robot capable of holding, and configured for operating, a working tool (6) ;
  • one or more in-let/out-let conveyors (5) in operation with the processing means (2), and capable of transporting the workpiece ( 11) into the robotic cell (RC), and/or the end-product ( 12) out of the robotic cell (RC) ;
  • one or more working tools (6) for mounting on said industrial robot (Rx), and suited for the intended task, e.g . as assessed by the processing means (2) ;
  • processing tables/workbenches/support stand (4) onto which the workpiece ( 11) can be placed for optimal support du ring processing and conclusion of the intended task;
  • an in-cell sensing means (3) configured for determining the location and/or characteristics of the workpiece ( 11), comprisi ng :
  • one or more processing means (2) in collaboration with each other, and in operation with said industrial robots (R), said processing table/workbench/support stand ( 15), said in-cell sensing means (3), and configured for processing digitalized data obtained by said in-cell sensing (3), and configured for applying machine learning to said obtained digitalised data .
  • Each robotic workstation (RC) of the invention is configured for, and undertakes one or more of the following tasks: - receives the workpiece ( 11) in question entering the cell via an in-let internal conveyor ( 5in) ;
  • the system of the invention comprises two or more robotic workstations, optionally in inter-communication with one or more of the other robotic workstations.
  • the robotic carcass processing system of the invention comprises the following additional (external) elements:
  • an in-cell sensing means (3) installed at, or within operating distance of the product supply means (7), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the starting material, which sensing means (3) comprises:
  • machine vision devices (3B) ;
  • one or more conductors (8) in operation with the processing means (2), configured for allocating each identified workpiece ( 11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material/intermediate products ( 12), retu rning from one robotic work station, to another robotic workstation for fu rther processing, or for allocating the end-products to the out-let conveyor (7C) ;
  • one or more processing means (2) in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic work stations (RC), said industrial robots (R), said out-let conveyor (7C), for determining the localisation, the character, structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing .
  • the robotic carcass processing system of the invention further comprises one or more of the following hardware elements:
  • a buffer/conveyor for receiving products (5, 7) ;
  • the robotic carcass processing system of the invention further comprises one or more of the following software elements, executed on the processing means (2) :
  • AI/machine learning algorithm for extraction of key information from sensor input, e.g . a 2D image
  • the robotic workstation fRQ The robotic workstation fRQ
  • the system of the invention comprises one or more robotic workstations, which may also be termed manufacturing cells or robot cells (RCi, RC2, RC3, ... RCx), in which workstations/cells the actual processing of the meat items takes place.
  • the robot cells for use according to the invention shall be configured for operation in parallel, and independently of each other, but may also be inter-communicating with each other via the processing means (2).
  • the robotic carcass processing workstation (RC) of the invention may be characterised by comprising the following elements:
  • each robot configured for holding, and capable of operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2);
  • an in-cell sensing means (3) configured for determining the location and/or character of the workpiece (11), comprising :
  • in-cell machine vision devices (3B) .
  • one or more processing means (2) in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3).
  • the robotic carcass processing workstation (RC) of the invention further comprises a means for storing information (a server/database) (10).
  • the robot cell of the invention may be regarded as a closed or possibly sealed environment, suited for implementation of clean-room techniques, and they should be established using cleaning-friendly materials.
  • Each robot cell receives material to be processed via an in-let conveyor (5in) and delivers processed material via an out-let conveyor (5out) .
  • supply and delivery of the goods is implemented using one and the same conveyor (5in/out), just by reversing the transporting direction of the conveyor.
  • Each robotic work station of the invention comprises one or more industrial robots (Ri, R2, R3, ... Rx), one or more working tools (6), one or more processing tables/workbenches/support stands (4), an in-cell sensing means (3), and one or more robot(s) (R), and the in-cell sensing means (3), shall be in communication with, and receive operational guidance from, the processing means (2).
  • the robot cell and the processes taking place herein may be monitored from outside the cell by an operator (13), optionally by use of projection mapping, or equipped with virtual reality (VR) or augmented reality (AR) equipment (14).
  • an operator 13
  • projection mapping or equipped with virtual reality (VR) or augmented reality (AR) equipment (14).
  • VR virtual reality
  • AR augmented reality
  • the workstations/cells (RC) for use according to the invention shall be configured for, and able to perform one or more of the following tasks: - receipt of the workpiece (11) in question entering the cell via an in-let internal conveyor (5in) ;
  • the workstations/cells (RC) for use according to the invention shall comprise a protective casing surrounding the robot cell (RC), in which casing a door assembly is provided in order to provide access to the workstation, via an opening, through which the station can be served by an operator.
  • the industrial robot (R) The industrial robot (R)
  • the system of the invention comprises the use of one or more robots (Ri, R ⁇ ,
  • R3, ... Rx including the use of single robots, and the use of multiple robots, working together.
  • the robot for use according to the invention may be any available automated industrial robot (or robotic manipulator), capable of being programmed, and capable of moving on two or more axes.
  • the robot shall be in communication with, and receive operational guidance from, the processing means (2).
  • the industrial robot is understood to comprise manipulators, so it can be configured for holding, and capable of operating a working tool (6).
  • the robot also shall be able to choose and change tools (6), e.g. a removable or exchangeable tool or hardware module chosen from a toolbox or tool magazine (6A).
  • the robot (Rx) for use according to the invention is mounted with a working tool (6), and a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot.
  • the robot for use according to the invention may be any commercially available industrial robot. Industrial robots may be classified based on their coordinate systems, i.e. based on reachable coordinates of a point on the end-effector, and include Cartesian robots (when arms of a robot move in the XYZ rectangular coordinate system),
  • Cylindrical robots when arms of a robot move in one angular and two linear directions
  • Spherical robots the arms move in two angular and one linear direction
  • SCARA robots Selective Compliance Arm for Robotic Assembly; have two parallel revolute joints providing compliance in a selected plane
  • Articulated robots also known as the anthropomorphic robot, the robot arm has 3 revolute joints.
  • the system of the invention also comprises one or more working tools (6), that can be attached to the industrial robot (Rx) and is suited for the intended task.
  • the working tools (6) may be available from a toolbox or tool magazine (6A), that may comprise one or more removable or exchangeable tools or hardware modules, from which toolbox (6A) one or more robots can pick and choose the desired tool, depending on the intended task.
  • a toolbox or tool magazine (6A) may comprise one or more removable or exchangeable tools or hardware modules, from which toolbox (6A) one or more robots can pick and choose the desired tool, depending on the intended task.
  • the working tool (6) may be introduced by use of one or more robots (R).
  • One tool (6) may be mounted on one robot, and two or more robots may work together to solve the intended task.
  • a multi-functional tool is mounted on a robot (R) configured for operating the multi-functional tool, and able to switch between the individual tools, as need be.
  • Tools for use according to the invention also may include scissors, e.g. scissors having multiple blades, including blades of different sizes.
  • the system of the invention may also comprise tools for cleaning of the robot cell and its interior, including tools for cleaning other tools.
  • Cleaning tools include e.g. brushes, brooms, and pressure devices.
  • the system of the invention comprises one or more processing means (2), which shall include one or more central processing units (CPU), that may e.g. include a standard PC. If two or more processing means are employed, these processors may be in inter-communication with one or more of the other processing means.
  • CPU central processing units
  • the processing means for use according to the invention shall be in communication with, and/or configured for processing digitalized data obtained by one or more of the following devices:
  • the in-cell sensing means (3) i.e. the in-cell sensor (3A) and/or the in-cell machine vision device (3B);
  • the processor(s) for use according to the invention shall also be configured for making assessments and determinations, and be able to perform one or more of the following tasks:
  • the processing means (2) for use according to the invention shall be programmed for performing i.a. machine vision and/or machine learning (more details below).
  • the carcass processing system of the invention may also be in operation with an external Enterprise Resource Planning (ERP) system.
  • ERP represents a software application that manages functional areas across the business. ERP integrates company functions such as order processing, sales, procurement, inventory management and financial systems.
  • the processing means (2) for use according to the invention shall be configured for running a local ERP application, or for communicating with an external ERP application.
  • the sensing means (3. 91
  • the system For tracking incoming products, as well as processed products, the system needs to know the position/location, and possibly also the characteristics, of each item. This may be accomplished by use of sensors and/or machine vision devices.
  • Tracking of the products takes place inside the robotic cell, but may also take place outside the robotic cell, when dealing with starting materials and/or
  • Sensing means for use according to the invention comprises sensors (3A, 9A), and/or machine vision devices (3B, 9B).
  • a sensor is a device, module, or subsystem, whose purpose is to detect events, or changes in its environment, and send the information to other electronics, and in particular the processing device (2) used according to the invention.
  • the sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
  • machine vision devises can determine positions or locations of specific subjects, but machine vision devises also can determine e.g. the character or the structure of a given object.
  • Machine vision represents a combination of hardware and software capable of providing operational guidance to other devices based on the capture and processing of images, and usually rely on digital sensors protected inside industrial cameras, with specialised optics to acquire images, so that computer hardware and software can process, analyse and measure various characteristics for decision making, including determine the identity and the location of objects in the production cell, and perform feature extraction, e.g. by using machine learning.
  • a machine vision system typically comprises lighting, a camera with a lens, an image sensor, a vision processing means, and communication means.
  • the lens captures the image and presents it to the sensor in the form of light.
  • the sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis. Lighting illuminates the part to be inspected and allows its features to stand out so they can be clearly seen by camera.
  • Processing may be accomplished by conventional processors, including central processing units (CPU) and/or graphics processing units (GPU), e.g. in a PC-based system, or in an embedded vision system, and is performed by software and may consist of several steps. First, an image is acquired from the sensor.
  • CPU central processing units
  • GPU graphics processing units
  • Machine vision systems essentially comes in three main categories:
  • ID vision captures a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera;
  • 2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera
  • 3D vision systems comprise multiple sensors, including or one or more distance sensors.
  • the machine vision device for use according to the invention also may include the use of e.g. X-ray and/or NMR-equipment.
  • Any category, or combination of categories, may be implemented in the processing system of the invention.
  • the machine vision device (3B, 9B) for use according to the invention shall be configured for determining the identity and/or location of objects in the camera's field of view, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining its particular motion with respect to a given workpiece to be processed.
  • the vision device (3B) may be configured to detect both static as well as moving objects, and determine their position in space, e.g. in order to prevent collisions.
  • the vision device for use according to the invention comprise a combination of a ID, 2D and 3D scanning device (3B, 9B), to determine the position and dimensions of the product and calculate the angle of its surface, thus allowing optimal guidance of the robotic arm to position the working tool (6) in the right angle.
  • the machine vision hardware components for use according to the invention are commercially available, and machine vision systems can be assembled from single components, or purchased as an integrated system, with all components in a single device.
  • the 3D scanning devices (3B, 9B) for use according to the invention may be any commercially available 3D scanner or range camera, such as a time-of-flight camera, structure light camera, or stereo camera, e.g . a Sick 3D ruler or Microsoft Kinect.
  • the transporting means (5. 7)
  • the carcass processing system of the invention must comprise one or more transporting means (5, 7), and these transporting means shall be in
  • Transporting according to the invention is for commodity supply and for distributing of the workpiece ( 11) to one of robotic work stations (RCi, ... RCx), and/or for redistributing processed material/intermediate products to another of the available robot cells (RC), and/or for transport of processed material/end-product (12) to the out let conveyor (7B) .
  • Transport may take place by use of a conveyor belt, a conveyor/overhead rail, a lift, or an automated guided vehicle (AGV) .
  • AGV automated guided vehicle
  • the system of the invention includes one or more processing tables/workbenches/support/fixation stands (4), onto which the workpiece ( 11) can be placed.
  • workbenches may involve a transition from a hanging position to a lay-down position, which transition may be accomplished by use of a lay-down mechanism.
  • the system of the invention also comprises one or more conductors (8) .
  • a conductor represents an equipment that can redistribute the incoming meat items to other conveyors.
  • the conductor (8) shall be in communication with, and receive operational guidance from, the processing means (2) .
  • the conductor (8) is configured for allocating each identified starting workpiece ( 11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material ( 12), returning from one robotic work station, to another robotic work station for further processing, or for allocating the end- products to the out-let conveyor (7C) .
  • the entire production may be monitored and possibly corrected from outside the closed environment of the robot cell (RC) by an operator ( 13).
  • Monitoring of the carcass processing may be accomplished by visual inspection, or by use of projection mapping, or by use of virtual reality (VR), or augmented reality (AR) equipment ( 14) .
  • VR virtual reality
  • AR augmented reality
  • Machine learning is an application of artificial intelligence (AI) which use statistical techniques to perform automated decision-making, and optionally improve performance on specific tasks based on experience without being explicitly programmed.
  • AI artificial intelligence
  • supervised learning e.g ., the computer is presented with example inputs and their desired outputs, given by the supervisor, and the goal is to learn a general rule that maps inputs to outputs.
  • a supervised learning algorithm is trained to learn a general rule that maps inputs to outputs.
  • a product ID can be allocated to each workpiece /mixture of workpieces ( 11), and transmitted to the server ( 10) for further action/use.
  • information about the product type or product ID is used for specifying the destination of the meat product(s) (12), e.g . a specific place in the packing room.
  • system of the invention may include a safety system to protect human operators from harm .
  • Safety in human-robot collaborative manufactu ring can be ensured by various types of safety systems, including gua rd-free workspaces and fenceless robotic systems, that come in many forms, shapes and sizes.
  • active vision-based safety systems have gained momentum due to their affordable price (e.g . off-the-shelf RGB or depth cameras and projectors), flexible installation and easy tailoring .
  • the i nvention provides a method for automatic processing of carcasses.
  • step c2 optionally, redistribution of a processed workpiece, processed according to step cl, to another robotic workstation (RC) for continued or further processing ;
  • step (a) The analysis carried out according to step (a) is performed using an (external) sensing means (9), and in particular an (external) sensor (9A), and/or an (external) machine vision device (9B), in communication with the processing means (2) . While a sensor (9A) may help locating and tracking the incoming workpiece ( 11), the vision device (9B) may help providing additional information about the incoming product.
  • processing means (2) location and identification of the product is determined, and a decision on the further processing is calculated according to a predetermined (programmed or self-taught) schedule, or according to actual measured values, i.e. information obtained using the sensor (9A), or using the vision device (9B) .
  • the incoming workpiece ( 11) is distributed to a robotic workstation (RC) for processing .
  • This transport is accomplished using a conductor (8) in collaboration with an (in-let) conveyor (5in).
  • the conductor (8) secu res that the product is being directed onto the correct in-let conveyor, as determined by the processing means (2) .
  • the actual processing of the product takes place.
  • this analysis takes place using one or more in-cell sensors (3A), in communication with the processor (2).
  • the sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
  • analysis takes place using an in-cell machine vision devices (3B), in communication with the processor (2).
  • the machine vision device (3B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
  • a robot (R) guided by the processing means (2), and already mounted with a working tool (6), or after having picked a working tool from the toolbox (6A), starts working on the workpiece (11), properly supported on the on a processing table, workbench, or a support stand (4). If needed, the robot (R) may change working tool for completion of its task.
  • Processing may be accomplished by use of one robot (R) only, or using multiple robots working together.
  • Processing of the workpiece (11) may be carried out according to a
  • a scheduled programme may be stored on a means for storing information (server/database) (10), from which the processor (2) may get the necessary information.
  • processing of the workpiece (11) may be carried out according to a self-taught schedule, applying the machine learning techniques described above.
  • step (c2) may be redistribution to another robotic workstation (RC) for further processing.
  • RC robotic workstation
  • This may be accomplished via an out-let conveyor (5 0 ut), optionally via the conductor (8), and an inlet conveyor leading to the other robot cell (RC) .
  • the in-let conveyor may turn into the our-let conveyor, si mply by reversing its transport direction .
  • step (cl) After arrival at another robot cell (RC), the process of step (cl) may be repeated .
  • the end-product (12) is being transported out of the robot cell (RC), via the out-let conveyor (5 0 ut), via the conductor (8), and via the out-let conveyor (7B), for final processing.
  • Final processing/finishing may include packaging, shipping, etc.
  • External machine vision device/camera 10 Means for storing information (server/data base)

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Food Science & Technology (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Manipulator (AREA)

Abstract

This invention relates to robotic carcass processing systems and methods for production in parallel cell stations. The system and method of the invention solve several of the disadvantages associated with conventional meat production methods.

Description

CELLULAR MEAT PRODUCTION
TECHNICAL FIELD
This invention relates to robotic carcass processing systems and methods for production in parallel cell stations. The system and method of the invention solve several of the disadvantages associated with conventional meat production methods.
BACKGROUND ART
Through many years, the meat industry has been industrialised in terms of organisation, work specialisation and use of long production lines. Although robots and automation have gained ground, many slaughterhouse processes still are carried out partly or entirely by hand, i .a. due to the biological variation of the carcasses to be processed .
The production methods of conventional slaughterhouses are largely characterized by long production lines with specialized workstations operating serially, one after the other. While specialization and the use of few production lines generally increases efficiency, this set-up also has its drawbacks.
Losses following breakdown somewhere along the production line is greater than for other forms of production . Sometimes carcasses can be damaged by previous operating processes, and the sensitivity to fluctuations in commodity supply increases, as production is concentrated on fewer process units. Machines are not always suited for processing a variety of sizes, and biological variations, contaminants, etc, significantly reduce the overall effectivity. Equipment errors, as well as cleaning and maintenance, often result in a complete stop of the entire production line, affecting all products on the line, and greatly influences the capacity.
Long production lines also are unsuited for producing small series, with frequent conversions/change of product focus, and are unsuited for a product mix where multiple products are run simultaneously. Moreover, highly specialized production lines involve much internal transport of the goods, which negatively affects the overall efficiency/economy.
Finally, long production lines increase the risk of repetitive work at the individual workstations.
For many years robots have represented the solution to much of the labour- intensive work that is undertaken in various industries. Thus WO 2006/085744 describes a robot cell and a method for changing and storing elements in a robot cell .
WO 2015/168511 descri bes a robotic carcass processing method and system. WO 2019/081663 discloses a production cell comprising a robot.
However, the robotic carcass processing methods and systems described herein have never been disclosed .
SUMMARY OF THE INVENTION
The present invention provides an alternative system and a related method of processing carcasses at abattoirs. The system of the invention may be described as a robotic carcass processing system, or automated carcass manufacturing system, composed of a plurality of collaborative robot cells or process modules, and the system and method of the invention differ from those conventionally used in abattoirs by taking place in several parallel cell stations rather than a few serial production lines.
Prominent features of the present invention are e.g. : Rather than using specialised machinery, multi-functional robots are introduced; Rather than running a fixed production flow, a programmable and varied production flow is accomplished; Rather than undertaking a fixed schedule for cleaning and maintenance, a need-adapted cleaning and maintenance schedules are introduced.
Another essential feature of the method of the invention is increased flexibility, allowing the handling of small product series, handling of a varied range of products, and customised production, focusing of different customer's special needs. Production in robot cells according to the invention makes it possible to settle several orders/productions in parallel, j ust by configuration via software.
Moreover, the method of the invention facilitates platform-based development rather than single development projects, and the method of the invention also allows for a better capacity adaptation. As the production units are programmable, there are fever physical limitations, and the production plant can easily be adapted to fluctuations in the delivery of animals. While the cell may be equipped with a variety of tools, which provides access to a palette of operations, the arrangement, using of programmable robots, also allows for use of the same tool for different operations, thus reducing the number of tools necessary. Losses as result of stops are reduced, down-time as result of cleaning and maintenance is reduced, and the plant may run 24/7.
Clean-in-place (CIP) is a method of cleaning the interior su rfaces of various process equipment without disassembly. Composed of separate, delimited, self-containing units, each unit can be isolated for better cleaning, including CIP. Due to the fact, that each cell does not need to be served by an operator, or the operator can supervise the process from outside the closed cell, the process may involve the use of X-ray, NMR-equipment, and similar hazardous processes, that may otherwise be harmful to an operator.
Finally, the method of the invention converts repetitive work into process monitoring and management. In this respect, the method of the invention may include the use of virtual reality (VR) or augmented reality (AR), as the operator, working from outside the cell, may monitor and correct the production process e.g. by wearing a virtual reality headset, which presents a virtual environment with a digital twin of the real robot cell.
Therefore, in its first aspect, the invention provides a robotic carcass processing system as described below.
In another aspect, the invention provides a method for automatic processing of carcasses.
Further aspects of the invention relate to use of the robotic carcass processing system of the invention in the method of the invention for automatic processing of carcasses.
Other objects of the invention will be apparent to the person skilled in the art from reading the following detailed description and accompanying drawings.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is further illustrated by reference to the accompanying drawing, in which :
Fig. 1 shows an overview of an in-cell processing phase, taking place inside a robot cell (RCx) of the invention :
starting material/intermediate products arrive via an in-cell (in-let) conveyor (5in) ; starting material/intermediate products optionally are placed on a work bench/processing table (4), and are being tracked and monitored by the in-cell sensing means (3), e.g. by the in-cell sensor (3A), and/or by the in-cell vision device (3B);
the active robot (Rx) choses a tool (6x) suited for the intended process from a toolbox (6A) and performs the intended operation;
the processed product/end-product is placed on an internal out-let conveyor (50ut) for transport to the outside of the robot cell (RCx) ;
all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (i.e. a server/database) (10), and optionally using machine learning methodology applied to data obtained by the sensing means (3); and the robot cell and the processes taking place herein may be monitored from outside the cell by an operator, optionally equipped with a VR/AR device (14);
Fig. 2 shows an overview of the external system of the invention :
starting material arrives via an in-let conveyor (7A), and the staring material is being tracked (localised/analysed) by the external sensing means (9);
staring material is being distributed to one of the Robot Cells (RCi-x) via a conductor (8), in operation with the in-let conveyor (5in);
staring material is processed in one of the robot cells (cf. Fig. 1), and processed starting material (intermediate products) may then be re-distributed to another Robot Cell (RCå-x) via an out-let conveyor (50ut), optionally guided in-cell sensing means (3), and/or the external sensing means (9), both in operation with the processing means (2);
this process (re-distribution of intermediate products) may be repeated;
the processed product (12) is being transported to an out-let conveyor (7B); and all processes are carried out in operation with the processing means (2), optionally in communication with a means for storing information (a server/database) (10);
Fig. 3 shows a robot (Rx) for use according to the invention, mounted with a working tool (6), and onto which robot a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot; and
Fig. 4 shows an example of a robot cell of the invention, equipped with three industrial robots (Rxi, Rxå, Rx3), individually working on the two half-carcasses ( 11), presented on and supported by the workbench (4) after being delivered inside the production cell (RCx) by an in-let conveyor (5in), optionally assisted by one or more of the robots, which production cell is guarded by a safety fence ( 15), and the entire in-cell processes may be monitored by one or more cameras (3B), mounted in the ceiling (not shown on the figure), and being in communication with the processing means (2) (not shown on the figure).
DETAILED DISCLOSURE OF THE INVENTION
The system of the invention
In its first aspect, the invention provides a robotic carcass processing system, composed of one or more collaborative or non-collaborative robot cells.
The robotic carcass processing system of the invention takes its starting point in incoming starting material, which may be any carcasses, or parts thereof, conventionally processed in slaughterhouses. During the further processing, the starting material turns into i.a. processed products, meat items, intermediate products, and - eventually - end- products.
The system of the invention may also be characterised by comprising an in-let processing step, an outlet processing step, and in between these steps, internal processing steps are taking place in robot cells/production cells, which robot cells represent a closed or possibly sealed environment. Moreover, all processes are carried out in communication with, and guided or assisted by a processing means.
Moreover, the entire production may be monitored, and possibly corrected from outside the closed environment of the robot cell, by an operator.
The workstation
The system of the invention ( 1) may be characterised by comprising the following (in-cell) elements:
one or more robotic workstations (RCi, RC2, ... RCx), configured for operation in parallel, and independently of each other, each of which robotic
workstations comprises:
one or more industrial robots (Ri, R2, ... Rx), in operation with the processing means (2), each robot capable of holding, and configured for operating, a working tool (6) ;
one or more in-let/out-let conveyors (5), in operation with the processing means (2), and capable of transporting the workpiece ( 11) into the robotic cell (RC), and/or the end-product ( 12) out of the robotic cell (RC) ;
one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, e.g . as assessed by the processing means (2) ;
one or more processing tables/workbenches/support stand (4), onto which the workpiece ( 11) can be placed for optimal support du ring processing and conclusion of the intended task;
an in-cell sensing means (3), configured for determining the location and/or characteristics of the workpiece ( 11), comprisi ng :
one or more in-cell sensors (3A); and/or
one or more in-cell machine vision devices (3B); and
one or more processing means (2), in collaboration with each other, and in operation with said industrial robots (R), said processing table/workbench/support stand ( 15), said in-cell sensing means (3), and configured for processing digitalized data obtained by said in-cell sensing (3), and configured for applying machine learning to said obtained digitalised data .
Each robotic workstation (RC) of the invention is configured for, and undertakes one or more of the following tasks: - receives the workpiece ( 11) in question entering the cell via an in-let internal conveyor ( 5in) ;
- processes the workpiece ( 11) in question, according to a pre-programmed schedule, according to real-time, online measurements, and/or, optionally, by applying machine learning methodology on digitalised data obtained by the sensing means (3) ;
- determines the character, structure, nature, size, orientation, location, quality, etc., of the meat item to be processed ;
- makes quality assessment of the workpiece in question, and providing feedback to the robot in question, optionally accomplished by a self-learning algorithm;
- guides the motion of the industrial robot (R) during operation/processing ;
- selects and picks a working tool (6) from a toolbox (6A), and mounts this tool on the industrial robot (Rx) in question, in accordance with the intended (programmed or self-taught) task;
- replaces the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task;
- performs self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or
- places the end-product ( 12) on the out-let conveyor (50ut), for transport to outside the cell .
In one embodiment, the system of the invention comprises two or more robotic workstations, optionally in inter-communication with one or more of the other robotic workstations.
In another embodiment, the robotic carcass processing system of the invention comprises the following additional (external) elements:
one or more transporting means (5, 7), in operation with the processing means (2), for commodity supply and configured for distributing of workpiece ( 11) to one of the two or more robotic work stations (RCi, ... RCx), or for redistributing processed material/intermediate products to another of the available robot cells (RC), or for transport of processed material/end-product to the out-let conveyor (7C) ;
an in-cell sensing means (3), installed at, or within operating distance of the product supply means (7), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the starting material, which sensing means (3) comprises:
one or more sensors (3A); and/or
one or more machine vision devices (3B) ;
one or more conductors (8), in operation with the processing means (2), configured for allocating each identified workpiece ( 11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material/intermediate products ( 12), retu rning from one robotic work station, to another robotic workstation for fu rther processing, or for allocating the end-products to the out-let conveyor (7C) ;
one or more processing means (2), in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic work stations (RC), said industrial robots (R), said out-let conveyor (7C), for determining the localisation, the character, structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing .
In further embodiments, the robotic carcass processing system of the invention further comprises one or more of the following hardware elements:
a buffer/conveyor for receiving products (5, 7) ;
an industrial robot (R);
an operation tool (6) mounted on the industrial robot (R) ;
a support stand (4);
a 3D camera (3B, 9B) ;
a processing means/computer for running software (2) ; and
a human safety system.
In further embodiments, the robotic carcass processing system of the invention further comprises one or more of the following software elements, executed on the processing means (2) :
AI/machine learning algorithm for extraction of key information from sensor input, e.g . a 2D image;
software for communication among different software components;
adaptive path planning on robot controller with input from detection system; software for data collection ;
software for data preparation ;
software for training model ;
software for running model online;
robot pose generation from 3D data;
software for human safety control ;
software for online data collection ;
software for online situation recognition ; and
software for online optimisation with self-learning and execution planning .
The robotic workstation fRQ
The system of the invention comprises one or more robotic workstations, which may also be termed manufacturing cells or robot cells (RCi, RC2, RC3, ... RCx), in which workstations/cells the actual processing of the meat items takes place. The robot cells for use according to the invention shall be configured for operation in parallel, and independently of each other, but may also be inter-communicating with each other via the processing means (2).
The robotic carcass processing workstation (RC) of the invention may be characterised by comprising the following elements:
one or more industrial robots (Ri, ... Rx), in operation with the processing means (2), each robot configured for holding, and capable of operating a working tool (6); one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2);
one or more processing tables/workbenches/support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task;
an in-cell sensing means (3), configured for determining the location and/or character of the workpiece (11), comprising :
one or more in-cell sensors (3A); and/or
one or more in-cell machine vision devices (3B) .
one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3).
In one embodiment, the robotic carcass processing workstation (RC) of the invention further comprises a means for storing information (a server/database) (10).
The robot cell of the invention may be regarded as a closed or possibly sealed environment, suited for implementation of clean-room techniques, and they should be established using cleaning-friendly materials.
Each robot cell receives material to be processed via an in-let conveyor (5in) and delivers processed material via an out-let conveyor (5out) . In one embodiment, supply and delivery of the goods is implemented using one and the same conveyor (5in/out), just by reversing the transporting direction of the conveyor.
Each robotic work station of the invention comprises one or more industrial robots (Ri, R2, R3, ... Rx), one or more working tools (6), one or more processing tables/workbenches/support stands (4), an in-cell sensing means (3), and one or more robot(s) (R), and the in-cell sensing means (3), shall be in communication with, and receive operational guidance from, the processing means (2).
In another embodiment, the robot cell and the processes taking place herein may be monitored from outside the cell by an operator (13), optionally by use of projection mapping, or equipped with virtual reality (VR) or augmented reality (AR) equipment (14).
The workstations/cells (RC) for use according to the invention shall be configured for, and able to perform one or more of the following tasks: - receipt of the workpiece (11) in question entering the cell via an in-let internal conveyor (5in) ;
- processing of the workpiece (11) in question, according to a pre-programmed schedule, according to real-time, online measurements, and/or according to a self learning algorithm (reinforcement learning/machine learning);
- selecting a working tool (6) for mounting on the industrial robot (Rx) in question, in accordance with the intended (programmed) task;
- replacement of the working tool (6) on the industrial robot (Rx) in question as needed, and in accordance with the intended task;
- quality assessment of the workpiece in question, and providing feed-back to the processing means (2);
- performing self-cleaning according to a predetermined (programmed) schedule, or according to an actual measured value; and/or
- placing the end-product (12) on an internal out-let conveyor (50ut), for transport to the out-let conveyor (7C).
In a further embodiment, the workstations/cells (RC) for use according to the invention shall comprise a protective casing surrounding the robot cell (RC), in which casing a door assembly is provided in order to provide access to the workstation, via an opening, through which the station can be served by an operator.
The industrial robot (R)
The system of the invention comprises the use of one or more robots (Ri, Rå,
R3, ... Rx), including the use of single robots, and the use of multiple robots, working together.
The robot for use according to the invention may be any available automated industrial robot (or robotic manipulator), capable of being programmed, and capable of moving on two or more axes. The robot shall be in communication with, and receive operational guidance from, the processing means (2).
The industrial robot is understood to comprise manipulators, so it can be configured for holding, and capable of operating a working tool (6). In another embodiment, the robot also shall be able to choose and change tools (6), e.g. a removable or exchangeable tool or hardware module chosen from a toolbox or tool magazine (6A).
In a further embodiment, the robot (Rx) for use according to the invention is mounted with a working tool (6), and a vision device (3Bx), in communication with the processing means (2), is mounted on the tip of the robotic arm, within operating distance of the working tool (6), allowing monitoring of the process and/or guidance of the robot. The robot for use according to the invention may be any commercially available industrial robot. Industrial robots may be classified based on their coordinate systems, i.e. based on reachable coordinates of a point on the end-effector, and include Cartesian robots (when arms of a robot move in the XYZ rectangular coordinate system),
Cylindrical robots (when arms of a robot move in one angular and two linear directions), Spherical robots (the arms move in two angular and one linear direction), SCARA robots (Selective Compliance Arm for Robotic Assembly; have two parallel revolute joints providing compliance in a selected plane), and Articulated robots (also known as the anthropomorphic robot, the robot arm has 3 revolute joints).
The working tools (6)
For performing the intended tasks, the system of the invention also comprises one or more working tools (6), that can be attached to the industrial robot (Rx) and is suited for the intended task.
The working tools (6) may be available from a toolbox or tool magazine (6A), that may comprise one or more removable or exchangeable tools or hardware modules, from which toolbox (6A) one or more robots can pick and choose the desired tool, depending on the intended task.
The working tool (6) may be introduced by use of one or more robots (R). One tool (6) may be mounted on one robot, and two or more robots may work together to solve the intended task.
In another embodiment, a multi-functional tool is mounted on a robot (R) configured for operating the multi-functional tool, and able to switch between the individual tools, as need be.
Examples of working tools frequently used in abattoirs are knives, e.g. fixed knives, rotary knives, oscillating knives, and wizard knives; saws, e.g. round saws, band saws and chain saws; or combinations hereof (i .e. multi-functional tools). Tools for use according to the invention also may include scissors, e.g. scissors having multiple blades, including blades of different sizes.
The system of the invention may also comprise tools for cleaning of the robot cell and its interior, including tools for cleaning other tools. Cleaning tools include e.g. brushes, brooms, and pressure devices.
The processing means (2)
For receiving input from, and for guiding the relevant devices used according to the invention, the system of the invention comprises one or more processing means (2), which shall include one or more central processing units (CPU), that may e.g. include a standard PC. If two or more processing means are employed, these processors may be in inter-communication with one or more of the other processing means.
For performing the necessary processing actions, the processing means for use according to the invention shall be in communication with, and/or configured for processing digitalized data obtained by one or more of the following devices:
the robot(s) (R);
the in-cell sensing means (3), i.e. the in-cell sensor (3A) and/or the in-cell machine vision device (3B);
the processing table/workbench/support stand (4);
the transporting means (5, 7);
the external sensing means (9), i.e. the external sensor (9A) and/or the external machine vision device (9B);
the conductors (8); and
the robotic workstations (RC).
The processor(s) for use according to the invention shall also be configured for making assessments and determinations, and be able to perform one or more of the following tasks:
determine the position/location, the character, structure, nature, size, orientation, quality, etc. of a workpiece (11);
guide the motion of the industrial robot (R) during operation/processing;
calculate the speed of the transporting means/conveyor (5, 7);
calculate the optimal cutting/trimming pattern in view of the specifications for the processed product/cut (12);
recognize and track the workpiece (11) while being transported from arrival (5in, 7A) to the processing table/workbench/support stand (4);
recognize and track processed materials while being transported to outside (5out) the cell to another workstation (RC), or to be delivered as the processed product (12) via the out-let conveyor (7B).
In one embodiment, the processing means (2) for use according to the invention shall be programmed for performing i.a. machine vision and/or machine learning (more details below).
The carcass processing system of the invention may also be in operation with an external Enterprise Resource Planning (ERP) system. ERP represents a software application that manages functional areas across the business. ERP integrates company functions such as order processing, sales, procurement, inventory management and financial systems. In another embodiment, the processing means (2) for use according to the invention shall be configured for running a local ERP application, or for communicating with an external ERP application.
The sensing means (3. 91
For tracking incoming products, as well as processed products, the system needs to know the position/location, and possibly also the characteristics, of each item. This may be accomplished by use of sensors and/or machine vision devices.
Tracking of the products takes place inside the robotic cell, but may also take place outside the robotic cell, when dealing with starting materials and/or
processed/end-products.
Sensing means for use according to the invention comprises sensors (3A, 9A), and/or machine vision devices (3B, 9B).
As defined herein, a sensor is a device, module, or subsystem, whose purpose is to detect events, or changes in its environment, and send the information to other electronics, and in particular the processing device (2) used according to the invention.
The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
Like sensors, machine vision devises can determine positions or locations of specific subjects, but machine vision devises also can determine e.g. the character or the structure of a given object.
Machine vision
Machine vision represents a combination of hardware and software capable of providing operational guidance to other devices based on the capture and processing of images, and usually rely on digital sensors protected inside industrial cameras, with specialised optics to acquire images, so that computer hardware and software can process, analyse and measure various characteristics for decision making, including determine the identity and the location of objects in the production cell, and perform feature extraction, e.g. by using machine learning.
A machine vision system typically comprises lighting, a camera with a lens, an image sensor, a vision processing means, and communication means. The lens captures the image and presents it to the sensor in the form of light. The sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis. Lighting illuminates the part to be inspected and allows its features to stand out so they can be clearly seen by camera. Processing may be accomplished by conventional processors, including central processing units (CPU) and/or graphics processing units (GPU), e.g. in a PC-based system, or in an embedded vision system, and is performed by software and may consist of several steps. First, an image is acquired from the sensor. In some cases, pre processing may be required to optimize the image and ensure that all the necessary features stand out. Next, the software locates the specific features, runs measurements, and, optionally, compares these to the specification. Finally, a decision is made, and the results are communicated. Machine vision systems essentially comes in three main categories:
ID vision captures a digital signal one line at a time instead of looking at a whole picture at once, and is accomplished by use of a line-scan camera;
2D vision is looking at the whole picture, and may be accomplished by use of an industrial camera; and
3D vision systems comprise multiple sensors, including or one or more distance sensors.
The machine vision device for use according to the invention also may include the use of e.g. X-ray and/or NMR-equipment.
Any category, or combination of categories, may be implemented in the processing system of the invention.
The machine vision device (3B, 9B) for use according to the invention shall be configured for determining the identity and/or location of objects in the camera's field of view, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining its particular motion with respect to a given workpiece to be processed.
The vision device (3B) may be configured to detect both static as well as moving objects, and determine their position in space, e.g. in order to prevent collisions.
If the incoming workpiece (11) is considered represented by a spatial distribution, the vision device for use according to the invention comprise a combination of a ID, 2D and 3D scanning device (3B, 9B), to determine the position and dimensions of the product and calculate the angle of its surface, thus allowing optimal guidance of the robotic arm to position the working tool (6) in the right angle.
The machine vision hardware components for use according to the invention, such as sensors and processors, are commercially available, and machine vision systems can be assembled from single components, or purchased as an integrated system, with all components in a single device. The 3D scanning devices (3B, 9B) for use according to the invention may be any commercially available 3D scanner or range camera, such as a time-of-flight camera, structure light camera, or stereo camera, e.g . a Sick 3D ruler or Microsoft Kinect.
The transporting means (5. 7)
For transporting of the meat items from in-put to out-put, and for inter-cell transport, including for use as a buffer area for intermediate storage of products inside the production cell, the carcass processing system of the invention must comprise one or more transporting means (5, 7), and these transporting means shall be in
communication with, and receive guidance from the processing means (2) .
Transporting according to the invention is for commodity supply and for distributing of the workpiece ( 11) to one of robotic work stations (RCi, ... RCx), and/or for redistributing processed material/intermediate products to another of the available robot cells (RC), and/or for transport of processed material/end-product (12) to the out let conveyor (7B) .
Transport may take place by use of a conveyor belt, a conveyor/overhead rail, a lift, or an automated guided vehicle (AGV) .
The processing table/workbench/support stand (4)
For optimal support during processing, the system of the invention includes one or more processing tables/workbenches/support/fixation stands (4), onto which the workpiece ( 11) can be placed.
In contrast to conventional abattoir processes, that is performed while the carcasses are hanging on an overhead rail, processing on tables/workbenches/support stands allows for maximal support of the meat item during processing.
The use of workbenches may involve a transition from a hanging position to a lay-down position, which transition may be accomplished by use of a lay-down mechanism.
The external conductor (8)
The system of the invention also comprises one or more conductors (8) . As defined herein, a conductor represents an equipment that can redistribute the incoming meat items to other conveyors. For fulfilling this task, the conductor (8) shall be in communication with, and receive operational guidance from, the processing means (2) .
In the system of the invention, the conductor (8) is configured for allocating each identified starting workpiece ( 11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material ( 12), returning from one robotic work station, to another robotic work station for further processing, or for allocating the end- products to the out-let conveyor (7C) .
The operator ( 131
Moreover, the entire production may be monitored and possibly corrected from outside the closed environment of the robot cell (RC) by an operator ( 13).
Monitoring of the carcass processing may be accomplished by visual inspection, or by use of projection mapping, or by use of virtual reality (VR), or augmented reality (AR) equipment ( 14) .
Machine learning
Machine learning is an application of artificial intelligence (AI) which use statistical techniques to perform automated decision-making, and optionally improve performance on specific tasks based on experience without being explicitly programmed.
Several approaches to machine leaning exists, e.g. supervised learning, reinforced learning, imitation learning and un-supervised learning .
In supervised learning, e.g ., the computer is presented with example inputs and their desired outputs, given by the supervisor, and the goal is to learn a general rule that maps inputs to outputs. Using large sets of reference data, covering different product types, along with the desired output for each element in the data sets, a supervised learning algorithm is trained to learn a general rule that maps inputs to outputs.
As data is building up, the precision and robustness of the system increases, but occasionally, e.g. caused by quality errors or recognition errors, manual assistance may be needed, for additional training of the system.
After receipt of the digitalized data from the machine vision device (3B, 9B), information about the product type can be determined, and, e.g. by reference to a product catalogue also stored on the server ( 10), a product ID can be allocated to each workpiece /mixture of workpieces ( 11), and transmitted to the server ( 10) for further action/use.
In one embodiment, information about the product type or product ID is used for specifying the destination of the meat product(s) (12), e.g . a specific place in the packing room.
Safety system
In a further embodiment, the system of the invention may include a safety system to protect human operators from harm .
Safety in human-robot collaborative manufactu ring can be ensured by various types of safety systems, including gua rd-free workspaces and fenceless robotic systems, that come in many forms, shapes and sizes. Recently, active vision-based safety systems have gained momentum due to their affordable price (e.g . off-the-shelf RGB or depth cameras and projectors), flexible installation and easy tailoring .
The method of the invention
In another aspect, the i nvention provides a method for automatic processing of carcasses.
The method of the invention for automatic processing of carcasses may be characterised by comprising the subsequent steps of:
(a) subjecting the incoming workpiece ( 11) to analysis for identification, and for a decision on the further processing, using the external sensing means (9);
(b) distribution of the analysed workpiece ( 11) to a robotic workstation (RC) ;
(cl) in the robotic work station (RC), subjecting the incoming workpiece ( 11) to in-cell analysis by use of the in-cell sensing means (3), and processing of the workpiece ( 11) according to a predetermined schedule, or according to an actual measured value, and, optionally, by use of a machine learning methodology applied to the data obtained by the in-cell sensing means (3) ;
(c2) optionally, redistribution of a processed workpiece, processed according to step cl, to another robotic workstation (RC) for continued or further processing ; and
(d) transporting the processed material ( 12) to the out-let conveyor (7B) for final processing.
The analysis carried out according to step (a) is performed using an (external) sensing means (9), and in particular an (external) sensor (9A), and/or an (external) machine vision device (9B), in communication with the processing means (2) . While a sensor (9A) may help locating and tracking the incoming workpiece ( 11), the vision device (9B) may help providing additional information about the incoming product.
By help of the processing means (2), location and identification of the product is determined, and a decision on the further processing is calculated according to a predetermined (programmed or self-taught) schedule, or according to actual measured values, i.e. information obtained using the sensor (9A), or using the vision device (9B) .
Based on the information obtained in step (a), the incoming workpiece ( 11) is distributed to a robotic workstation (RC) for processing . This transport is accomplished using a conductor (8) in collaboration with an (in-let) conveyor (5in). The conductor (8) secu res that the product is being directed onto the correct in-let conveyor, as determined by the processing means (2) .
Having arrived at the robot cell (RC), the actual processing of the product takes place. First the workpiece ( 11) should be placed in a supported position, i .e. on a processing table, workbench, or a support stand (4), onto which the workpiece (11) can be placed for optimal support during processing and conclusion of the intended task. Transition from the in-let conveyor (5in) to the processing table/stand (4) may optionally be accomplished using a "lay down mechanism", frequently used in abattoirs.
Next, for determining the location and/or character of the incoming workpiece (11), this is being analysed using the in-cell sensing means (3).
In one embodiment, this analysis takes place using one or more in-cell sensors (3A), in communication with the processor (2).
The sensor (3A) for use according to the invention may be a mechanical (electro-mechanical) sensor or an optically based sensor, e.g. a force/torque sensor, a light beam or electro-magnetic sensor.
In another embodiment, analysis takes place using an in-cell machine vision devices (3B), in communication with the processor (2).
The machine vision device (3B) for use according to the invention shall be configured for locating the position, and/or for determining the characteristics, and/or for determining the quality of the workpiece (11) to be processed, and shall be in communication with, and provide (digitalised) data to the processing means (2), which data, eventually, shall be used by a robot (R) for determining the particular motion of the industrial robot in respect of a given workpiece to be processed.
Having examined the workpiece (11), a robot (R), guided by the processing means (2), and already mounted with a working tool (6), or after having picked a working tool from the toolbox (6A), starts working on the workpiece (11), properly supported on the on a processing table, workbench, or a support stand (4). If needed, the robot (R) may change working tool for completion of its task.
Processing may be accomplished by use of one robot (R) only, or using multiple robots working together.
Processing of the workpiece (11) may be carried out according to a
predetermined schedule, or according to an actual, measured value. A scheduled programme may be stored on a means for storing information (server/database) (10), from which the processor (2) may get the necessary information.
In another embodiment, processing of the workpiece (11) may be carried out according to a self-taught schedule, applying the machine learning techniques described above.
Having completed the intended processing of the workpiece (11), whether completed by use of a single robot (R), or by use of two or more robots, each equipped with one working tool (6), or one or more equipped with a multi-functional tool, it may be necessary to subject the workpiece (11) to another processing, optionally to take place in another robot cell (RC), as determined and guided by the processing means (2), or, optionally, as determined and guided by the operator (13), which is optionally equipped with a virtual reality (VR) or augmented reality (AR) device ( 14) .
In case the workpiece ( 11) shall be further processed it may be redistribution to another robotic workstation (RC) for further processing (i .e. step (c2)) . This may be accomplished via an out-let conveyor (50ut), optionally via the conductor (8), and an inlet conveyor leading to the other robot cell (RC) . For practical reasons, the in-let conveyor may turn into the our-let conveyor, si mply by reversing its transport direction .
After arrival at another robot cell (RC), the process of step (cl) may be repeated .
After finishing the workpiece ( 11), the end-product (12) is being transported out of the robot cell (RC), via the out-let conveyor (50ut), via the conductor (8), and via the out-let conveyor (7B), for final processing. Final processing/finishing may include packaging, shipping, etc.
Any combination of two or more of the embodiments described herein is considered within the scope of the present invention.
List of reference signs
In the figures, identical structures, elements or parts that appear in more than one figure are generally labelled with the same numeral in all the figures in which they appear.
1 A robotic carcass processing system
2 Processing means/CPU
3 In-cell sensing means
3A In-cell sensor
3B In-cell machine vision device/camera
4 Workbench/processing table/support stand/fixation stand
5 In-cell conveyors
5 in In-let conveyor
5out Out-let conveyor
6 Working tool/end of arm tool
6A Toolbox
7 External conveyors
7A In-let conveyor
7B Out-let conveyor
8 Conductor
9 External sensing means
9A External sensor
9B External machine vision device/camera 10 Means for storing information (server/data base)
11 Workpiece/starting material
12 Processed material/end-product
13 Operator
14 Virtual reality (VR)/augmented reality (AR) equipment
15 Safety-fence
RC Robotic workstation/production cell (Robot Cell)
R Industrial robot/manipulator

Claims

1. A robotic carcass processing system ( 1), which system comprises:
one or more robotic workstations (RCi, RC2, ... RCx), configured for operation in parallel, and independently of each other, each of which robotic
workstations comprise :
one or more industrial robots (Ri, R2, ... Rx), in operation with the processing means (2), each robot capable of holding, and configured for operating a working tool (6) ;
one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2);
one or more workbenches/processing tables/support stands (4), onto which the workpiece ( 11) can be placed for optimal support du ring processing and conclusion of the intended task;
an in-cell sensing means (3), configured for determining the location and/or character of the workpiece (11), comprising :
one or more in-cell sensors (3A); and/or
one or more in-cell machine vision devices (3B);
one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3), and, optionally, configured for applying machine learning methodologies to the digitalized data obtained by said sensing means (3) .
2. The robotic carcass processing system of claim 1, which system further comprises:
one or more transporting means (5, 7), in operation with the processing means (2), for commodity supply and configured for distributing of the workpiece ( 11) to one of the two or more robotic work stations (RCi, RC2, RC3, ... RCx), or for redistributing processed material/intermediate products to another of the available robot cells (RC), or for transport of processed material ( 12) to the out-let conveyor (7B);
an external sensing means (9), installed at, or within operating distance of the in-let conveyor (7A), which sensing means is in operation with the processing means (2), configured for determining the location/position and/or character of the workpiece ( 11), which sensing means (9) comprises:
one or more sensors (9A); and/or
one or more machine vision devices (9B);
one or more conductors (8), in operation with the processing means (2), configured for allocating each identified workpiece ( 11) from the in-let conveyor (7A) to a robotic work stations (RC), or for allocating processed material (12) returning from one robotic work station, to another robotic work station for continued or further processing, or for allocating processed materials ( 12) to the out-let conveyor (7B) ;
one or more processing means (2), in operation with, and configured for processing digitalized data obtained by, said in-let conveyor (7A), said in-cell and/or external sensing means (3, 9), said conductors (8), said conveyors (5, 7), said robotic work stations (RC), said industrial robots (R), said out-let conveyor (7B), for determining the localisation, the character, structure, nature, size, orientation, quality, etc., and for guiding the motion of the industrial robot during operation/processing .
3. The robotic carcass processing system of either one of claims 1-2, which system further comprises a means for storing information (a server/data base) ( 10), in
communication with the processing means (2).
4. A robotic carcass processing workstation (RC), which workstation comprises:
one or more industrial robots (Ri, ... Rx), in operation with the processing means (2), each robot configured for holding, and capable of operating a working tool (6);
one or more working tools (6), for mounting on said industrial robot (Rx), and suited for the intended task, as assessed by the processing means (2);
one or more workbenches/processing tables/support stands (4), onto which the workpiece ( 11) can be placed for optimal support du ring processing and conclusion of the intended task;
an in-cell sensing means (3), configured for determining the location and/or character of the workpiece (11), comprising :
one or more in-cell sensors (3A); and/or
one or more in-cell machine vision devices (3B) .
one or more processing means (2), in operation with said industrial robots (R), and configured for processing digitalized data obtained by said sensing means (3) .
5. The robotic carcass processing workstation (RC) of claim 4, which workstation further comprises a means for storing information (a server/data base) ( 10).
6. A method for automatic processing of carcasses, which method comprises the subsequent steps of:
(a) subjecting the incoming workpiece ( 11) to analysis for identification, and for a decision on the further processing, using the external sensing means (9); (b) distribution of the analysed workpiece ( 11) to a robotic workstation (RC) ;
(cl) in the robotic work station (RC), subjecting the incoming workpiece ( 11) to in-cell analysis by use of the in-cell sensing means (3), and processing of the workpiece ( 11) according to a predetermined schedule, or according to an actual measured value, and, optionally, by use of a machine learning methodology applied to the data obtained by the in-cell sensing means (3) ;
(c2) optionally, redistribution of a processed workpiece, processed according to step cl, to another robotic workstation (RC) for continued or further processing ; and
(d) transporting the processed material ( 12) to the out-let conveyor (7B) for final processing.
7. The method of claim 6, wherein the analysis of step a is performed using an (external) sensing means (9), and in particular an (external) sensor (9A), and/or an (external) machine vision device (9B), in communication with the processing means (2) .
8. The robotic carcass processing workstation (RC) of either one of clai ms 4-5, for use in the robotic carcass processing system (1) of claims 1-3.
9. The robotic carcass processing system ( 1), of claims 1-3 for use in the method for automatic processing of carcasses of claim 6.
10. The robotic carcass processing workstation (RC) of either one of claims 4-5, for use in the method for automatic processing of carcasses of claim 6.
PCT/EP2019/085048 2018-12-17 2019-12-13 Cellular meat production WO2020126890A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19831624.2A EP3897162A1 (en) 2018-12-17 2019-12-13 Cellular meat production

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA201801006 2018-12-17
DKPA201801006A DK180199B1 (en) 2018-12-17 2018-12-17 Cellular meat production

Publications (1)

Publication Number Publication Date
WO2020126890A1 true WO2020126890A1 (en) 2020-06-25

Family

ID=69104360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/085048 WO2020126890A1 (en) 2018-12-17 2019-12-13 Cellular meat production

Country Status (3)

Country Link
EP (1) EP3897162A1 (en)
DK (1) DK180199B1 (en)
WO (1) WO2020126890A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022152638A1 (en) * 2021-01-12 2022-07-21 Teknologisk Institut Robotic meat packing plant and method of operating a meat packing plant
WO2023280606A1 (en) * 2021-07-09 2023-01-12 Teknologisk Institut A method for digital process monitoring in a slaughterhouse
WO2023052227A1 (en) 2021-09-28 2023-04-06 Teknologisk Institut A robotic carcass processing system for use in a slaughterhouse

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050186896A1 (en) * 2002-03-18 2005-08-25 Scanvaegt International A/S Method and system for monitoring the processing of items
WO2006085744A1 (en) 2004-10-25 2006-08-17 De Meerpaal B.V. Robot cell and method for changing and storing elements in a robot cell
WO2010114397A1 (en) * 2009-04-03 2010-10-07 Robotic Technologies Limited Carcass cutting methods and apparatus
WO2015168511A2 (en) 2014-05-01 2015-11-05 Jarvis Products Corporation Robotic carcass processing method and system
US20180116234A1 (en) * 2016-10-28 2018-05-03 Jarvis Products Corporation Beef splitting method and system
US20180153179A1 (en) * 2016-10-28 2018-06-07 Jarvis Products Corporation Beef splitting method and system
WO2018167089A1 (en) * 2017-03-13 2018-09-20 Carometec A/S 3d imaging system and method of imaging carcasses
WO2019081663A1 (en) 2017-10-27 2019-05-02 Creaholic S.A. Production cell

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050186896A1 (en) * 2002-03-18 2005-08-25 Scanvaegt International A/S Method and system for monitoring the processing of items
WO2006085744A1 (en) 2004-10-25 2006-08-17 De Meerpaal B.V. Robot cell and method for changing and storing elements in a robot cell
WO2010114397A1 (en) * 2009-04-03 2010-10-07 Robotic Technologies Limited Carcass cutting methods and apparatus
WO2015168511A2 (en) 2014-05-01 2015-11-05 Jarvis Products Corporation Robotic carcass processing method and system
US20180116234A1 (en) * 2016-10-28 2018-05-03 Jarvis Products Corporation Beef splitting method and system
US20180153179A1 (en) * 2016-10-28 2018-06-07 Jarvis Products Corporation Beef splitting method and system
WO2018167089A1 (en) * 2017-03-13 2018-09-20 Carometec A/S 3d imaging system and method of imaging carcasses
WO2019081663A1 (en) 2017-10-27 2019-05-02 Creaholic S.A. Production cell

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022152638A1 (en) * 2021-01-12 2022-07-21 Teknologisk Institut Robotic meat packing plant and method of operating a meat packing plant
WO2023280606A1 (en) * 2021-07-09 2023-01-12 Teknologisk Institut A method for digital process monitoring in a slaughterhouse
WO2023052227A1 (en) 2021-09-28 2023-04-06 Teknologisk Institut A robotic carcass processing system for use in a slaughterhouse

Also Published As

Publication number Publication date
EP3897162A1 (en) 2021-10-27
DK180199B1 (en) 2020-08-13
DK201801006A1 (en) 2020-07-31

Similar Documents

Publication Publication Date Title
US11667030B2 (en) Machining station, workpiece holding system, and method of machining a workpiece
WO2020126890A1 (en) Cellular meat production
US11701777B2 (en) Adaptive grasp planning for bin picking
Yang et al. Collaborative mobile industrial manipulator: a review of system architecture and applications
WO2020231319A1 (en) Robot cell setup system and process
Boschetti A picking strategy for circular conveyor tracking
WO2021185805A2 (en) A relocatable robotic system for production facilities
CN110914021A (en) Operating device with an operating device for carrying out at least one work step, and method and computer program
Gusan et al. Industrial robots versus collaborative robots-The place and role in nonconventional technologies
Wu et al. Application of visual servoing for grasping and placing operation in slaughterhouse
Bard An assessment of industrial robots: Capabilities, economics, and impacts
US11370124B2 (en) Method and system for object tracking in robotic vision guidance
US11953888B2 (en) Production cell
DK202100195A1 (en) A relocatable robotic system for production facilities
Saukkoriipi Design and implementation of robot skill programming and control
EP4137780A1 (en) Autonomous measuring robot system
Groen et al. Multi-sensor robot assembly station
US11660757B2 (en) Robot control system simultaneously performing workpiece selection and robot task
Bartoš et al. Conceptual Design and Simulation of Cable-driven Parallel Robot for Inspection and Monitoring Tasks
WO2024038323A1 (en) Item manipulation system and methods
Shen et al. Considerations on deploying a model-based safety system into human-robot co-operation
SK512019A3 (en) Method and device for automatic calibration of an industrial robot workplace
Mair et al. Computer Integrated Manufacture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19831624

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019831624

Country of ref document: EP

Effective date: 20210719