AU2023200298B1 - A robotic system for processing an animal body part - Google Patents

A robotic system for processing an animal body part Download PDF

Info

Publication number
AU2023200298B1
AU2023200298B1 AU2023200298A AU2023200298A AU2023200298B1 AU 2023200298 B1 AU2023200298 B1 AU 2023200298B1 AU 2023200298 A AU2023200298 A AU 2023200298A AU 2023200298 A AU2023200298 A AU 2023200298A AU 2023200298 B1 AU2023200298 B1 AU 2023200298B1
Authority
AU
Australia
Prior art keywords
remote
orientation
robotic
body part
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2023200298A
Inventor
Connor Eatwell
Anthony Fitzpatrick
Tom Glover
Chris Hopkins
Jamie Spyker
Barbara Webster
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mimeo Industrial Ltd
Original Assignee
Mimeo Industrial Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimeo Industrial Ltd filed Critical Mimeo Industrial Ltd
Publication of AU2023200298B1 publication Critical patent/AU2023200298B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0017Apparatus for cutting, dividing or deboning carcasses
    • A22B5/0041Electronic, robotic or computer assisted cutting, dividing or deboning carcasses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0045Manipulators used in the food industry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/02Apparatus for holding meat or bones while cutting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

A robotic system (100) for processing a body part of an animal is provided. The robotic system (100) comprises a retaining unit (10) for holding the body part (M), a device (20) for 5 processing the body part (M), an object tracking system (30) for tracking a position and/or orientation of a remote object (31) representing the body part (M), a robot (40) having at least one position and/or orientation controllable robotic member (41) attached to at least one of the retaining unit (11) and the device (20) for controlling the position and/or orientation thereof. The robotic system 100 further comprises a controller (50) arranged to control (53) the position and/or .0 orientation of the robotic member (41) based on the tracked position and/or orientation of the remote object (31). Figure to be published with Abstract: Figure 1 20

Description

A ROBOTIC SYSTEM FOR PROCESSING AN ANIMAL BODY PART
TECHNICAL FIELD The present disclosure relates in general to a robotic system for processing an animal body part. More particularly, the present disclosure relates to a robotic system using a remote object tracking system for cutting the animal body part.
BACKGROUND Traditionally meat processing, e.g. cutting, has generally been conducted by butchers using .0 knives and saws, e.g. band saws. More recently, semi-automatic or fully automatic robotic systems have been developed to minimise the risk of injury to the butcher as well as allowing for increased productivity. While some recent robotic systems may be said to be fully automatic, most current systems are semi-automatic, meaning they require some operator input. While some are capable of .5 butchering carcasses into selected cuts of meat along a conveyor assembly line, current systems are not accurate enough to replace the fine motor skills of an experienced butcher. Hence, it is not possible using existing automatic systems to make precise fine cuts at a level meeting or exceeding that of an experienced butcher. Further, fully automatic systems are very expensive as they have to cope with the natural variability of animals. On the other hand, semi-automatic systems, making use .0 of the skills of an experienced operator offer an effective approach for managing the natural variability in animals during butchering. Thus, an improved system for processing meat would be advantageous.
SUMMARY .5 The present disclosure relates to a robotic system and an associated remote control system allowing an operator to control the processing from a location remote to the location where the meat is cut, trimmed or packaged and which overcomes problems or provides the public with a useful choice. According to a first aspect a robotic system for processing an animal body part is provided. The robotic system comprises a retaining unit for holding the body part, a device for processing the body part, an object tracking system for tracking a position and/or orientation of a remote object representing the body part, a robot having at least one position and/or orientation controllable robotic member attached to at least one of the retaining unit and the device for controlling the position and/or orientation thereof. Further, the robotic system comprises a controller arranged to: access the tracked position and/or orientation of the remote object, identify a mapped corresponding position and/or orientation of the robotic member based on the tracked position and/or orientation of the remote object, and control the position and/or orientation of the robotic member based on the mapped corresponding position and/or orientation, thereby controlling the relative position and/or orientation of the retaining unit in relation to the device and/or the relative position and/or orientation of the device in relation to the retaining unit. According to a second aspect an animal processing assembly line is provided. The animal processing assembly line comprises the robotic system as disclosed with reference to the embodiments disclosed herein, and at least one of a first transfer unit arranged upstream the robotic system for transporting the body part to be held by the retaining unit, and a second transfer unit arranged downstream the robotic system for transporting the processed body part to a packing station.
.0 BRIEF DESCRIPTION OF THE DRAWINGS A number of embodiments will now be shown, by way of example, with reference to the following drawings, in which: Figure 1 illustrates a robotic system according to an embodiment; Figure 2 is a top view of a robotic system according to an embodiment; .5 Figure 3 illustrates an isometric view of a processing device and a robot of the robotic system according to an embodiment; Figure 4 illustrates a diagram showing the calibration mapping between the tracked remote object and the robot according to an embodiment; Figure 5 illustrates an operator operating the robotic system according to an embodiment; .0 Figure 6 illustrates the robotic system comprising a robotic member connected to a retaining unit retaining a piece of meat and a processing device a number of cuts of meat according to an embodiment; Figure 7 illustrates a flow diagram of a controller's processes according to an embodiment; and .5 Figure 8 illustrates a flow diagram of a controller's processes according to an embodiment.
DETAILED DESCRIPTION The present disclosure relates to a robotic system for processing a body part of an animal. The robotic system is at least partly controlled by an operator. The body part may e.g. relate to a piece of meat and the associated processing may relate to cutting, trimming or packaging the piece of meat. Alternatively, the body part may pertain to a pelted body part of an animal and the associated processing may relate to de-pelting the body part. While the term "body part" is used generically throughout the present specification, it should be appreciated that "body part" may relate to either a part of an animal carcass, or the whole carcass of the animal in the embodiments disclosed herein. Figure 1 illustrates a robotic system 100 for processing a body part M of an animal according to an embodiment. The robotic system 100 comprises a retaining unit 10 for holding the body part M. The robotic system 100 further comprises a processing device 20 for processing the body part M. An object tracking system 30 is provided for tracking a position and/or orientation of a remote object 31 representing the body part M. Further, the robotic system 100 comprises a robot 40 having at least one position and/or orientation controllable robotic member 41 attached to at least one of the retaining unit 10 and the processing device 20 for controlling the position and/or orientation thereof. The robotic system 100 may further comprise a controller 50. The controller 50 may include microprocessor(s) and a memory to execute software instructions optionally stored thereon. The controller 50 is configured to access 51 the tracked position and/or orientation of the remote object 31. The controller 50 may further be configured to identify 52 a mapped corresponding .0 position and/or orientation of the robotic member 41 that is based on the tracked position and/or orientation of the remote object 31. Moreover, the controller 50 is configured to control 53 the position and/or orientation of the robotic member 41 based on the mapped corresponding position and/or orientation of the robotic member 41. By controlling the position and/or orientation of the robotic member 41 the relative position and/or orientation of the retaining unit 10 in relation to the .5 processing device 20 and/or the relative position and/or orientation of the processing device 20 in relation to the retaining unit 10 may be controlled. In some embodiments the robotic member 41may be rigidly attached to the retaining unit 10. When the robotic member 41 is rigidly attached to the retaining unit 10, controlling the position and/or orientation of the robotic member 41, via the controller 50, will also control the position .0 and/or orientation of the retaining unit 10. The position and/or orientation of the retaining unit 10 may thus be controlled in relation to other components of the robotic system 100, e.g. the processing device 20. In some embodiments the robotic member 41may be rigidly attached to the processing device 20. When the robotic member 41 is rigidly attached to the processing device 20, controlling .5 the position and/or orientation of the robotic member 41, via the controller 50, will also control the position and/or orientation of the processing device 20. The position and/or orientation of the processing device 20 may thus be controlled in relation to other components of the robotic system 100, e.g. the retaining unit 10. Based on information provided by the object tracking system 30 that is tracking the position and/or orientation of the remote object 31, the robotic member 41, via the controller, shadows or mimics the movement and/or orientation of the remote object 31. According to an embodiment, the robotic system 100 is designed for cutting a piece of meat. In such an application, the body part M pertains to a piece of meat. The robotic system 100 is not limited to any particular type of meat. The robotic system 100 may be calibrated to cut any type or piece of meat. For example, the robotic system 100 may be used to cut up chops from mid loin, e.g. in straight cuts at fixed width or at a fixed scanned volume (with variable width depending on scanned meat volume). The piece of meat M may for example be scanned using a laser scanner (not shown) operatively connected to and/or in communication with the controller 50. The robotic system 100 may also be configured to cut meat according to a weight based and/or other specification. For example, an unprocessed, not-yet-cut portion of meat - which may be relatively larger than a cut of meat - may be scanned, examined, weighed and the like, so that an optimal cutting plan to achieve one or more particular specifications for each cut (such as weight and/or size per cut - e.g. 200g per cut) can be put in place. The cutting plan may be calculated automatically by the robotic system 100 or another system, or manually by an operator, or a combination thereof. According to an embodiment, the robotic system 100 disclosed herein may be configured for a slaughter floor application, wherein typically the carcass will be held stationary and the .0 processing device 20 is rigidly attached to the robotic member 41. The robotic member 41 may use external support structures and/or equipment to align and/or support the mass of the carcass during processing. Further, the processing device 20 may be used to remove hocks from the carcass using external or embedded tooling. In an alternative embodiment, the processing device 20 may be configured to grip the pelt .5 of the carcass for an external device to remove, or the same robotic member 41 may remove the pelt. In an embodiment the robotic system 100 may be configured for a boning room application, wherein each retaining unit 10 moves a carcass through existing processing devices 20 to obtain different portions of the carcass, which may then be passed to further/other robotic .0 systems 100 for further breakdown of the carcass. The robotic system 100 may comprise one or more processing devices 20 in order to perform multiple operations within the same robotic system 100. Further, the processing devices 20 may be releasably connected to the robotic system 100 such that they are mobile and operator selectable. This enables the operator to perform multiple operations by simply replacing or .5 substituting the processing device 20 that is releasably connected to the robotic system 100. Further the retaining unit 10 may also be releasably connected to the robotic member 41 such that the retaining unit 10 may be replaced by another alternate retaining unit 10. In some embodiments, the processing device 20 is configured to remove the chine bone of a carcass. Alternatively, or additionally, the processing device 20 may be configured to halve bones to make canoes. Alternatively, or additionally the processing device 20 may be configured to cut the entire carcass into portions for smaller sub-systems. In an embodiment, the robotic system 100 may be configured for a packing application, wherein processed body parts, e.g. cuts of meat, may be individually positioned and oriented by the retaining unit 10 or the processing device 20, whichever is connected to the robotic member 41, as the operator moves and orients the remote object 31. In some embodiments, the object tracking system 30 may track the position and/or orientation of each processed body part, and output an ideal position and/or orientation to be displayed by the remote display unit 80. The ideal position and/or orientation may be based at least partly on a known packaging size, and/or the position and/or orientation of previous processed body parts. The operator may position and orient each processed body part via the remote object 31. The object tracking system 30 may be arranged to calculate an ideal position and/or orientation of a following processed body part once a former processed body part has been positioned and/or oriented by the operator, or is located in a certain region of interest within the local coordinate system. For a meat cutting application, the processing device 20 may comprise a saw, e.g. a band saw, blade, or knife for cutting the piece of meat M. In some embodiments, the saw, blade, or knife 20 is attached to the robotic member 41, .0 whereby the retaining unit 10 holding the piece of meat is arranged separate from the robotic member 41. Alternatively, the saw, e.g. band saw, knife or blade may be arranged separate from the robotic member 41, whereby the retaining unit 10 holding the piece of meat may be attached to the robotic member 41. This embodiment may include using a knife to trim fat from a piece of meat. In the example of the robotic system 100 shown with reference to Figure 1, the robot 40 is .5 situated adjacent to a bandsaw, forming the processing device 20, and the meat to be cut is held by the robotic member 41using a custom-built retaining unit 10, acting as an end-effector. The position of the retaining unit 10 is controlled by an operator who is situated at an operator station 60 out of reach of the robot 40 and bandsaw 20. When the operator moves a tracked remote object 31on a 2 D plane formed by a tracking table 61 within the operator's workspace, the controller 50 controls .0 the retaining unit 10 so as to "shadow" or follow the motion of the remote object 31 in real-time, which allows the operator to manipulate the animal body part comprising a piece of meat to be cut by the bandsaw. An isometric top view of the robotic system 100 of Figure 1 is shown in Figure 2. The controller 50 of the robotic system 100 may be arranged to scale the motion of the tracked remote object such that the robot 40 movements are enlarged or reduced by a specified .5 scale factor. In some embodiments, the controller 50 is arranged to limit transverse motion when the piece of meat is engaging and/or moving through the processing device 20. In some embodiments, the controller 50 is arranged to output feedback to the operator. The feedback may be displayed on a display unit 80 or by the touch screen of a tracking table 61. The feedback may optionally comprise information associated with the current width/thickness of the cut. The cut information (e.g. width/thickness) may be derived from the difference between the robot x-coordinate (i.e. the coordinate direction perpendicular to the cutting direction) from the previous cut and the robot local x-coordinate of the current cut. For example, if a first cut at a first local x coordinate (representing 30mm) is made, followed by a cut at second local robot x coordinate (representing 50mm), the width of the current cut may be calculated as 50mm - 30mm = 20mm. In some embodiments, the cut information may additionally or alternatively include other coordinates. This can include z-axis coordinates, wherein the associated z axis may extend in a substantially vertical direction. The z-axis coordinates may thus provide information regarding the height of the coordinate from a reference point (e.g. the bench/table of the processing device 20). This essentially provides sufficient cut information for the operator to move the remote object in three dimensions to control the position and/or orientation of the robotic member 41 in 3D. Alternatively, the object tracking system may be configured to measure the width/thickness of each cut using a known scaling factor converting the local coordinate system into e.g. millimetres (mm). The robotic system according to some embodiments is capable of a cutting accuracy of ±5mm, or 2 mm, or 1mm, or 0.5 mm at a speed comparable to or faster than an experienced butcher. .0 As a non-limiting example, the robot 40 may e.g. be a Kuka, KR10 R1100-2 robot. The controller 50 may be a KRC5 microcontroller, running KSS 8.7 robot controlling software. In some embodiments, the robot 40 is arranged with a member, e.g. a base, that is rigidly attached to a corresponding member, i.e. base, of the processing device 20. When the processing device 20 is a bandsaw, the robot 40 may be arranged so that its x .5 and y-axes in the local coordinate system are aligned with the plane of the bandsaw table such that moving the robot 40 in the x direction will change the thickness of a cut and moving the robot 40 in the y direction will perform a cut. For a two-dimensional coordinate system embodiment, the height above the bandsaw table (the global z-axis) may be held constant. Furthermore, movement limits or motion restrictions, accessed by the controller 50, may be established for any one or more of the .0 robot x-, y- and z-axes so that the robot 40 has sufficient range of motion to perform the required processing actions, e.g. cutting or de-pelting, whilst not allowing the retaining unit 10 or any part of the robot 40 to contact the processing device 20. Consequently, the movement limits may form the position boundary of the local coordinate system, e.g. as shown with reference to Figure 4. In the embodiment shown with reference to Figure 1 and 2, the position of the bandsaw blade in the .5 robot's global x and y coordinates is known and does not change as it is stationary. In an embodiment, the retaining unit 10 comprises a gripping member for securably and/or controllably holding the body part. The gripping member may comprise one or more clamps to hold the animal body part M. Alternatively, the clamps may comprise swivel heads arranged to screw down onto the body part M and adjust to shape. In an embodiment, the retaining unit 10 or processing device 20 comprises a "Jaws of life" style cutter or gripper, or optionally a reverse "Jaws of life" style gripper for getting between joints and pulling things apart. The object tracking system 30 is provided for tracking a position and/or orientation of the remote object 31 representing the body part M. The remote object 31 has features or fiducial markers designed to allow for said tracking. In an embodiment, the remote object 31 is a feature tracking object. The feature tracking object may be designed to have a shape that is detectable by the object tracking system 30. For example, the shape may be defined by a number of characteristic features, e.g. edges, corners, and/or surfaces that are detectable by the object tracking system 30. In some embodiments, the remote object 31may be said to be passive, meaning the remote object does not require any electrical power for the object tracking system 30 to be able to track its position and/or orientation. For example the feature tracking object 31 may be made of a block of suitable material, e.g. plastic, wood, or metallic material. When the remote object 31 is passive, the tracking of the remote object 31 is fully conducted by other electrically powered components of the object tracking system 30. In some embodiments, the remote object 31may be said to be active, meaning the remote .0 object 31 will require electrical power for the object tracking system 30 to be able to track its position and/or orientation. In an embodiment, the active remote object 31may be a peripheral computer device, or a motion capture garment as will be further elucidated below. The tracking may still be said to be passive in this case, e.g. when the peripheral device comprises fiducial markers. In an embodiment, the peripheral computer device may comprise a keyboard, and/or a .5 special controller, e.g. a joystick, Xbox*/Playstation* style controller, arcade-style station with a joystick buttons and a screen, replica of meat carcass/cut portion/target meat related product with touch-sensitive points on the surface for directing the tool too with or without buttons, optical switches, or similar. For example, using a touch screen with projected image of the body part, the operator may draw a cut line/multiple points on the touch screen projection of meat, press Go, and .0 then the system may use the drawn lines on the projection as cutting lines to position the processing device 20 in relation to the body part to make said cuts. In an embodiment, the object tracking system 30 is a visual object tracking system comprising at least one remote camera 32 arranged to capture image data or information of a remote operator station 60 comprising the remote object 31. .5 The object tracking system 30 may comprise an object tracking system controller 33 that is configured to recognise 33a the remote object 31 in the image information based on machine vision. The controller 33 may be further configured to access 33b information about the remote coordinate system, including at least one region of interest (ROI) boundary, from the captured image data or information. The controller 33 may in some embodiments identify the ROI boundary using machine vision algorithms, e.g. by identifying the position boundary marked in the associated tracking table 61. Alternatively, the remote ROI boundary may be pre-programmed into the controller 33, e.g. by manual input by an operator. The controller 33 is further configured to identify 33c the position and/or orientation of the recognized remote object 31 in the remote coordinate system as a tracked position and/or orientation. In an embodiment, with reference to Figure 1, a single remote camera 32 may be used for tracking the position and/or orientation of the remote object 31 in a 2D remote coordinate system.
The remote camera 32 may e.g. be arranged above an operator station 60 to capture image data of the remote object 31, e.g. when moving over the associated tracking table 61 according to some embodiments. For a 3D remote coordinate system one or more cameras 32 could be used. In an embodiment, a second remote camera 32 (and optionally further additional cameras) could be arranged in other positions, e.g. adjacent or below the tracking table 61, to capture image data of the remote object 31from an angle different from that of the first remote camera 32. When using a remote object with fiducial markers a single camera may derive a 3D position and orientation of the remote object, so each further camera may be used for redundancy should .0 the first camera lose "sight" of one or more of the fiducial markers. However, in other embodiments, two or more cameras may be used for 3-D tracking. In an embodiment, the tracking table 61may be configured with one or more lasers projecting one or more point(s) or crosshairs or the like onto the top surface of tracking table 61. The projected light from the laser(s) acts as a visual guide for the operator. The laser point or .5 crosshairs in the operator's workstation may correspond to the position of the saw or blade (e.g. in a fixed position bandsaw application) to give the operator an indication as to where the cut will occur. In some embodiments, alternatively or in addition to the laser(s) for projecting the one or more point(s) or crosshair(s) on the tracking table 61, there may be provided additional point(s) or crosshair(s) for projecting onto locations of the operator station 60. These additional point(s) or .0 crosshair(s) may be provided by additional laser(s). The additional point(s) or crosshair(s) provide several advantages for the system 100, including providing the operator with the ability to perform trimmings in several and/or multiple directions (e.g. 2D as well as 3D cuts/trims). The position of the points or crosshairs in the remote coordinate system may be selected to represent the position of the processing device 20, such as the saw blade, in the local coordinate .5 system, whereby an operator may use the crosshairs as a reference when moving and orienting the remote object 31. As mentioned above, such points or crosshairs representing the processing device 20 (or other components of the robotic system 100) may be projected onto the tracking table 61 and/or other locations of the operator station 60. The tracking table 61 may be painted in a homogenous colour, e.g. dark or black, to allow for good vision tracking of the remote object 31 when moving on the tracking table 31 surface. In some embodiments, the controller is arranged to register one or more cutting reference points in the local coordinate system as the operator moves the remote object 31 relative to the crosshairs in the remote coordinate system. In this way, several local coordinate system cutting reference points may be stored by the controller 33. The controller 33 may subsequently be configured to control the robotic member 41 (connected to the processing device 20 or retaining unit 10) to execute the cuts based on the cutting reference points in accordance with a set specification in an automatic fashion without further input from the operator. For example, this mode of operation may be suitable for removing flaps or cutting short ribs.
In some embodiments one or more reference points may be stored for each cut. Alternatively, for some cuts of meat one reference point per cut may be used. The controller 33 may be arranged to execute one or more computer vision algorithms configured to identify the remote object and determine its position and/or orientations. For example, the computer vision algorithms may relate to OpenCV functions identifying fiducial ArUco markers of the remote object, or ChArUco markers. Alternatively, or additionally, the computer vision algorithms may relate to colour tracking, wherein in a first iteration the controller is arranged to detect the largest area of a certain colour, place the largest area in a bounding box if greater than a threshold size, followed by tracking a .0 centre, edge or defined point within that bounding box. The bounding box may be defined as the smallest rectangle that can be created on the camera image (i.e. the four corners of the rectangle correspond to four pixels of the image) that completely contains an area of a certain colour. If the rectangle is greater than a predefined size (number of pixels) then the algorithm will consider the rectangle as the location of the tracked object in the image. .5 The remote camera 32 may be a conventional digital video camera with a relatively high resolution. Optionally, the remote camera 32 may be a Time of Flight (TOF) camera that is able to determine the distance between the camera 32 and remote object 31. A TOF camera may be used when the remote coordinate system is three dimensional, allowing the operator to move the remote object in three dimensions to control the position and/or orientation of the robotic member 41in .0 3D. Additionally or alternatively, the object tracking system 30 may comprise a light detection and ranging (LIDAR) unit or laser scanner to identify the remote object 31 and its position and/or orientation in the remote coordinate system. Structured light and fiducial feature tracking techniques may also be used to provide 3D .5 positioning and guidance in addition to 2D positioning and guidance. In an alternative embodiment, the object tracking system 30 may be based on motion capture. In this embodiment the remote object 31may e.g. relate to a motion capture garment to be worn by the operator. The motion capture garment may for example be a glove, or upper body garment such as a shirt or suit capturing the movement of the operator wearing the garment. The motion capture garment may for example comprise accelerometers or gyros to detect associated motion. In some embodiments the remote object 31 may comprise a vibrator or speaker to allow for providing haptic feedback to the operator, in use. For example, the object tracking system may be configured to detect when the remote object 31 touches the projected laser light, and trigger a signal that is received by the remote object 31 to activate the vibrator. This will assist the operator in recognising when the cuts are actually made. The triggered signal could also result in the provision of verbal feedback by the speaker. Alternatively, an actuator could be used to push back against the operator's motion to let the operator know they may have hit or found something.
In order to move and orient the robotic member 41 in a predetermined manner based on the tracked position and/or orientation of the remote object 31, mapping the position and/or orientation of the remote object 31 to a corresponding position and/or orientation of the robotic member 41 is required. To this end the controller may be arranged to execute 54 a calibration process for mapping each tracked position and/or orientation of the remote object 31 in a remote coordinate system to a corresponding position and/or orientation of the robotic member 41 in a local coordinate system. In some embodiments, the calibration process comprises registering 54a at least one position boundary of the remote object 31 in the remote coordinate system and at least one .0 corresponding position boundary of the robotic member 41 in the local coordinate system. Based on the information associated with the registered position boundaries, the controller may be configured to calculate a mapped position of the robotic member 41 within the position boundary of the local coordinate system for each position of the remote object 31 within the position boundary of the remote coordinate system. .5 Figure 4 illustrates an example diagram demonstrating a calibration process where the associated local and remote coordinate systems are two dimensional. It should be appreciated that the calibration process is not limited to two-dimensional coordinate systems. Three-dimensional coordinate systems or higher are also within the scope of the present disclosure. The position boundaries of the remote object 31 in the remote coordinate system may be observed to the left in .0 Figure 4. In this example, the remote position boundary forms a rectangular position boundary. A corresponding position boundary of the robotic member 41 in the local coordinate system is shown to the right. As mentioned previously, the position boundary of the robotic member 41 may e.g. be inputted by an operator at an operator station 60 using a graphical user interface of a remote display unit 80. It should be appreciated that other forms or types of user interfaces may also be .5 used in the robot system disclosed herein. These include but are not limited to a holographic user interface, a user interface using Pepper's ghost effect, virtual reality or augmented reality. Individual locations within each position boundary are indicated by cells. The controller 50, not shown, when having access to the position boundaries for the respective coordinate system may calculate a resulting corresponding location of the robotic member 41for any given location, e.g. cell, of the position boundary of the remote coordinate system. Non exhaustive examples of mapped locations may be seen in the bottom of Figure 4 with the associated locations shown within the respective position boundary. It may be observed from Figure 4, that any of the four pixels within the square region at the top left of the tracking camera position boundary ((1,10),(2,10),(1,9),(2,9)) would map to the robot position indicated by the top left square region of the robot position diagram (1,5). As the tracked remote object 31 may be represented by a single pixel corresponding to the centre of the left edge of the object's identified position boundary, if the pixel representing the tracked remote object 31 is within the tracking position boundary, the robotic member 41 will be moved to the position that corresponds to the tracked remote object. For example, using the diagram of Figure 4, if the pixel representing the tracked remote object 31is identified as (x=9,y=5), the robotic member 41 will be moved to the corresponding position of (x=5,y=3). Consequently, moving the tracked remote object 31 within the tracking position boundary moves the robotic member 41, in real-time, to a corresponding (x,y) coordinate position in the local coordinate system. In some embodiments, the object tracking system 30 comprises a remote camera 32 arranged to capture image data of the remote object 31, in use. Referring to the remote position boundary of Figure 4, the respective cells may be seen as individual pixels of the remote camera 32. For example, in embodiments where the remote object 31 has a square shape, the squares .0 illustrated within the remote position boundary may be seen as relating to image data of the square shaped remote object as 'seen' by the remote camera 32. It should be appreciated that while Figure 4 only shows the mapped location, the controller is also arranged to map the orientation of the remote object 31 to a corresponding orientation of the robotic member 41 according to some embodiments. .5 In one embodiment, during the calibration process, the operator may physically move the remote object 31 through a number of positions to create the position boundary of the remote coordinate system. However, in some embodiments, the remote position boundary may be pre programmed into the controller. Alternatively, the remote position boundary forming a region of interest may be input by .0 the operator at an operator station 60, e.g. by inputting the remote position boundary into a graphical user interface shown on a remote display unit 80. In some embodiments, the remote position boundary and/or the position and/or orientation of the remote object 31 may be obtained in Cartesian coordinates and/or in millimetres instead of pixels. This arrangement may use a fiducial tracking system. The system may use fiducial .5 markers and/or provide positions in Cartesian coordinates and/or millimetres as an output from a fiducial tracking algorithm. The fiducial tracking system may be configured to yield a tracked position in Cartesian coordinates and/or relative to the remote location (e.g. camera location). In other words, the origin/reference point of the tracking system may be located at the remote (e.g. camera) position in application/the real world. Additionally or alternatively, the robot 40 may utilise a Cartesian coordinate system. Thus, the remote position boundary and/or the position and/or orientation of the remote object 31 can be obtained in Cartesian coordinates and/or millimetres, which can be communicated to, utilised and/or replicated in the local coordinate system without needing any conversions. Further, using the fiducial tracking system, it is possible to allow for essentially "zeroing" the current tracked position to the robot home position. "Zeroing" in this context may comprise applying a constant offset to the camera coordinate system so that the origins of the local and remote coordinate systems are aligned.
However, in some embodiments, it may be desirable to align an arbitrary (but strategically chosen) point in the tracking system with the home position of the robot. In such embodiments, it is not the origins (zero points) of the local and remote coordinate systems that would be aligned. Rather, the "zeroing" in such embodiments may result in the tracked position (at the time of the "zeroing") aligning with the local (e.g. robot) home position as opposed to the "zero" positions in the local and remote coordinate systems being aligned. In an embodiment, the remote camera 32 is stationary arranged to overlook a stationary tracking table 61over which the remote object 31 may be moved. The tracking table 61 may comprise markings indicating the remote position boundary. Since remote camera 32 and the .0 tracking table 61 remain stationary, in use, the remote position boundary marked on the tracking table 61may be identified from the image data and accessed by the controller 50. Hence, in such an embodiment, the operator does not need to move the remote object 31 during the calibration process. Once the body part is retained by the retaining unit 10, the object tracking system 30, e.g. .5 via a local camera 70, may be arranged to scan the exterior boundary of the body part to determine the local position boundary. The identified local position boundary may then be accessed by the controller. In this way the extremities of the body part may be determined to allow the robotic system to know the body part boundaries in relation to the robotic member 41. Further, the relevant locations and/or orientations of the robotic member 41 in the local .0 coordinate system may be predetermined and stored for access to the controller 41 according to some embodiments. The relevant range of locations and/or orientations of the robotic member 41 may be determined by taking into consideration the location of the robotic member 41 in relation to the processing device 20 and/or retaining unit 10, the size and shape of the body part to be processed, and the available range of locations and orientation of the robotic member 41. .5 It should be appreciated that other methods of calibrating the robotic system are also possible. For example, calibration could be done using commonly available OpenCV programming functions that may be built into the remote camera and or the controller etc. In some embodiments, the calibration process does not need to rely on computer vision using a remote camera 32. For example, the calibration process could be utilizing sensors, laser scanners, mechanical switches, X-Ray technology, or human measurement. In some embodiments the controller 33 of the object tracking system 30 or the controller 50 may be configured to adopt a scaling function when mapping or translating the position and/or orientation of the remote object 31 in the remote coordinate system to the corresponding position and/or orientation of the robotic member 41 of the local coordinate system. In an embodiment, the scaling function may relate to a constant scaling factor, so that when an operator moves the tracked remote object a known distance, the robot will move the known distance multiplied by the scale factor.
In some embodiments, the scaling function may be a linear scaling function. A linear scaling function implies that each unitary movement along an axis in the remote coordinate system will translate into a corresponding unitary movement (scaled with the scale factor) along the corresponding axis in the local coordinate system. In some embodiments, the controller 33, 50 is arranged to apply a first linear scaling function for any movement along a first axis of the remote coordinate system, and a second linear function for any movement along a second axis of the remote coordinate system, wherein the first linear scaling function and the second linear scaling function are different. For example, when the retaining unit 10 is attached to the robotic member 41 and the processing device 20 is a saw, using different linear scaling functions allows the operator to .0 achieve a high accuracy when moving the remote object 31 transversely, i.e. resulting in moving the robotic member 41 and the retaining unit 10 transversely in relation to the saw 20. A different linear scale, such as a higher scale, could be utilised when moving the remote object translationally, e.g. forward and/or backward, resulting in moving the robotic member 41 translationally forward/backward at a higher speed for a clean cut in a single motion. .5 The controller 33, 50 may be further configured to execute a linear scaling correction function, wherein correction factors are applied to the captured image data and information, for example to account for distortion and/or radial correction factors. In some embodiments, the linear scaling correction could be utilised during the calibration process. For example, a standard OpenCV calibration step may be run where a checkerboard or grid .0 pattern is moved in front of the camera in a variety of orientations and positions, and the associated OpenCV function uses regression to determine distortion coefficients and a camera intrinsic matrix (e.g. focal lengths, and centre pixel translation). This allows for the location and orientation of known shapes to be determined with sub-pixel accuracy. In an alternative embodiment the controller may be configured to adopt a logarithmic .5 scaling when translating the position and/or orientation of the remote object 31in the remote coordinate system to the corresponding position and/or orientation of the robotic member 41of the local coordinate system. A logarithmic scaling may be advantageous for gradual speed build ups and to give the operator more time to adjust the cut position transversely before and/or during the translational forward movement forcing the piece of meat M through the saw 20. Logarithmic scaling functions may also be adopted by the controller 33, 50 for different axes of the remote coordinate system, e.g. to move the robotic member 41 to and from a home/calibration location quickly while maintaining fractional scaling in the associated cut zone for extra accuracy. In some embodiments the controller 33, 50 is configured to adopt a custom n-space scaling function. A custom n-space scaling function may be derived by considering important cut zones, home positions, operator perspective etc.
In an embodiment, the controller 33, 50 is arranged to apply a transformation matrix used for changing the translation and rotation of each point between coordinates of the remote coordinate system and the coordinates of the local coordinate system. Depending on the type of application the local and/or remote coordinate system may refer to a two-dimensional (2D) coordinate system, three-dimensional (3D) coordinate system or higher dimensional coordinate system. For example, in a 2D coordinate system the robotic member 41 may be configured to move laterally and towards and away from either the retaining unit 10 when the processing device 20 is attached to the robotic member 41 or the processing device 20 when the retaining unit 10 is .0 attached to the robotic member 41. A flat top surface of a tracking table 61, as shown with reference to Figures 1, 2 and 5, may serve to form a 2D surface over which the remote object 31 is moved, in use. By moving the remote object 31 over the flat surface, the distance between the remote object 31 and the camera may be maintained within a limited range, which in turn may reduce overall processing requirement of the controller in terms of scaling. .5 As stated previously the robotic system 100 may comprises a remote operation station 60 having a tracking table 61. The tracking table 61 may act to provide a support surface for the remote object, so that the remote object 31may move in position and/or orientation over the tracking table 61. A tracking table 61 may increase the robustness of the system, by forming a barrier against .0 movement of the remote object 31 in the direction of the tracking table 61. Hence, it acts as to somewhat limit the range of movement of the remote object 31 so as to prevent the remote object 31 from moving outside the remote coordinate system range of the object tracking system. In some embodiments, the operator station 60 may be arranged at a remote location away from the robot 40, retaining unit 10 and processing unit 20. The remote location may be another .5 room in the animal processing facility or geographically remote from the animal processing facility. The object tracking system 30 and the remote display unit 80 may be arranged to communicate with the controller 50 over the Internet or by any other means of communication suitable. In an embodiment, the tracking table 61 comprises a touch screen configured to display a specific image or images, such that key cut data may be quickly entered. Examples of key cut data could be: type of cut, cut pathway, e.g. following intercostal spacing between ribs or the like, width of the cut, or the number of ribs of each cut, etc. In an embodiment, the object tracking system 30 further comprises at least one local camera 70 arranged to capture image data or information of the retaining unit 10 and the processing device 20. The captured image data of the local camera 70 may be presented on a remote display unit 80 arranged in the operation station 60. The local or remote camera 32 may e.g. be a Basler Ace type camera, e.g. model a2A1920 160ucBAS or a2A1920-160ucPRO.
The image data or information captured by the local camera may be processed by the controller 33 to determine a position boundary of the animal body part M. The position boundary of the animal body part M may be used to determine when the animal body part is in the vicinity of the processing device 20. The animal body part position boundary allows the system to know where the edges of the body part are in the local coordinate system. When the animal body part M is within close proximity to the processing device 20, the controller may be arranged to limit the transverse movement of the robotic member 41 in relation to the processing device 20, in order to limit any sideways motion of the animal body part M while .0 cutting. This allows for making precise straight cuts through the animal body part M. In addition the prevented sideways movement of the robotic member 41 when cutting reduces the wear of the processing device 20. In some embodiments, the remote display unit 80 comprises a user interface allowing the operator to control the robotic system from the remote display unit 80. The remote display unit may .5 for example be a touch screen allowing the operator to input parameters for control the robotic system. Alternatively, the remote display unit 80 may comprise a projector arranged to project an image onto a surface such as a tracking table 61 adjacent to the operator or another shaped object to be used as a tracking object. Alternatively, when the tracking table has a touch screen the image may be displayed by the tracking table 80 itself. It should be appreciated that other forms or types of .0 user interfaces can be provided. These include but are not limited to a holographic user interface, a user interface using Pepper's ghost effect, virtual reality or augmented reality. The information presented by the remote display unit 80, via the controller 33, 50, could e.g. comprise any image data or information captured by the local camera 32, e.g. image data associated with the robotic member 41, the processing device 20, and/or the animal body part. .5 In applications where the operator station 60 is arranged adjacent the robot 40, the retaining unit 10 and the processing device 20, the robotic system 100 may further comprise a safety barrier 90 separating the robotic system 100 from the adjacent environment and the operator. The safety barrier 90 prevents any person inadvertent access to the robotic system 100, and thereby minimise the risk of health and safety accidents. The safety barrier 90 may for example comprise a Satech Greenfast fencing, 1xBAM02HA E-Stop from Balluff and 1xSTR-SAFM03P8 gate switch from SICK chained together into an RLY3 unit from SICK and looped through the robot's in-built safety. Further examples of the safety barrier 90 include a safety light curtain, which may e.g. comprise one or more LEDs which can provide "expected" pulses in a predetermined timed sequence, which, when interrupted by an object, can trigger an automatic safety/stop signal. Additional or alternative safety measures may also be included in the robotic system 100. For example, robot 40 may include a soft and/or pressure sensitive safety skin (e.g. Airskin *), which may be configured to safely stop or shut-down the robot upon an application of e.g. an external force on the safety skin, by e.g. detecting internal pressure changes due to the application of the externalforce. According to an embodiment, the robotic system 100 may form part of an animal processing assembly line (not shown), e.g. a meat processing assembly. The meat processing assembly may comprise a first transfer unit arranged upstream the robotic system 100 for transporting the body part M to be held by the retaining unit 10. Further, the animal processing assembly line may comprise a second transfer unit arranged downstream the robotic system 100 for transporting the processed body part to a packing station. The robotic system 100 may comprise a trimming station (e.g. for trimming fat from the cuts of meat) positioned before or after the first .0 transfer unit (i.e. before or after the body part M is transported to be held by the retaining unit 10) or before or after the second transfer unit (i.e. before or after the processed body part is transported to the packing station). Additionally or alternatively, there may be provided a bagging station that is configured to bag, seal or wrap the processed body part. The bagging station may be provided as a component of the packing station, or may be positioned before the packing station. .5 In an embodiment, the first transfer unit or second transfer unit may comprise a conveyor belt. In an embodiment, the controller 50 is arranged to control the operation of the gripping member of the retaining unit 10 when attached to the robotic member 41. When an animal body part is detected at a loading position of the first transfer unit, the controller 50 may be configured to control the gripping member of the retaining unit 10 to its open position. Next, the controller 50 is arranged to move the position of the robotic member 41 to the position of the detected body part until the gripping member at least partly surrounds the animal body part. The controller 50 may subsequently control the gripping member towards its closed position, resulting in retaining unit 10 holding the animal body part. The controller may subsequently control the robotic member 41 to move to a predetermined location in the vicinity of the processing device 20. Detection of the animal body part at the loading position of the first transfer unit may be based on machine learning or computer vision algorithms. In an embodiment the second transfer unit may relate to a chute or slide.

Claims (15)

1. A robotic system (100) for processing an animal body part, comprising a retaining unit (10) for holding the body part (M), a device (20) for processing the body part (M), an object tracking system (30) for tracking a position and/or orientation of a remote object (31) representing the body part (M), a robot (40) having at least one position and/or orientation controllable robotic member (41) attached to at least one of the retaining unit (10) and the device (20) for controlling the position .0 and/or orientation thereof, and a controller (50) arranged to: access (51) the tracked position and/or orientation of the remote object (31), identify (52) a mapped corresponding position and/or orientation of the robotic member (41) based on the tracked position and/or orientation of the remote object (31), and .5 control (53) the position and/or orientation of the robotic member (41) based on the mapped corresponding position and/or orientation, thereby controlling the relative position and/or orientation of the retaining unit (10) in relation to the device (20) and/or the relative position and/or orientation of the device (20) in relation to the retaining unit (10).
.0 2. The robotic system (100) of claim 1, wherein the position and/or orientation of the at least one robotic member (41) is controllable in relation to a local coordinate system, and the controller (50) is arranged to identify (52) the mapped corresponding position and/or orientation of the robotic member (41) by: .5 accessing (52a) mapping information associated with mapping each tracked position and/or orientation of the remote object (31) in a remote coordinate system to a corresponding position and/or orientation of the robotic member (41) in the local coordinate system, and identifying (52b) the corresponding position and/or orientation of the robotic member (41) mapped to the accessed tracked position and/or orientation of the remote object (31) based on the accessed mapping information.
3. The robotic system (100) of claim 1 or 2, wherein the remote object is selected from the group consisting of: a feature tracking object, a peripheral computer device, a touch screen, a mouse, a joystick, and a motion capture glove.
4. The robotic system (100) of any one of the preceding claims, wherein the controller (50) is further arranged to: execute (54) a calibration process for mapping each tracked position and/or orientation of the remote object (31) in a remote coordinate system to a corresponding position and/or orientation of the robotic member (41) in a local coordinate system.
5. The robotic system (100) of claim 4, wherein the calibration process comprises: registering (54a) a position boundary of the remote object (31) in the remote coordinate system and a position boundary of the robotic member (41) in the local coordinate system, and based on the registering calculating a mapped position of the robotic member (41) within the position boundary of .0 the local coordinate system for each position of the remote object (31) within the position boundary of the remote coordinate system.
6. The robotic system (100) of any one of claims 2 to 5, wherein the local and/or remote coordinate system is a two-dimensional (2D) or three-dimensional (3D) coordinate system. .5
7. The robotic system (100) of claim 6, wherein the remote coordinate system is a 2D coordinate system, aligned with a 2D surface of a remote operator station (60) over which an operator can move and change the orientation of the remote object (31), in use.
.0
8. The robotic system (100) of any one of the preceding claims, wherein the retaining unit (10) comprises a gripping member for controllably holding the body part.
9. The robotic system (100) of any one of the preceding claims, wherein the object tracking system (30) is a visual object tracking system, comprising .5 at least one remote camera (32) arranged to capture image information of a remote operator station (60), an object tracking system controller (33) arranged to: recognise (33a) the remote object in the image information based on machine vision, and access (33b) information about the remote coordinate system, including at least one region of interest (ROI) boundary, from the captured image information based on machine vision or by manual input by an operator, and identify (33c) the position and/or orientation of the recognized remote object in the defined remote coordinate system as a tracked position and/or orientation.
10. The robotic system (100) of any one of the preceding claims, further comprising at least one local camera (70) arranged to capture image information of the retaining unit (10) and the device (20), and a remote display unit (80) for displaying the captured image information of the retaining unit (10) and the device (20) to an operator.
11. The robotic system (100) of any one of the preceding claims, further comprising a remote operation station (60) having a tracking table (61), wherein the remote object (31) is configured to move in position and/or orientation over the tracking table (61).
12. The robotic system (100) according to claim 11, wherein the tracking table (61) .0 comprises a touch screen for controlling parameters of the controller (50, 33) via operator input.
13. The robotic system (100) of any one of the preceding claims, wherein the body part (M) comprises a piece of meat, and the device (20) comprises a saw, band saw, blade, knife for cutting the piece of meat (M). .5
14. The robotic system (100) of any one of the claims 1 to 13, wherein the retaining unit (10) is configured to hold a first portion of the body part (M), and the device (20) comprises a gripping member for holding a second portion of the body part, wherein .0 the first portion or second portion includes a pelt of the body part (M), wherein the relative position and/or orientation of the device (20) in relation to the retaining unit (10) or vice versa allows for de-pelting of the body part (M).
15. An animal processing assembly line, comprising .5 the robotic system (100) of any one of claims 1 to 14, and at least one of a first transfer unit arranged upstream the robotic system (100) for transporting the body part (M) to be held by the retaining unit (10), and a second transfer unit arranged downstream the robotic system (100) for transporting the processed body part to a packing station.
AU2023200298A 2022-01-20 2023-01-20 A robotic system for processing an animal body part Active AU2023200298B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NZ78444722 2022-01-20
NZ784447 2022-01-20

Publications (1)

Publication Number Publication Date
AU2023200298B1 true AU2023200298B1 (en) 2023-07-27

Family

ID=87315321

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2023200298A Active AU2023200298B1 (en) 2022-01-20 2023-01-20 A robotic system for processing an animal body part

Country Status (1)

Country Link
AU (1) AU2023200298B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020127575A1 (en) * 2018-12-19 2020-06-25 Poly-Clip System Gmbh & Co. Kg System for picking up, conveying, and delivering a product, and method for controlling such a system
US10813710B2 (en) * 2017-03-02 2020-10-27 KindHeart, Inc. Telerobotic surgery system using minimally invasive surgical tool with variable force scaling and feedback and relayed communications between remote surgeon and surgery station
US11534252B2 (en) * 2017-11-16 2022-12-27 Intuitive Surgical Operations, Inc. Master/slave registration and control for teleoperation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10813710B2 (en) * 2017-03-02 2020-10-27 KindHeart, Inc. Telerobotic surgery system using minimally invasive surgical tool with variable force scaling and feedback and relayed communications between remote surgeon and surgery station
US11534252B2 (en) * 2017-11-16 2022-12-27 Intuitive Surgical Operations, Inc. Master/slave registration and control for teleoperation
WO2020127575A1 (en) * 2018-12-19 2020-06-25 Poly-Clip System Gmbh & Co. Kg System for picking up, conveying, and delivering a product, and method for controlling such a system

Similar Documents

Publication Publication Date Title
EP3383593B1 (en) Teaching an industrial robot to pick parts
JP6474179B2 (en) Learning data set creation method, and object recognition and position and orientation estimation method
JP2019508134A5 (en)
JP2016099257A (en) Information processing device and information processing method
Zou et al. Fault-tolerant design of a limited universal fruit-picking end-effector based on vision-positioning error
US20150071491A1 (en) Method And Device For Optically Determining A Position And/Or Orientation Of An Object In Space
CN105225225B (en) A kind of leather system for automatic marker making method and apparatus based on machine vision
McIvor Calibration of a laser stripe profiler
Kohan et al. Robotic harvesting of rosa damascena using stereoscopic machine vision
KR102500626B1 (en) Apparatus for controlling movement of robot and therapeutic robot related thereto
JP2019089157A (en) Holding method, holding system, and program
CN112276936A (en) Three-dimensional data generation device and robot control system
EP3837095B1 (en) Method of programming an industrial robot
CN114173699A (en) Obstacle avoidance techniques for surgical navigation
JP2016170050A (en) Position attitude measurement device, position attitude measurement method and computer program
CN107848117B (en) Robot system and control method
AU2023200298B1 (en) A robotic system for processing an animal body part
Shen et al. A multi-view camera-projector system for object detection and robot-human feedback
Ming et al. Design of porcine abdomen cutting robot system based on binocular vision
CN111297615A (en) Medical device and method for operating a medical device
JP2019060695A (en) Three-dimensional object detector, robot, and program
US20220168902A1 (en) Method And Control Arrangement For Determining A Relation Between A Robot Coordinate System And A Movable Apparatus Coordinate System
Wu et al. Application of visual servoing for grasping and placing operation in slaughterhouse
CN114786468A (en) Man-machine guidance system for agricultural object detection in unstructured and noisy environments by integrating laser and vision
CN117794704A (en) Robot control device, robot control system, and robot control method

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)