CN113386146A - Method and device for cutting objects or food - Google Patents

Method and device for cutting objects or food Download PDF

Info

Publication number
CN113386146A
CN113386146A CN202010178657.2A CN202010178657A CN113386146A CN 113386146 A CN113386146 A CN 113386146A CN 202010178657 A CN202010178657 A CN 202010178657A CN 113386146 A CN113386146 A CN 113386146A
Authority
CN
China
Prior art keywords
cutting
cut
area
robot arm
light sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010178657.2A
Other languages
Chinese (zh)
Inventor
于毅欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yiqi Shanghai Intelligent Technology Co Ltd
Original Assignee
Yiqi Shanghai Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yiqi Shanghai Intelligent Technology Co Ltd filed Critical Yiqi Shanghai Intelligent Technology Co Ltd
Priority to CN202010178657.2A priority Critical patent/CN113386146A/en
Priority to PCT/CN2021/080645 priority patent/WO2021185187A1/en
Publication of CN113386146A publication Critical patent/CN113386146A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0045Manipulators used in the food industry
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/04Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Abstract

An apparatus for cutting objects or food, comprising: the robot arm, the structure light sensor, control module. The robotic arm is a device with degrees of freedom and/or a cutting portion. The structured light sensor is a sensor used for collecting depth information of a cutting area or/and an operation area. The control module is used for controlling the robot arm to cut the object according to the structured light sensor and/or the instruction or the cutting strategy. Because one feature of the invention is that only one cutting edge (often a knife) is often used, it is at least somewhat more convenient to clean than a method using a multi-knife arrangement (more than 5 blades side by side).

Description

Method and device for cutting objects or food
Technical Field
The invention relates to the technical field of robots, in particular to a method and a device for cutting objects or food.
Background
The invention relates to a machine arm for cutting food, which solves the requirement of cutting food or objects through the machine arm and can autonomously make a cutting strategy according to the condition of the cut objects. Since a feature of the present invention is that only one cutting edge (often a knife) is often used, it is at least somewhat more convenient to clean (reducing the total surface area of the knife in contact with the object or food) than a cutting machine (cutting method) using a multi-knife arrangement (more than 5 knives side by side) to ensure hygiene of the cut object at least to some extent after cutting.
Disclosure of Invention
According to one aspect of the present disclosure, an apparatus for cutting objects or food, comprising: the robot arm, the structure light sensor, control module.
The robot arm is a device with freedom and/or a cutting part;
the structured light sensor is a sensor used for collecting depth information of a cutting area or/and an operation area;
the control module is used for controlling the robot arm to cut the object according to the structured light sensor and/or the instruction or the cutting strategy.
Preferably, the degree of freedom is a joint (machine joint) capable of freely rotating, and a speed reduction motor or a steering engine or a motor can be used as a driving or power source.
According to one aspect of the present disclosure, a method of cooking fried food comprises:
3D modeling or/and 3D information collection is carried out on the cutting area;
after the object to be cut is placed, 3D modeling or/and 3D information collection is carried out on the cutting area again;
and (5) making a cutting strategy, and using a robot arm to cut.
Preferably, 3D modeling is performed based on a depth image acquired by a structured light sensor, and each pixel value in the depth image is converted into a distance and 3D coordinates to create a vertex, and necessary vertices may be connected to form a triangle.
Preferably, a special structured light sensor can be deployed to collect 3D information of the cutting area with the top down.
Preferably, the area of the object to be cut is determined according to a comparison of the 3D information of the cutting area without the object and the cutting area with the object to be cut placed.
Preferably, as an alternative embodiment, the depth images of the cut area without the placement and the cut area with the placement may be compared to obtain the top profile of the placement.
Further, the depth images of the cut region without the placed object and the cut region with the placed object may be compared pixel by pixel to obtain a top profile of the placed object.
Preferably, the cutting strategy is established according to the size of the object to be cut, the thickness (width) and height of the object to be cut, the user-defined cutting direction or \ and the cutting times.
Preferably, as an alternative embodiment, the cutting strategy may be several section information to guide the robot arm to perform the movement.
Preferably, as an alternative embodiment, the robotic arm may be coordinate calibrated prior to use.
Further, as an alternative embodiment, when initially calibrating the coordinates of the robot arm, each degree of freedom of the robot arm may be moved to a maximum angle to facilitate calibration.
Preferably, the cutting strategy may be specified according to the size of the band cut and a preset cutting width.
Preferably, as an alternative embodiment, the cutting width can be set by connecting a wireless terminal with the control module.
Preferably, the control module controls the robotic arm to perform the cutting according to a cutting strategy.
Preferably, as an alternative embodiment, a special robotic arm may be used for the fixation to ensure that the cutting can be performed closer to the cutting strategy.
Further, as an alternative embodiment, the end of such a robotic arm may be a fork or bifurcated member to facilitate securement.
Further, as an alternative embodiment, the position where such a robot arm fixes the work piece to be cut may be moved according to the change of the position to be cut.
Further, as an alternative embodiment, such a robot arm may be two or more.
Preferably, as an alternative embodiment, a camera may be added on top of the cutting area to shoot the cutting area downwards, to avoid cutting to at least some extent the hand that enters the cutting area dynamically or the hand with gloves or the hand wearing the work clothes.
Preferably, as an alternative embodiment, the arm responsible for the cutting can be provided with a motor to rotate a weight biased item (eccentric rotor) for the purpose of generating vibrations.
Further, as an alternative embodiment, the deployed position of the vibrating device may be on the cutting object (knife).
Preferably, as an alternative embodiment, the robot arm responsible for the cutting can be piston-moved with a piston-moved weight biased to a point (off-center piston) or a heavier piston for the purpose of generating vibrations in a direction parallel to the cutting edge line.
Preferably, as an alternative embodiment, the robot arm may intentionally avoid when collecting 3D information of the cutting area.
Preferably, as an alternative embodiment, special cutting runs (runs of blades) can be provided, for example a zigzag.
Preferably, as an alternative embodiment, there may be a plurality of structured light sensors used in combination to obtain 3D information of the cutting area or the operation area.
Preferably, as an alternative embodiment, the degrees of freedom of the robot arm may use a reduction motor or a steering engine.
Preferably, as an alternative embodiment, a robot arm may be added to exclusively fix the tape-cut object, and the end of the robot arm may be added with a suction device (a suction port is connected to an exhaust space with a suction fan).
Preferably, as an alternative embodiment, a table cleaning mode may be added to clean the table for the cutting area using a robotic arm.
Preferably, the coordinate system of the control module and the coordinate system of the depth image of the structured light sensor when converting into 3D information correspond.
Preferably, as an alternative embodiment, the cutting edge of the cutting module (knife) may be a straight cutting edge.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 schematically shows a layout of a robot arm, a structured light sensor and a cutting area, in a front view.
Detailed Description
The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Example 1
Step 1: the structured light sensor scans the blank cut region to obtain a depth map of the region, and uses the depth map to create 3D vertex information or \ and 3D mesh information.
Step 2: and placing the object to be cut in the cutting area, acquiring the depth map of the cutting area again by using the structured light sensor, and creating 3D vertex information or \ and 3D mesh information by using the depth map. And then comparing the 3D information of the cutting area where the object to be cut is placed with the 3D information of the cutting area where the object to be cut is not placed to determine the upper surface profile of the object to be cut.
Further, the two collected depth pictures can be compared pixel by pixel to determine the depth difference, when the depth difference is larger than a certain threshold (e.g. a certain value in the range of 0.01mm to 5mm, e.g. 0.1 mm), the object to be cut is considered, then the shape of the image is counted, and the number of times of cutting is determined according to the thickness to be cut (the horizontal direction of the local coordinate system or a width value perpendicular to the cutting direction of the knife) and the width of the object to be cut.
And step 3: for example, if the object to be cut is a rectangle about 15cm long by about 5cm wide, and a slice about 1cm thick needs to be cut, the cutting strategy is to cut 14 times (15 divided by 1 and then reduced by 1) in the direction perpendicular to the width.
And 4, step 4: when cutting, another robot arm can be used to fix the object to be cut, and a fixed distance (for example, 2 mm) can be reduced according to the 3D coordinates of the upper surface of the object to be cut as a destination to which the end of the other robot arm moves.
Example 2
Step 1: during initialization, the control module can read the relevant information of the robot arm stored in the configuration file, including the length of the blade, the coordinate range relative to the local coordinate system of the robot arm, and the length of each section of the robot arm.
Further, the length and width of the edge of the cutting region may be stored in advance, and used for comparing with the edge depth feature in the depth image, so as to calculate the coordinate value of each pixel of the depth image in the 3D horizontal plane.
Step 2: the structured light sensor scans the blank cut region to obtain a depth map of the region, and uses the depth map to create 3D vertex information or \ and 3D mesh information.
And step 3: and placing the object to be cut in the cutting area, acquiring the depth map of the cutting area again by using the structured light sensor, and creating 3D vertex information or \ and 3D mesh information by using the depth map. And then comparing the 3D information of the cutting area where the object to be cut is placed with the 3D information of the cutting area where the object to be cut is not placed to determine the upper surface profile of the object to be cut.
Further, the two collected depth pictures can be compared pixel by pixel to determine the depth difference, when the depth difference is larger than a certain threshold (e.g. a certain value in the range of 0.01mm to 2mm, e.g. 0.1 mm), the object to be cut is considered, then the shape of the image is counted, and the number of times of cutting is determined according to the thickness to be cut (the horizontal direction of the local coordinate system or a width value perpendicular to the cutting direction of the knife) and the width of the object to be cut.
And 4, step 4: for example, if the object to be cut is a rectangle about 15cm long by about 5cm wide, and a slice about 1cm thick needs to be cut, the cutting strategy is to cut 14 times (15 divided by 1 and then reduced by 1) in the direction perpendicular to the width.
And 5: when cutting, another robot arm can be used to fix the object to be cut, and a fixed distance (for example, 2 mm) can be reduced according to the 3D coordinates of the upper surface of the object to be cut as a destination to which the end of the other robot arm moves.
The above is a specific embodiment of the present invention, but the scope of the present invention should not be limited thereto. Any changes or substitutions that can be easily made by those skilled in the art within the technical scope of the present invention are included in the protection scope of the present invention, and therefore, the protection scope of the present invention is subject to the protection scope defined by the appended claims.

Claims (7)

1. An apparatus for cutting objects or food, comprising: the robot arm, the structured light sensor and the control module;
the robot arm is a device with freedom and/or a cutting part;
the structured light sensor is a sensor used for collecting depth information of a cutting area or/and an operation area;
the control module is used for controlling the robot arm to cut the object according to the structured light sensor and/or the instruction or the cutting strategy.
2. A method of cutting an object or food, comprising:
3D modeling or/and 3D information collection is carried out on the cutting area;
after the object to be cut is placed, 3D modeling or/and 3D information collection is carried out on the cutting area again;
and (5) making a cutting strategy, and using a robot arm to cut.
3. The method of claim 2, wherein 3D modeling or \ and 3D information collection of the cut region further comprises:
performing 3D modeling according to a depth image acquired by a structured light sensor, converting each pixel value in the depth image into a distance and a 3D coordinate to create a vertex, and connecting necessary vertices to form a triangle;
a special structured light sensor may be deployed to collect 3D information of the cutting area with the top down.
4. The method of claim 2, wherein after placing the object to be cut, 3D modeling or/and 3D information collection is performed again on the cut area, further comprising:
comparing the 3D information of the cutting area without the object with the 3D information of the cutting area with the object to be cut, and determining the area of the object to be cut;
as an alternative embodiment, the depth images of the cut areas without the placed object and the cut areas with the placed object may be compared to determine the top profile of the placed object;
the cutting strategy is formulated according to the size of the object to be cut, the thickness (width) and height of the object to be cut, the user-defined cutting direction or/and the cutting times.
5. The method of claim 2, wherein a cutting strategy is formulated, cutting being performed using a robotic arm, further comprising:
the cutting strategy can be specified according to the size of the belt cutting object and the preset cutting width;
as an alternative implementation, the cutting width can be set by connecting the wireless terminal with the control module;
the control module controls the robot arm to cut according to a cutting strategy;
as an alternative embodiment, a special robotic arm may be used for the fixation to ensure that the cutting can be performed closer to the cutting strategy.
6. A computer-readable write medium on which a computer program and related data are stored, characterized in that the program, when executed by a processor, implements the relevant computing functions and content of the invention.
7. An electronic device, comprising:
one or more processors;
a storage device to store one or more programs.
CN202010178657.2A 2020-03-14 2020-03-14 Method and device for cutting objects or food Pending CN113386146A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010178657.2A CN113386146A (en) 2020-03-14 2020-03-14 Method and device for cutting objects or food
PCT/CN2021/080645 WO2021185187A1 (en) 2020-03-14 2021-03-13 Method and apparatus for cutting object or food

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010178657.2A CN113386146A (en) 2020-03-14 2020-03-14 Method and device for cutting objects or food

Publications (1)

Publication Number Publication Date
CN113386146A true CN113386146A (en) 2021-09-14

Family

ID=77616315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010178657.2A Pending CN113386146A (en) 2020-03-14 2020-03-14 Method and device for cutting objects or food

Country Status (2)

Country Link
CN (1) CN113386146A (en)
WO (1) WO2021185187A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140026156A1 (en) * 2012-07-18 2014-01-23 David Deephanphongs Determining User Interest Through Detected Physical Indicia
CN106078818B (en) * 2016-06-17 2019-07-23 盐津铺子食品股份有限公司 A kind of yellow peach cuts semi-machine hand
SE544338C2 (en) * 2018-05-15 2022-04-12 Robot Grader Ab A portioning device and a method for packaging of food products
CN110299009A (en) * 2019-07-22 2019-10-01 上海工程技术大学 A kind of method and electronic equipment of the prediction short-term traffic flow based on KNN algorithm
CN110421292A (en) * 2019-08-14 2019-11-08 异起(上海)智能科技有限公司 A kind of method and apparatus to objects' contour positioning of welding robot
CN110539312A (en) * 2019-08-29 2019-12-06 南京禹智智能科技有限公司 Efficient and accurate livestock and poultry meat dividing robot
CN110602464A (en) * 2019-10-17 2019-12-20 异起(上海)智能科技有限公司 Method and device for saving image storage space during monitoring
CN110838142B (en) * 2019-11-05 2023-09-08 沈阳民航东北凯亚有限公司 Luggage size recognition method and device based on depth image

Also Published As

Publication number Publication date
WO2021185187A1 (en) 2021-09-23

Similar Documents

Publication Publication Date Title
JP6968700B2 (en) Systems, methods, and equipment for guide tools
RU2748005C2 (en) Systems, methods and device for sharing tool manufacturing and design data
JP5295828B2 (en) Object gripping system and interference detection method in the system
US20180004188A1 (en) Robot, robot control apparatus and robot system
JP5854815B2 (en) Information processing apparatus, information processing apparatus control method, and program
JP2016099257A (en) Information processing device and information processing method
JP5604647B2 (en) Fruit cutting mechanism
US20150127162A1 (en) Apparatus and method for picking up article randomly piled using robot
JP7376268B2 (en) 3D data generation device and robot control system
JP2019162684A (en) Gripping control device, gripping system, and program
JP5544464B2 (en) 3D position / posture recognition apparatus and method for an object
JP6450788B2 (en) Work removal system
CN113386146A (en) Method and device for cutting objects or food
JP7188574B2 (en) Suction pad and deformation measuring device
US20020130862A1 (en) System and method for modeling virtual object in virtual reality environment
JP2563390B2 (en) Cutting equipment for food processing
WO2021001882A1 (en) Information processing device and information processing method
JP2021088011A (en) Picking system, picking method, and program
EP2714327B1 (en) Improvements in knife sharpening methods
CN115805588A (en) Workpiece holding device, workpiece holding method, computer-readable medium, and control device
CN111496795B (en) Method and device for grabbing multilayer materials
CN117794704A (en) Robot control device, robot control system, and robot control method
JP2019124609A (en) Three-d shape auto-tracing method and measuring machine
US20200071094A1 (en) Article picking system
CN115087843A (en) Three-dimensional measuring device for generating three-dimensional point position information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210914

WD01 Invention patent application deemed withdrawn after publication