CN110919648A - Automatic picking and stacking device and method based on raspberry group - Google Patents

Automatic picking and stacking device and method based on raspberry group Download PDF

Info

Publication number
CN110919648A
CN110919648A CN201911087515.9A CN201911087515A CN110919648A CN 110919648 A CN110919648 A CN 110919648A CN 201911087515 A CN201911087515 A CN 201911087515A CN 110919648 A CN110919648 A CN 110919648A
Authority
CN
China
Prior art keywords
robot
image
processed product
gripper
stacking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911087515.9A
Other languages
Chinese (zh)
Inventor
解尘轩
简文娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lattice Power Jiangxi Corp
Nanchang University
Original Assignee
Lattice Power Jiangxi Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lattice Power Jiangxi Corp filed Critical Lattice Power Jiangxi Corp
Priority to CN201911087515.9A priority Critical patent/CN110919648A/en
Publication of CN110919648A publication Critical patent/CN110919648A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • B65G47/902Devices for picking-up and depositing articles or materials provided with drive systems incorporating rotary and rectilinear movements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The invention discloses an automatic picking and stacking device and a picking and stacking method based on a raspberry group, and relates to the technical field of production and processing, wherein the automatic picking and stacking device comprises a grabbing system and an image acquisition and processing system, the grabbing system comprises a robot and a robot gripper, and the image acquisition and processing system comprises a camera, the raspberry group and a robot controller; the robot gripper is installed at the action tail end of the robot, the camera, the robot gripper and the raspberry pie are matched, the camera collects image information of a machined product and the angle of the robot gripper are the same, the image information of the machined product is placed in a random posture, the image information of the machined product and the image information of the machined product are processed through an image processing algorithm, the image information of the machined product and the image information of the machined product are compared, a space motion track of the robot gripper to be adjusted is obtained through a linear relation between the actual length and the pixel point size, and the machined product is accurately picked up and stacked according to a fixing method. Reduce the manual operation intensity that the work piece picked up the pile up, promote to pick up and pile up efficiency.

Description

Automatic picking and stacking device and method based on raspberry group
Technical Field
The invention relates to the technical field of production and processing, in particular to an automatic material picking and stacking device and method based on a raspberry pie.
Background
Because the picking and stacking of the randomly placed processed products are manually carried out by most of the existing factories, the mode is high in labor intensity and low in efficiency, and when the processed products are high-temperature objects, the hands of operators can be damaged by manually picking and stacking the processed products.
Disclosure of Invention
In order to solve the problems in the prior art, the invention adopts the matching of the camera, the robot gripper and the raspberry pie, acquires the image information of the processed product with the same angle as the robot gripper and the image information of the processed product when the processed product is placed in random postures through the camera, compares the image information of the processed product and the image information of the processed product when the processed product is placed in random postures through an image processing algorithm, obtains the spatial motion track of the robot gripper to be adjusted through the linear relation between the actual length and the pixel point size, accurately picks up the processed product and stacks the processed product according to a fixed method. Reduce the manual operation intensity that the work piece picked up the pile, promote the efficiency of picking up and piling up.
The invention specifically adopts the following technical scheme:
an automatic picking and stacking device based on a raspberry group comprises a grabbing system and an image acquisition and processing system;
the grabbing system comprises a robot and a robot gripper, the robot gripper is mounted at the action tail end of the robot, and the robot is fixed on a robot mounting table;
the image acquisition and processing system comprises a camera, a raspberry group and a robot controller, wherein the camera is connected with the raspberry group, and the raspberry group is connected with the robot controller;
the robot gripper is a clamping plate gripper, and the camera is fixed to the top end of the interior of the robot gripper.
A picking and stacking method of an automatic picking and stacking device based on a raspberry group comprises the following steps:
step 1: before picking and stacking, placing a machined product with the same angle as the robot gripper below the robot gripper at an initial position of the robot gripper, and acquiring image data of the machined product through a camera and storing the image data of the machined product as standard machined product image data in a raspberry group;
step 2: starting to pick up materials; acquiring image data of a processed product to be picked and stacked below the robot gripper in real time through the camera, and transmitting the image data to the raspberry group;
and step 3: the raspberry group receives real-time processed product image data, obtains spatial motion track data of the robot gripper 11 through an image processing algorithm, and generates a data signal;
and 4, step 4: the raspberry group sends a spatial motion track data signal command to the robot controller, and the robot controller sends a grabbing command to the robot after receiving the signal;
and 5: after the robot receives the grabbing instruction, the robot adjusts the posture, so that the robot hand 11 moves to the edge of the processed product and grabs the processed product, and the automatic material picking of the processed product is completed;
step 6: the robot controller sends a return instruction to return the robot to an initial position;
and 7: the robot controller sends a preset stacking instruction, the robot rotates to the right by 90 degrees and moves to the position above a stacking position, the robot gripper descends, and a processed product is placed at the stacking position;
and 8: after the processed product is placed at the stacking position, the robot controller sends a return instruction to return the robot 10 to the initial position;
and step 9: and (5) repeating the steps 2 to 8 to realize continuous picking and stacking of the processed products.
Further, the spatial motion trajectory data comprises the transverse and vertical movement distances and the rotation angles of the robot gripper.
Further, the image processing algorithm comprises the following steps:
s1: gaussian filtering to smooth the image;
s2: carrying out image graying processing;
s3: carrying out Canny algorithm-based edge detection on the image to obtain all the contours of the image;
s4: screening all detected contours, and selecting the characteristic with more distinct contour; if no more distinct contour features exist, repeating S1-S3;
s5: and carrying out image binarization processing on the contour with more vivid characteristics, and carrying out linear conversion on the actual length and the pixel point size to obtain the horizontal and vertical moving distances and the rotating angle of the robot to be moved.
Further, the Canny algorithm adopted in S3 for edge detection specifically comprises the following steps:
a: finding the intensity gradient of each pixel point of the image;
b: applying non-maxima to eliminate edge false positives;
c: applying a dual threshold method to determine the boundary;
d: the boundaries are tracked using a hysteresis technique.
The invention has the beneficial effects that:
by adopting the matching of the camera, the robot gripper and the raspberry group, the camera acquires the image information of the processed product with the same angle as the robot gripper and the image information of the processed product when the processed product is placed in a random posture, the image information of the processed product and the image information of the processed product are compared after the images are processed by an image processing algorithm, the spatial motion track of the robot gripper to be adjusted is obtained through the linear relation between the actual length and the pixel point size, and the processed product is accurately picked up and stacked according to a fixed method. Reduce the manual operation intensity that the work piece picked up the pile, promote the efficiency of picking up and piling up.
Drawings
Fig. 1 is a schematic structural diagram of an automatic picking and stacking device based on a raspberry pi according to embodiment 1 of the present invention;
fig. 2 is a flowchart of a method for picking up a stack in embodiment 2 of the present invention:
fig. 3 is a flowchart of an image processing algorithm in embodiment 2 of the present invention:
the attached drawings are marked as follows: 10-a robot; 11-a robot gripper; 12-a mounting table; 20-camera.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
As shown in fig. 1, an embodiment 1 of the present invention discloses an automatic material picking and stacking device based on raspberry pi, which includes a grabbing system and an image collecting and processing system, wherein the grabbing system includes a robot 10 and a robot gripper 11, the robot gripper 11 is installed at an action end of the robot 10, and the robot 10 is fixed on a robot installation table 12; the image acquisition and processing system comprises a camera 20, a raspberry group and a robot controller, wherein the camera 20 is connected with the raspberry group, and the raspberry group is connected with the robot controller; the robot gripper 11 is a clamping plate gripper, and the camera 20 is fixed at the top end of the interior of the robot gripper 11. The camera is used for collecting image information of the processed product, and the robot controller is used for controlling the robot gripper to automatically pick and stack the processed product.
As shown in fig. 2-3, an embodiment 2 of the present invention discloses a method for picking up a material stack based on the raspberry pi automatic picking up and stacking device described in embodiment 1, which includes the following steps:
step 1: before picking and stacking, placing a machined product with the same angle as the robot gripper 11 below the robot gripper 11 at an initial position of the robot gripper 11, and acquiring image data of the machined product through a camera 20 and storing the image data of the machined product as standard machined product image data in a raspberry pie;
step 2: starting to pick up materials; acquiring image data of a processed product to be picked and stacked below the robot gripper 11 in real time through a camera 20, and transmitting the image data to a raspberry group;
and step 3: the raspberry group receives real-time processed product image data, obtains spatial motion track data of the robot gripper 11 through an image processing algorithm and generates a data signal;
and 4, step 4: the raspberry group sends a spatial motion trajectory data signal command to the robot controller, and the robot controller sends a grabbing command to the robot 10 after receiving the signal;
and 5: after the robot 10 receives the grabbing instruction, the robot 10 adjusts the posture, so that the robot gripper 11 moves to the edge of the processed product and grabs the processed product, and the automatic material picking of the processed product is completed;
step 6: the robot controller sends a return instruction to return the robot 10 to the initial position;
and 7: the robot controller sends a preset stacking instruction, the robot 10 rotates rightwards by 90 degrees to move to the position above the stacking position, the robot gripper 11 descends, and a processed product is placed at the stacking position;
and 8: after the processed product is placed at the stacking position, the robot controller sends a return instruction to return the robot 10 to the initial position;
and step 9: and (5) repeating the steps 2 to 8 to realize continuous picking and stacking of the processed products.
In embodiment 2, the spatial motion trajectory data includes the lateral, vertical movement distance and the rotation angle of the robot gripper 11. The robot gripper can better pick up the processed product.
In embodiment 2, the image processing algorithm comprises the following steps:
s1: gaussian filtering to smooth the image;
s2: carrying out image graying processing;
s3: carrying out Canny algorithm-based edge detection on the image to obtain all the contours of the image;
s4: screening all detected contours, selecting more distinct features, if no more distinct features exist, repeating S1-S3;
s5: and carrying out image binarization processing on the contour with more vivid characteristics, and carrying out linear conversion on the actual length and the pixel point size to obtain the horizontal and vertical moving distances and the rotating angle of the robot to be moved.
The Gaussian filtering is to discretize a Gaussian function, take the Gaussian function value on a discrete point as a weight, and perform weighted average in a certain range of neighborhood on each pixel point of the collected gray matrix, so that Gaussian noise can be effectively eliminated; and (3) carrying out binarization processing on the image, and if the quality of the white pixel point is 0 and the quality of the black pixel point is 1, calculating the gravity center of the image as (n in the formula represents the number): x ═ x1+ x2+. + xn)/n, y ═ y1+ y2+... + yn)/n. The gravity center position is the center position, and the coordinates are (x, y). Comparing the barycentric position of the standard processed product with the barycentric position coordinate of the actual processed product, and performing a linear conversion relation of the actual length and the pixel point, and X1=KX0,Y1=KY0And obtaining the moving distance and the rotating angle of the robot gripper.
In embodiment 2, the specific steps of performing edge detection by using the Canny algorithm in S3 are as follows:
a: finding the intensity gradient of each pixel point of the image;
b: applying non-maxima to eliminate edge false positives;
c: applying a dual threshold method to determine the boundary;
d: the boundaries are tracked using a hysteresis technique.
Canny edge detection is a multi-stage detection algorithm, and an optimal edge characteristic of a processed product can be found through the method.
Finally, only specific embodiments of the present invention have been described in detail above. The invention is not limited to the specific embodiments described above. Equivalent modifications and substitutions by those skilled in the art are also within the scope of the present invention. Accordingly, equivalent alterations and modifications are intended to be included within the scope of the invention, without departing from the spirit and scope of the invention.

Claims (5)

1. The utility model provides an automatic material pile device that picks up based on raspberry group, includes grasping system and image acquisition processing system, its characterized in that:
the grabbing system comprises a robot (10) and a robot gripper (11), wherein the robot gripper (11) is installed at the action tail end of the robot (10), and the robot (10) is fixed on a robot installation table (12);
the image acquisition and processing system comprises a camera (20), a raspberry group and a robot controller, wherein the camera (20) is connected with the raspberry group, and the raspberry group is connected with the robot controller;
the robot gripper (11) is a clamping plate gripper, and the camera (20) is fixed to the top end of the interior of the robot gripper (11).
2. A picking stacking method of the raspberry pi based automatic picking stacking apparatus according to claim 1, characterized in that: the method comprises the following steps:
step 1: before picking and stacking, a robot gripper (11) is at an initial position, a processed product with the same angle as the robot gripper (11) is placed below the robot gripper (11), and image data of the processed product is collected through a camera (20) and stored in a raspberry group as standard processed product image data;
step 2: starting to pick up materials; the camera (20) is used for acquiring image data of the processed product to be picked and stacked below the robot gripper (11) in real time and transmitting the image data to the raspberry group;
and step 3: the raspberry group receives real-time processed product image data, obtains space motion trajectory data of the robot gripper (11) through an image processing algorithm, and generates a data signal;
and 4, step 4: the raspberry group sends a spatial motion track data signal command to the robot controller, and the robot controller sends a grabbing command to the robot (10) after receiving the signal;
and 5: after the robot (10) receives the grabbing instruction, the robot (10) adjusts the posture, so that the robot gripper (11) moves to the edge of the processed product and grabs the processed product, and the automatic material picking of the processed product is completed;
step 6: the robot controller sends a return instruction to return the robot (10) to an initial position;
and 7: the robot controller sends a preset stacking instruction, the robot (10) rotates rightwards by 90 degrees to move to the position above a stacking position, the robot gripper (11) descends and places a processed product at the stacking position;
and 8: after the processed product is placed at the stacking position, the robot controller sends a return instruction to return the robot (10) to the initial position;
and step 9: and (5) repeating the steps 2 to 8 to realize continuous picking and stacking of the processed products.
3. The pickup stacking method of claim 2, wherein:
the spatial motion trajectory data comprises the lateral and vertical movement distances and the rotation angles of the robot gripper (11).
4. The pickup stacking method of claim 2, wherein:
the image processing algorithm comprises the following steps:
s1: gaussian filtering to smooth the image;
s2: carrying out image graying processing;
s3: carrying out Canny algorithm-based edge detection on the image to obtain all the contours of the image;
s4: screening all detected contours, and selecting the characteristic with more distinct contour; if no more distinct contour features exist, repeating S1-S3;
s5: and carrying out image binarization processing on the contour with more vivid characteristics, and carrying out linear conversion on the actual length and the pixel point size to obtain the horizontal and vertical moving distances and the rotating angle of the robot to be moved.
5. The pickup stacking method of claim 4, wherein:
the specific steps of the Canny algorithm adopted in the S3 for edge detection are as follows:
a: finding the intensity gradient of each pixel point of the image;
b: applying non-maxima to eliminate edge false positives;
c: applying a dual threshold method to determine the boundary;
d: the boundaries are tracked using a hysteresis technique.
CN201911087515.9A 2019-11-08 2019-11-08 Automatic picking and stacking device and method based on raspberry group Pending CN110919648A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911087515.9A CN110919648A (en) 2019-11-08 2019-11-08 Automatic picking and stacking device and method based on raspberry group

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911087515.9A CN110919648A (en) 2019-11-08 2019-11-08 Automatic picking and stacking device and method based on raspberry group

Publications (1)

Publication Number Publication Date
CN110919648A true CN110919648A (en) 2020-03-27

Family

ID=69852633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911087515.9A Pending CN110919648A (en) 2019-11-08 2019-11-08 Automatic picking and stacking device and method based on raspberry group

Country Status (1)

Country Link
CN (1) CN110919648A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111891745A (en) * 2020-08-10 2020-11-06 珠海格力智能装备有限公司 Processing method and device for loading and unloading and loading and unloading system
CN111975782A (en) * 2020-08-25 2020-11-24 北京华航唯实机器人科技股份有限公司 Object placing method and device and robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108499054A (en) * 2018-04-04 2018-09-07 清华大学深圳研究生院 A kind of vehicle-mounted mechanical arm based on SLAM picks up ball system and its ball picking method
CN108527311A (en) * 2018-06-27 2018-09-14 佛山科学技术学院 A kind of taking care of books robot
CN108555908A (en) * 2018-04-12 2018-09-21 同济大学 A kind of identification of stacking workpiece posture and pick-up method based on RGBD cameras
CN109739133A (en) * 2019-01-08 2019-05-10 太原工业学院 Tomato picking robot system and its control method based on radar fix
CN109785317A (en) * 2019-01-23 2019-05-21 辽宁工业大学 The vision system of automatic stacking truss robot
CN109954254A (en) * 2019-03-19 2019-07-02 武汉理工大学 Based on omnidirectional come the court intelligent ball collecting robot of good fortune wheel
CN109955265A (en) * 2019-03-08 2019-07-02 武汉理工大学 A kind of indoor range complex intelligence shell case cleaning robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108499054A (en) * 2018-04-04 2018-09-07 清华大学深圳研究生院 A kind of vehicle-mounted mechanical arm based on SLAM picks up ball system and its ball picking method
CN108555908A (en) * 2018-04-12 2018-09-21 同济大学 A kind of identification of stacking workpiece posture and pick-up method based on RGBD cameras
CN108527311A (en) * 2018-06-27 2018-09-14 佛山科学技术学院 A kind of taking care of books robot
CN109739133A (en) * 2019-01-08 2019-05-10 太原工业学院 Tomato picking robot system and its control method based on radar fix
CN109785317A (en) * 2019-01-23 2019-05-21 辽宁工业大学 The vision system of automatic stacking truss robot
CN109955265A (en) * 2019-03-08 2019-07-02 武汉理工大学 A kind of indoor range complex intelligence shell case cleaning robot
CN109954254A (en) * 2019-03-19 2019-07-02 武汉理工大学 Based on omnidirectional come the court intelligent ball collecting robot of good fortune wheel

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111891745A (en) * 2020-08-10 2020-11-06 珠海格力智能装备有限公司 Processing method and device for loading and unloading and loading and unloading system
CN111975782A (en) * 2020-08-25 2020-11-24 北京华航唯实机器人科技股份有限公司 Object placing method and device and robot

Similar Documents

Publication Publication Date Title
CN106000904B (en) A kind of house refuse Automated Sorting System
CN109279373B (en) Flexible unstacking and stacking robot system and method based on machine vision
CN108399639B (en) Rapid automatic grabbing and placing method based on deep learning
CN109785317B (en) Automatic pile up neatly truss robot's vision system
CN110580725A (en) Box sorting method and system based on RGB-D camera
WO2017015898A1 (en) Control system for robotic unstacking equipment and method for controlling robotic unstacking
CN111015662B (en) Method, system and equipment for dynamically grabbing object and method, system and equipment for dynamically grabbing garbage
CN107009391B (en) Robot grabbing method
CN110919648A (en) Automatic picking and stacking device and method based on raspberry group
CN114751153B (en) Full-angle multi-template stacking system
CN110640741A (en) Grabbing industrial robot with regular-shaped workpiece matching function
CN113666028B (en) Garbage can detecting and grabbing method based on fusion of laser radar and camera
CN113643280A (en) Plate sorting system and method based on computer vision
CN110404803B (en) Parallel robot sorting system and sorting method based on vision
CN110125036B (en) Self-recognition sorting method based on template matching
CN111169871A (en) Method for grabbing garbage can by intelligent manipulator of garbage truck and manipulator
CN114155301A (en) Robot target positioning and grabbing method based on Mask R-CNN and binocular camera
CN110980276A (en) Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
CN113460716A (en) Remove brick anchor clamps and intelligent sign indicating number brick robot based on visual identification
CN114789452A (en) Robot grabbing method and system based on machine vision
CN109625922A (en) A kind of automatic loading and unloading system and method for intelligence manufacture
WO2020010876A1 (en) Mechanical arm control method based on least squares method for use in robot experimental teaching
CN113034526B (en) Grabbing method, grabbing device and robot
CN114055501A (en) Robot grabbing system and control method thereof
CN110533717B (en) Target grabbing method and device based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200327

RJ01 Rejection of invention patent application after publication