CN110640741A - Grabbing industrial robot with regular-shaped workpiece matching function - Google Patents

Grabbing industrial robot with regular-shaped workpiece matching function Download PDF

Info

Publication number
CN110640741A
CN110640741A CN201911019398.2A CN201911019398A CN110640741A CN 110640741 A CN110640741 A CN 110640741A CN 201911019398 A CN201911019398 A CN 201911019398A CN 110640741 A CN110640741 A CN 110640741A
Authority
CN
China
Prior art keywords
workpiece
image
industrial robot
control system
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911019398.2A
Other languages
Chinese (zh)
Inventor
孙法君
Original Assignee
Ningbo Sailang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sailang Technology Co Ltd filed Critical Ningbo Sailang Technology Co Ltd
Publication of CN110640741A publication Critical patent/CN110640741A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The utility model provides a regular shape work piece matches snatch industrial robot, includes industrial robot control system, sensor system, control system, network integrated control system, visual tracking system and actuating mechanism, and sensor system is connected with industrial robot control system, and the image recognition's of work piece process is: step 1, gray level conversion; step 2, contrast equalization; step 3, performing median filtering on the gray level image; step 4, positioning a workpiece; step 5, detecting the inclination angle of the rectangular area by adopting Hough transform, and then performing interpolation rotation according to the detected inclination angle to horizontally arrange the workpieces in the rectangular area; and 6, identifying the shape.

Description

Grabbing industrial robot with regular-shaped workpiece matching function
Technical Field
The invention belongs to the field of industrial robots, and particularly relates to a grabbing industrial robot with a workpiece matched with a regular shape.
Background
With the current level of industrial automation becoming higher and higher, industries are gaining more and more importance, and they can skillfully and accurately perform complex tasks that may not be accomplished at all by people due to environmental or other factors. The industrial robot is mainly applied to automobile part manufacturing and assembling, mechanical automatic manufacturing, toxic chemical product production, standard assembly line operation, high-risk environment equipment installation, nuclear radiation field operation, extreme environment operation and the like.
Grasping industrial robot for shape matching of simple workpiece lacking in prior art
Disclosure of Invention
The invention provides a grabbing industrial robot matched with a regular-shaped workpiece, aiming at solving the technical problem of how to quickly grab the regular-shaped workpiece,
the technical scheme of the invention is as follows: a grabbing industrial robot matched with a regular-shaped workpiece comprises an industrial robot control system, a sensor system, an operation system, a network integrated control system, a visual tracking system and an execution mechanism, wherein the sensor system is connected with the industrial robot control system,
the control system receives data of the sensor system and the visual tracking system through the network integrated control system and sends a control instruction to the industrial robot control system, the industrial robot control system controls the execution mechanism, the sensor system is arranged on the execution mechanism and monitors the pose state of the execution mechanism in real time, the sensor system and the visual tracking system are also connected with the industrial robot control system and used for feeding back the working state of the execution mechanism in real time and monitoring the surrounding environment,
wherein the industrial robot control system comprises a demonstrator and a motion controller,
wherein the control system comprises an industrial personal computer,
wherein the visual tracking system comprises an RGB camera, a laser scanner, a tracking camera and a radio frequency transmitting and receiving device,
wherein the sensor system comprises a plurality of six-axis sensors, an optical sensor, a motion sensor and a Hall current sensor,
wherein the actuating mechanism comprises a mechanical part and an electric part,
the network integrated control system searches for industrial robots existing in the local area network and is connected to corresponding motion controllers, the motion sensors are operated and the six-axis sensors are cleared, the motion sensors collect pose information of end effectors of the execution mechanisms to guide the motion of the industrial robots, and the pose information is displayed on a screen of the demonstrator in real time.
Industrial robot communication is divided into two stages: the first-level communication is communication between a control system and an industrial robot control system, and adopts a serial communication technology or a network communication technology; the second-level communication is communication among the industrial robot control system, the sensor system and the visual tracking system, and adopts an industrial field bus communication technology.
The RGB camera, the workpiece conveying line, the workpiece placing area and the executing mechanism form an industrial robot grabbing system, image information obtained by the RGB camera in real time is processed by an industrial personal computer to obtain shape and position information of a target workpiece, and the upper computer sends control information to enable the industrial robot to execute specified grabbing and placing operations.
The RGB camera observes that a workpiece enters a visual field range, the trigger is placed in the middle of the workpiece conveying line in the visual field range, when the workpiece touches the trigger, the RGB camera captures an image, the captured image is transmitted to the industrial personal computer for image processing, the shape and the grabbing position information of a target workpiece can be obtained after the image is analyzed and processed, the shape and the grabbing position information are sent to the industrial robot, then the designated position of the industrial robot is grabbed, and the target workpiece can be stacked at different places of the workpiece placing area according to different shapes. And analyzing and processing the acquired image, identifying the shape of the target workpiece and obtaining the central position of the target workpiece.
The specific process of image recognition of the workpiece is as follows:
step 1, gray level conversion;
step 2, contrast equalization;
step 3, performing median filtering on the gray level image;
step 4, positioning a workpiece;
step 5, detecting the inclination angle of the rectangular area by adopting Hough transform, and then performing interpolation rotation according to the detected inclination angle to horizontally arrange the workpieces in the rectangular area;
and 6, identifying the shape.
The invention has the beneficial effects that:
(1) the simple workpiece is quickly identified through a matching algorithm of a regular shape;
(2) the central position of the workpiece is identified and positioned through image processing;
(3) the optical coupling isolation circuit greatly improves the reliability of hardware;
(4) by using secondary communication, flexible control over the industrial robot is realized;
(5) the actuating mechanism uses a parallelogram, so that the rigidity of the whole structure is increased, and the stability of the system is improved.
Drawings
Fig. 1 is a block diagram of an industrial robot system of the present invention;
FIG. 2 is a mechanical block diagram of the actuator of the present invention;
fig. 3 is a schematic view of the gripping operation of an industrial robot according to the invention;
FIG. 4 is a flow chart of a simple workpiece contrast matching method of shape according to the present invention;
FIG. 5 is a flowchart of a workpiece center position acquisition process of the present invention;
Detailed Description
The invention will be further described with reference to the accompanying drawings.
A grabbing industrial robot matched with a regular-shaped workpiece comprises an industrial robot control system, a sensor system, an operation system, a network integrated control system, a visual tracking system and an execution mechanism, wherein the sensor system is connected with the industrial robot control system,
the control system receives data of the sensor system and the visual tracking system through the network integrated control system and sends a control instruction to the industrial robot control system, the industrial robot control system controls the execution mechanism, the sensor system is arranged on the execution mechanism and monitors the pose state of the execution mechanism in real time, the sensor system and the visual tracking system are also connected with the industrial robot control system and used for feeding back the working state of the execution mechanism in real time and monitoring the surrounding environment,
wherein the industrial robot control system comprises a demonstrator and a motion controller,
wherein the control system comprises an industrial personal computer,
wherein the visual tracking system comprises an RGB camera, a laser scanner, a tracking camera and a radio frequency transmitting and receiving device,
wherein the sensor system comprises a plurality of six-axis sensors, an optical sensor, a motion sensor and a Hall current sensor,
wherein the actuating mechanism comprises a mechanical part and an electric part,
the network integrated control system searches for industrial robots existing in the local area network and is connected to corresponding motion controllers, the motion sensors are operated and the six-axis sensors are cleared, the motion sensors collect pose information of end effectors of the execution mechanisms to guide the motion of the industrial robots, and the pose information is displayed on a screen of the demonstrator in real time.
Industrial robot communication is divided into two stages: the first-level communication is communication between a control system and an industrial robot control system, and adopts a serial communication technology or a network communication technology; the second-level communication is communication among the industrial robot control system, the sensor system and the visual tracking system, and adopts an industrial field bus communication technology.
The mechanical part comprises a base, a connecting piece, a big arm, a small arm, a wrist, an end effector and a rotary joint, wherein the rotary joint is respectively positioned between the base and the connecting piece, between the connecting piece and the big arm, between the big arm and the small arm and between the wrist and the end effector, the base is a bearing base part and is fixed on the ground or a support, the connecting piece is a supporting part of the big arm and realizes the rotation function of the robot, the connecting piece rotates on the base, the big arm is a supporting part of the small arm, the swinging of the big arm changes the stroke of the end effector in the horizontal direction, the pitching of the small arm realizes the position transformation of the end effector in the vertical direction, and the rotary joint of the end effector of the wrist adjusts the rotation angle and the position of a bearing.
The joint seat of the base is connected with a rotary joint with the axis vertical to the ground, the joint seat is arranged on the base and used for supporting the big arm, the small arm and the connecting rod for keeping the wrist horizontal are arranged on the joint seat, the big arm, the small arm and the connecting rod form a parallelogram, the rigidity of the whole arm is increased, the easy control performance of the wrist is met through the superposition effect of a serial parallelogram mechanism, the wrist is a flange plate, and a vacuum chuck is connected to the flange plate according to different requirements of a user.
The structure increases the rigidity of the whole arm part, the interaction of the parallelograms increases the rigidity of the whole robot transmission system, reduces the robot vibration caused under the conditions of starting and sudden stop, enlarges the stroke, reduces the system inertia, saves the cost, simultaneously increases the stability of the system, simplifies the control of the pose of the robot by utilizing the parallelogram principle of the transfer robot, reduces the difficulty of process control, and can shorten the working period and the research, development and design cost of the robot.
Wherein, the power part comprises an encoder, a decoding circuit, an optical coupling isolation circuit, a permanent magnet synchronous servo motor (PMSM), a speed reducer and an intelligent power control module (IPM), a Hall current sensor collects the phase current of U and V of the permanent magnet synchronous servo motor and feeds back the phase current to the motion controller, the encoder feeds back the actual position of the permanent magnet synchronous servo motor to the motion controller in real time through the decoding circuit, the motion controller receives the target position information through a serial bus, the target position, the actual position and the actual current are subjected to single-shaft logic control in the motion controller, the pulse width modulation is output through the time sequence scheduling of vector control and is provided for the intelligent power control module through the optical coupling isolation circuit and converted into a power control signal, the optical coupling isolation circuit realizes the complete isolation of the control part circuit and the power part circuit, the reliability of hardware is greatly improved, the intelligent power control, the output shaft of the permanent magnet synchronous servo motor is connected with a speed reducer, the speed reducer is connected with a rotary joint of the mechanical part, and the speed reducer is controlled by a motion controller to realize fine adjustment of actions.
The RGB camera, the workpiece conveying line, the workpiece placing area and the executing mechanism form an industrial robot grabbing system, image information obtained by the RGB camera in real time is processed by an industrial personal computer to obtain shape and position information of a target workpiece, and the upper computer sends control information to enable the industrial robot to execute specified grabbing and placing operations.
The RGB camera observes that a workpiece enters a visual field range, the trigger is placed in the middle of the workpiece conveying line in the visual field range, when the workpiece touches the trigger, the RGB camera captures an image, the captured image is transmitted to the industrial personal computer for image processing, the shape and the grabbing position information of a target workpiece can be obtained after the image is analyzed and processed, the shape and the grabbing position information are sent to the industrial robot, then the designated position of the industrial robot is grabbed, and the target workpiece can be stacked at different places of the workpiece placing area according to different shapes. And analyzing and processing the acquired image, identifying the shape of the target workpiece, and obtaining the central position or the mass center position of the target workpiece.
For simple workpieces with regular shapes, the shapes of the workpieces are obtained by using a contrast matching method, and the specific process is as follows:
the specific process of image recognition of the workpiece is as follows:
step 1, the gray scale conversion is carried out,
the collected working space image is 24-bit BMP bitmap in RGB format, the color image is converted into a gray scale image and then processed, f (i, j) represents the gray scale value of a pixel point (i, j), R (i, j) represents the red component of the pixel point (i, j), G (i, j) represents the green component of the pixel point (i, j), B (i, j) represents the blue component of the pixel point (i, j), and the gray scale conversion formula is as follows:
f(i,j)=0.299×R(i,j)+0.587×G(i,j)+0.144×B(i,j),
step 2, contrast equalization, namely setting g (i, j), (i is 1,2,.. multidot.M, j is 1,2,.. multidot.N) as the image after gray level conversion, wherein M, N is the height and the width of the image pixel size respectively, the gray level variation range of the image is [0,255],
step 2.1, according to the original image [ f (i, j)]M×NConstruction of 256-dimensional hf(t), t ═ 0,1, 2.., 255 vectors;
step 2.2, the gray distribution probability p of the original image is obtainedf(t) the vector is then calculated,
Figure BDA0002246708600000071
wherein N isfThe total number of pixels of the image;
step 2.3, calculating the cumulative distribution probability p of each gray value of the imagea(k) Then, the first step is executed,
Figure BDA0002246708600000072
wherein p isa(0)=0;
Step 2.4, histogram equalization calculation, to obtain pixel value g (i, j) of the processed image, g (i, j) is 255 × pa(k)。
Therefore, the gray level conversion value of each pixel after equalization can be obtained according to the statistic of the original image, the contrast of the image after equalization is enhanced, the interference on the part of the workpiece is avoided, and the workpiece with uneven illumination becomes clearer.
Step 3, performing median filtering on the gray level image, performing edge detection by using a Canny operator and a gradient operator, and eliminating boundary points by using corrosion operation to enable a workpiece area to be highlighted;
median filtering can effectively protect image edges and can remove noise; the Canny operator is a double-threshold method, is not easily interfered by noise, and can detect a real weak edge; the gradient operator can detect along a specific direction, and the combination of the gradient operator and the Canny operator can greatly reduce the operation amount and the data amount, and effectively remove irrelevant partial images.
Step 4, positioning a workpiece;
step 4.1, constructing a structural matrix to perform closed operation on the image, filling a workpiece area into a communicated area, eliminating a small area, communicating the workpiece part, and marking a preselected area;
step 4.2, setting the preselection area to be white, setting the pixel value to be 0, setting other areas to be black, and setting the pixel value to be 1; labeling each closed white block in the image, specifically:
scanning an image, marking a first pixel value as 1 as LAB, then scanning an object in the neighborhood of 8, if the eight pixel values are all 1, marking the pixels in one region as LAB, then continuing to scan until a pixel with a pixel value of 0 is scanned, indicating that the region is marked, and when a point with a pixel value of 1 is scanned again, marking LAB as LAB +1, and sequentially scanning the whole image;
step 4.3, calculating the length-width ratio of the labeling area, specifically:
after all the areas are marked, the height and the width of each area are calculated, and the upper, lower, left and right extreme points of each area are respectively marked as Wmax,Wmin,Hmax,HminWhen the width W is equal to Wmax-WminHeight H ═ Hmax-Hmin
Step 4.4, regional screening, which specifically comprises the following steps: using the width-to-height ratio W/H as a constraint condition, deleting the areas of the width-to-height ratio W/H except for the areas of [3, 5], and further reducing interference areas according to the absolute width value W and the height value H if a plurality of areas of the width-to-height ratio W/H meet the requirements; calculating the area of the residual region, accumulating the number of pixels in each block region, averaging, and selecting and removing a communication region smaller than 500 to obtain a workpiece region;
and 4.5, obtaining the specific position of the workpiece by positioning the rectangular area, specifically: determining the starting and ending positions of the rows and the starting and ending positions of the columns of the workpiece by accumulating the gray values of the pixels of the rows and the pixels of the columns, and combining the positions of the rows and the columns to determine the accurate position of the workpiece;
step 5, detecting the inclination angle of the rectangular area by adopting Hough transform, and then performing interpolation rotation according to the detected inclination angle to horizontally arrange the workpieces in the rectangular area;
hough transform has strong anti-interference capability on the extraction of straight lines.
And step 6, recognizing the shape,
and 6.1, projecting the shape of the workpiece in the vertical direction by using a projection method, namely scanning the workpiece row by row along the vertical direction, accumulating the number of pixel points with the pixel value of 1 in each row to be used as a vertical coordinate, wherein the horizontal coordinate is the horizontal position of the image, the vertical projection curve has the characteristic of wave crest-wave trough-wave crest, and the position of the wave trough on the projection graph is selected as the segmentation position of the projection of the workpiece.
Step 6.2, normalization, specifically: calculating the height and width of the workpiece projection according to the upper and lower boundaries when the workpiece projection is extracted and the left and right boundaries obtained when the workpiece projection is divided, counting the calculation results, taking the maximum value as the normalized template size, and dividing the workpiece projection into the pixel size of 50 multiplied by 25 through calculation test;
step 6.3, a coarse grid feature extraction method is adopted, the workpiece projection is equally divided into 10 multiplied by 5 grids, and the number of black pixels in the grids is counted;
step 6.4, neural network identification, which comprises the following specific processes:
step 6.4.1, inputting the characteristic sample into a BP neural network for learning, and establishing an identification model;
6.4.2, matching the established workpiece shape library with the recognized workpiece shape;
and 6.4.3, outputting a matching result.
The specific process of acquiring the center position of the workpiece by using image processing is as follows:
step 1, acquiring original images of a workpiece conveying line and a workpiece, wherein a lens optical axis of an RGB (red, green and blue) camera is parallel to the workpiece conveying line;
and 2, enhancing the image, and respectively filtering and denoising the RGB values in the original image. Filtering and denoising the original image, wherein the noise comprises equipment noise, salt and pepper noise and quantization noise, and the filtering process of the noise is shown as the following formula:
Figure BDA0002246708600000101
Figure BDA0002246708600000103
wherein, a rectangular coordinate system x-0-y, f is established by taking the center of the original image as the originR(x,y)、fG(x,y)、fB(x, y) are R, G, B-valued functions of the pixel at coordinate (x, y) in the original image, respectively, where x ═ 0,1, … … 255, y ∈ (0,1, … … 255), and FR(x,y)、FG(x,y)、FB(x, y) is a function of the filtered R, G, B value, N × N is a size representing a truncated window, N ═ 3,5,7. > preferably N ═ 3, and P represents a set of points made up of pixels within the window;
the filtering mode filters the RGB values of the pixels respectively, inhibits useless information and well reserves the color information of the original picture;
and 3, segmenting the image to obtain a target image of the workpiece.
Step 3.1, converting the RGB color space to generate a new color space U1U2U3
Filtered FR(x,y)、FG(x,y)、FB(x, y) becomes the corresponding coefficient function via the following transformation:
Figure BDA0002246708600000111
wherein, U1(x, y) is a red-green correlation function, U2(x, y) is a red-blue correlation function, U3(x, y) is a green-blue correlation function;
and 3.2, distinguishing the workpiece from the workpiece conveying line.
Constructing a segmentation function G for workpieces and workpiece conveyor linesS(x, y) Using U1(x,y)、U2(x, y) are asAs the judgment condition:
Figure BDA0002246708600000112
wherein, TSIs a segmentation threshold;
segmentation threshold TSMay be a predetermined fixed value, e.g. TS=4。
And 4, denoising the image.
The image of the target workpiece is obtained through calculation, but some small-area noise, namely speckle noise on the image, exists inevitably, the speckle noise is obviously not the image of the workpiece and needs to be filtered, and the opening calculation and the closing calculation of mathematical morphology are used for denoising, wherein the opening calculation and the closing calculation of mathematical morphology are used for denoising
Step 4.1, constructing a binary segmentation function G'A(x, y) dividing the function G first before operationA(x, y) binarizing, wherein the binary segmentation function is as follows:
Figure BDA0002246708600000121
step 4.2, using open operation, firstly carrying out corrosion operation on the binary image and then carrying out expansion operation;
and 4.3, using closed operation. Firstly, performing expansion operation on the binary image and then performing corrosion operation;
step 4.4, generating the final workpiece objective function GF(x, y). Binary image and segmentation function G 'after opening and closing calculation'APerforming AND operation on the binary image composed of (x, y), and performing AND operation on G in the region of 1 valueAThe values of (x, y) are assigned one by one according to the coordinates to form a final workpiece target function GF(x,y);
Step 5, obtaining the central position (x) of the workpiece targetcen,ycen). Its aim at lets industrial robot can fix a position the position of work piece, and then realizes snatching.
Obtaining the maximum response value in the image by using a Gaussian filter, further determining the central position of a target in the image, and constructing a Gaussian response value functionThe number is as follows:
Figure BDA0002246708600000122
wherein, δ is a scale factor and can be set according to actual conditions;
for GF(x, y) performing convolution calculation to obtain a Gaussian convolution response function:
h(x,y)=GF(x,y)*g(x,y),
the coordinates when the maximum value of h (x, y) is calculated, that is, the center position (x) of the targetcen,ycen)。
The above-described embodiment merely represents one embodiment of the present invention, but is not to be construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (10)

1. A grabbing industrial robot matched with a workpiece in a regular shape comprises an industrial robot control system, a sensor system, an operation and control system, a network integrated control system, a visual tracking system and an execution mechanism, wherein the sensor system is connected with the industrial robot control system, the industrial robot control system comprises a demonstrator and a motion controller, the operation and control system comprises an industrial personal computer, the visual tracking system comprises an RGB (red, green and blue) camera, a laser scanner, a tracking camera and a radio frequency transmitting and receiving device, the sensor system comprises a plurality of six-axis sensors, an optical sensor, a motion sensor and a Hall current sensor, the execution mechanism comprises a mechanical part and an electric part,
RGB camera and work piece transfer chain, work piece are placed the district, actuating mechanism has constituted industrial robot grasping system, the image information that the RGB camera obtained in real time obtains the shape and the positional information of target work piece after the image processing of industrial computer, the host computer sends control information, let industrial robot carry out appointed snatching of placing the operation, carry out analysis processes to the image that the collection obtained, discern the shape of target work piece, the central point who obtains target work piece puts, the image recognition's of work piece process is:
step 1, gray level conversion;
step 2, contrast equalization;
step 3, performing median filtering on the gray level image;
step 4, positioning a workpiece;
step 5, detecting the inclination angle of the rectangular area by adopting Hough transform, and then performing interpolation rotation according to the detected inclination angle to horizontally arrange the workpieces in the rectangular area;
and 6, identifying the shape.
2. A regularly shaped workpiece matched gripping industrial robot according to claim 1, characterized in that: the RGB camera observes that a workpiece enters a visual field range, the trigger is placed in the middle of the workpiece conveying line in the visual field range, when the workpiece touches the trigger, the RGB camera captures an image, the captured image is transmitted to the industrial personal computer for image processing, the shape and the grabbing position information of a target workpiece can be obtained after the image is analyzed and processed, the shape and the grabbing position information are sent to the industrial robot, then the designated position of the industrial robot is grabbed, and the target workpiece can be stacked at different places of the workpiece placing area according to different shapes.
3. A regularly shaped workpiece matching gripping industrial robot according to claim 1, characterised in that step 1 is embodied as: the collected working space image is 24-bit BMP bitmap in RGB format, the color image is converted into a gray scale image and then processed, f (i, j) represents the gray scale value of a pixel point (i, j), R (i, j) represents the red component of the pixel point (i, j), G (i, j) represents the green component of the pixel point (i, j), B (i, j) represents the blue component of the pixel point (i, j), and the gray scale conversion formula is as follows:
f(i,j)=0.299×R(i,j)+0.587×G(i,j)+0.144×B(i,j)。
4. a regularly shaped workpiece matching gripping industrial robot according to claim 1, characterised in that step 2 is embodied by:
let g (i, j), (i 1, 2.. times, M; j 1, 2.. times, N) be the image after the gradation conversion, where M, N are the height and width in the image pixel size, respectively, and the gradation variation range of the image is [0,255],
step 2.1, according to the original image [ f (i, j)]M×NConstruction of 256-dimensional hf(t), t ═ 0,1, 2.., 255 vectors;
step 2.2, the gray distribution probability p of the original image is obtainedf(t) the vector is then calculated,
Figure FDA0002246708590000021
wherein N isfThe total number of pixels of the image;
step 2.3, calculating the cumulative distribution probability p of each gray value of the imagea(k) Then, the first step is executed,
Figure FDA0002246708590000022
wherein p isa(0)=0;
Step 2.4, histogram equalization calculation, to obtain pixel value g (i, j) of the processed image, g (i, j) is 255 × pa(k)。
5. A regularly shaped workpiece matching gripping industrial robot according to claim 1, characterised in that step 4 is embodied by:
step 4.1, constructing a structural matrix to perform closed operation on the image, filling a workpiece area into a communicated area, eliminating a small area, communicating the workpiece part, and marking a preselected area;
step 4.2, setting the preselection area to be white, setting the pixel value to be 0, setting other areas to be black, and setting the pixel value to be 1; labeling each closed white block in an image, scanning the image, labeling LAB when the first pixel value is 1, then scanning an object in the neighborhood of 8, if the eight pixel values are all 1, marking the pixels in one region as LAB, then continuing scanning until the pixel with the pixel value of 0 is scanned, indicating that the region is labeled completely, and when the pixel value is scanned again, scanning the whole image by LAB + 1;
step 4.3, calculating the length-width ratio of the labeling area, specifically:
after all the areas are marked, the height and the width of each area are calculated, and the upper, lower, left and right extreme points of each area are respectively marked as Wmax,Wmin,Hmax,HminWhen the width W is equal to Wmax-WminHeight H ═ Hmax-Hmin
Step 4.4, regional screening, which specifically comprises the following steps: using the width-to-height ratio W/H as a constraint condition, deleting the areas of the width-to-height ratio W/H except for the areas of [3, 5], and further reducing interference areas according to the absolute width value W and the height value H if a plurality of areas of the width-to-height ratio W/H meet the requirements; calculating the area of the residual region, accumulating the number of pixels in each block region, averaging, and selecting and removing a communication region smaller than 500 to obtain a workpiece region;
and 4.5, obtaining the specific position of the workpiece by positioning the rectangular area, specifically: the line start and end positions and the column start and end positions of the workpiece are determined by integrating the gray values of the line pixels and the column pixels, and the positions of the lines and the columns are combined to determine the accurate position of the workpiece.
6. A regularly shaped workpiece matching gripping industrial robot according to claim 1, characterised in that step 6 is embodied by:
and 6.1, projecting the shape of the workpiece in the vertical direction by using a projection method, namely scanning the workpiece row by row along the vertical direction, accumulating the number of pixel points with the pixel value of 1 in each row to be used as a vertical coordinate, wherein the horizontal coordinate is the horizontal position of the image, the vertical projection curve has the characteristic of wave crest-wave trough-wave crest, and the position of the wave trough on the projection graph is selected as the segmentation position of the projection of the workpiece.
Step 6.2, normalization, specifically: calculating the height and width of the workpiece projection according to the upper and lower boundaries when the workpiece projection is extracted and the left and right boundaries obtained when the workpiece projection is divided, counting the calculation results, taking the maximum value as the normalized template size, and dividing the workpiece projection into the pixel size of 50 multiplied by 25 through calculation test;
step 6.3, a coarse grid feature extraction method is adopted, the workpiece projection is equally divided into 10 multiplied by 5 grids, and the number of black pixels in the grids is counted;
and 6.4, identifying the neural network.
7. A regularly shaped workpiece matching gripping industrial robot according to claim 6, characterised in that step 6.4 is embodied as:
step 6.4.1, inputting the characteristic sample into a BP neural network for learning, and establishing an identification model;
6.4.2, matching the established workpiece shape library with the recognized workpiece shape;
and 6.4.3, outputting a matching result.
8. A regularly shaped workpiece matching gripping industrial robot according to claim 1, characterised in that the specific procedure for obtaining the centre position of the workpiece using image processing is as follows:
step 1, acquiring original images of a workpiece conveying line and a workpiece, wherein a lens optical axis of an RGB (red, green and blue) camera is parallel to the workpiece conveying line;
and 2, enhancing the image, and respectively filtering and denoising the RGB values in the original image. Filtering and denoising an original image, wherein the noise comprises equipment noise, salt and pepper noise and quantization noise;
step 3, image segmentation is carried out, and a workpiece target image is obtained;
step 4, denoising the image;
and 5, acquiring the central position of the workpiece target.
9. A regularly shaped workpiece matched gripping industrial robot according to claim 1, characterized in that: the control system receives data of the sensor system and the visual tracking system through the network integrated control system and sends a control instruction to the industrial robot control system, the industrial robot control system controls the execution mechanism, the sensor system is installed on the execution mechanism and monitors the pose state of the execution mechanism in real time, and the sensor system and the visual tracking system are connected with the industrial robot control system and used for feeding back the working state of the execution mechanism in real time and monitoring the surrounding environment.
10. A regularly shaped workpiece matched gripping industrial robot according to claim 1, characterized in that: the network integrated control system searches for industrial robots existing in the local area network and is connected to corresponding motion controllers, the motion sensors are operated and the six-axis sensors are cleared, the motion sensors collect pose information of end effectors of the execution mechanisms to guide the motion of the industrial robots, and the pose information is displayed on a screen of the demonstrator in real time.
Industrial robot communication is divided into two stages: the first-level communication is communication between a control system and an industrial robot control system, and adopts a serial communication technology or a network communication technology; the second-level communication is communication among the industrial robot control system, the sensor system and the visual tracking system, and adopts an industrial field bus communication technology.
CN201911019398.2A 2018-11-07 2019-10-24 Grabbing industrial robot with regular-shaped workpiece matching function Pending CN110640741A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811320522 2018-11-07
CN2018113205224 2018-11-07

Publications (1)

Publication Number Publication Date
CN110640741A true CN110640741A (en) 2020-01-03

Family

ID=69013436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911019398.2A Pending CN110640741A (en) 2018-11-07 2019-10-24 Grabbing industrial robot with regular-shaped workpiece matching function

Country Status (1)

Country Link
CN (1) CN110640741A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111483803A (en) * 2020-04-17 2020-08-04 湖南视比特机器人有限公司 Control method, capture system and storage medium
CN112587036A (en) * 2020-12-10 2021-04-02 苏州阿甘机器人有限公司 Unmanned sweeper based on machine vision and working method thereof
CN113145942A (en) * 2021-03-12 2021-07-23 重庆市永川区中川科技发展有限责任公司 Work control method of gear shaping machine
CN113934174A (en) * 2021-10-22 2022-01-14 中冶赛迪上海工程技术有限公司 Steel grabbing control method and system
CN114762977A (en) * 2022-05-19 2022-07-19 深圳市大族机器人有限公司 Six-axis assisting robot based on double-joint module

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103336966A (en) * 2013-07-15 2013-10-02 山东奥泰机械有限公司 Weed image identification method applied to intelligent agricultural machine
CN103353976A (en) * 2013-07-12 2013-10-16 江苏远洋数据股份有限公司 Visual quality inspection device of packaging machine
CN107738255A (en) * 2017-09-26 2018-02-27 山东工商学院 Mechanical arm and the Mechanical arm control method based on Leap Motion
CN108025438A (en) * 2015-09-21 2018-05-11 亚马逊技术股份有限公司 The robotic manipulator of networking
CN108335396A (en) * 2018-04-15 2018-07-27 珠海市华清创新科技有限公司 A kind of artificial intelligence visitor management system with Car license recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103353976A (en) * 2013-07-12 2013-10-16 江苏远洋数据股份有限公司 Visual quality inspection device of packaging machine
CN103336966A (en) * 2013-07-15 2013-10-02 山东奥泰机械有限公司 Weed image identification method applied to intelligent agricultural machine
CN108025438A (en) * 2015-09-21 2018-05-11 亚马逊技术股份有限公司 The robotic manipulator of networking
CN107738255A (en) * 2017-09-26 2018-02-27 山东工商学院 Mechanical arm and the Mechanical arm control method based on Leap Motion
CN108335396A (en) * 2018-04-15 2018-07-27 珠海市华清创新科技有限公司 A kind of artificial intelligence visitor management system with Car license recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘雷: "基于FPGA的六轴工业机器人伺服系统设计与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
卢冠男: "基于机器视觉的工业机器人抓取系统的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111483803A (en) * 2020-04-17 2020-08-04 湖南视比特机器人有限公司 Control method, capture system and storage medium
CN111483803B (en) * 2020-04-17 2022-03-04 湖南视比特机器人有限公司 Control method, capture system and storage medium
CN112587036A (en) * 2020-12-10 2021-04-02 苏州阿甘机器人有限公司 Unmanned sweeper based on machine vision and working method thereof
CN113145942A (en) * 2021-03-12 2021-07-23 重庆市永川区中川科技发展有限责任公司 Work control method of gear shaping machine
CN113934174A (en) * 2021-10-22 2022-01-14 中冶赛迪上海工程技术有限公司 Steel grabbing control method and system
CN114762977A (en) * 2022-05-19 2022-07-19 深圳市大族机器人有限公司 Six-axis assisting robot based on double-joint module
CN114762977B (en) * 2022-05-19 2023-01-10 深圳市大族机器人有限公司 Six-axis assisting robot based on double-joint module

Similar Documents

Publication Publication Date Title
CN110640741A (en) Grabbing industrial robot with regular-shaped workpiece matching function
CN109785317B (en) Automatic pile up neatly truss robot's vision system
CN111462154B (en) Target positioning method and device based on depth vision sensor and automatic grabbing robot
CN110666801A (en) Grabbing industrial robot for matching and positioning complex workpieces
CN102514002B (en) Monocular vision material loading and unloading robot system of numerical control lathe and method thereof
JP5869583B2 (en) Filtering method of target object image in robot system
CN110580725A (en) Box sorting method and system based on RGB-D camera
CN110480637B (en) Mechanical arm part image recognition and grabbing method based on Kinect sensor
WO2016055031A1 (en) Straight line detection and image processing method and relevant device
CN105930854A (en) Manipulator visual system
CN111923053A (en) Industrial robot object grabbing teaching system and method based on depth vision
CN112845143A (en) Household garbage classification intelligent sorting system and method
CN109839384B (en) Visual detector and detection method for detecting defects of micro vibration motor
CN112926503B (en) Automatic generation method of grabbing data set based on rectangular fitting
Djajadi et al. A model vision of sorting system application using robotic manipulator
CN107527368A (en) Three-dimensional attitude localization method and device based on Quick Response Code
CN113822810A (en) Method for positioning workpiece in three-dimensional space based on machine vision
CN110640744A (en) Industrial robot with fuzzy control of motor
CN111823224A (en) Automatic balance system of optical lens
CN110640743A (en) Monocular industrial robot with visual tracking function
CN102735690A (en) Intelligent high speed online automation detection method based on machine vision, and system thereof
CN113034526B (en) Grabbing method, grabbing device and robot
CN113715012A (en) Automatic assembly method and system for remote controller parts
CN114029941A (en) Robot grabbing method and device, electronic equipment and computer medium
Gao et al. An automatic assembling system for sealing rings based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20200730

Address after: Room 1207, building 8, block B, R & D Park, Ningbo, Zhejiang Province, 315000

Applicant after: Zhang Huanhuan

Address before: 315000 room 014, floor 3, building 2, No.128, Jingyuan Road, hi tech Zone, Ningbo City, Zhejiang Province

Applicant before: NINGBO SAILANG TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200103

RJ01 Rejection of invention patent application after publication