CN102645219B - Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection - Google Patents

Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection Download PDF

Info

Publication number
CN102645219B
CN102645219B CN201210150845.XA CN201210150845A CN102645219B CN 102645219 B CN102645219 B CN 102645219B CN 201210150845 A CN201210150845 A CN 201210150845A CN 102645219 B CN102645219 B CN 102645219B
Authority
CN
China
Prior art keywords
image
pixel
gray
cross
weld seam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210150845.XA
Other languages
Chinese (zh)
Other versions
CN102645219A (en
Inventor
张立国
肖波
焦建彬
高学山
刘璐
张雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Original Assignee
Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Fenghua Co ltd China Aerospace Science & Industry Corp filed Critical Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Priority to CN201210150845.XA priority Critical patent/CN102645219B/en
Publication of CN102645219A publication Critical patent/CN102645219A/en
Application granted granted Critical
Publication of CN102645219B publication Critical patent/CN102645219B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a visual navigation system of a wall climbing robot for weld inspection, a welding and locating method of a welding seam and a method of obtaining welding seam offset. According to the invention, the problem that in the place with a bad natural environment, the overhauling and maintaining work for a tower cylinder in a teaching reappearing way cannot be realized by using the prior art is solved. A charge coupled camera and a cross-shaped laser transmitter of the visual navigation system are fixedly arranged at the front end of the head part of the wall climbing robot, the cross-shaped laser transmitter is arranged right above the charge coupled camera, laser transmitted by the cross-shaped laser transmitter irradiates on the surface of a welding workpiece to form a cross-shaped light spot, the charge coupled camera is used for photographing the cross-shaped light spot on the surface of the welding workpiece, and a data output end of the charge coupled camera is connected with a data input end of a computer. The visual navigation system, the welding and locating method and the method of obtaining the welding seam offset, disclosed by the invention, are suitable for the field of overhauling and maintaining the tower cylinder.

Description

Welding localization method and weld excursion amount acquisition methods for the weld seam of the vision navigation system of the climbing robot of weld seam detection
Technical field
The present invention is applicable to the repair and maintenance field to tower cylinder.
Background technology
Recent years, Wind Power Generation Industry was flourish, and wind-power tower quantity is surged, and the repair and maintenance work requirements amount for tower cylinder also increases accordingly.General wind-power tower height is between 50 to 100 meters, and tower cylinder is divided into some sections, and every section is welded by monolithic steel plate roll bending.Due to the impact of welding technology and welding precision, weld seam part there will be the problems such as rosin joint, pore, slag inclusion unavoidably, easily further causes crack in wind field use procedure.Due to wind energy turbine set majority, all build on the severe places of physical environment such as sea, valley, mountain pass, cause the problems referred to above in the use of blower fan, to become great potential safety hazard.All the time, be all to adopt suspension type operating type to carry out maintenance, the maintenance work of wind-power tower, risk is higher, therefore in the urgent need to the machine of working under limit mode, replaces manual work.And under this maximum conditions, be difficult to realize by GUIDANCE FOR AUTONOMIC GUIDED VEHICLES the training of the robot course of work, need robot to there is more intelligent control system and adapt to working environment, the work of the line correlation of going forward side by side.
Summary of the invention
In order to solve the severe place of physical environment prior art, cannot realize the problem of GUIDANCE FOR AUTONOMIC GUIDED VEHICLES to the repair and maintenance work of tower cylinder, thereby welding localization method and weld excursion amount acquisition methods for the weld seam of the vision navigation system of the climbing robot of weld seam detection are provided.
The vision navigation system of the climbing robot for weld seam detection of the present invention, it comprises charge coupling camera, cross laser transmitter and computing machine,
Charge coupling camera and cross laser transmitter are fixed on the positive front end of robot head, and cross laser transmitter is directly over charge coupling camera, the Ear Mucosa Treated by He Ne Laser Irradiation of described cross laser transmitter transmitting forms cross hot spot on welded part surface, charge coupling camera is for taking the cross hot spot of surface of the work, and the angle of the optical axis of laser beam of cross laser transmitter transmitting and the optical axis of the camera of charge coupling camera is 45 °, and the data output end of charge coupling camera is connected with the data input pin of computing machine.
Welding localization method, it comprises the steps:
Step 1, cross laser transmitter are launched red cross line style laser beam, described laser beam irradiation forms cross hot spot on welded part surface, and adopt this Laser Beam Scanning Workpiece surperficial, in scanning process, adopt charge coupling camera to gather the video information of surface of the work, perform step two;
Step 2, computing machine receive the video information that charge coupling camera gathers;
Step 3, computing machine are handled as follows each two field picture in video information:
Step 3 one, this two field picture is divided into pixel, this dot image of the value representation of described pixel is red, the value of Huang and Lan Sanse;
Step 3 two, the numerical value of all red, the Huang of this two field picture and blue three-color component is stored in respectively in three arrays;
Step 3 three, computing machine are asked for the optimal threshold of the gray-scale value that red component numerical value is corresponding to the array of red component corresponding to this two field picture by Two-dimensional maximum-entropy split plot design;
Step 3 four, the optimal threshold that the gray-scale value of all pixels of this two field picture and step 3 three obtained are compared, and the gray-scale value of the pixel higher than threshold value is set to 255, and the grey scale pixel value lower than threshold value is set to 0, and this two field picture is converted to binary map;
Step 3 five, the binary map of using Canny operator to obtain step 3 four are carried out rim detection, to obtain the marginal information of bianry image, show as the edge of cross hot spot on this two field picture;
The marginal information of step 3 six, extraction step three or five, gets the center line at edge of cross hot spot as skeleton, obtains the skeleton curve of smooth single pixel wide;
Step pseudo-ginseng, the skeleton curve obtaining according to step 3 six carry out the Hough conversion of straight line, obtain two straight lines on camber line both sides, and mark two end points of camber line, find camber line part, detect the position of weld seam in this two field picture.
Weld excursion amount acquisition methods, behind weld seam location, extract three unique points of central point of two end points and the cross light of weld seam camber line, between described two end points, distance is a, central point to and the end points that closes on of weld seam between distance be b, establish a, the measured value of b is a', b', can try to achieve deviation angle and be:
θ = arccos a + b a ' + b ' = arccos b ' b , - - - ( 14 )
Position offset is:
w=b'-b, (15)
Work as a'=a, b'=b, is normal condition, during robot normally advances; Work as a'>a, b'≤b, robot is offset left; Work as a'>a, b' >=b, robot is offset to the right.
The present invention uses the vision navigation system of the climbing robot of weld seam detection that robot can have been reached from line search weld seam by vision navigation system; Welding localization method by weld seam is determined position while welding, follows the tracks of bead direction, thereby guarantee that defect-detecting equipment can, always along weld seam detection, reach the object that does not depart from weld seam by weld excursion amount acquisition methods.
Accompanying drawing explanation
Fig. 1 is the structural representation for the vision navigation system of the climbing robot of weld seam detection; Fig. 2 is gray scale-neighborhood gray average two-dimensional histogram; Fig. 3 is bead contour signal; Fig. 4 is the perspective view of laser projection and weld seam intersection drift condition; Fig. 5 is the contrast of laser projection and weld seam intersection drift condition and normal conditions; Fig. 6 is the isoboles of Fig. 5.
Embodiment
Embodiment one, in conjunction with Fig. 1, illustrate present embodiment, the vision navigation system of the climbing robot for weld seam detection described in present embodiment, it comprises charge coupling camera 1, cross laser transmitter 2 and computing machine,
Charge coupling camera 1 and cross laser transmitter 2 are fixed on the positive front end of robot head, and cross laser transmitter 2 is directly over charge coupling camera 1, the Ear Mucosa Treated by He Ne Laser Irradiation of described cross laser transmitter 2 transmittings forms cross hot spot on welded part surface, charge coupling camera 1 is for taking the cross hot spot of surface of the work, and the angle of cross laser transmitter 2 optical axises of laser beam of transmitting and the optical axis of the camera of charge coupling camera 1 is 45 °, and the data output end of charge coupling camera 1 is connected with the data input pin of computing machine.
The welding localization method of the weld seam of the vision navigation system of embodiment two, the climbing robot for weld seam detection based on described in embodiment one, it comprises the steps:
Step 1, the red cross line style laser beam of cross laser transmitter 2 transmitting, described laser beam irradiation forms cross hot spot on welded part surface, and adopt this Laser Beam Scanning Workpiece surperficial, in scanning process, adopt charge coupling camera 1 to gather the video information of surface of the work, perform step two;
Step 2, computing machine receive the video information that charge coupling camera gathers;
Step 3, computing machine are handled as follows each two field picture in video information:
Step 3 one, this two field picture is divided into pixel, this dot image of the value representation of described pixel is red, the value of Huang and Lan Sanse;
Step 3 two, the numerical value of all red, the Huang of this two field picture and blue three-color component is stored in respectively in three arrays;
Step 3 three, computing machine are asked for the optimal threshold of the gray-scale value that red component numerical value is corresponding to the array of red component corresponding to this two field picture by Two-dimensional maximum-entropy split plot design;
Step 3 four, the optimal threshold that the gray-scale value of all pixels of this two field picture and step 3 three obtained are compared, and the gray-scale value of the pixel higher than threshold value is set to 255, and the grey scale pixel value lower than threshold value is set to 0, and this two field picture is converted to binary map;
Step 3 five, the binary map of using Canny operator to obtain step 3 four are carried out rim detection, to obtain the marginal information of bianry image, show as the edge of cross hot spot on this two field picture;
The marginal information of step 3 six, extraction step three or five, gets the center line at edge of cross hot spot as skeleton, obtains the skeleton curve of smooth single pixel wide;
Step pseudo-ginseng, the skeleton curve obtaining according to step 3 six carry out the Hough conversion of straight line, obtain two straight lines on camber line both sides, and mark two end points of camber line, find camber line part, detect the position of weld seam in this two field picture.
Hough conversion in step pseudo-ginseng, the concrete steps that obtain two straight lines on camber line both sides are:
Step 3 July 1st, by (ρ, θ) space quantization: skeleton curve ρ=xcos θ+ysin θ obtains two-dimensional matrix A (ρ, θ), and initialization A (ρ, θ) is full null matrix;
Step pseudo-ginseng two, each non-zero gray-value pixel point coordinate (x, y) of image is calculated to corresponding ρ by each quantized value of θ, make a (i, j)+1 → a (i, j), a (i, j) be the element value of the capable j row of i of matrix A (ρ, θ);
Step pseudo-ginseng three, by whole non-zero gray-value pixel point coordinate (x, y) after processing, analyze A (ρ, θ), if A is (ρ, θ) be greater than threshold value T, have a line segment, (ρ, θ) is the fitting parameter of this line segment, T is a nonnegative integer, and in image, the priori of scenery determines;
Step pseudo-ginseng four, by (ρ, θ) and (x, y), jointly determine the line segment in image, and breaking portion is connected, obtain straight-line segment.
Embodiment three, in conjunction with Fig. 2, illustrate present embodiment, the difference of present embodiment and embodiment two is, the optimal threshold method of asking for the gray-scale value that red component numerical value is corresponding described in step 3 three is:
On image four pixels up and down of pixel next-door neighbour with and the neighbor at 4 diagonal angles just formed the neighborhood information of 8 pixels of this pixel, by Two-dimensional maximum-entropy split plot design, make image about the two-dimensional histogram of pixel gray-scale value and this neighborhood of pixel points information gray average, utilize two-dimensional entropy maximum to ask for the optimal threshold f (x, y) of the gray-scale value that red component numerical value is corresponding;
Wherein, the coordinate that (x, y) is pixel,
A district and B district that two-dimensional histogram distributes along diagonal line represent respectively target and background, away from cornerwise C district and D district, represent border and noise, in A district and B district, utilize some gray scale-area grayscale average Two-dimensional maximum-entropy method to determine optimal threshold, can make the quantity of information of authentic representative target and background maximum, two-dimensional entropy is:
H = - Σ i Σ j p i , j log 2 p i , j , - - - ( 1 )
The discriminant function of entropy is:
f(x 0,y 0)=log 2[P A(1-P A)]+H A/P A+(H L-H A)/(1-P A), (2)
Optimal threshold is:
f(x,y)=max{f(x 0,y 0)}, (3)
Wherein,
P A Σ i s Σ j t p i , j - - - ( 4 )
H A = - Σ i s Σ j t p i , j log 2 p i , j - - - ( 5 )
H L = - Σ i L Σ j L p i , j log 2 p i , j - - - ( 6 )
The number of greyscale levels of image is L, and total pixel is counted as N(m * n), g i, jfor image mid point gray scale is that the pixel that i and area grayscale average thereof are j is counted, p i, jfor the probability of gray scale-area grayscale average to (i, j) generation, that is: p i, j=g i, j/ N, wherein N(m * n) be the number of total picture element of image, p i, j, i, and j=1,2 ..., L } and presentation video is about a two-dimensional histogram for gray scale-area grayscale average.
The difference of embodiment four, present embodiment and embodiment two is, the concrete steps of carrying out rim detection according to binary map in step 3 five are:
Step 3 May Day, with Gaussian wave filter, the optimal threshold f (x, y) of image is carried out to convolution:
g(x,y)=h(x,y,σ)*f(x,y) (7)
h ( x , y , σ ) = 1 2 πσ 2 e - x 2 + y 2 2 σ 2 - - - ( 8 )
Wherein, g (x, y) is the image after convolution, and h (x, y, σ) is Gaussian filter function, and σ is standard deviation, and * represents convolution;
Step 3 five or two, by single order local derviation method of finite difference, obtain partial gradient M (x, y) and the edge direction θ (x, y) of the image g (x, y) after convolution,
g' x(x,y)≈G x(x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y+1)]/2 (9)
g' y(x,y)≈G y(x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1,y)]/2 (10)
M ( x , y ) = G x ( x , y ) 2 + G y ( x , y ) 2 - - - ( 11 )
θ(x,y)=arctan(G x(x,y)/G y(x,y)) (12)
Wherein, G xthe horizontal gradient of (x, y) presentation video g (x, y); G ythe vertical gradient of (x, y) presentation video g (x, y); The partial gradient of M (x, y) presentation video g (x, y); The edge direction of θ (x, y) presentation video g (x, y),
Step 3 five or three, by each pixel of image, neighborhood information partial gradient M (x, y) being compared with two pixels along gradient line, if neighborhood information partial gradient M (x, y) be less than the pixel along gradient line, M (x, y) is set to zero, this Grad is marginal point; If neighborhood information partial gradient M (x, y) is more than or equal to the pixel along gradient line, this Grad is not marginal point, will disregard;
Step 3 the May 4th, choose two threshold values of two-dimensional histogram, and act on by non-maximum value and suppress the image obtaining, the gray-scale value that Grad is less than to the pixel of less threshold value is made as 0, obtains image 1; Then the gray-scale value that Grad is less than to the pixel of larger threshold value is made as 0, obtains image 2; Take image 2 as basis, take image 1 as supplementing to link the edge that obtains cross hot spot in image.
The difference of embodiment five, present embodiment and embodiment four is, step 3 the May 4th medium chain obtain in image the concrete steps at the edge of cross hot spot be:
Step 3 the May 4th one, scans image 2, when running into the pixel of a non-zero gray-scale value, follows the tracks of and take its outline line that is starting point, until the terminal of outline line;
Step 3 the May 4th two, the neighborhood information of 8 pixels of the pixel corresponding with the final position of outline line in image 2 in image under consideration 1, if there is the pixel of non-zero gray-scale value to exist in 8 pixel neighborhoods of a point of this point, be included in image 2, as NEW BEGINNING point, then repeating step 3641, until all cannot continue in image 1 and image 2;
Step 3 five-four-three, after completing the link of the outline line to comprising a certain pixel, is labeled as this outline line and accesses;
Repeating step three the May 4ths one, step 3 the May 4th two and step 3 five-four-three, until can not find new outline line in image 2.
The difference of embodiment six, present embodiment and embodiment two is, the skeleton curve described in step 3 six is:
ρ=xcosθ+ysinθ; (13)
Wherein, a bit (x, the y) in ρ=xcos θ+ysin θ presentation video space is corresponding to (ρ, θ) space sinusoidal curve, x represents the horizontal ordinate of pixel, y represents the ordinate of pixel, image coordinate initial point is ρ to the distance of this straight line, and the angle of the normal of this straight line and x axle is θ.
Embodiment seven, in conjunction with Fig. 3, Fig. 4, Fig. 5 and Fig. 6, illustrate present embodiment, the weld excursion amount acquisition methods of the welding localization method of the weld seam of the vision navigation system of the climbing robot for weld seam detection based on described in embodiment two, behind weld seam location, extract three unique points of central point of two end points and the cross light of weld seam camber line, between described two end points, distance is a, central point to and the end points that closes between distance be b, if a, the measured value of b is a', b', can try to achieve deviation angle and be:
θ = arccos a + b a ' + b ' = arccos b ' b , - - - ( 14 )
Position offset is:
w=b'-b, (15)
Work as a'=a, b'=b, is normal condition, during robot normally advances; Work as a'>a, b'≤b, robot is offset left; Work as a'>a, b' >=b, robot is offset to the right.
Figure 5 shows that the situation of right avertence appears in robot, in Fig. 6, shift state and standard state are done to a contrast.Angle theta is deviation angle.
No matter be crawler-type wall climbing robot or wheeled climbing robot, when vertical crawling, all can not occur suddenly horizontal displacement, the situation of a'=a & b' >=b can not be in instantaneous generation.Therefore Fig. 5 can be equivalent to the result of Fig. 6,
Deviation angle θ = arccos a + b a ' + b ' = arccos b ' b
In robot, when horizontal position welding seam is creeped, may occur because of gravity factor the situation of downward small amplitude slip, the situation that now has a'=a & b' >=b occurs, position offset w=b'-b.
Using calculating the w of gained and θ, as feedback quantity, input to the PID control section of robot, can realize the real-time correction to running orbit.

Claims (6)

1. for the welding localization method of the weld seam of the vision navigation system of the climbing robot of weld seam detection, described method is by realizing with lower device, and described device comprises charge coupling camera (1), cross laser transmitter (2) and computing machine,
Charge coupling camera (1) and cross laser transmitter (2) are fixed on the positive front end of robot head, and cross laser transmitter (2) is directly over charge coupling camera (1), the Ear Mucosa Treated by He Ne Laser Irradiation of described cross laser transmitter (2) transmitting forms cross hot spot on welded part surface, charge coupling camera (1) is for taking the cross hot spot of surface of the work, and the angle of the optical axis of laser beam of cross laser transmitter (2) transmitting and the optical axis of the camera of charge coupling camera (1) is 45 °, the data output end of charge coupling camera (1) is connected with the data input pin of computing machine,
It is characterized in that, for the welding localization method of the weld seam of the vision navigation system of the climbing robot of weld seam detection, comprise the steps:
Step 1, cross laser transmitter (2) are launched red cross line style laser beam, described laser beam irradiation forms cross hot spot on welded part surface, and adopt this Laser Beam Scanning Workpiece surperficial, in scanning process, adopt charge coupling camera (1) to gather the video information of surface of the work, perform step two;
Step 2, computing machine receive the video information that charge coupling camera gathers;
Step 3, computing machine are handled as follows each two field picture in video information:
Step 3 one, this two field picture is divided into pixel, this dot image of the value representation of described pixel is red, the value of Huang and Lan Sanse;
Step 3 two, the numerical value of all red, the Huang of this two field picture and blue three-color component is stored in respectively in three arrays;
Step 3 three, computing machine are asked for the optimal threshold of the gray-scale value that red component numerical value is corresponding to the array of red component corresponding to this two field picture by Two-dimensional maximum-entropy split plot design;
Step 3 four, the optimal threshold that the gray-scale value of all pixels of this two field picture and step 3 three obtained are compared, and the gray-scale value of the pixel higher than threshold value is set to 255, and the grey scale pixel value lower than threshold value is set to 0, and this two field picture is converted to binary map;
Step 3 five, the binary map of using Canny operator to obtain step 3 four are carried out rim detection, to obtain the marginal information of bianry image, show as the edge of cross hot spot on this two field picture;
The marginal information of step 3 six, extraction step three or five, gets the center line at edge of cross hot spot as skeleton, obtains the skeleton curve of smooth single pixel wide;
Step pseudo-ginseng, the skeleton curve obtaining according to step 3 six carry out the Hough conversion of straight line, obtain two straight lines on camber line both sides, and mark two end points of camber line, find camber line part, detect the position of weld seam in this two field picture.
2. the welding localization method of the weld seam of vision navigation system according to claim 1, is characterized in that, the optimal threshold method of asking for the gray-scale value that red component numerical value is corresponding described in step 3 three is:
On image four pixels up and down of pixel next-door neighbour with and the neighbor at 4 diagonal angles just formed the neighborhood information of 8 pixels of this pixel, by Two-dimensional maximum-entropy split plot design, make image about the two-dimensional histogram of pixel gray-scale value and this neighborhood of pixel points information gray average, utilize two-dimensional entropy maximum to ask for the optimal threshold f (x, y) of the gray-scale value that red component numerical value is corresponding;
Wherein, the coordinate that (x, y) is pixel,
A district and B district that two-dimensional histogram distributes along diagonal line represent respectively target and background, away from cornerwise C district and D district, represent border and noise, in A district and B district, utilize some gray scale-area grayscale average Two-dimensional maximum-entropy method to determine optimal threshold, can make the quantity of information of authentic representative target and background maximum, two-dimensional entropy is:
H = - Σ i Σ j p i , j log 2 p i , j , - - - ( 1 )
The discriminant function of entropy is:
f(x 0,y 0)=log 2[P A(1-P A)]+H A/P A+(H L-H A)/(1-P A), (2)
Optimal threshold is:
f(x,y)=max{f(x 0,y 0)}, (3)
Wherein,
P A Σ i s Σ j t p i , j - - - ( 4 )
H A = - Σ i s Σ j t p i , j log 2 p i , j - - - ( 5 )
H L = - Σ i L Σ j L p i , j log 2 p i , j - - - ( 6 )
The number of greyscale levels of image is L, and total pixel is counted as N(m * n), g i, jfor image mid point gray scale is that the pixel that i and area grayscale average thereof are j is counted, p i, jfor the probability of gray scale-area grayscale average to (i, j) generation, that is: p i, j=g i, j/ N, wherein N(m * n) be the number of total picture element of image, p i, j, i, and j=1,2 ..., L } and presentation video is about a two-dimensional histogram for gray scale-area grayscale average.
3. the welding localization method of the weld seam of vision navigation system according to claim 1, is characterized in that, the concrete steps of carrying out rim detection according to binary map in step 3 five are:
Step 3 May Day, with Gaussian wave filter, the optimal threshold f (x, y) of image is carried out to convolution:
g(x,y)=h(x,y,σ)*f(x,y) (7)
h ( x , y , σ ) = 1 2 πσ 2 e - x 2 + y 2 2 σ 2 - - - ( 8 )
Wherein, g (x, y) is the image after convolution, and h (x, y, σ) is Gaussian filter function, and σ is standard deviation, and * represents convolution;
Step 3 five or two, by single order local derviation method of finite difference, obtain partial gradient M (x, y) and the edge direction θ (x, y) of the image g (x, y) after convolution,
g' x(x,y)≈G x(x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y+1)]/2 (9)
g' y(x,y)≈G y(x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1,y)]/2 (10)
M ( x , y ) = G x ( x , y ) 2 + G y ( x , y ) 2 - - - ( 11 )
θ(x,y)=arctan(G x(x,y)/G y(x,y)) (12)
Wherein, G xthe horizontal gradient of (x, y) presentation video g (x, y); G ythe vertical gradient of (x, y) presentation video g (x, y); The partial gradient of M (x, y) presentation video g (x, y); The edge direction of θ (x, y) presentation video g (x, y),
Step 3 five or three, by each pixel of image, neighborhood information partial gradient M (x, y) being compared with two pixels along gradient line, if neighborhood information partial gradient M (x, y) be less than the pixel along gradient line, M (x, y) is set to zero, this Grad is marginal point; If neighborhood information partial gradient M (x, y) is more than or equal to the pixel along gradient line, this Grad is not marginal point, will disregard;
Step 3 the May 4th, choose two threshold values of two-dimensional histogram, and act on by non-maximum value and suppress the image obtaining, the gray-scale value that Grad is less than to the pixel of less threshold value is made as 0, obtains image 1; Then the gray-scale value that Grad is less than to the pixel of larger threshold value is made as 0, obtains image 2; Take image 2 as basis, take image 1 as supplementing to link the edge that obtains cross hot spot in image.
4. the welding localization method of the weld seam of vision navigation system according to claim 3, is characterized in that, step 3 the May 4th medium chain obtain in image the concrete steps at the edge of cross hot spot be:
Step 3 the May 4th one, scans image 2, when running into the pixel of a non-zero gray-scale value, follows the tracks of and take its outline line that is starting point, until the terminal of outline line;
Step 3 the May 4th two, the neighborhood information of 8 pixels of the pixel corresponding with the final position of outline line in image 2 in image under consideration 1, if there is the pixel of non-zero gray-scale value to exist in 8 pixel neighborhoods of a point of this point, be included in image 2, as NEW BEGINNING point, then repeating step three the May 4ths one, until all cannot continue in image 1 and image 2;
Step 3 five-four-three, after completing the link of the outline line to comprising a certain pixel, is labeled as this outline line and accesses;
Repeating step three the May 4ths one, step 3 the May 4th two and step 3 five-four-three, until can not find new outline line in image 2.
5. the welding localization method of the weld seam of vision navigation system according to claim 1, is characterized in that, the skeleton curve described in step 3 six is:
ρ=xcosθ+ysinθ; (13)
Wherein, a bit (x, the y) in ρ=xcos θ+ysin θ presentation video space is corresponding to (ρ, θ) space sinusoidal curve, x represents the horizontal ordinate of pixel, y represents the ordinate of pixel, image coordinate initial point is ρ to the distance of this straight line, and the angle of the normal of this straight line and x axle is θ.
6. the weld excursion amount acquisition methods of the welding localization method based on weld seam claimed in claim 1, it is characterized in that, behind weld seam location, three unique points of central point of extracting two end points and the cross light of weld seam camber line, between described two end points, distance is a, central point to and the end points that closes between distance be b, if a, the measured value of b is a', and b' can try to achieve deviation angle and is:
θ = arccos a + b a ' + b ' = arccos b ' b , - - - ( 14 )
Position offset is:
w=b'-b, (15)
Work as a'=a, b'=b, is normal condition, during robot normally advances; Work as a'>a, b'≤b, robot is offset left; Work as a'>a, b' >=b, robot is offset to the right.
CN201210150845.XA 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection Active CN102645219B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210150845.XA CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210150845.XA CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Publications (2)

Publication Number Publication Date
CN102645219A CN102645219A (en) 2012-08-22
CN102645219B true CN102645219B (en) 2014-12-03

Family

ID=46658192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210150845.XA Active CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Country Status (1)

Country Link
CN (1) CN102645219B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105988142A (en) * 2014-12-31 2016-10-05 新代科技股份有限公司 Pipe welding bead detection system and method

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103567607A (en) * 2013-11-06 2014-02-12 广东德科机器人技术与装备有限公司 Welding-seam tracking method
CN103853141A (en) * 2014-03-18 2014-06-11 航天科工哈尔滨风华有限公司 Control system of automatic detecting wall-climbing robot for fan tower cylinder welding seams
CN104121882B (en) * 2014-07-24 2017-06-13 中国石油集团渤海石油装备制造有限公司 Be in a pout detection method and the device of steel pipe seam
CN104634372B (en) * 2015-02-15 2017-03-22 易测智能科技(天津)有限公司 Terminal positioning device and positioning method for testing of mobile terminals
CN104776799B (en) * 2015-04-10 2017-06-13 清华大学 Detection device and method before the cosmetic welding of shadow feature are built using lateral light
CN106530269A (en) * 2015-09-15 2017-03-22 苏州中启维盛机器人科技有限公司 Weld detection method
CN105195888B (en) * 2015-10-09 2018-04-13 航天工程装备(苏州)有限公司 Agitating friction welds planar laser tracking compensation technique
CN105499865A (en) * 2016-01-22 2016-04-20 广西大学 Planar welding manipulator with function of automatic track seeking
CN106091936A (en) * 2016-06-01 2016-11-09 中国电子科技集团公司第四十研究所 A kind of cellophane offset detecting device based on machine vision technique and method
CN106382884A (en) * 2016-08-18 2017-02-08 广东工业大学 Point light source welding seam scanning detection method
CN106370113A (en) * 2016-11-01 2017-02-01 合肥超科电子有限公司 Water wheel offset detection device, automatic alignment device and water wheel support
CN106767401A (en) * 2016-11-26 2017-05-31 江苏瑞伯特视觉科技股份有限公司 A kind of shaft hole series part based on cross laser and machine vision determines appearance localization method
CN106695192B (en) * 2016-12-22 2018-04-10 江苏工程职业技术学院 A kind of climbing robot automatic welding control method
CN107414253B (en) * 2017-08-21 2022-09-06 河北工业大学 Welding seam tracking control device and method based on cross laser
CN108788550B (en) * 2018-06-27 2019-07-12 清华大学 Detection device, the control method and device that areola welding bead is detected using detection device
CN109058053B (en) * 2018-07-04 2020-11-03 苏州智能制造研究院有限公司 Method for measuring horizontal displacement of top end of wind driven generator tower
CN111918742B (en) * 2018-08-29 2022-04-15 深圳配天智能技术研究院有限公司 Gap detection method and system for visual welding system
CN109060262A (en) * 2018-09-27 2018-12-21 芜湖飞驰汽车零部件技术有限公司 A kind of wheel rim weld joint air-tight detection device and air-tightness detection method
CN109483018A (en) * 2018-11-06 2019-03-19 湖北书豪智能科技有限公司 The active vision bootstrap technique of weld seam in automatic welding of pipelines
CN109365998B (en) * 2018-12-24 2019-09-17 中南大学 Laser soldering device and vision positioning method based on machine vision positioning
CN109949245B (en) * 2019-03-25 2021-04-16 长沙智能驾驶研究院有限公司 Cross laser detection positioning method and device, storage medium and computer equipment
CN110310295B (en) * 2019-03-27 2021-09-14 广东技术师范学院天河学院 Weld contour extraction method and system
CN109993741B (en) * 2019-04-03 2022-10-04 南昌航空大学 Steel rail welding seam contour automatic positioning method based on K-means clustering
CN112388626B (en) * 2019-08-15 2022-04-22 广东博智林机器人有限公司 Robot-assisted navigation method
CN111044701A (en) * 2019-12-30 2020-04-21 中核武汉核电运行技术股份有限公司 Device and method for calibrating position of wall-climbing robot for spent pool inspection of nuclear power plant
CN111551565A (en) * 2020-06-19 2020-08-18 湖南恒岳重钢钢结构工程有限公司 Wind power tower cylinder weld defect detection device and method based on machine vision
CN112797895B (en) * 2020-12-24 2022-07-22 上海智殷自动化科技有限公司 Frame body positioning device based on vision and laser
CN112902876B (en) * 2021-01-14 2022-08-26 西北工业大学 Method for measuring weld deflection of spin forming curved surface member of tailor-welded blank
CN115018833B (en) * 2022-08-05 2022-11-04 山东鲁芯之光半导体制造有限公司 Processing defect detection method of semiconductor device
CN116558438B (en) * 2023-07-11 2023-09-15 湖南视觉伟业智能科技有限公司 Bottle blowing quality detection device and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900701A (en) * 2006-07-19 2007-01-24 北京科技大学 Online detecting method and device for hot rolling strip surface fault based on laser line light source
CN101033953A (en) * 2007-02-02 2007-09-12 西安交通大学 Measurement method of planeness based on image processing and pattern recognizing
CN101718532A (en) * 2009-06-15 2010-06-02 三星重工业株式会社 Laser image module and non-contact type measurement device using same
CN101782552A (en) * 2010-02-05 2010-07-21 航天科工哈尔滨风华有限公司 Automatic on-line detecting device for welding lines of mast of wind driven generator

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3556589B2 (en) * 2000-09-20 2004-08-18 ファナック株式会社 Position and orientation recognition device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900701A (en) * 2006-07-19 2007-01-24 北京科技大学 Online detecting method and device for hot rolling strip surface fault based on laser line light source
CN101033953A (en) * 2007-02-02 2007-09-12 西安交通大学 Measurement method of planeness based on image processing and pattern recognizing
CN101718532A (en) * 2009-06-15 2010-06-02 三星重工业株式会社 Laser image module and non-contact type measurement device using same
CN101782552A (en) * 2010-02-05 2010-07-21 航天科工哈尔滨风华有限公司 Automatic on-line detecting device for welding lines of mast of wind driven generator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱志宏,李济泽,彭晋民,李军,高学山.微小型壁面检测爬壁机器人移动平台研究.《机械工程学报》.2011,第47卷(第3期),全文. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105988142A (en) * 2014-12-31 2016-10-05 新代科技股份有限公司 Pipe welding bead detection system and method

Also Published As

Publication number Publication date
CN102645219A (en) 2012-08-22

Similar Documents

Publication Publication Date Title
CN102645219B (en) Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection
CN108230344B (en) Automatic identification method for tunnel water leakage diseases
CN102541063B (en) Line tracking control method and line tracking control device for micro intelligent automobiles
CN110726726A (en) Quantitative detection method and system for tunnel forming quality and defects thereof
CN108445496A (en) Ranging caliberating device and method, distance-measuring equipment and distance measuring method
CN103473762B (en) A kind of method for detecting lane lines and device
KR102146451B1 (en) Apparatus and method for acquiring conversion information of coordinate system
CN103196418A (en) Measuring method of vehicle distance at curves
CN103940369A (en) Quick morphology vision measuring method in multi-laser synergic scanning mode
CN101067557A (en) Environment sensing one-eye visual navigating method adapted to self-aid moving vehicle
CN104964672A (en) Long-distance obstacle perception sensor based on line structured light
CN104330048B (en) A kind of railway snow depth measurement apparatus and method based on image
CN105678776A (en) Weld image feature point extraction method based on laser vision sensor
Sehestedt et al. Robust lane detection in urban environments
CN105488503A (en) Method for detecting circle center image coordinate of uncoded circular ring-shaped gauge point
CN102331795A (en) Method for controlling sunlight reflecting device to automatically track sun based on facula identification
CN104668738A (en) Cross type double-line laser vision sensing welding gun height real-time identification system and method
CN110889827A (en) Transmission line tower online identification and inclination detection method based on vision
CN112304954A (en) Part surface defect detection method based on line laser scanning and machine vision
CN108106617A (en) A kind of unmanned plane automatic obstacle-avoiding method
CN107014291A (en) A kind of vision positioning method of the accurate transfer platform of material
CN110322462B (en) Unmanned aerial vehicle visual landing method and system based on 5G network
CN106097423A (en) LiDAR point cloud intensity correction method based on k neighbour
CN109671059A (en) A kind of battery case image processing method and system based on OpenCV
CN110543612B (en) Card collection positioning method based on monocular vision measurement

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent for invention or patent application
CB03 Change of inventor or designer information

Inventor after: Zhang Liguo

Inventor after: Xiao Bo

Inventor after: Jiao Jianbin

Inventor after: Gao Xueshan

Inventor after: Liu Lu

Inventor after: Zhang Lei

Inventor before: Zhang Liguo

Inventor before: Xiao Bo

Inventor before: Jiao Jianbin

Inventor before: Gao Xueshan

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: ZHANG LIGUO XIAO BO JIAO JIANBIN GAO XUESHAN TO: ZHANG LIGUO XIAO BO JIAO JIANBIN GAO XUESHAN LIU LU ZHANG LEI

C14 Grant of patent or utility model
GR01 Patent grant