CN110490934A - Mixing machine vertical blade attitude detecting method based on monocular camera and robot - Google Patents
Mixing machine vertical blade attitude detecting method based on monocular camera and robot Download PDFInfo
- Publication number
- CN110490934A CN110490934A CN201910742194.5A CN201910742194A CN110490934A CN 110490934 A CN110490934 A CN 110490934A CN 201910742194 A CN201910742194 A CN 201910742194A CN 110490934 A CN110490934 A CN 110490934A
- Authority
- CN
- China
- Prior art keywords
- blade
- axis
- rotation
- monocular camera
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Abstract
The present invention relates to a kind of mixing machine vertical blade attitude detecting method based on monocular camera and robot, this method obtains the hand and eye calibrating relationship between robot and monocular camera first, suitable position is reached by adjusting robot to adjust monocular camera, after acquiring mixing paddle image using monocular camera, image procossing and feature extraction are carried out to it, the angular relationship between vertical mixer blade and camera can be solved by monocular camera image-forming principle and related geometrical relationship, and then rotation angle of the blade in robot basis coordinates system is solved using the camera and Robot Hand-eye relationship demarcated, obtain blade posture.
Description
Technical field
The present invention relates to the pose detection fields in high-risk explosive environments, in particular a kind of to be based on monocular camera and machine
The mixing machine vertical blade attitude detecting method of people.
Background technique
The background of invention is Space Pyrotechnics Devices product workshop, belongs to inflammable and explosive high-risk environment.Currently, related in space industry
The gesture recognition research of spacecraft is more, and the pertinent literature about the gestures of object identification in high-risk workshop is less.Mixing machine
Vertical blade is mixed for propellant, in inflammable and explosive workshop, before carrying out relevant operation to it using robot,
Need to detect its spatial attitude.The mode that common vision solves targeted attitude mainly has monocular camera, binocular camera etc..It compares
Monocular camera, binocular camera stereo calibration corrects matching algorithm complexity, relatively inefficient, and binocular camera system outer dimension
It is larger, it is not suitable for being installed on the lesser industry spot of working space and industrial robot end.In view of needing to visual sensing
Device carries out flame proof processing, selects small volume, the monocular camera that easy for installation, system is more stable carries out vertical mixer blade appearance
The detection of state.
A kind of dynamic object based on mechanical arm tail end monocular vision is disclosed in patent of invention CN201410016272.0
Position and orientation measurement method, first progress camera calibration and hand and eye calibrating, then shoot two width figures using monocular-camera
Picture extracts target feature point later and carries out Feature Points Matching, and then solves the rotational transformation matrix of video camera, clicks through to feature
Row three-dimensionalreconstruction and dimension correction finally obtain position and the posture of target relative camera using the characteristic point after reconstruct.It should
Method efficiently uses the travelling performance of mechanical arm, carries out the Image Acquisition of target, principle and binocular camera in two different locations
System is similar but to have greater flexibility and stability, but this method is in the Feature Points Matching process of two width object to be measured images
In, more to the requirement of characteristic point number, for more severe commercial measurement environment, object to be measured is difficult to provide greater number of spy
Point is levied, in addition, three-dimensionalreconstruction algorithm is computationally intensive, and processing speed is slower for the object to be measured of complex contour, to algorithm speed
It is not applicable to spend more demanding occasion.
Summary of the invention
Technical problems to be solved
Mixing machine vertical blade be it is complex-curved, surface is attached with abundant residues propellant, can not provide enough spies
Point is levied, therefore, it is impossible to complete the detection of blade posture by Feature Points Matching and three-dimensional reconstruction.For the inspection for overcoming its severe
Survey condition and meet its detection demand, needs to propose that one kind does not depend on a certain number of characteristic points, without carrying out three-dimensional reconstruction
Simply, the big blade attitude detecting method of feasibility.
Technical solution
In order to solve the problems existing in the prior art, the invention proposes a kind of mixing machine based on monocular camera and robot
Vertical blade attitude detecting method.This method obtains the hand and eye calibrating relationship between robot and monocular camera first, passes through tune
Whole robot reaches suitable position to adjust monocular camera, after acquiring mixing paddle image using monocular camera, carries out to it
Image procossing and feature extraction can solve vertical mixer blade and phase by monocular camera image-forming principle and related geometrical relationship
Angular relationship between machine, and then blade is solved in robot basis coordinates system using the camera and Robot Hand-eye relationship demarcated
In rotation angle to get arrive blade posture.
A kind of mixing machine vertical blade attitude detecting method based on monocular camera and robot, it is characterised in that step is such as
Under:
Step 1: definition monocular camera image pixel coordinates system is OpixXpixYpix, monocular camera is installed on robot end,
The calibration of monocular camera intrinsic parameter is carried out using gridiron pattern scaling board, solves its internal reference matrix, specific as follows:
Gridiron pattern scaling board is placed in camera fields of view by 1a., constantly converts scaling board posture, acquires scaling board image;
1b. handles image, and monocular camera internal reference matrix K is calculated using " Zhang Zhengyou calibration method ",Wherein, f is monocular camera focal length, and dx and dy are the single photosensitive unit core of monocular camera
The length and width of piece, (u0,v0) it is that pixel of the intersection point of monocular camera optical axis and imaging plane in image pixel coordinates system is sat
Mark;
Step 2: definition robot basis coordinates system is ORXRYRZR, robot flange coordinate system is OFXFYFZF, monocular camera seat
Mark system is OCXCYCZC, using hand and eye calibrating method, the coordinate demarcated between monocular camera coordinate system and flange coordinate system converts square
Battle array Tcf;
Step 3: suitable position when defining monocular camera detection is camera coordinates system XCOCYCPlane and YCAxis and robot
Basis coordinates system ZRAxis is parallel, according to step 2 gained Tcf, when calculating camera arrival suitable position, posture changing needed for robot
Matrix adjusts monocular camera to the position by adjusting robot;
Step 4: monocular camera position has been adjusted to suitable position after step 3, acquires a blade using monocular camera
Image is according to the following steps filtered it except making an uproar and edge detection, feature extraction:
4a. carries out edge detection to image using Canny edge detection algorithm, extracts the bilateral side of the surveyed blade axis of rotation
Edge pixel coordinate is recorded as pixel coordinate point set { x respectivelypix1,ypix1},{xpix2,ypix2};
4b. extracts blade unilateral side feature shoulder point pixel coordinate, is recorded as (xpix3,ypix3);
Step 5: calculating in blade image, the pixel distance Δ x between the bilateral edge of the blade axis of rotation1And blade rotation
The pixel distance Δ x of axis center axis and blade unilateral side feature shoulder point2, steps are as follows:
5a., it is found that when the blade axis of rotation is imaged on the image, becomes the mother of its edge contour by monocular camera image-forming principle
Line has two, and angle corresponding to two buses is θ1With θ2, by blade axis of rotation position in space and radius R, can solve
θ1With θ2;
5b. is by least square method to pixel coordinate point set { x obtained in step 4pix1,ypix1},{xpix2,ypix2Into
Row fitting, obtains linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system in blade image1With l2, in turn
The pixel distance Δ x between the bilateral edge of the blade axis of rotation is calculated1, unit is pixel;
The 5c. linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system as obtained in step 5b1With
l2And blade unilateral side feature shoulder point pixel coordinate (x obtained in step 4pix3,ypix3), blade feature in blade image can be obtained
Pixel distance Δ x between shoulder point and its immediate blade axis of rotation edge2', unit is pixel;
It is flat in imaging that 5d. defines projection straight line of the blade axis of rotation central axis on imaging plane and two buses of blade
The distance of projection straight line on face is respectively d1And d2, definition i is d1With d2Ratio, then by monocular camera image-forming principle, i=
d1/d2=-cos θ2/cosθ1, so in the picture, picture of the blade axis of rotation central axis apart from the bilateral edge of the blade axis of rotation
Element distance is respectively Δ x1/(i+1),iΔx1/ (i+1), the Δ x obtained by step c2', it can solve in image in the blade axis of rotation
The pixel distance Δ x of mandrel line and blade unilateral side feature shoulder point2, Δ x2=Δ x2′+Δx1/ (i+1), unit are pixel;
Step 6: measurement obtains the distance between 2 feature shoulder point of blade L, so calculate blade axis of rotation central axis with
Distance L ', L '=L/2 of blade feature shoulder point;
Step 7: defining two feature shoulder point line of blade and camera coordinates system XCOCYCAngle between plane is blade in phase
Rotation angle, θ under machine coordinate system ', θ ' is calculated according to the following steps:
It is distance of the bilateral contour line of the blade axis of rotation between the projection on imaging plane that 7a., which defines m, with step 5
Pixel distance Δ x between the middle bilateral edge of the blade axis of rotation1Corresponding, n is blade axis of rotation central axis and blade feature shoulder
Distance of the point between the projection on imaging plane, with blade axis of rotation central axis in step 5 and blade unilateral side feature shoulder
The pixel distance Δ x of point2It is corresponding, by monocular camera image-forming principle, m/n=Δ x1/Δx2;
7b. defines angle of the γ between camera photocentre and blade feature shoulder point two o'clock line and camera optical axis, defines D1
=R/cos (π-θ1), D2=R/cos θ2And T=L ' cos θ '-L ' sin θ ' tan γ, by similar triangles theorem, m/n=(D1
+D2)/T;
7c. is by monocular camera image-forming principle, tan γ=(u0-xpix3)dx/ f, f/dx, u0It is obtained by step 1, xpix3By step
Rapid 5 obtain;
7d. combination step 7a and 7b, can obtain Δ x1/Δx2=(D1+D2)/T=(R/ (- cos θ1)+R/(cosθ2))/(L′
Cos θ '-L ' sin θ ' tan γ), in formula, Δ x1With Δ x2It is obtained by step 5, blade axis of rotation radius R is it is known that the blade axis of rotation
Central axis is obtained with blade feature shoulder point distance L ' by step 6, tan γ obtained in step c is substituted into, by simple three
Angle function operation last solution obtains unique unknown number θ ';
Step 8: defining two feature shoulder point line of blade and robot basis coordinates system XRORZRAngle between plane is blade
Rotation angle, θ under robot basis coordinates system, calculates θ according to the following steps:
8a. obtains the coordinate conversion matrix T between monocular camera coordinate system and robot flange coordinate system by step 2cf, by
Robot demonstrator reads the transition matrix T for working as forward flange coordinate system and robot basis coordinates systemfr, current monocular camera can be solved
Coordinate conversion matrix between coordinate system and robot basis coordinates system is Tcr, Tcr=TcfTfr;
8b.TcrIt is represented by [r1 r2 r3T], wherein r1, r2, r3For rotating vector, t is translation vector, by current phase
The positional relationship and coordinate system rotation transformation principle of machine and robot can obtain, r1=[0 sin α 0 of-cos α]T, wherein α is machine
Device people's basis coordinates system XRAxis and camera coordinates system XCAngle between axis can solve α by antitrigonometric function;
8c. is obtained by each co-ordinate system location relationship, θ=α+θ ', wherein θ ' is obtained by step 7, and α is obtained by step 8b, finally
Rotation angle, θ of the blade under robot basis coordinates system is solved, that is, solves blade posture information.
Beneficial effect
A kind of mixing machine vertical blade attitude detecting method based on monocular camera and robot proposed by the present invention, does not have
The characteristic point for excessively relying on mixing machine vertical blade avoids complicated three-dimensional reconstruction work, special merely with blade shape geometry
Point can solve blade posture with monocular camera image-forming principle, and rotation angle derivation algorithm calculates simply, and precision is higher, can meet
Application requirement.This method is suitable for the attitude detection of a variety of spinning motion rigid bodies, can with low-cost and high-precision meet detection need
It asks.
Detailed description of the invention
Fig. 1 is the mixing machine vertical blade attitude detection flow chart based on monocular camera and robot;
Fig. 2 is mixing machine vertical blade schematic diagram;
Fig. 3 is blade and robot and monocular camera installation environment schematic diagram;
Fig. 4 is mixing machine vertical blade axis of rotation imaging schematic diagram;
Fig. 5 is mixing machine vertical blade monocular camera imaging schematic top plan view;
Fig. 6 is each coordinate system and mixing machine vertical blade rotation angle schematic diagram;
Wherein: the 1- mixing machine vertical blade axis of rotation;2- mixing machine vertical blade axis of rotation central axis;3- mixing machine is vertical
Formula blade feature shoulder point;4- mixing machine vertical blade feature shoulder point;5- robot;6- robot flange;7- camera mounting base;8-
Monocular camera;9- light source;10- end-of-arm tooling;11- blade axis of rotation bus;12- blade axis of rotation bus;13- camera photocentre;
14- camera virtual image plane (symmetrical about camera photocentre with camera actual imaging plane);15. axis of rotation side in blade image
Edge;Axis of rotation edge in 16- blade image;17- camera optical axis;18- vertical mixer blade rotation angle;19- camera coordinates
System;20- robot basis coordinates system.
Specific embodiment
Now in conjunction with embodiment, attached drawing, the invention will be further described:
Referring to attached drawing 1-6, it is vertical to apply to mixing machine for attitude detecting method of the present embodiment based on monocular camera and robot
On formula blade.The mixing machine vertical blade axis of rotation 1 is cylindrical body, is located above blade, central axis 2 of the blade around the axis of rotation 1
It does rotation movement, when stopping every time, rotation angle is uncertain;Symmetrical rigid body centered on mixing paddle, 3 He of blade feature shoulder point
The line of feature shoulder point 4 passes through central axis 2;Monocular camera 8 is installed on the end flange 6 of robot 5 by mounting base 7,
For obtaining blade image, picture quality is can be improved in light source 9, and tool 10 is for completing subsequent robot task;It utilizes
Camera calibration method after obtaining monocular camera internal reference matrix and trick relationship, can be adjusted monocular camera by controlling robot
To suitable position, blade image is acquired, blade images in camera imaging plane 14, and camera optical axis 17 is vertical with imaging plane 14,
Distance of the camera photocentre 13 away from imaging plane 14 is focal length, and two buses 11 and 12 of the blade axis of rotation become right in blade image
The bilateral edge 15 and 16 of the blade answered;Blade image is pre-processed using image processing algorithm and extracts blade feature shoulder point
And the pixel coordinate of the bilateral edge of the blade axis of rotation in the picture, according to monocular camera image-forming principle and related geometrical relationship and hand
Eye calibration relationship acquires rotation angle 18 of the blade under robot basis coordinates system 20.
The specific steps of method in the present embodiment are given below:
Monocular camera is installed on robot end by step 1., and the pixel coordinate system for defining the monocular camera plane of delineation is
OpixXpixYpix, the calibration of monocular camera intrinsic parameter is carried out using gridiron pattern scaling board, solves its internal reference matrix, steps are as follows:
A. gridiron pattern scaling board is placed in camera fields of view, makes its placement location and the blade axis of rotation as close as possible to tune
Whole monocular camera focal length, makes on scaling board gridiron pattern high-visible and all gridiron patterns are in camera fields of view, constantly transformation mark
Fixed board posture acquires the scaling board image of 20 different postures;
B. 20 images of monocular camera acquisition are handled, monocular camera is calculated using " Zhang Zhengyou calibration method "
Internal reference matrix K,Wherein, f is monocular camera focal length, and dx and dy is that monocular camera is single
The length and width of photosensitive unit chip, (u0,v0) it is the intersection point of monocular camera optical axis and imaging plane in image pixel coordinates system
In pixel coordinate;
It is O that step 2., which defines robot basis coordinates system,RXRYRZR, robot flange coordinate system is OFXFYFZF, monocular camera seat
Mark system is OCXCYCZC.Robot uses ground mounting means, and robot basis coordinates system initial point is located at robot base center, locates
When mechanical zero, robot basis coordinates system Z axis is parallel with gravity direction and opposite.Pass through following hand and eye calibrating step, calibration
Coordinate conversion matrix T between monocular camera coordinate system and flange coordinate systemcf:
A. the calibration of space known to pose object is set up, control robot is mobile, then monocular camera moves, it is ensured that movement
Form is not pure flat shifting, obtains the module and carriage transformation matrix T between two spaces calibration object and monocular camera coordinate system1And T2, by machine
Device people's controller can obtain the module and carriage transformation matrix T of mobile front and back robot flange coordinate system3, then converted and closed by spatial pose
System can obtain constraint equation, T1TcfT3=T2Tcf;;
B. control robot moves again, it is ensured that and forms of motion is not pure flat shifting, can obtain constraint equation newly, two groups of simultaneous
Equation can solve the unknown matrix T in equationcf, complete monocular camera hand and eye calibrating;
When step 3. monocular camera carries out Image Acquisition, it is desirable that camera imaging plane and imaging plane edge and blade rotation
Axis is parallel.Blade is vertically-mounted, blade axis of rotation central axis and robot basis coordinates system ZRIn parallel, then it can be considered requirement camera
Coordinate system XCOCYCPlane and YCAxis and robot basis coordinates system ZRAxis is parallel, this position is defined as monocular camera suitable position,
Monocular camera is adjusted to the position according to the following steps:
A. monocular camera coordinate system has been obtained to the transformational relation T robot flange coordinate system by step 2cf, Tcf
=[R T], wherein R is spin matrix,Assuming that by monocular camera coordinate system to robot flange coordinate
The rotation sequence of system is Z-Y-X, i.e., respectively around ZC, YC, XCAxis rotation βz, βy, βx, then by TcfIn spin matrix R can reverse
Three rotation Eulerian angles βz, βy, βx,
And then when camera can be calculated be in suitable position, robot flange coordinate system and each reference axis of camera coordinates system
Between angle β1, β2, β3;
B. when meeting camera coordinates system XCOCYCPlane and YCAxis and robot basis coordinates system ZRAxis is parallel, obtains robot base
Transition matrix T between coordinate system and camera coordinates systemcr, Tcr=TcfTfr, TcfIt is obtained by step 2, flange coordinate system and machine
Coordinate conversion matrix T between people's basis coordinates systemfrIt is directly obtained by robot controller, with step a, robot base can be solved
Angle between mark system and each reference axis of camera coordinates systemIn conjunction with β1, β2, β3, camera can be solved and be in suitable position
When setting, angle between robot flange coordinate system and each reference axis of basis coordinates system calculates robot pose converted quantity accordingly,
By adjusting robot pose, so that X in camera coordinates systemCOCYCPlane and YCAxis and robot basis coordinates system ZRAxis is parallel, this
When, monocular camera position has been adjusted to suitable position;
Step 4. has been adjusted through step 3, monocular camera position to suitable position, acquires a blade figure using monocular camera
Picture is according to the following steps filtered it except making an uproar and the image processing operations such as edge detection, feature extraction:
A. edge detection is carried out to image using Canny edge detection algorithm, extracts the bilateral side of the surveyed blade axis of rotation
Edge pixel coordinate is recorded as pixel coordinate point set { xpix1,ypix1},{xpix2,ypix2};
B. blade unilateral side feature shoulder point pixel coordinate is extracted, (x is recorded aspix3,ypix3);
Step 5. calculates in blade image, the pixel distance Δ x between the bilateral edge of the blade axis of rotation1And blade rotation
The pixel distance Δ x of axis center axis and blade unilateral side feature shoulder point2, steps are as follows:
A. the blade axis of rotation is cylindrical body, by monocular camera image-forming principle it is found that when the blade axis of rotation is imaged on the image,
Bus as its edge contour has two, and angle corresponding to two buses is θ1With θ2, S point is camera photocentre, and blade is in sky
Between in position and diameter D it is known that through a series of coordinates conversion after, blade axis of rotation central axis is in monocular camera coordinate system
Coordinate it is available, and by step 3 it is found that blade axis of rotation central axial direction and robot basis coordinates system Z at this timeRAxis is flat
Row, then at this time on blade central axis all the points xc0With zc0It is identical and determining, blade axis of rotation radius R=D/2, by following formula
θ can be calculated1With θ2:
Work as xc0When < 0,
Work as xc0When >=0,
B. two buses on the blade axis of rotation in step a and the blade axis of rotation in step 4 in blade image are bilateral
Edge is corresponding, and the axis of rotation edge in blade image corresponds to projection of two buses on imaging plane.Pass through minimum
Square law is to pixel coordinate point set { x obtained in step 4pix1,ypix1},{xpix2,ypix2Be fitted, it can get step 4 institute
Linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system in collected blade image1With l2, Jin Erji
Calculation obtains the pixel distance Δ x between the bilateral edge of the blade axis of rotation1, unit is pixel;
C. linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system as obtained in step b1With l2
And blade unilateral side feature shoulder point pixel coordinate (x obtained in step 4pix3,ypix3), the collected blade figure of step 4 institute can be obtained
Pixel distance Δ x as between blade feature shoulder point and its immediate blade axis of rotation edge2', unit is pixel;
D. it is flat in imaging that projection straight line of the blade axis of rotation central axis on imaging plane and two buses of blade are defined
The distance of projection straight line on face is respectively d1And d2, definition i is d1With d2Ratio, then by monocular camera image-forming principle, i=
d1/d2=-cos θ2/cosθ1, can be obtained by similar triangles theorem, in blade image, blade axis of rotation central axis is apart from paddle
The pixel distance at the bilateral edge of the leaf axis of rotation is respectively Δ x1/(i+1),iΔx1/ (i+1), the Δ x obtained by step c2', it can solve
Obtain the pixel distance Δ x of blade axis of rotation central axis and blade unilateral side feature shoulder point in image2, Δ x2=Δ x2′+Δx1/(i
+ 1), unit is pixel;
Step 6. measurement obtains the distance between 2 feature shoulder point of blade L, blade form central symmetry, and then calculates blade
Axis of rotation central axis and blade feature shoulder point distance L ', L '=L/2;
Step 7. defines two feature shoulder point line of blade and camera coordinates system XCOCYCAngle between plane is blade in phase
Rotation angle, θ under machine coordinate system ', θ ' is calculated according to the following steps:
A. defining m is distance of the bilateral contour line of the blade axis of rotation between the projection on imaging plane, and in step 5
Pixel distance Δ x between the bilateral edge of the blade axis of rotation1Corresponding, n is blade axis of rotation central axis and blade feature shoulder point
Distance between the projection on imaging plane, with blade axis of rotation central axis in step 5 and blade unilateral side feature shoulder point
Pixel distance Δ x2It is corresponding, by monocular camera image-forming principle, m/n=Δ x1/Δx2;
B. angle of the γ between camera photocentre and blade feature shoulder point two o'clock line and camera optical axis is defined, D is defined1=
R/cos(π-θ1), D2=R/cos θ2And T=L ' cos θ '-L ' sin θ ' tan γ, by similar triangles theorem, m/n=(D1+
D2)/T;
C. by monocular camera aufbauprinciple, tan γ=(u0-xpix3)dx/ f, f/dx,u0It is obtained by step 1, xpix3By step
5 obtain;
D. step a, b are combined, Δ x can be obtained1/Δx2=(D1+D2)/T=(R/ (- cos θ1)+R/(cosθ2))/(L′cos
θ '-L ' sin θ ' tan γ), in formula, Δ x1With Δ x2It is obtained by step 5, blade axis of rotation radius R is it is known that blade rotation axis center
Axis is obtained with blade feature shoulder point distance L ' by step 6, tan γ obtained in step c is substituted into, by simple triangle letter
Number operation last solution obtains unique unknown number θ ',
θ '=cos-1(((Δx2/Δx1)(-R/cosθ1+R/cosθ2))/L')-γ;
Step 8. defines two feature shoulder point line of blade and robot basis coordinates system XRORZRAngle between plane is blade
Rotation angle, θ under robot basis coordinates system, calculates θ according to the following steps:
A. the coordinate conversion matrix T between monocular camera coordinate system and robot flange coordinate system is obtained by step 2cf, by
Robot demonstrator reads the transition matrix T for working as forward flange coordinate system and robot basis coordinates systemfr, current monocular camera can be solved
Coordinate conversion matrix between coordinate system and robot basis coordinates system is Tcr, Tcr=TcfTfr;
b.TcrIt is represented by [r1 r2 r3T], r1, r2, r3For rotating vector, t is translation vector, by step 3 it is found that single
Mesh camera coordinates system YCAxis and robot basis coordinates system ZRAxis is parallel, then can be obtained by cartesian coordinate system rotation transformation principle, r1=
[-cosα 0 sinα 0]T, wherein α is robot basis coordinates system XRAxis and camera coordinates system XCAngle between axis, You Fansan
Angle function can solve α;
C. it is obtained by each co-ordinate system location relationship and rotation angle, θ=α+θ ', wherein θ ' is obtained by step 7, and α is by step
B is obtained, and last solution obtains rotation angle, θ of the blade under robot basis coordinates system, that is, solves blade posture information.
Claims (1)
1. a kind of mixing machine vertical blade attitude detecting method based on monocular camera and robot, it is characterised in that step is such as
Under:
Step 1: definition monocular camera image pixel coordinates system is OpixXpixYpix, monocular camera is installed on robot end, uses
Gridiron pattern scaling board carries out the calibration of monocular camera intrinsic parameter, solves its internal reference matrix, specific as follows:
Gridiron pattern scaling board is placed in camera fields of view by 1a., constantly converts scaling board posture, acquires scaling board image;
1b. handles image, and monocular camera internal reference matrix K is calculated using " Zhang Zhengyou calibration method ",Wherein, f is monocular camera focal length, and dx and dy are the single photosensitive unit core of monocular camera
The length and width of piece, (u0,v0) it is that pixel of the intersection point of monocular camera optical axis and imaging plane in image pixel coordinates system is sat
Mark;
Step 2: definition robot basis coordinates system is ORXRYRZR, robot flange coordinate system is OFXFYFZF, monocular camera coordinate system
For OCXCYCZC, using hand and eye calibrating method, demarcate the coordinate conversion matrix between monocular camera coordinate system and flange coordinate system
Tcf;
Step 3: suitable position when defining monocular camera detection is camera coordinates system XCOCYCPlane and YCAxis and robot base
Mark system ZRAxis is parallel, according to step 2 gained Tcf, when calculating camera arrival suitable position, posture changing matrix needed for robot,
Monocular camera is adjusted to the position by adjusting robot;
Step 4: monocular camera position has been adjusted to suitable position after step 3, acquires a blade image using monocular camera,
It is filtered according to the following steps except making an uproar and edge detection, feature extraction:
4a. carries out edge detection to image using Canny edge detection algorithm, extracts the bilateral edge picture of the surveyed blade axis of rotation
Plain coordinate is recorded as pixel coordinate point set { x respectivelypix1,ypix1},{xpix2,ypix2};
4b. extracts blade unilateral side feature shoulder point pixel coordinate, is recorded as (xpix3,ypix3);
Step 5: calculating in blade image, the pixel distance Δ x between the bilateral edge of the blade axis of rotation1And in the blade axis of rotation
The pixel distance Δ x of mandrel line and blade unilateral side feature shoulder point2, steps are as follows:
By monocular camera image-forming principle it is found that when the blade axis of rotation is imaged on the image, the bus for becoming its edge contour has 5a.
Two, angle corresponding to two buses is θ1With θ2, by blade axis of rotation position in space and radius R, θ can be solved1With
θ2;
5b. is by least square method to pixel coordinate point set { x obtained in step 4pix1,ypix1},{xpix2,ypix2Intended
It closes, obtains linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system in blade image1With l2, and then calculate
Obtain the pixel distance Δ x between the bilateral edge of the blade axis of rotation1, unit is pixel;
The 5c. linear equation l of the bilateral edge of the blade axis of rotation in image pixel coordinates system as obtained in step 5b1With l2With
And blade unilateral side feature shoulder point pixel coordinate (x obtained in step 4pix3,ypix3), blade feature shoulder point in blade image can be obtained
With the pixel distance Δ x between its immediate blade axis of rotation edge2', unit is pixel;
5d. defines projection straight line of the blade axis of rotation central axis on imaging plane and two buses of blade on imaging plane
The distance of projection straight line be respectively d1And d2, definition i is d1With d2Ratio, then by monocular camera image-forming principle, i=d1/d2
=-cos θ2/cosθ1, so in the picture, pixel distance of the blade axis of rotation central axis apart from the bilateral edge of the blade axis of rotation
Respectively Δ x1/(i+1),iΔx1/ (i+1), the Δ x obtained by step c2', blade axis of rotation central axis in image can be solved
With the pixel distance Δ x of blade unilateral side feature shoulder point2, Δ x2=Δ x2′+Δx1/ (i+1), unit are pixel;
Step 6: measurement obtains the distance between 2 feature shoulder point of blade L, and then calculates blade axis of rotation central axis and blade
Distance L ', L '=L/2 of feature shoulder point;
Step 7: defining two feature shoulder point line of blade and camera coordinates system XCOCYCAngle between plane is that blade is sat in camera
The lower rotation angle, θ of mark system ', calculating θ ' according to the following steps:
It is distance of the bilateral contour line of the blade axis of rotation between the projection on imaging plane that 7a., which defines m, with paddle in step 5
Pixel distance Δ x between the bilateral edge of the leaf axis of rotation1Corresponding, n is that blade axis of rotation central axis and blade feature shoulder point exist
The distance between projection on imaging plane, with blade axis of rotation central axis in step 5 and blade unilateral side feature shoulder point
Pixel distance Δ x2It is corresponding, by monocular camera image-forming principle, m/n=Δ x1/Δx2;
7b. defines angle of the γ between camera photocentre and blade feature shoulder point two o'clock line and camera optical axis, defines D1=R/
cos(π-θ1), D2=R/cos θ2And T=L ' cos θ '-L ' sin θ ' tan γ, by similar triangles theorem, m/n=(D1+D2)/
T;
7c. is by monocular camera image-forming principle, tan γ=(u0-xpix3)dx/ f, f/dx, u0It is obtained by step 1, xpix3It is obtained by step 5
It arrives;
7d. combination step 7a and 7b, can obtain Δ x1/Δx2=(D1+D2)/T=(R/ (- cos θ1)+R/(cosθ2))/(L′cos
θ '-L ' sin θ ' tan γ), in formula, Δ x1With Δ x2It is obtained by step 5, blade axis of rotation radius R is it is known that blade rotation axis center
Axis is obtained with blade feature shoulder point distance L ' by step 6, tan γ obtained in step c is substituted into, by simple triangle letter
Number operation last solution obtains unique unknown number θ ';
Step 8: defining two feature shoulder point line of blade and robot basis coordinates system XRORZRAngle between plane is blade in machine
Rotation angle, θ under device people's basis coordinates system, calculates θ according to the following steps:
8a. obtains the coordinate conversion matrix T between monocular camera coordinate system and robot flange coordinate system by step 2cf, by machine
People's teaching machine reads the transition matrix T for working as forward flange coordinate system and robot basis coordinates systemfr, current monocular camera coordinate can be solved
Coordinate conversion matrix between system and robot basis coordinates system is Tcr, Tcr=TcfTfr;
8b.TcrIt is represented by [r1 r2 r3T], wherein r1, r2, r3For rotating vector, t is translation vector, by Current camera with
The positional relationship and coordinate system rotation transformation principle of robot can obtain, r1=[0 sin α 0 of-cos α]T, wherein α is robot
Basis coordinates system XRAxis and camera coordinates system XCAngle between axis can solve α by antitrigonometric function;
8c. is obtained by each co-ordinate system location relationship, θ=α+θ ', wherein θ ' is obtained by step 7, and α is obtained by step 8b, and last solution obtains
Rotation angle, θ of the blade under robot basis coordinates system, that is, solve blade posture information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910742194.5A CN110490934B (en) | 2019-08-13 | 2019-08-13 | Monocular camera and robot-based mixer vertical type blade attitude detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910742194.5A CN110490934B (en) | 2019-08-13 | 2019-08-13 | Monocular camera and robot-based mixer vertical type blade attitude detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110490934A true CN110490934A (en) | 2019-11-22 |
CN110490934B CN110490934B (en) | 2022-04-19 |
Family
ID=68550743
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910742194.5A Active CN110490934B (en) | 2019-08-13 | 2019-08-13 | Monocular camera and robot-based mixer vertical type blade attitude detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110490934B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111325802A (en) * | 2020-02-11 | 2020-06-23 | 中国空气动力研究与发展中心低速空气动力研究所 | Circular mark point identification matching method in helicopter wind tunnel test |
CN112419375A (en) * | 2020-11-18 | 2021-02-26 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium and electronic device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5499306A (en) * | 1993-03-08 | 1996-03-12 | Nippondenso Co., Ltd. | Position-and-attitude recognition method and apparatus by use of image pickup means |
CN102135776A (en) * | 2011-01-25 | 2011-07-27 | 解则晓 | Industrial robot control system based on visual positioning and control method thereof |
CN103759716A (en) * | 2014-01-14 | 2014-04-30 | 清华大学 | Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm |
CN104296725A (en) * | 2014-10-08 | 2015-01-21 | 南开大学 | Method applied to parameter calibration of deformable robot operation arm |
CN105160059A (en) * | 2015-07-11 | 2015-12-16 | 西安工业大学 | BP and GA based blade machining cutting quantity optimization selection method |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN107966112A (en) * | 2017-12-03 | 2018-04-27 | 中国直升机设计研究所 | A kind of large scale rotor movement parameter measurement method |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN109794938A (en) * | 2019-02-01 | 2019-05-24 | 南京航空航天大学 | A kind of robot hole error-compensating apparatus and its method suitable for curved-surface structure |
DE102018101162A1 (en) * | 2018-01-19 | 2019-07-25 | Hochschule Reutlingen | Measuring system and method for extrinsic calibration |
-
2019
- 2019-08-13 CN CN201910742194.5A patent/CN110490934B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5499306A (en) * | 1993-03-08 | 1996-03-12 | Nippondenso Co., Ltd. | Position-and-attitude recognition method and apparatus by use of image pickup means |
CN102135776A (en) * | 2011-01-25 | 2011-07-27 | 解则晓 | Industrial robot control system based on visual positioning and control method thereof |
CN103759716A (en) * | 2014-01-14 | 2014-04-30 | 清华大学 | Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm |
CN104296725A (en) * | 2014-10-08 | 2015-01-21 | 南开大学 | Method applied to parameter calibration of deformable robot operation arm |
CN105160059A (en) * | 2015-07-11 | 2015-12-16 | 西安工业大学 | BP and GA based blade machining cutting quantity optimization selection method |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN109415119A (en) * | 2016-04-08 | 2019-03-01 | 列奥纳多股份公司 | Method of the rotor and detection blade for the aircraft that can be hovered relative to the posture of the hub of this rotor |
CN107966112A (en) * | 2017-12-03 | 2018-04-27 | 中国直升机设计研究所 | A kind of large scale rotor movement parameter measurement method |
DE102018101162A1 (en) * | 2018-01-19 | 2019-07-25 | Hochschule Reutlingen | Measuring system and method for extrinsic calibration |
CN109794938A (en) * | 2019-02-01 | 2019-05-24 | 南京航空航天大学 | A kind of robot hole error-compensating apparatus and its method suitable for curved-surface structure |
Non-Patent Citations (4)
Title |
---|
YONG-LIN KUO 等: "Pose Determination of a Robot Manipulator Based on Monocular Vision", 《IEEE ACCESS》 * |
ZHANXI WANG 等: "Base Detection Research of Drilling Robot System by Using Visual Inspection", 《HINDAWI JOURNAL OF ROBOTICS》 * |
王君 等: "单目移动机器人相对位姿估计方法", 《应用光学》 * |
辛锋 等: "机器人系统在立式混合机清理中的设计", 《航天制造技术》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111325802A (en) * | 2020-02-11 | 2020-06-23 | 中国空气动力研究与发展中心低速空气动力研究所 | Circular mark point identification matching method in helicopter wind tunnel test |
CN112419375A (en) * | 2020-11-18 | 2021-02-26 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium and electronic device |
CN112419375B (en) * | 2020-11-18 | 2023-02-03 | 青岛海尔科技有限公司 | Feature point matching method and device, storage medium and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN110490934B (en) | 2022-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111735479B (en) | Multi-sensor combined calibration device and method | |
CN106643699B (en) | Space positioning device and positioning method in virtual reality system | |
CN110728715B (en) | Intelligent inspection robot camera angle self-adaptive adjustment method | |
JP6121063B1 (en) | Camera calibration method, device and system | |
CN109405835B (en) | Relative pose measurement method based on non-cooperative target straight line and circular monocular image | |
CN109297413B (en) | Visual measurement method for large-scale cylinder structure | |
CN105716527B (en) | Laser seam tracking transducer calibration method | |
CN105716542B (en) | A kind of three-dimensional data joining method based on flexible characteristic point | |
CN106871787B (en) | Large space line scanning imagery method for three-dimensional measurement | |
CN105157592B (en) | The deformed shape of the deformable wing of flexible trailing edge and the measuring method of speed based on binocular vision | |
CN106197265B (en) | A kind of space free flight simulator precision visual localization method | |
CN108844459A (en) | A kind of scaling method and device of leaf digital template detection system | |
CN110009682A (en) | A kind of object recognition and detection method based on monocular vision | |
JP2009042162A (en) | Calibration device and method therefor | |
CN111220126A (en) | Space object pose measurement method based on point features and monocular camera | |
CN110312111A (en) | The devices, systems, and methods calibrated automatically for image device | |
JP2008205811A (en) | Camera attitude calculation target device and camera attitude calculation method using it, and image display method | |
JP2006234703A (en) | Image processing device, three-dimensional measuring device, and program for image processing device | |
CN110490934A (en) | Mixing machine vertical blade attitude detecting method based on monocular camera and robot | |
KR20150125767A (en) | Method for generating calibration indicator of camera for vehicle | |
CN110030979B (en) | Spatial non-cooperative target relative pose measurement method based on sequence images | |
CN114413958A (en) | Monocular vision distance and speed measurement method of unmanned logistics vehicle | |
CN110361001A (en) | One kind being used for space junk movement measurement system and scaling method | |
JP6473188B2 (en) | Method, apparatus and program for generating depth map | |
CN106840137B (en) | Automatic positioning and orienting method of four-point type heading machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |