CN101419055A - Space target position and pose measuring device and method based on vision - Google Patents

Space target position and pose measuring device and method based on vision Download PDF

Info

Publication number
CN101419055A
CN101419055A CNA2008102253292A CN200810225329A CN101419055A CN 101419055 A CN101419055 A CN 101419055A CN A2008102253292 A CNA2008102253292 A CN A2008102253292A CN 200810225329 A CN200810225329 A CN 200810225329A CN 101419055 A CN101419055 A CN 101419055A
Authority
CN
China
Prior art keywords
image
extraterrestrial target
point
binocular
displacement platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008102253292A
Other languages
Chinese (zh)
Other versions
CN101419055B (en
Inventor
张弘
贾瑞明
穆滢
李璐
王磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN2008102253292A priority Critical patent/CN101419055B/en
Publication of CN101419055A publication Critical patent/CN101419055A/en
Application granted granted Critical
Publication of CN101419055B publication Critical patent/CN101419055B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a space target pose measuring device based on vision, as well as a measuring method thereof. The device consists of an electronic-control displacement table, an image acquisition device and a comprehensive information processing platform. The method consists of a binocular-image camera calibration module based on a 2D planar target, a module for extracting the feature points of binocular images, a module for matching the feature points of the binocular images, as well as a module for measuring attitude parameters. The device and the method control the motion of the electronic-control displacement table and a CCD camera on the electronic-control displacement table through comprehensive information processing; the CCD camera in motion transmits the binocular images to a high-speed computer of the information processing platform; and the high-speed computer works out the relative position and attitude parameters of a space target in a process of approaching the space target through a series of information processing processes (mainly through an image processing process), and then compares the parameters of the space target which is obtained through information processing with the motion parameters of the electronic-control displacement table, so as to verify the correctness of measurement results and analyze measurement precision.

Description

Extraterrestrial target pose measuring apparatus and method based on vision
(1) technical field:
The present invention relates to extraterrestrial target pose measuring apparatus and method based on vision, especially refer to that a kind of pattern-recognition and Based Intelligent Control, tracking aircraft approach the apparatus and method that tracking, Flame Image Process and extraterrestrial target kinematic parameter in the process of drawing close calculate to extraterrestrial target, approach a series of information processes (being mainly image processing process) in the process of drawing close by the subtend extraterrestrial target, realized the measurement of the parameter of extraterrestrial target motion is belonged to technical field of information processing.
(2) background technology:
The research work of early warning from satellite, mainly concentrate on remote section the demand remote in time discovery, the early warning of small and weak extraterrestrial target, the emphasis of research be the extraterrestrial target at the spacecraft stage casing detect tracking (at this moment the visible light of extraterrestrial target, infrared characteristic with respect to powered phase still a little less than, signal to noise ratio (S/N ratio) is low), the image-forming condition of research spacecraft and imaging characteristics are to the influence of extraterrestrial target accurate tracking.For far away section, the research of the extraterrestrial target recognition and tracking in stage casing, also obtained impressive progress.
For the research of closely extraterrestrial target extraterrestrial target feature extraction, gesture recognition and measurement aspect, the dealer does a lot of work, and has obtained certain achievement in research.
Research work mainly concentrate on remote section to small and weak extraterrestrial target remote in time find, the emphasis of research is that extraterrestrial target at the spacecraft stage casing detects tracking.For far away section, the extraterrestrial target recognition and tracking in stage casing, extraterrestrial target detects identification, extraterrestrial target and aspects such as background modeling, extraterrestrial target trajectory predictions and tracking and has obtained a lot of important achievements under complex background.The intersection butt joint of spacecraft is one of gordian technique, and closely feature extraction tracking, gesture recognition and the kinetic measurement of extraterrestrial target are to influence the task key of success in this process.
(3) summary of the invention:
The object of the present invention is to provide a kind of extraterrestrial target pose measuring apparatus and method based on vision, approach in the process of drawing close to extraterrestrial target to be implemented in the tracking aircraft, by the principle and the image process method of binocular imaging, the position and the attitude parameter of extraterrestrial target carried out accurate Calculation.
Technical scheme of the present invention is:
A kind of extraterrestrial target pose measuring apparatus based on vision of the present invention is made of following three parts: automatically controlled displacement platform, image capture device, integrated information processing platform.
1) image capture device:
This image capture device mainly comprises two CCD (Charge Coupled Device, i.e. Charge Coupled Device (CCD) imageing sensor) video camera.
The concrete parameter of ccd video camera is as follows:
The KP-F200SCL camera parameters is:
● resolution (valid pixel) 1628 * 1236
● CCD area 7.16 * 5.44mm 1/1.8-inch
● frame frequency 15fps
● interface Mini CamraLink
● size 29 (W) * 29 (H) * 29 (D)
● the about 53g of weight
● other 10bits output, serial ports control supports, working temperature 0~+ 40 ℃
2) automatically controlled displacement platform:
This automatically controlled displacement platform is made up of the servo displacement platform of six degree of freedom high accuracy number, displacement platform control box.
The support platform that the servo displacement platform of this six degree of freedom high accuracy number is an image acquiring device can be carried out the motion of six degree of freedom according to the steering order of displacement platform control box, realizes the checking to measurement result in the binocular measuring process.
This displacement platform control box by serial ports to the servo displacement platform of six degree of freedom high accuracy number sending controling instruction.
The parameter of automatically controlled displacement platform is as follows:
● X, Y-axis ± 0.35m
● Z axle ± 0.5m
● three slewing area ± 45 °
● base length is adjusted 0.2~0.5m
● optical axis angle ± 45 °
● X, Y, Z axial translation precision ± 0.1mm
● three rotation precision ± 0.05 °
● base length is adjusted precision 0.01mm
● optical axis angle is adjusted precision ± 0.05 °
3) integrated information processing platform:
This integrated information processing platform is made up of information interface, servocontrol processing unit, high-speed computer.
High-speed computer receives from the left and right sides image information that two ccd video cameras import into, finish to about a series of information processes in two width of cloth binocular images, comprise that extraterrestrial target Feature Extraction, characteristic matching, signature tracking, kinematic parameter calculate, the realization of three-dimensional reconstruction scheduling algorithm.
The servocontrol processing unit can send control command to automatically controlled displacement platform, makes the instruction campaign of the servo displacement platform of six degree of freedom high accuracy number according to input.
Details are as follows based on the relation between each ingredient of extraterrestrial target pose measuring apparatus of vision:
Annexation based on the extraterrestrial target pose measuring apparatus of vision is: image capture device (two ccd video cameras) is installed in the servo displacement platform of the six degree of freedom high accuracy number support two ends of automatically controlled displacement platform respectively, relative position can be mediated, and can move with the servo displacement platform of high accuracy number simultaneously; Image capture device all links to each other with integrated information processing platform with automatically controlled displacement platform, realizes that image that image capture device obtained is to the transmission to the steering order of servo platform of the transmission of the information processing platform and the information processing platform.
A kind of extraterrestrial target pose measuring apparatus of the present invention based on vision, its workflow is: at first the servocontrol processing unit by integrated information processing platform sends instruction to the servo displacement platform of the six degree of freedom high accuracy number of automatically controlled displacement platform, it is approached to extraterrestrial target according to the program that configures draw close; Simultaneously, in the process of motion, the image capture device that is installed in the servo displacement platform of six degree of freedom high accuracy number of automatically controlled displacement platform is constantly gathered the extraterrestrial target binocular sequence image in the motion process, these images that collect are passed to the high-speed computer of the information processing platform, high-speed computer is by a series of information process (being mainly image processing process), calculate to extraterrestrial target and approach relative position and the attitude parameter that the space target is concrete in the process of drawing close, again the kinematic parameter of the servo displacement platform of six degree of freedom high accuracy number of the parameter of the extraterrestrial target that draws through Flame Image Process computation process and automatically controlled displacement platform is compared at last, in order to confirmatory measurement result's correctness and follow-up measurement precision analysis.
A kind of extraterrestrial target pose measuring method based on vision of the present invention as shown in Figure 5, the steps include:
1) select the method for binocular image based on the camera calibration of 2D plane target drone for use, this method is the camera calibration module of binocular image based on the 2D plane target drone
Here the labeling algorithm that is adopted is the camera marking method based on the 2D plane target drone that people such as Zhang Zhengyou proposes.
In this method, video camera is taken a plane target drone in orientation different more than two, and video camera and can move freely in the 2D plane target drone does not need to know kinematic parameter.In the process of demarcating, suppose that the inner parameter of video camera is constant all the time, no matter video camera is from any angle shot target, and the parameter of video camera inside is a constant all, has only external parameter to change.
After this demarcation, just can carry out the polar curve correction of binocular image according to the result who demarcates.This is a critical step, and the speed of the coupling of subsequent images and the raising of accuracy are all played a very important role, the limit constraint that just will mention afterwards.
2) extraction module of the unique point of design binocular image
Angle point in the image is meant the point that has higher curvature in the image, and the method for extracting angle point roughly is divided into two classes: a class is to extract edge of image, with the point of crossing at the point of curvature maximum on the edge or edge as angle point; One class is directly to utilize the gray scale of image to calculate the curvature of intensity profile, with the point of maximum curvature as angle point.The method that the Harris angle point that this experiment is adopted extracts is directly to utilize gradation of image information to survey the method for angle point.
The expression formula of Harris operator is as follows:
M = G ( s ) ~ ⊗ g x g x g y g x g y g y
I=Det(M)-kTrace 2(M),k=0.04
Wherein:
g xGradient for the x direction; g yGradient for the y direction;
Figure A200810225329D00072
Be Gauss's template;
Figure A200810225329D00073
Be convolution operation; I is every a interest value; Det is a determinant of a matrix; Trace is a matrix trace; K is the silent constant of appointing.
Each point of gray level image to operation calculates this in horizontal and vertical first order derivative, and the product of the two.The property value that can obtain each pixel correspondence like this is respectively g x, g yAnd g xg y, calculate the interest value I of each point at last.
The Harris algorithm thinks that unique point is the picture element of the very big interest value correspondence in the subrange.Therefore, after having calculated the interest value of each point, extract the point of all the local interest value maximums in the original image.In the practical operation, can take out 8 pixels in 8 neighborhoods of each pixel successively, propose maximal value from center pixel and these 8 pixels, if the interest value of central point pixel is exactly a maximal value, then this point is exactly a unique point.
When extract minutiae, all I of satisfying all can be considered to unique point greater than the picture element of a certain threshold value T.Simultaneously can come choosing of controlling features point by some limiting factors.
The validity of feature extraction causes the feature extraction failure reasons to mainly contain between 50% to 65%:
● phenomenon such as block because the binocular image to there being parallax, exists in imaging, in another visual angle, do not have imaging in the extraterrestrial target feature of a visual angle imaging; Or because the difference of illumination, same feature is in the gray feature difference that image pair showed, thus resulting angle point extracts difference as a result.
● because angle point extraction algorithm itself is intrinsic, in neighborhood, calculates eigenwert and accept or reject and obtain corner location, this position and deviation is arranged by its position that should be in of people's Direct observation according to gradation of image.
The angle point that lost efficacy need be rejected by the coupling of unique point in the binocular image pair, and it is right to guarantee to obtain effective unique point.
3) matching module of design binocular image characteristic point
The coupling of unique point is in the binocular stereo vision image pair, gives the known point that fixes on the piece image, according to certain matching criterior, seeks corresponding with it match point on another width of cloth image.The method of images match mainly contains coupling based on the zone, based on the coupling of feature and the coupling that combines based on the coupling of explaining or several different methods.Corresponding match point is based upon on the parallax basis, because noise, illumination, block, the influence of factor such as perspective distortion, may be different in the feature that image pair presented, may be at certain region memory in a plurality of similar couplings, therefore need certain constraint as assistant criteria, to obtain accurate match.The general constraint of adopting has polar curve constraint, unique constraints, parallax continuity constraint and sequence consensus constraint etc.
This experiment is adopted the solid matching method that combines with the zone coupling based on polar curve constraint, characteristic matching in conjunction with the characteristics of binocular tri-dimensional vision system.
At first utilize the result of binocular stereo vision system parameter calibration, set up the polar curve restriction relation of left and right sides view, image is proofreaied and correct, polar curve is adjusted to horizontal level according to this relation.Image after the correction, corresponding match point is positioned on the same horizontal line substantially.Utilize then in the image of aforesaid Angular Point Extracting Method after polar curve is proofreaied and correct unique point is extracted.With left figure as original graph, right figure conduct is with reference to figure, to each unique point among the left figure, in right figure is the center with left figure characteristic point position, about 100 pixels, carry out the search of potential match point up and down in the scope of 2 pixels, classify the unique point of right figure in this scope as potential match point.Use polar curve constraint herein, the height of hunting zone is controlled in the very little scope, embody characteristic matching point basic polar curve on same horizontal line is retrained.To each potential match point, all according to the gray-scale value of its eight neighborhood according to the Normalized Grey Level difference square matching criterior mate with the primitive character point.The coupling of square mode of Normalized Grey Level difference is:
S ( x , y ) = Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 [ T ( x ′ , y ′ ) - I ( x + x ′ , y + y ′ ) ] 2 Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 T ( x ′ , y ′ ) 2 Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 I ( x + x ′ , y + y ′ ) 2
Wherein T (x ', y ') is the pixel grey scale in the original image, and I (x+x ', y+y ') be the pixel grey scale in the reference picture, m, n are the neighborhood size.(x, y) unique point of value minimum is as optimal match point to get S.
After the same method, be original graph with right figure, left figure is that reference diagram carries out the right judgement of match point, promptly about carry out symmetry among the figure and calculate.With only satisfy a direction or both direction not the match point of Satisfying Matching Conditions to being considered as false coupling, promptly only in symmetry is calculated, be match point right just be that correct matching characteristic point is right.Correct match point just can be according to the point that obtains to calculating parallax, the three-dimensional coordinate of unique point in the further computer memory to after determining.
In the relevant candidate matches of setting up,, certainly exist false coupling owing to be subjected to the influence etc. of brightness of image, surrounding environment similar features based on gray scale.
Can add constraint condition again to matching result calculates and further eliminates false coupling.Condition is harsh more, and false coupling is few more, but resulting correct match point is unfavorable for the modeling to object more to also few more simultaneously, can screen according to actual conditions.The unique point of trying to achieve coupling to after, can from about try to achieve the parallax size the figure, further according to the calculation of parameter unique point of binocular tri-dimensional vision system three-dimensional coordinate, as extraterrestrial target modeling and based measurement in the space.
4) design attitude parameter measurement module
The binocular measurement of the relative pose of extraterrestrial target is the range information that obtains extraterrestrial target surface characteristics point by the method for binocular vision, make up the geometric model of extraterrestrial target in the position in space relation according to unique point, and by to the tracking and the test constantly of unique point, the information that the six degree of freedom pose that utilizes the relation of geometric model to calculate extraterrestrial target changes relatively.In this problem, only the variable quantity of pose is measured calculating, and do not calculate its absolute magnitude.
To make up the model space geometric of extraterrestrial target by unique point, need at least three not unique points of conllinear.
At first, (z1), (z2), (x3, y3 z3), utilize the binocular stereo vision principle to obtain the volume coordinate of three unique points under camera coordinates system to M3 to M2 for x2, y2 for x1, y1 to utilize image process method to extract three the unique point M1 in extraterrestrial target surface.
By three in space not the point of conllinear can determine a plane, when the hypothesis space target was rigid body, this plane had just constituted the naive model of extraterrestrial target, reflected that by the calculating to the relative pose information on this plane the relative pose of extraterrestrial target changes.Note is by three unique point M1, M2, and the plane that M3 determines is a characteristic plane.
When the hypothesis space target is rigid body, no matter what kind of variation takes place in the position and the direction of this extraterrestrial target, or observation space target under different coordinate systems, relative position relation between the object feature point remains unchanged, if extraterrestrial target self is set up coordinate system, observe under camera coordinates system, the extraterrestrial target coordinate system after then pose changes can obtain by original extraterrestrial target coordinate system is rotated with translation.
Then just to be converted into the extraterrestrial target coordinate system be asking for of down conversion parameter in camera coordinates to the problem measured of the relative pose of extraterrestrial target.Set up the extraterrestrial target coordinate system according to the characteristic plane that three unique points are determined: with Direction is a Z ' axle forward, is XOY plane with the characteristic plane, Z axle and X-axis, and Y-axis constitutes right-handed coordinate system.True origin is got the leg-of-mutton center that is made of three unique points, note O ′ = ( 1 3 Σ i = 1 3 xi , 1 3 Σ i = 1 3 yi , 1 3 Σ i = 1 3 zi ) 。Suppose carry out the longitudinal axis that attitude change of O ' Z ' axle, promptly suppose the longitudinal axis that changes as spatial target posture by the planar process vector that is defined as Z ' axle forward for extraterrestrial target.
Characteristic plane is described with three angles with respect to the attitude variation of camera coordinates system, is respectively pitching angle theta, crab angle ψ and spin angle φ, and relative attitude information is the changing value of these three angles before and after attitude changes.With camera coordinates is that OXYZ moves to the extraterrestrial target coordinate origin, and this is the attitude information of effect characteristics plane in camera coordinates system not, and the definition of three angles and relative attitude information are defined as follows:
● pitching angle theta: extraterrestrial target is around the angle of OX axle rotation.
● relative pitching angle theta ': before and after attitude changes, the difference of the relative OX anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, relatively the angle of pitch is the angle of the YOZ plane projection of planar process vector in camera coordinates system, when extraterrestrial target when the OX axle is done the clockwise direction rotation, the angle of pitch is got negative value relatively, on the contrary for just.As shown in Figure 6.
● crab angle ψ: extraterrestrial target is around the angle of OY axle rotation.
● relative crab angle ψ ': before and after attitude changes, the difference of the relative OY anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, relatively crab angle is the angle of the XOZ plane projection of planar process vector in camera coordinates system, when extraterrestrial target when the OY axle is done the clockwise direction rotation, crab angle is got negative value relatively, on the contrary for just.As shown in Figure 7.
● spin angle φ: extraterrestrial target is around the angle of OZ axle rotation.
Relative spin angle φ ': before and after attitude changes, the difference of the relative OZ anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, the planar process vector can only characterize the attitude information of plane around OX, OY axle, and can't characterize the attitude change information of plane around himself, a vector on the face of making even this moment Characterize the spin angle, the corresponding vector before and after the rotation on the plane With
Figure A200810225329D00103
Between angle be relative spin angle.When
Figure A200810225329D00104
When direction was OZ axle forward, relatively spin angle was got negative value, otherwise for just.As Fig. 8 institute not.
Characteristic plane relative displacement parameter is described by X, Y under camera coordinates system, the shift value on the Z component.The vector that previously defined center O1 ' and the O2 ' that is made of three unique points characterized Component on OX, OY and OZ axle is X, Y, the Z component of relative displacement, as shown in Figure 9.
According to the measurement result of binocular vision, and the coordinate M11 of known initial three unique points under camera coordinates system (x11, y11, z11), M12 (x12, y12, z12), M13 (x13, y13, z13), and three characteristic point coordinates M21 of the correspondence after the attitude variation (x21, y21, z21), M22 (x22, y22, z22), M23 (x23, y23, z23).Coordinate after attitude changes is to be obtained through rotation and translation by the coordinate before changing.Measurement result to the relative variation of spatial target posture is represented with six parameters, is respectively the relative displacement of X, Y, Z direction, the pitching angle theta of rotating around X, the crab angle ψ that rotates around Y-axis and around the spin angle φ of Z axle rotation.
(1) calculating is determined leg-of-mutton center by unique point
O 1 ′ = ( 1 3 Σ i = 1 3 x 1 i , 1 3 Σ i = 1 3 y 1 i , 1 3 Σ i = 1 3 z 1 i )
O 2 ′ = ( 1 3 Σ i = 1 3 x 2 i , 1 3 Σ i = 1 3 y 2 i , 1 3 Σ i = 1 3 z 2 i )
Three components that calculated relative displacement by the displacement at two centers are:
x_trans=xo2-xo1
y_trans=yo2-yo1
z_trans=zo2-zo1
(2) computing center's point O1 ', O2 ' are at the projection and the angle on each plane
A) O1 ', O2 ' are projected in the YOZ plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00113
With
Figure A200810225329D00114
With the angle angle_original of Y-axis, angle_new, then the angle of pitch is relatively:
θ=angle_new-angle_original
B) O1 ', O2 ' are projected in the XOZ plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00115
With With the angle angle_original of Z axle, angle_new, then crab angle is relatively:
ψ=angle_new-angle_original
C) O1 ', O2 ' are projected in XOY plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00117
With With the angle angle_original of X-axis, angle_new, then spin angle is relatively:
φ=angle_new-angle_original
A kind of extraterrestrial target pose measuring method based on vision of the present invention, it mainly is made up of based on extraction module, the matching module of binocular image characteristic point, four modules of attitude parameter measurement module of the unique point of the camera calibration module of 2D plane target drone, binocular image the binocular image.
The distortion parameter that the function that the binocular image is finished based on the camera calibration module of 2D plane target drone is mainly video camera calculates, the polar curve of image is proofreaied and correct; The function that the extraction module of the unique point of binocular image is finished is mainly the extraction of the Harris feature angle point of binocular left and right sides head portrait; The function that the matching module of binocular image characteristic point is finished is mainly the coupling to the pair of right and left one of the Harris feature angle point of binocular left and right sides head portrait; The function that the attitude parameter measurement module is finished is mainly according to the coupling of unique point the position and attitude parameter of extraterrestrial target is measured.
By these four modules, finished the realization that get access to extraterrestrial target position and attitude parameter of software by initial left and right sides binocular image.
The present invention a kind of extraterrestrial target pose measuring apparatus and method based on vision, its advantage is: based on the method for vision and information processing (being mainly Flame Image Process), finish the measurement to the extraterrestrial target pose, reach higher precision.
(4) description of drawings:
Figure 1 shows that formation block diagram based on the extraterrestrial target pose measuring apparatus of vision
Figure 2 shows that the formation block diagram of image capture device
Figure 3 shows that the formation block diagram of automatically controlled displacement platform
Figure 4 shows that the formation block diagram of integrated information processing platform
Figure 5 shows that extraterrestrial target pose measuring method process flow diagram based on vision
Figure 6 shows that the angle of pitch and the relative attitude angle of design attitude parameter measurement module;
Figure 7 shows that the crab angle and relative crab angle of design attitude parameter measurement module;
Figure 8 shows that the spin angle and relative spin angle of design attitude parameter measurement module;
Figure 9 shows that the relative displacement of design attitude parameter measurement module and X, Y, Z component.
Number in the figure is described as follows:
1, image capture device;
2, automatically controlled displacement platform;
3, integrated information processing platform;
4, ccd video camera;
5, the servo displacement platform of six degree of freedom high accuracy number;
6, displacement platform control box;
7, information interface;
8, servocontrol processing unit;
9, high-speed computer;
(5) embodiment:
A kind of extraterrestrial target pose measuring apparatus based on vision of the present invention is made of following three parts: automatically controlled displacement platform, image capture device, integrated information processing platform.
1) image capture device:
As shown in Figure 2, this image capture device 1 mainly comprises two CCD (Charge Coupled Device, i.e. Charge Coupled Device (CCD) imageing sensor) video camera 4.
The concrete parameter of ccd video camera is as follows:
KP-F200SCL video camera 4 parameters are:
● resolution (valid pixel) 1628 * 1236
● CCD area 7.16 * 5.44mm 1/1.8-inch
● frame frequency 15fps
● interface Mini CamraLink
● size 29 (W) * 29 (H) * 29 (D)
● the about 53g of weight
Other 10bits output, serial ports control supports, working temperature 0~+ 40 ℃
2) automatically controlled displacement platform:
As shown in Figure 3, this automatically controlled displacement platform 2 is made up of the servo displacement platform 5 of six degree of freedom high accuracy number, displacement platform control box 6.
The support platform that the servo displacement platform 5 of this six degree of freedom high accuracy number is image acquiring devices can be carried out the motion of six degree of freedom according to the steering order of displacement platform control box 6, realizes the checking to measurement result in the binocular measuring process.
This displacement platform control box 6 by serial ports to servo displacement platform 5 sending controling instructions of six degree of freedom high accuracy number.
The parameter of automatically controlled displacement platform is as follows:
● X, Y-axis ± 0.35m
● Z axle ± 0.5m
● three slewing area ± 45 °
● base length is adjusted 0.2~0.5m
● optical axis angle ± 45 °
● X, Y, Z axial translation precision ± 0.1mm
● three rotation precision ± 0.05 °
● base length is adjusted precision 0.01mm
● optical axis angle is adjusted precision ± 0.05 °
3) integrated information processing platform:
As shown in Figure 4, this integrated information processing platform 3 is made up of information interface 7, servocontrol processing unit 8, high-speed computer 9.
High-speed computer 9 receives from the left and right sides image information that two ccd video cameras 4 import into, finish to about a series of information processes in two width of cloth binocular images, comprise that extraterrestrial target Feature Extraction, characteristic matching, signature tracking, kinematic parameter calculate, the realization of three-dimensional reconstruction scheduling algorithm.
Servocontrol processing unit 8 can send control command to automatically controlled displacement platform 2, makes the instruction campaign of the servo displacement platform 5 of six degree of freedom high accuracy number according to input.
Details are as follows based on the relation between each ingredient of extraterrestrial target pose measuring apparatus of vision:
As shown in Figure 1, annexation based on the extraterrestrial target pose measuring apparatus of vision is: image capture device 1 (two ccd video cameras 4) is installed in the servo displacement platform of the six degree of freedom high accuracy number 5 support two ends of automatically controlled displacement platform 2 respectively, relative position can be mediated, and can move platform 5 motions with the servo transposition of high accuracy number simultaneously; Image capture device 1 all links to each other with integrated information processing platform 3 with automatically controlled displacement platform 2, realizes the transmission to the steering order of the transmission of the information processing platform 3 and 3 pairs of servo platforms 2 of the information processing platform of image that image capture device 1 obtained.
A kind of extraterrestrial target pose measuring apparatus of the present invention based on vision, its workflow is: at first the servocontrol processing unit 8 by integrated information processing platform 3 sends instruction to the servo displacement platform 5 of the six degree of freedom high accuracy number of automatically controlled displacement platform 2, it is approached to extraterrestrial target according to the program that configures draw close; Simultaneously, in the process of motion, be installed in the image capture device 1 continuous extraterrestrial target binocular sequence image of gathering in the motion process of the servo displacement platform 5 of six degree of freedom high accuracy number of automatically controlled displacement platform 2, these images that collect are passed to the high-speed computer 9 of the information processing platform 3, high-speed computer 9 is by a series of information process (being mainly image processing process), calculate to extraterrestrial target and approach relative position and the attitude parameter that the space target is concrete in the process of drawing close, again the kinematic parameter of the servo displacement platform 5 of six degree of freedom high accuracy number of the parameter of the extraterrestrial target that draws through Flame Image Process computation process and automatically controlled displacement platform 2 is compared at last, in order to confirmatory measurement result's correctness and follow-up measurement precision analysis.
As shown in Figure 5, a kind of extraterrestrial target
The method of the vision measurement of pose parameter, this method mainly comprises following module: 1, the binocular image is based on the camera calibration module of 2D plane target drone, 2, the extraction of the unique point of binocular image, 3, the coupling of binocular image characteristic point, 4, attitude parameter measures.Being described in detail as follows of each module:
A kind of extraterrestrial target pose measuring method based on vision of the present invention the steps include:
1) select the method for binocular image based on the camera calibration of 2D plane target drone for use, this method is the camera calibration module of binocular image based on the 2D plane target drone
Here the labeling algorithm that is adopted is the camera marking method based on the 2D plane target drone that people such as Zhang Zhengyou proposes.
In this method, video camera is taken a plane target drone in orientation different more than two, and video camera and can move freely in the 2D plane target drone does not need to know kinematic parameter.In the process of demarcating, suppose that the inner parameter of video camera is constant all the time, no matter video camera is from any angle shot target, and the parameter of video camera inside is a constant all, has only external parameter to change.
After this demarcation, just can carry out the polar curve correction of binocular image according to the result who demarcates.This is a critical step, and the speed of the coupling of subsequent images and the raising of accuracy are all played a very important role, the limit constraint that just will mention afterwards.
2) extraction module of the unique point of design binocular image
Angle point in the image is meant the point that has higher curvature in the image, and the method for extracting angle point roughly is divided into two classes: a class is to extract edge of image, with the point of crossing at the point of curvature maximum on the edge or edge as angle point; One class is directly to utilize the gray scale of image to calculate the curvature of intensity profile, with the point of maximum curvature as angle point.The method that the Harris angle point that this experiment is adopted extracts is directly to utilize gradation of image information to survey the method for angle point.
The expression formula of Harris operator is as follows:
M = G ( s ) ~ ⊗ g x g x g y g x g y g y
I=Det(M)-kTrace 2(M),k=0.04
Wherein:
g xGradient for the x direction; g yGradient for the y direction;
Figure A200810225329D00152
Be Gauss's template;
Figure A200810225329D00153
Be convolution operation; I is every a interest value; Det is a determinant of a matrix; Trace is a matrix trace; K is the silent constant of appointing.
Each point of gray level image to operation calculates this in horizontal and vertical first order derivative, and the product of the two.The property value that can obtain each pixel correspondence like this is respectively g x, g yAnd g xg y, calculate the interest value I of each point at last.
The Harris algorithm thinks that unique point is the picture element of the very big interest value correspondence in the subrange.Therefore, after having calculated the interest value of each point, extract the point of all the local interest value maximums in the original image.In the practical operation, can take out 8 pixels in 8 neighborhoods of each pixel successively, propose maximal value from center pixel and these 8 pixels, if the interest value of central point pixel is exactly a maximal value, then this point is exactly a unique point.
When extract minutiae, all I of satisfying all can be considered to unique point greater than the picture element of a certain threshold value T.Simultaneously can come choosing of controlling features point by some limiting factors.
The validity of feature extraction causes the feature extraction failure reasons to mainly contain between 50% to 65%:
● phenomenon such as block because the binocular image to there being parallax, exists in imaging, in another visual angle, do not have imaging in the extraterrestrial target feature of a visual angle imaging; Or because the difference of illumination, same feature is in the gray feature difference that image pair showed, thus resulting angle point extracts difference as a result.
● because angle point extraction algorithm itself is intrinsic, in neighborhood, calculates eigenwert and accept or reject and obtain corner location, this position and deviation is arranged by its position that should be in of people's Direct observation according to gradation of image.
The angle point that lost efficacy need be rejected by the coupling of unique point in the binocular image pair, and it is right to guarantee to obtain effective unique point.
3) matching module of design binocular image characteristic point
The coupling of unique point is in the binocular stereo vision image pair, gives the known point that fixes on the piece image, according to certain matching criterior, seeks corresponding with it match point on another width of cloth image.The method of images match mainly contains coupling based on the zone, based on the coupling of feature and the coupling that combines based on the coupling of explaining or several different methods.Corresponding match point is based upon on the parallax basis, because noise, illumination, block, the influence of factor such as perspective distortion, may be different in the feature that image pair presented, may be at certain region memory in a plurality of similar couplings, therefore need certain constraint as assistant criteria, to obtain accurate match.The general constraint of adopting has polar curve constraint, unique constraints, parallax continuity constraint and sequence consensus constraint etc.
This experiment is adopted the solid matching method that combines with the zone coupling based on polar curve constraint, characteristic matching in conjunction with the characteristics of binocular tri-dimensional vision system.
At first utilize the result of binocular stereo vision system parameter calibration, set up the polar curve restriction relation of left and right sides view, image is proofreaied and correct, polar curve is adjusted to horizontal level according to this relation.Image after the correction, corresponding match point is positioned on the same horizontal line substantially.Utilize then in the image of aforesaid Angular Point Extracting Method after polar curve is proofreaied and correct unique point is extracted.With left figure as original graph, right figure conduct is with reference to figure, to each unique point among the left figure, in right figure is the center with left figure characteristic point position, about 100 pixels, carry out the search of potential match point up and down in the scope of 2 pixels, classify the unique point of right figure in this scope as potential match point.Use polar curve constraint herein, the height of hunting zone is controlled in the very little scope, embody characteristic matching point basic polar curve on same horizontal line is retrained.To each potential match point, all according to the gray-scale value of its eight neighborhood according to the Normalized Grey Level difference square matching criterior mate with the primitive character point.The coupling of square mode of Normalized Grey Level difference is:
S ( x , y ) = Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 [ T ( x ′ , y ′ ) - I ( x + x ′ , y + y ′ ) ] 2 Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 T ( x ′ , y ′ ) 2 Σ y ′ = 0 n - 1 Σ x ′ = 0 m - 1 I ( x + x ′ , y + y ′ ) 2
Wherein T (x ', y ') is the pixel grey scale in the original image, and I (x+x ', y+y ') be the pixel grey scale in the reference picture, m, n are the neighborhood size.(x, y) unique point of value minimum is as optimal match point to get S.
After the same method, be original graph with right figure, left figure is that reference diagram carries out the right judgement of match point, promptly about carry out symmetry among the figure and calculate.With only satisfy a direction or both direction not the match point of Satisfying Matching Conditions to being considered as false coupling, promptly only in symmetry is calculated, be match point right just be that correct matching characteristic point is right.Correct match point just can be according to the point that obtains to calculating parallax, the three-dimensional coordinate of unique point in the further computer memory to after determining.
In the relevant candidate matches of setting up,, certainly exist false coupling owing to be subjected to the influence etc. of brightness of image, surrounding environment similar features based on gray scale.
Can add constraint condition again to matching result calculates and further eliminates false coupling.Condition is harsh more, and false coupling is few more, but resulting correct match point is unfavorable for the modeling to object more to also few more simultaneously, can screen according to actual conditions.The unique point of trying to achieve coupling to after, can from about try to achieve the parallax size the figure, further according to the calculation of parameter unique point of binocular tri-dimensional vision system three-dimensional coordinate, as extraterrestrial target modeling and based measurement in the space.
4) design attitude parameter measurement module
The binocular measurement of the relative pose of extraterrestrial target is the range information that obtains extraterrestrial target surface characteristics point by the method for binocular vision, make up the geometric model of extraterrestrial target in the position in space relation according to unique point, and by to the tracking and the test constantly of unique point, the information that the six degree of freedom pose that utilizes the relation of geometric model to calculate extraterrestrial target changes relatively.In this problem, only the variable quantity of pose is measured calculating, and do not calculate its absolute magnitude.
To make up the model space geometric of extraterrestrial target by unique point, need at least three not unique points of conllinear.
At first, (z1), (z2), (x3, y3 z3), utilize the binocular stereo vision principle to obtain the volume coordinate of three unique points under camera coordinates system to M3 to M2 for x2, y2 for x1, y1 to utilize image process method to extract three the unique point M1 in extraterrestrial target surface.
By three in space not the point of conllinear can determine a plane, when the hypothesis space target was rigid body, this plane had just constituted the naive model of extraterrestrial target, reflected that by the calculating to the relative pose information on this plane the relative pose of extraterrestrial target changes.Note is by three unique point M1, M2, and the plane that M3 determines is a characteristic plane.
When the hypothesis space target is rigid body, no matter what kind of variation takes place in the position and the direction of this extraterrestrial target, or observation space target under different coordinate systems, relative position relation between the object feature point remains unchanged, if extraterrestrial target self is set up coordinate system, observe under camera coordinates system, the extraterrestrial target coordinate system after then pose changes can obtain by original extraterrestrial target coordinate system is rotated with translation.
Then just to be converted into the extraterrestrial target coordinate system be asking for of down conversion parameter in camera coordinates to the problem measured of the relative pose of extraterrestrial target.Set up the extraterrestrial target coordinate system according to the characteristic plane that three unique points are determined: with
Figure A200810225329D00171
Direction is a Z ' axle forward, is XOY plane with the characteristic plane, Z axle and X-axis, and Y-axis constitutes right-handed coordinate system.True origin is got the leg-of-mutton center that is made of three unique points, note O ′ = ( 1 3 Σ i = 1 3 xi , 1 3 Σ i = 1 3 yi , 1 3 Σ i = 1 3 zi ) 。Suppose carry out the longitudinal axis that attitude change of O ' Z ' axle, promptly suppose the longitudinal axis that changes as spatial target posture by the planar process vector that is defined as Z ' axle forward for extraterrestrial target.
Characteristic plane is described with three angles with respect to the attitude variation of camera coordinates system, is respectively pitching angle theta, crab angle ψ and spin angle φ, and relative attitude information is the changing value of these three angles before and after attitude changes.With camera coordinates is that OXYZ moves to the extraterrestrial target coordinate origin, and this is the attitude information of effect characteristics plane in camera coordinates system not, and the definition of three angles and relative attitude information are defined as follows:
● pitching angle theta: extraterrestrial target is around the angle of OX axle rotation.
● relative pitching angle theta ': before and after attitude changes, the difference of the relative OX anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, relatively the angle of pitch is the angle of the YOZ plane projection of planar process vector in camera coordinates system, when extraterrestrial target when the OX axle is done the clockwise direction rotation, the angle of pitch is got negative value relatively, on the contrary for just.As shown in Figure 6.
● crab angle ψ: extraterrestrial target is around the angle of OY axle rotation.
● relative crab angle ψ ': before and after attitude changes, the difference of the relative OY anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, relatively crab angle is the angle of the XOZ plane projection of planar process vector in camera coordinates system, when extraterrestrial target when the OY axle is done the clockwise direction rotation, crab angle is got negative value relatively, on the contrary for just.As shown in Figure 7.
● spin angle φ: extraterrestrial target is around the angle of OZ axle rotation.
Relative spin angle φ ': before and after attitude changes, the difference of the relative OZ anglec of rotation.According to the hypothesis to the extraterrestrial target coordinate system, the planar process vector can only characterize the attitude information of plane around OX, OY axle, and can't characterize the attitude change information of plane around himself, a vector on the face of making even this moment
Figure A200810225329D00181
Characterize the spin angle, the corresponding vector before and after the rotation on the plane
Figure A200810225329D00182
With
Figure A200810225329D00183
Between angle be relative spin angle.When
Figure A200810225329D00184
When direction was OZ axle forward, relatively spin angle was got negative value, otherwise for just.As shown in Figure 8.
Characteristic plane relative displacement parameter is described by X, Y under camera coordinates system, the shift value on the Z component.The vector that previously defined center O1 ' and the O2 ' that is made of three unique points characterized
Figure A200810225329D00185
Component on OX, OY and OZ axle is X, Y, the Z component of relative displacement, as shown in Figure 9.
According to the measurement result of binocular vision, and the coordinate M11 of known initial three unique points under camera coordinates system (x11, y11, z11), M12 (x12, y12, z12), M13 (x13, y13, z13), and three characteristic point coordinates M21 of the correspondence after the attitude variation (x21, y21, z21), M22 (x22, y22, z22), M23 (x23, y23, z23).Coordinate after attitude changes is to be obtained through rotation and translation by the coordinate before changing.Measurement result to the relative variation of spatial target posture is represented with six parameters, is respectively the relative displacement of X, Y, Z direction, the pitching angle theta of rotating around X, the crab angle ψ that rotates around Y-axis and around the spin angle φ of Z axle rotation.
(3) calculating is determined leg-of-mutton center by unique point
O 1 ′ = ( 1 3 Σ i = 1 3 x 1 i , 1 3 Σ i = 1 3 y 1 i , 1 3 Σ i = 1 3 z 1 i )
O 2 ′ = ( 1 3 Σ i = 1 3 x 2 i , 1 3 Σ i = 1 3 y 2 i , 1 3 Σ i = 1 3 z 2 i )
Three components that calculated relative displacement by the displacement at two centers are:
x_trans=xo2-xo1
y_trans=yo2-yo1
z_trans=zo2-zo1
(4) computing center's point O1 ', O2 ' are at the projection and the angle on each plane
D) O1 ', O2 ' are projected in the YOZ plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00193
With
Figure A200810225329D00194
With the angle angle_original of Y-axis, angle_new, then the angle of pitch is relatively:
θ=angle_new-angle_original
E) O1 ', O2 ' are projected in the XOZ plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00195
With
Figure A200810225329D00196
With the angle angle_original of Z axle, angle_new, then crab angle is relatively:
ψ=angle_new-angle_original
F) O1 ', O2 ' are projected in XOY plane
The note subpoint is respectively P1, P2, compute vector respectively
Figure A200810225329D00197
With
Figure A200810225329D00198
With the angle angle_original of X-axis, angle_new, then spin angle is relatively:
φ=angle_new-angle_original
A kind of extraterrestrial target pose measuring method based on vision of the present invention, it mainly is made up of based on extraction module, the matching module of binocular image characteristic point, four modules of attitude parameter measurement module of the unique point of the camera calibration module of 2D plane target drone, binocular image the binocular image.
The distortion parameter that the function that the binocular image is finished based on the camera calibration module of 2D plane target drone is mainly video camera calculates, the polar curve of image is proofreaied and correct; The function that the extraction module of the unique point of binocular image is finished is mainly the extraction of the Harris feature angle point of binocular left and right sides head portrait; The function that the matching module of binocular image characteristic point is finished is mainly the coupling to the pair of right and left one of the Harris feature angle point of binocular left and right sides head portrait; The function that the attitude parameter measurement module is finished is mainly according to the coupling of unique point the position and attitude parameter of extraterrestrial target is measured.
By these four modules, finished the realization that get access to extraterrestrial target position and attitude parameter of software by initial left and right sides binocular image.
Adopt technology and device among the present invention, total system can reach following technical indicator to the extraterrestrial target pose measuring apparatus based on vision:
● decipherment distance scope 20m~0.5m
● camera field of view angle (70-100) °
● correct measurement probability 95%
● base length scope 0.1~0.5m
● positional accuracy measurement 0.1m~0.5m
data updating rate 1~5Hz.

Claims (5)

1, a kind of extraterrestrial target pose measuring apparatus based on vision is characterized in that: it is to be made of following three parts: automatically controlled displacement platform, image capture device, integrated information processing platform; This automatically controlled displacement platform and integrated information processing platform are made up of the servo displacement platform of six degree of freedom high accuracy number, displacement platform control box and information interface, servocontrol processing unit, high-speed computer respectively; Wherein, image capture device mainly comprises two CCD (Charge Coupled Device, i.e. Charge Coupled Device (CCD) imageing sensor) video camera; The support platform that the servo displacement platform of six degree of freedom high accuracy number is an image acquiring device can be carried out the motion of six degree of freedom according to the steering order of displacement platform control box, realizes the checking to measurement result in the binocular measuring process; High-speed computer receives from the left and right sides image information that two ccd video cameras import into, finish to about a series of information processes in two width of cloth binocular images, comprise that extraterrestrial target Feature Extraction, characteristic matching, signature tracking, kinematic parameter calculate, the realization of three-dimensional reconstruction scheduling algorithm; The servocontrol processing unit can send control command to automatically controlled displacement platform, makes the instruction campaign of the servo displacement platform of six degree of freedom high accuracy number according to input;
Should be based on the pass between each ingredient of extraterrestrial target pose measuring apparatus of vision: image capture device (two ccd video cameras) be installed in the servo displacement platform of the six degree of freedom high accuracy number support two ends of automatically controlled displacement platform respectively, relative position can be mediated, and can move with the servo displacement platform of high accuracy number simultaneously; Image capture device all links to each other with integrated information processing platform with automatically controlled displacement platform, realizes that image that image capture device obtained is to the transmission to the steering order of servo platform of the transmission of the information processing platform and the information processing platform;
This workflow based on the extraterrestrial target pose measuring apparatus of vision is: at first the servocontrol processing unit by integrated information processing platform sends instruction to the servo displacement platform of the six degree of freedom high accuracy number of automatically controlled displacement platform, it is approached to extraterrestrial target according to the program that configures draw close; Simultaneously, in the process of motion, the image capture device that is installed in the servo displacement platform of six degree of freedom high accuracy number of automatically controlled displacement platform is constantly gathered the extraterrestrial target binocular sequence image in the motion process, these images that collect are passed to the high-speed computer of the information processing platform by serial ports, high-speed computer is by a series of information process (being mainly image processing process), calculate to extraterrestrial target and approach position and the attitude parameter that the space target is concrete in the process of drawing close, again the kinematic parameter of the servo displacement platform of six degree of freedom high accuracy number of the parameter of the extraterrestrial target that draws through Flame Image Process computation process and automatically controlled displacement platform is compared at last, in order to confirmatory measurement result's correctness and follow-up measurement precision analysis.
2, a kind of extraterrestrial target pose measuring method based on vision is characterized in that: this method comprises binocular image four modules of matching module, attitude parameter measurement module based on the unique point of the extraction module of the unique point of the camera calibration module of 2D plane target drone, binocular image, binocular image; Wherein: the binocular image is the camera marking method based on the 2D plane target drone that people such as Zhang Zhengyou proposes based on the labeling algorithm that camera calibration adopted of 2D plane target drone; The method that the extraction of the unique point of binocular image adopts the Harris angle point to extract; The coupling of the unique point of binocular image adopts the solid coupling that combines based on polar curve constraint, characteristic matching; The process of parameter measurement is a process of the calculating of relative pose, the binocular measurement of relative pose is the range information that obtains extraterrestrial target surface characteristics point by the method for binocular vision, make up the geometric model of extraterrestrial target in the position in space relation according to unique point, and by to the tracking and the test constantly of unique point, the information that the six degree of freedom pose that utilizes the relation of geometric model to calculate extraterrestrial target changes relatively.
3, the coupling of the unique point of the binocular image of a kind of extraterrestrial target pose measuring method based on vision according to claim 2 adopts and retrains based on polar curve, it is characterized in that: the result who utilizes the binocular stereo vision system parameter calibration, set up the polar curve restriction relation of left and right sides view, according to this relation image is proofreaied and correct, polar curve is adjusted to horizontal level, image after the correction, corresponding match point is positioned on the same horizontal line substantially.
4, the coupling of the unique point of the binocular image of a kind of extraterrestrial target pose measuring method based on vision according to claim 2 adopts characteristic matching, it is characterized in that: to each potential match point, all according to the gray-scale value of its eight neighborhood according to the Normalized Grey Level difference square matching criterior mate with the primitive character point, simultaneously about carry out symmetry calculating among the figure, with only satisfy a direction or both direction not the match point of Satisfying Matching Conditions to being considered as false coupling, promptly only in symmetry is calculated, be match point right just be that correct matching characteristic point is right.
5, a kind of extraterrestrial target pose measuring method according to claim 2 based on vision, it is characterized in that: the kinematic parameter of the servo displacement platform of six degree of freedom high accuracy number of the parameter of the extraterrestrial target that draws through Flame Image Process computation process and automatically controlled displacement platform is compared, in order to confirmatory measurement result's correctness and follow-up measurement precision analysis.
CN2008102253292A 2008-10-30 2008-10-30 Space target position and pose measuring device and method based on vision Expired - Fee Related CN101419055B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008102253292A CN101419055B (en) 2008-10-30 2008-10-30 Space target position and pose measuring device and method based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008102253292A CN101419055B (en) 2008-10-30 2008-10-30 Space target position and pose measuring device and method based on vision

Publications (2)

Publication Number Publication Date
CN101419055A true CN101419055A (en) 2009-04-29
CN101419055B CN101419055B (en) 2010-08-25

Family

ID=40629961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008102253292A Expired - Fee Related CN101419055B (en) 2008-10-30 2008-10-30 Space target position and pose measuring device and method based on vision

Country Status (1)

Country Link
CN (1) CN101419055B (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102075686A (en) * 2011-02-10 2011-05-25 北京航空航天大学 Robust real-time on-line camera tracking method
CN102141376A (en) * 2011-01-06 2011-08-03 大连理工大学 Auxiliary reference-based machine vision detection system and method
CN102324198A (en) * 2011-09-28 2012-01-18 哈尔滨工业大学 Aircraft attitude tracking control teaching experimental device based on polar coordinate target and three-axle table
CN102353931A (en) * 2011-09-02 2012-02-15 北京邮电大学 Relative positioning method for spatial object
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN102831407A (en) * 2012-08-22 2012-12-19 中科宇博(北京)文化有限公司 Method for realizing vision identification system of biomimetic mechanical dinosaur
CN102867073A (en) * 2011-07-08 2013-01-09 中国民航科学技术研究院 Flight program design system for performance-based navigation, verification platform and verification method
CN103220975A (en) * 2010-10-08 2013-07-24 泰莱伊奥斯有限责任公司 Apparatus and method for mapping a three-imensional space in medical applications for diagnostic, surgical or interventional medicine purposes
CN103262122A (en) * 2010-12-23 2013-08-21 阿尔卡特朗讯公司 Integrated method for camera planning and positioning
CN103635937A (en) * 2011-05-30 2014-03-12 原子能和辅助替代能源委员会 Method for locating a camera and for 3d reconstruction in a partially known environment
CN103753530A (en) * 2013-12-30 2014-04-30 西北工业大学 Extremely near visual servo control method for space tethered robot
CN104006803A (en) * 2014-06-20 2014-08-27 中国人民解放军国防科学技术大学 Camera shooting measurement method for rotation motion parameters of spinning stability spacecraft
CN104424382A (en) * 2013-08-21 2015-03-18 北京航天计量测试技术研究所 Multi-feature point position posture redundancy resolving method
CN105588581A (en) * 2015-12-16 2016-05-18 南京航空航天大学 On-orbit service relative navigation experiment platform and work method
CN105627819A (en) * 2016-02-25 2016-06-01 中国人民解放军武汉军械士官学校 Dynamic performance measuring method and device of automatic control system
CN106548173A (en) * 2016-11-24 2017-03-29 国网山东省电力公司电力科学研究院 A kind of improvement no-manned plane three-dimensional information getting method based on classification matching strategy
CN106643699A (en) * 2016-12-26 2017-05-10 影动(北京)科技有限公司 Space positioning device and positioning method in VR (virtual reality) system
CN106780607A (en) * 2016-11-24 2017-05-31 中国人民解放军国防科学技术大学 A kind of detection means of two moving ship with respect to six-freedom motion
CN106845515A (en) * 2016-12-06 2017-06-13 上海交通大学 Robot target identification and pose reconstructing method based on virtual sample deep learning
CN107067436A (en) * 2015-11-17 2017-08-18 株式会社东芝 Pose estimation unit and vacuum cleaning system
CN107179077A (en) * 2017-05-15 2017-09-19 北京航空航天大学 A kind of self-adaptive visual air navigation aid based on ELM LRF
CN104457761B (en) * 2014-11-18 2017-10-24 上海新跃仪表厂 The feature relay method of relative position and posture based on multi-vision visual
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107655664A (en) * 2017-08-29 2018-02-02 农业部南京农业机械化研究所 Spraying machine spray lance dynamical property test system and method based on binocular image collection
CN107782244A (en) * 2017-10-24 2018-03-09 南京航空航天大学 A kind of six degree of freedom thin tail sheep detection method of view-based access control model
WO2018095278A1 (en) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 Aircraft information acquisition method, apparatus and device
CN108195557A (en) * 2018-04-04 2018-06-22 绵阳浩微科技有限公司佰腾分公司 Model attitude non-contact measurement system based on binocular vision
CN108225316A (en) * 2016-12-22 2018-06-29 成都天府新区光启未来技术研究院 The acquisition methods and apparatus and system of attitude of carrier information
CN108280043A (en) * 2018-01-29 2018-07-13 北京润科通用技术有限公司 A kind of method and system of fast prediction flight path
CN108460804A (en) * 2018-03-20 2018-08-28 重庆大学 A kind of Three Degree Of Freedom position and posture detection method of transhipment docking mechanism and transhipment docking mechanism based on machine vision
WO2018161555A1 (en) * 2017-03-06 2018-09-13 广州视源电子科技股份有限公司 Object pose detection method and device
CN109035343A (en) * 2018-07-12 2018-12-18 福州大学 A kind of floor relative displacement measurement method based on monitoring camera
CN109154815A (en) * 2017-11-30 2019-01-04 深圳市大疆创新科技有限公司 Maximum temperature point-tracking method, device and unmanned plane
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN109272453A (en) * 2018-08-31 2019-01-25 盎锐(上海)信息科技有限公司 Model building device and localization method based on 3D video camera
CN109308722A (en) * 2018-11-26 2019-02-05 陕西远航光电有限责任公司 A kind of spatial pose measuring system and method based on active vision
CN109360245A (en) * 2018-10-26 2019-02-19 魔视智能科技(上海)有限公司 The external parameters calibration method of automatic driving vehicle multicamera system
CN109405635A (en) * 2018-09-17 2019-03-01 南京理工大学 Camera optical axis vertical orthogonal shadowgraph station calibration system and its adjustment method
CN109426835A (en) * 2017-08-31 2019-03-05 佳能株式会社 Information processing unit, the control method of information processing unit and storage medium
CN110057341A (en) * 2019-03-05 2019-07-26 西安工业大学 A kind of binocular stereo vision measurement pose refers to platform
CN110189375A (en) * 2019-06-26 2019-08-30 中国科学院光电技术研究所 A kind of images steganalysis method based on monocular vision measurement
CN110827337A (en) * 2018-08-08 2020-02-21 深圳地平线机器人科技有限公司 Method and device for determining posture of vehicle-mounted camera and electronic equipment
CN110849331A (en) * 2019-11-04 2020-02-28 上海航天控制技术研究所 Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN110958445A (en) * 2019-11-12 2020-04-03 中山大学 Calibration system for intelligently tracking camera module
CN111006591A (en) * 2019-10-29 2020-04-14 国网浙江省电力有限公司电力科学研究院 Method for non-contact measurement of displacement inversion stress of GIS (gas insulated switchgear)
CN111024047A (en) * 2019-12-26 2020-04-17 北京航空航天大学 Six-degree-of-freedom pose measurement device and method based on orthogonal binocular vision
CN111323048A (en) * 2020-02-28 2020-06-23 上海航天控制技术研究所 Performance test method and system for single relative attitude measurement machine
CN111540019A (en) * 2020-04-27 2020-08-14 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for determining installation position of camera and storage medium
CN111595289A (en) * 2020-05-25 2020-08-28 湖北三江航天万峰科技发展有限公司 Three-dimensional angle measurement system and method based on image processing
CN111798466A (en) * 2020-07-01 2020-10-20 中国海洋石油集团有限公司 Method and system for measuring kinetic energy of drilling support platform in real time based on visual positioning
CN111882621A (en) * 2020-07-22 2020-11-03 武汉大学 Rice thickness parameter automatic measurement method based on binocular image
CN112200126A (en) * 2020-10-26 2021-01-08 上海盛奕数字科技有限公司 Method for identifying limb shielding gesture based on artificial intelligence running
CN112198884A (en) * 2020-07-27 2021-01-08 北京理工大学 Unmanned aerial vehicle mobile platform landing method based on visual guidance
CN112381880A (en) * 2020-11-27 2021-02-19 航天科工智能机器人有限责任公司 Binocular vision pose estimation method based on circle features
CN112486190A (en) * 2020-10-16 2021-03-12 北京电子工程总体研究所 Comprehensive test system for realizing attitude control
CN112639883A (en) * 2020-03-17 2021-04-09 华为技术有限公司 Relative attitude calibration method and related device
CN113494883A (en) * 2020-03-20 2021-10-12 湖南科天健光电技术有限公司 Turntable load pose measurement method and system based on external multi-view vision equipment
CN114715447A (en) * 2022-04-19 2022-07-08 北京航空航天大学 Cell spacecraft module docking device and visual alignment method
CN115420277A (en) * 2022-08-31 2022-12-02 北京航空航天大学 Object pose measuring method and electronic equipment
CN116539068A (en) * 2023-07-03 2023-08-04 国网山西省电力公司电力科学研究院 Flexible self-checking adjusting device and method for vision measurement system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102175251A (en) * 2011-03-25 2011-09-07 江南大学 Binocular intelligent navigation system
CN108082539B (en) * 2017-12-08 2019-09-03 中国科学院光电研究院 A kind of high rail of optical measurement revolves the Satellite Formation Flying of unstability target with respect to racemization system and method slowly

Cited By (91)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103220975A (en) * 2010-10-08 2013-07-24 泰莱伊奥斯有限责任公司 Apparatus and method for mapping a three-imensional space in medical applications for diagnostic, surgical or interventional medicine purposes
CN103262122B (en) * 2010-12-23 2017-04-05 阿尔卡特朗讯公司 For the integrated approach of camera planning and positioning
CN103262122A (en) * 2010-12-23 2013-08-21 阿尔卡特朗讯公司 Integrated method for camera planning and positioning
CN102141376A (en) * 2011-01-06 2011-08-03 大连理工大学 Auxiliary reference-based machine vision detection system and method
CN102075686A (en) * 2011-02-10 2011-05-25 北京航空航天大学 Robust real-time on-line camera tracking method
CN102075686B (en) * 2011-02-10 2013-10-30 北京航空航天大学 Robust real-time on-line camera tracking method
CN103635937B (en) * 2011-05-30 2017-02-15 原子能和辅助替代能源委员会 Method for locating a camera and for 3d reconstruction in a partially known environment
CN103635937A (en) * 2011-05-30 2014-03-12 原子能和辅助替代能源委员会 Method for locating a camera and for 3d reconstruction in a partially known environment
CN102867073B (en) * 2011-07-08 2014-12-24 中国民航科学技术研究院 Flight program design system for performance-based navigation, verification platform and verification method
CN102867073A (en) * 2011-07-08 2013-01-09 中国民航科学技术研究院 Flight program design system for performance-based navigation, verification platform and verification method
CN102353931A (en) * 2011-09-02 2012-02-15 北京邮电大学 Relative positioning method for spatial object
CN102324198B (en) * 2011-09-28 2013-03-13 哈尔滨工业大学 Aircraft attitude tracking control teaching experimental device based on polar coordinate target and three-axle table
CN102324198A (en) * 2011-09-28 2012-01-18 哈尔滨工业大学 Aircraft attitude tracking control teaching experimental device based on polar coordinate target and three-axle table
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN102831407A (en) * 2012-08-22 2012-12-19 中科宇博(北京)文化有限公司 Method for realizing vision identification system of biomimetic mechanical dinosaur
CN102831407B (en) * 2012-08-22 2014-10-29 中科宇博(北京)文化有限公司 Method for realizing vision identification system of biomimetic mechanical dinosaur
CN104424382A (en) * 2013-08-21 2015-03-18 北京航天计量测试技术研究所 Multi-feature point position posture redundancy resolving method
CN104424382B (en) * 2013-08-21 2017-09-29 北京航天计量测试技术研究所 A kind of multi-characteristic points position and attitude redundancy calculation method
CN103753530A (en) * 2013-12-30 2014-04-30 西北工业大学 Extremely near visual servo control method for space tethered robot
CN104006803A (en) * 2014-06-20 2014-08-27 中国人民解放军国防科学技术大学 Camera shooting measurement method for rotation motion parameters of spinning stability spacecraft
CN104006803B (en) * 2014-06-20 2016-02-03 中国人民解放军国防科学技术大学 The photographing measurement method of spin stabilization spacecraft rotational motion parameter
CN104457761B (en) * 2014-11-18 2017-10-24 上海新跃仪表厂 The feature relay method of relative position and posture based on multi-vision visual
CN107067436B (en) * 2015-11-17 2021-01-05 株式会社东芝 Pose estimation device and vacuum cleaner system
CN107067436A (en) * 2015-11-17 2017-08-18 株式会社东芝 Pose estimation unit and vacuum cleaning system
CN105588581A (en) * 2015-12-16 2016-05-18 南京航空航天大学 On-orbit service relative navigation experiment platform and work method
CN105588581B (en) * 2015-12-16 2019-04-09 南京航空航天大学 A kind of in-orbit service Relative Navigation experiment porch and working method
CN105627819B (en) * 2016-02-25 2017-09-19 中国人民解放军武汉军械士官学校 A kind of automatic control system dynamic performance detection method and device
CN105627819A (en) * 2016-02-25 2016-06-01 中国人民解放军武汉军械士官学校 Dynamic performance measuring method and device of automatic control system
WO2018095278A1 (en) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 Aircraft information acquisition method, apparatus and device
CN106780607A (en) * 2016-11-24 2017-05-31 中国人民解放军国防科学技术大学 A kind of detection means of two moving ship with respect to six-freedom motion
CN106548173B (en) * 2016-11-24 2019-04-09 国网山东省电力公司电力科学研究院 A kind of improvement no-manned plane three-dimensional information acquisition method based on classification matching strategy
CN106548173A (en) * 2016-11-24 2017-03-29 国网山东省电力公司电力科学研究院 A kind of improvement no-manned plane three-dimensional information getting method based on classification matching strategy
US10942529B2 (en) 2016-11-24 2021-03-09 Tencent Technology (Shenzhen) Company Limited Aircraft information acquisition method, apparatus and device
CN106845515B (en) * 2016-12-06 2020-07-28 上海交通大学 Robot target identification and pose reconstruction method based on virtual sample deep learning
CN106845515A (en) * 2016-12-06 2017-06-13 上海交通大学 Robot target identification and pose reconstructing method based on virtual sample deep learning
CN108225316B (en) * 2016-12-22 2023-12-29 成都天府新区光启未来技术研究院 Carrier attitude information acquisition method, device and system
CN108225316A (en) * 2016-12-22 2018-06-29 成都天府新区光启未来技术研究院 The acquisition methods and apparatus and system of attitude of carrier information
CN106643699B (en) * 2016-12-26 2023-08-04 北京互易科技有限公司 Space positioning device and positioning method in virtual reality system
CN106643699A (en) * 2016-12-26 2017-05-10 影动(北京)科技有限公司 Space positioning device and positioning method in VR (virtual reality) system
WO2018161555A1 (en) * 2017-03-06 2018-09-13 广州视源电子科技股份有限公司 Object pose detection method and device
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107179077A (en) * 2017-05-15 2017-09-19 北京航空航天大学 A kind of self-adaptive visual air navigation aid based on ELM LRF
CN107655664B (en) * 2017-08-29 2024-02-02 农业部南京农业机械化研究所 System and method for testing dynamic characteristics of spray boom of sprayer based on binocular image acquisition
CN107655664A (en) * 2017-08-29 2018-02-02 农业部南京农业机械化研究所 Spraying machine spray lance dynamical property test system and method based on binocular image collection
CN109426835B (en) * 2017-08-31 2022-08-30 佳能株式会社 Information processing apparatus, control method of information processing apparatus, and storage medium
CN109426835A (en) * 2017-08-31 2019-03-05 佳能株式会社 Information processing unit, the control method of information processing unit and storage medium
CN107782244A (en) * 2017-10-24 2018-03-09 南京航空航天大学 A kind of six degree of freedom thin tail sheep detection method of view-based access control model
CN107782244B (en) * 2017-10-24 2019-07-26 南京航空航天大学 A kind of six degree of freedom thin tail sheep detection method of view-based access control model
US11798172B2 (en) 2017-11-30 2023-10-24 SZ DJI Technology Co., Ltd. Maximum temperature point tracking method, device and unmanned aerial vehicle
US11153494B2 (en) 2017-11-30 2021-10-19 SZ DJI Technology Co., Ltd. Maximum temperature point tracking method, device and unmanned aerial vehicle
CN109154815A (en) * 2017-11-30 2019-01-04 深圳市大疆创新科技有限公司 Maximum temperature point-tracking method, device and unmanned plane
CN108280043B (en) * 2018-01-29 2021-11-23 北京润科通用技术有限公司 Method and system for quickly predicting flight trajectory
CN108280043A (en) * 2018-01-29 2018-07-13 北京润科通用技术有限公司 A kind of method and system of fast prediction flight path
CN108460804A (en) * 2018-03-20 2018-08-28 重庆大学 A kind of Three Degree Of Freedom position and posture detection method of transhipment docking mechanism and transhipment docking mechanism based on machine vision
CN108195557A (en) * 2018-04-04 2018-06-22 绵阳浩微科技有限公司佰腾分公司 Model attitude non-contact measurement system based on binocular vision
CN109035343A (en) * 2018-07-12 2018-12-18 福州大学 A kind of floor relative displacement measurement method based on monitoring camera
CN110827337A (en) * 2018-08-08 2020-02-21 深圳地平线机器人科技有限公司 Method and device for determining posture of vehicle-mounted camera and electronic equipment
CN109272453A (en) * 2018-08-31 2019-01-25 盎锐(上海)信息科技有限公司 Model building device and localization method based on 3D video camera
CN109272453B (en) * 2018-08-31 2023-02-10 上海盎维信息技术有限公司 Modeling device and positioning method based on 3D camera
CN109405635A (en) * 2018-09-17 2019-03-01 南京理工大学 Camera optical axis vertical orthogonal shadowgraph station calibration system and its adjustment method
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN109166150B (en) * 2018-10-16 2021-06-01 海信视像科技股份有限公司 Pose acquisition method and device storage medium
CN109360245A (en) * 2018-10-26 2019-02-19 魔视智能科技(上海)有限公司 The external parameters calibration method of automatic driving vehicle multicamera system
CN109360245B (en) * 2018-10-26 2021-07-06 魔视智能科技(上海)有限公司 External parameter calibration method for multi-camera system of unmanned vehicle
CN109308722A (en) * 2018-11-26 2019-02-05 陕西远航光电有限责任公司 A kind of spatial pose measuring system and method based on active vision
CN110057341A (en) * 2019-03-05 2019-07-26 西安工业大学 A kind of binocular stereo vision measurement pose refers to platform
CN110189375A (en) * 2019-06-26 2019-08-30 中国科学院光电技术研究所 A kind of images steganalysis method based on monocular vision measurement
CN110189375B (en) * 2019-06-26 2022-08-23 中国科学院光电技术研究所 Image target identification method based on monocular vision measurement
CN111006591A (en) * 2019-10-29 2020-04-14 国网浙江省电力有限公司电力科学研究院 Method for non-contact measurement of displacement inversion stress of GIS (gas insulated switchgear)
CN110849331A (en) * 2019-11-04 2020-02-28 上海航天控制技术研究所 Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN110849331B (en) * 2019-11-04 2021-10-29 上海航天控制技术研究所 Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN110958445A (en) * 2019-11-12 2020-04-03 中山大学 Calibration system for intelligently tracking camera module
CN111024047B (en) * 2019-12-26 2021-03-12 北京航空航天大学 Six-degree-of-freedom pose measurement device and method based on orthogonal binocular vision
CN111024047A (en) * 2019-12-26 2020-04-17 北京航空航天大学 Six-degree-of-freedom pose measurement device and method based on orthogonal binocular vision
CN111323048A (en) * 2020-02-28 2020-06-23 上海航天控制技术研究所 Performance test method and system for single relative attitude measurement machine
CN112639883A (en) * 2020-03-17 2021-04-09 华为技术有限公司 Relative attitude calibration method and related device
CN113494883A (en) * 2020-03-20 2021-10-12 湖南科天健光电技术有限公司 Turntable load pose measurement method and system based on external multi-view vision equipment
CN113494883B (en) * 2020-03-20 2022-08-05 湖南科天健光电技术有限公司 Turntable load pose measurement method and system based on external multi-view vision equipment
CN111540019A (en) * 2020-04-27 2020-08-14 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for determining installation position of camera and storage medium
CN111595289A (en) * 2020-05-25 2020-08-28 湖北三江航天万峰科技发展有限公司 Three-dimensional angle measurement system and method based on image processing
CN111798466A (en) * 2020-07-01 2020-10-20 中国海洋石油集团有限公司 Method and system for measuring kinetic energy of drilling support platform in real time based on visual positioning
CN111882621A (en) * 2020-07-22 2020-11-03 武汉大学 Rice thickness parameter automatic measurement method based on binocular image
CN112198884A (en) * 2020-07-27 2021-01-08 北京理工大学 Unmanned aerial vehicle mobile platform landing method based on visual guidance
CN112486190A (en) * 2020-10-16 2021-03-12 北京电子工程总体研究所 Comprehensive test system for realizing attitude control
CN112200126A (en) * 2020-10-26 2021-01-08 上海盛奕数字科技有限公司 Method for identifying limb shielding gesture based on artificial intelligence running
CN112381880A (en) * 2020-11-27 2021-02-19 航天科工智能机器人有限责任公司 Binocular vision pose estimation method based on circle features
CN114715447A (en) * 2022-04-19 2022-07-08 北京航空航天大学 Cell spacecraft module docking device and visual alignment method
CN115420277A (en) * 2022-08-31 2022-12-02 北京航空航天大学 Object pose measuring method and electronic equipment
CN115420277B (en) * 2022-08-31 2024-04-12 北京航空航天大学 Object pose measurement method and electronic equipment
CN116539068A (en) * 2023-07-03 2023-08-04 国网山西省电力公司电力科学研究院 Flexible self-checking adjusting device and method for vision measurement system
CN116539068B (en) * 2023-07-03 2023-09-08 国网山西省电力公司电力科学研究院 Flexible self-checking adjusting device and method for vision measurement system

Also Published As

Publication number Publication date
CN101419055B (en) 2010-08-25

Similar Documents

Publication Publication Date Title
CN101419055B (en) Space target position and pose measuring device and method based on vision
Kim et al. SLAM-driven robotic mapping and registration of 3D point clouds
US11034026B2 (en) Utilizing optical data to dynamically control operation of a snake-arm robot
CN107392964B (en) The indoor SLAM method combined based on indoor characteristic point and structure lines
JP5671281B2 (en) Position / orientation measuring apparatus, control method and program for position / orientation measuring apparatus
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN109472831A (en) Obstacle recognition range-measurement system and method towards road roller work progress
US11654571B2 (en) Three-dimensional data generation device and robot control system
Xiao et al. 3D point cloud registration based on planar surfaces
CN112184812B (en) Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system
CN110334701B (en) Data acquisition method based on deep learning and multi-vision in digital twin environment
CN110260866A (en) A kind of robot localization and barrier-avoiding method of view-based access control model sensor
CN104463899A (en) Target object detecting and monitoring method and device
Momeni-k et al. Height estimation from a single camera view
US20130202212A1 (en) Information processing apparatus, information processing method, and computer program
Liao et al. Extrinsic calibration of lidar and camera with polygon
Yang et al. Infrared LEDs-based pose estimation with underground camera model for boom-type roadheader in coal mining
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
Thomas et al. Multi sensor fusion in robot assembly using particle filters
JP6410231B2 (en) Alignment apparatus, alignment method, and computer program for alignment
CN201355241Y (en) Visual-based space target pose measuring device
CN114581632A (en) Method, equipment and device for detecting assembly error of part based on augmented reality technology
JP5976089B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
Phan et al. Towards 3D human posture estimation using multiple kinects despite self-contacts

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100825

Termination date: 20121030