CN108317953A - A kind of binocular vision target surface 3D detection methods and system based on unmanned plane - Google Patents

A kind of binocular vision target surface 3D detection methods and system based on unmanned plane Download PDF

Info

Publication number
CN108317953A
CN108317953A CN201810081160.1A CN201810081160A CN108317953A CN 108317953 A CN108317953 A CN 108317953A CN 201810081160 A CN201810081160 A CN 201810081160A CN 108317953 A CN108317953 A CN 108317953A
Authority
CN
China
Prior art keywords
point
binocular vision
unmanned plane
image
target surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810081160.1A
Other languages
Chinese (zh)
Inventor
侯民
侯一民
薛明镇
刘峻杭
伦向敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Electric Power University
Original Assignee
Northeast Dianli University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Dianli University filed Critical Northeast Dianli University
Priority to CN201810081160.1A priority Critical patent/CN108317953A/en
Publication of CN108317953A publication Critical patent/CN108317953A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Abstract

The binocular vision target surface 3D detecting systems based on unmanned plane that the invention discloses a kind of, including:Unmanned plane, binocular vision system, computer system and remote control equipment;Its detection method is:Utilize the image data of the binocular vision system traversal acquisition target surface of UAV flight;Using binocular vision system the image collected, by feature extraction, matching obtains the parallax information of left images;The three-dimensional coordinate at image midpoint is acquired in conjunction with camera interior and exterior parameter, obtains target partial 3 d point cloud;Pass through the feature extraction to left video camera adjacent image three-dimensional point cloud, matching, obtain the transformation relation between two three-dimensional point cloud coordinate systems, and then realize the registration two-by-two of three-dimensional point cloud, the three dimensional point cloud of entire target surface is finally obtained, comparison target surface initial three-dimensional data realize the detection to target.The present invention is applied to the Surface testing of large-scale target such as large industry equipment etc., has feature easy to operate, efficient.

Description

A kind of binocular vision target surface 3D detection methods and system based on unmanned plane
Technical field
The present invention relates to technical field of visual measurement, and in particular to a kind of binocular vision target surface 3D based on unmanned plane Detection method and system.
Background technology
Surface three dimension reconstruct for large-scale target, under high-precision requirement, fixed detector can not perceive its whole Three-dimensional surface.The method that mobile detector scanning can only be used, the three-dimensional of block-by-block detection target surface under dynamic coordinate system Data are spliced later, obtain final goal.Therefore, image three-dimensional reconstruction and the 3-D view splicing after reconstruct exist There is highly important status in this application field.Currently, to large-scale target surface 3D vision reconstruct technique study not It is common, the Surface testing of more complicated large scale equipment is dabbled with regard to less someone.Therefore, for research in this respect at Fruit horizontal, production efficiency and production security will generate huge progradation to industrial informationization.
Since unmanned plane has flight flexible, cheap, cruise duration is long, and control range is big, and it is excellent that payload is big etc. Point provides to carry out Surface testing to the large-scale target such as boiler, reaction tower, destilling tower using unmanned plane carry binocular camera It may.
Research now concerning binocular vision 3 D detection technique is as follows:
It is CN105716530A, a kind of entitled " the vehicle dimensioning based on binocular stereo vision of patent through retrieval request number Very little measurement method ", from rear five direction collection vehicle scene images all around, is calculated same pixel point and existed using video camera Then coordinate difference in two width scene graph calculates the depth information of vehicle scene point, and then calculate the size of vehicle, this Method video camera is fixedly mounted in five directions, can only acquire some smaller, moveable objects in fixed location, Measurement for the large-scale plant of outdoor fixed position may can not find suitable installation site and time-consuming and laborious, overall cost Also it can improve.
And the airborne three-dimensional mapping method for being applied to large-scale fixed target at present mainly uses the airborne 3D laser radars of carry Scanning collection partial 3 d point cloud completes three-dimensional mapping operations then in conjunction with UAV position and orientation.It is through retrieving publication No. CN106443705A, patent entitled " airborne laser radar measuring system and method " carry out three-dimensional mapping by the method, still 3D laser radars are expensive, and weight is big, require height to the load-carrying of unmanned plane, are not widely deployed in small drone, carry High system cost.
Invention content
To solve the above problems, the present invention provides a kind of binocular vision target surface 3D detection methods based on unmanned plane And system.
To achieve the above object, the technical solution that the present invention takes is:
A kind of binocular vision target surface 3D detecting systems based on unmanned plane, including:
Unmanned plane, for by carry its bottom binocular vision system by from top to bottom, sequence from left to right according to The image data of secondary acquisition target surface;
Computer system, for handling binocular vision system the image collected, including image characteristics extraction, it is special Sign matching, obtains depth information, calculates three-dimensional coordinate, 3-D view splicing etc.;Export testing result;
Remote control equipment controls the keying of the flight path and binocular vision system of unmanned plane by radio transmission module;
The unmanned plane includes
Supplying cell group, for providing electric power safeguard for UAV Communication equipment, image capture device, power-equipment;
Airborne control unit, the instruction for receiving ground control centre,
Navigation attitude measuring system, the pose for measuring unmanned plane in real time, and it is able to record the posture information at lower each moment;
Wifi/ radio transmission modules receive the remote control equipment and are sent out for realizing wireless telecommunications with the remote control equipment The control instruction sent;
Binocular vision system is fixed together with unmanned plane, the object-oriented surface setting of camera, for carrying out target figure The acquisition of picture.
The binocular vision system includes
Pose measuring apparatus, the pose for measuring binocular vision system;
Left and right cameras, the image data for acquiring target;
Storage card, for storing left and right cameras the image collected data respectively;
Two video cameras of described binocular vision system or so are placed in parallel and parameter is identical.
The binocular vision target surface 3D detection methods based on unmanned plane that the present invention also provides a kind of, including walk as follows Suddenly:
Step 1:Unmanned plane, which is controlled, by remote control equipment acquires image according to desired trajectory, it will storage after Image Acquisition Image in card, which imports in computer, to be handled;
Step 2:The image of airborne binocular vision system acquisition is numbered in chronological order, forms binocular sequence of left-right images It is right;
Step 3:Camera calibration can obtain camera interior and exterior parameter, including focal length of camera f and parallax range T etc.;
Step 4:Denoising, the processing such as image pose adjustment are carried out to the image collected;
Step 5:Feature point extraction is carried out to the left images that synchronization takes, Feature Points Matching is carried out, obtains and regard Poor information, and the three-dimensional coordinate at image midpoint is calculated, obtain partial 3 d point cloud;
Step 6:The three-dimensional point cloud of the adjacent image of left camera shooting is registrated, two three-dimensional point clouds is made uniformly to arrive In one coordinate system, the three-dimensional point cloud model of entire target surface is obtained;
Step 7:The surface factory data provided with large scale equipment using obtained three-dimensional point cloud model is compared, inspection Whether variant survey it.
Wherein, in the step 4 affine transformation correction chart image position appearance is utilized with reference to the posture information of unmanned plane.
The step 5 specifically comprises the following steps:
S51, extraction surf characteristic points, the rectangular area block of a 4*4 is taken around characteristic point, 25 are counted per sub-regions The haar wavelet characters horizontally and vertically of a pixel, the haar wavelet characters be horizontal direction value after, it is vertical After direction value, after horizontal direction absolute value and 4 directions of the sum of vertical direction absolute value, using this 4 values as each The feature vector in sub-block region determines matching degree by calculating the Euclidean distance between two characteristic points, and Euclidean distance is shorter, generation The matching degree of two characteristic points of table is better, and parallax information can be obtained by matched characteristic point;
S52, according to parallax information d=xr-x1Three-dimensional coordinate a little is sought by following formula with camera interior and exterior parameter:
Three-dimensional point cloud method for registering can be divided into following steps in the step 6:
S1, K neighborhoods are established to each of adjacent three-dimensional point cloud coordinate system point;
S2, extraction characteristic point, establish feature description for each characteristic point, match adjacent three-dimensional point cloud characteristic point;
S3, the transformational relation that two cloud coordinate systems can be calculated according to matching double points complete initial registration;
S4, the initial position provided according to initial registration carry out consecutive points cloud coordinate system using iteration closest approach algorithm Iteration may finally obtain the accurate transformation relationship of consecutive points cloud coordinate system, keep consecutive points cloud coordinate system unified to a coordinate In system.
The K neighborhoods of KD-tree methods acceleration search point are used in the step S1.
It is carried using normal vector variable quantity as standard come the normal vector of estimation point by Principal Component Analysis in the step S2 Characteristic point is taken, according to the geometrical relationship between characteristic point and neighbor point, various features is chosen and carrys out Expressive Features point, utilize histogram Statistical property statistical nature point and characteristic point K neighborhoods in put various features, for it establishes histogram feature description, obtain To feature vector, using the distance of feature vector between 2 points as matching foundation, feature vector is apart from minimum between 2 points Matching double points.
The transformation matrix between consecutive points cloud coordinate system is sought using Quaternion Method in the step S3, realizes step such as Under:S31:Calculate separately the center of gravity of source point collection P and target point set Q:
Wherein,The center of gravity three-dimensional coordinate of respectively source point collection P, target point set Q;NP, NQRespectively source point collection P, mesh The number at the midpoints punctuate collection Q;Pi, QiI-th point of three-dimensional coordinate in respectively source point collection P, target point set Q.
S32:Covariance matrix is constructed according to data point set P and Q:
S33:According to the symmetrical matrix of covariance matrix construction 4 × 4:
Wherein, I3It is three rank unit matrixs, tr (∑ P, Q) is the mark of matrix ∑ P, Q,
A=[A23 A31 A12]T;AI, j=(∑ P, Q- ∑ P, QT)ij
S34:The characteristic value and feature vector of Q (∑ P, Q) are acquired, the feature vector of maximum eigenvalue is to require Rotating vector
qR=[q0 q1 q3]T
S35:Rotating vector q required by above formulaRIt can be in the hope of spin matrix R, so as to obtain best translation Vector, formula are:
By two neighboring cloud coordinate system X in step S4i-1And XiUnification can be by following formula reality to the same coordinate system It is existing:
Xi-1=XiR+T;Wherein, R is spin matrix, and T is translation matrix.
The invention has the advantages that:
(1) present invention compares the detection method weight of laser radar more using binocular camera as three dimensional detection sensor Gently, small drone can be selected as carrier, while reduces the cost of equipment purchase.
(2) current video camera can reach sufficiently high precision, be fully able to meet detection demand, and existing With reconfiguration technique by the quite ripe of development, technical aspect can meet needs completely.
(3) airborne binocular camera shooting system operatio is flexible and convenient, and use is also fairly simple, be suitable for it is various under the conditions of three-dimensional Surface testing.
Description of the drawings
Fig. 1 is the system construction drawing of binocular vision target surface 3D detecting system of the embodiment of the present invention based on unmanned plane.
Fig. 2 is flight path schematic diagram of the embodiment of the present invention.
Fig. 3 is the overhaul flow chart of binocular vision target surface 3D detecting system of the embodiment of the present invention based on unmanned plane.
Fig. 4 is the binocular camera depth extraction schematic diagram in the embodiment of the present invention
In figure:1- unmanned planes, 2 one battery packs, the airborne control units of 3-, 4-WiFi or radio transmission module, 5- binoculars Video camera, 6- fixed connection structures, 7- navigation attitude detecting systems.
Specific implementation mode
In order to make objects and advantages of the present invention be more clearly understood, the present invention is carried out with reference to embodiments further It is described in detail.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not used to limit this hair It is bright.
As shown in Figure 1, an embodiment of the present invention provides a kind of binocular vision target surface 3D detection systems based on unmanned plane System, which is one kind based on binocular stereo vision three-dimensional reconstruction and estimates UAV position and orientation, passes through The correction of pose is attached in three-dimensional reconstruction by software algorithm, obtains the depth information of target, and then obtains Surface testing result. System includes unmanned plane, battery pack, airborne control unit, WiFi or radio transmission module, carries the binocular camera of storage card, It is fixedly connected with mechanism, video camera is connected to uav bottom.When system works, battery pack provides for the electronic equipment of unmanned plane Power source, staff, by WiFi or radio transmission module, convey instruction to airborne control unit, control by remote control equipment The rising of unmanned plane processed turns to, flight path etc., and can control the start and stop of video camera.When specific execution, by video camera It is separated by suitable distance with measured target, the field range of video camera is then tested, then according to the lateral model of camera coverage It encloses, measured target is divided into several parts on the basis of bottom, be then that starting point controls rotor flying with one of part Device vertical ascent determines often to rise how much shot once apart from video camera according to the longitudinal extent in the visual field.Aircraft flight rail Mark is as shown in Fig. 2, from bottom to top, then moves to next part, then from top to bottom, until covering the surface of entire target.Stop It only shoots, aircraft returns to ground, and storage card is taken out, and picture is handled, and carries out three-dimensionalreconstruction, obtains Surface testing knot Fruit.
As shown in figure 3, based on unmanned plane visual surface 3D, detection method includes the following steps according to present example:
Step 1 acquires image by remote control equipment control unmanned plane according to desired trajectory, will left and right after Image Acquisition The picture of video camera shooting imported into computer and is numbered, then adopting collected UAV position and orientation information and each moment Collect obtained picture to be corresponded to, is corrected by Principle of Affine Transformation, left and right pole is mapped to infinite point realization figure The row alignment of picture.
Step 2, to all image zooming-out surf characteristic points;
Step 3, the surf characteristic points of each left images centering are matched, specific matching process is as follows:
In Surf algorithms, the rectangular area block of a 4*4 is taken around characteristic point, but acquired rectangular area direction is Along the principal direction of characteristic point.The haar wavelet characters horizontally and vertically of 25 pixels are counted per sub-regions, Here be all both horizontally and vertically opposite principal direction for.The haar wavelet characters be horizontal direction value after, it is vertical After direction value, after horizontal direction absolute value and 4 directions of the sum of vertical direction absolute value, using this 4 values as each The feature vector in sub-block region, so description of the shared 4*4*4=64 dimensional vectors as Surf features.
Surf is to determine matching degree by calculating the Euclidean distance between two characteristic points, and Euclidean distance is shorter, represents two The matching degree of a characteristic point is better, takes some key point of left figure, and by traversing two key points that distance is nearest in right figure, this two In a key point, if secondary minimum distance divided by minimum distance are less than threshold range, it is determined as a pair of of match point.
Step 4, depth calculation, as shown in figure 4, simple triangle relation can be obtained,
Therefore, depth is represented by:
Wherein, xr-x1Parallax is defined as when the position of camera is placed in parallel, depth Z and parallax xr-x1It is inversely proportional Relationship, and fT is a determining constant.
The three-dimensional coordinate that can then find out a little is as follows:
Step 5:The three-dimensional point cloud of the adjacent image of left camera shooting is registrated, two three-dimensional point clouds is made uniformly to arrive In one coordinate system, the three-dimensional point cloud model of entire target surface is obtained, steps are as follows for specific implementation:
(1):Using the K neighborhoods each put in KD-tree method Searching point clouds, local neighborhood a little is established;
(2):By Principal Component Analysis come the normal vector of estimation point, feature is extracted using normal vector variable quantity as standard Point chooses various features and carrys out Expressive Features point, utilize the statistics of histogram according to the geometrical relationship between characteristic point and neighbor point The various features put in the K neighborhoods of statistics of features characteristic point and characteristic point establish histogram feature description for it, obtain feature Vector, it is match point that using the distance of feature vector between 2 points as matching foundation, between 2 points, feature vector distance is minimum It is right.
(3):The transformational relation of two cloud coordinate systems can be calculated according to matching double points, complete initial registration, Calculating process is as follows:
S31:Calculate separately the center of gravity of source point collection P and target point set Q:
Wherein,The center of gravity three-dimensional coordinate of respectively source point collection P, target point set Q;NP, NQRespectively source point collection P, mesh The number at the midpoints punctuate collection Q;Pi, QiI-th point of three-dimensional coordinate in respectively source point collection P, target point set Q.
S32:Covariance matrix is constructed according to data point set P and Q:
S33:According to the symmetrical matrix of covariance matrix construction 4 × 4:
Wherein, I3It is three rank unit matrixs, tr (∑ P, Q) is the mark of matrix ∑ P, Q,
Δ=[A23 A31 A12]T;AI, j=(∑ P, Q- ∑ P, QT)ij
S34:The characteristic value and feature vector of Q (∑ P, Q) are acquired, the feature vector of maximum eigenvalue is to require Rotating vector qR=[q0q1q3]T
S35:Rotating vector q required by above formulaRIt can be in the hope of spin matrix R, so as to obtain best translation Vector, formula are:
(4):According to the initial position that initial registration provides, consecutive points cloud coordinate system is carried out using iteration closest approach algorithm Iteration may finally obtain the accurate transformation relationship of consecutive points cloud coordinate system, keep consecutive points cloud coordinate system unified to a coordinate In system.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, without departing from the principle of the present invention, it can also make several improvements and retouch, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (10)

1. a kind of binocular vision target surface 3D detecting systems based on unmanned plane, it is characterised in that:Including:
Unmanned plane is pressed from top to bottom for the binocular vision system by carry in its bottom, and sequence from left to right is adopted successively Collect the image data of target surface;
Computer system exports testing result for handling binocular vision system the image collected;
Remote control equipment controls the keying of the flight path and binocular vision system of unmanned plane by radio transmission module;
The unmanned plane includes
Supplying cell group, for providing electric power safeguard for UAV Communication equipment, image capture device, power-equipment;
Airborne control unit, the instruction for receiving ground control centre,
Navigation attitude measuring system, the pose for measuring unmanned plane in real time, and it is able to record the posture information at lower each moment;
Wifi/ radio transmission modules are received for realizing wireless telecommunications with the remote control equipment transmitted by the remote control equipment Control instruction;
Binocular vision system is fixed together with unmanned plane, the object-oriented surface setting of camera, for carrying out target image Acquisition.
2. a kind of binocular vision target surface 3D detecting systems based on unmanned plane as described in claim 1, it is characterised in that: The binocular vision system includes:
Pose measuring apparatus, the pose for measuring binocular vision system;
Left and right cameras, the image data for acquiring target;
Storage card, for storing left and right cameras the image collected data respectively.
3. according to a kind of binocular vision target surface 3D detecting systems based on unmanned plane described in right 1, it is characterised in that: Two video cameras of binocular vision system or so are placed in parallel and parameter is identical.
4. a kind of binocular vision target surface 3D detection methods based on unmanned plane, it is characterised in that:Include the following steps:
Step 1:Unmanned plane, which is controlled, by remote control equipment acquires image according to desired trajectory, it will be in storage card after Image Acquisition Image import computer in handled;
Step 2:The image of airborne binocular vision system acquisition is numbered in chronological order, forms binocular sequence of left-right images pair;
Step 3:Camera calibration can obtain camera interior and exterior parameter, including focal length of camera f and parallax range T etc.;
Step 4:Denoising, the processing such as image pose adjustment are carried out to the image collected;
Step 5:Feature point extraction is carried out to the left images that synchronization takes, carries out Feature Points Matching, show that parallax is believed Breath, and the three-dimensional coordinate at image midpoint is calculated, obtain partial 3 d point cloud;
Step 6:The three-dimensional point cloud of the adjacent image of left camera shooting is registrated, keeps two three-dimensional point clouds unified to one In coordinate system, the three-dimensional point cloud model of entire target surface is obtained;
Step 7:The surface factory data provided with large scale equipment using obtained three-dimensional point cloud model is compared, it is detected It is whether variant.
5. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 4, it is characterised in that:Institute It states the posture information in step 4 with reference to unmanned plane and utilizes affine transformation correction chart image position appearance.
6. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 4, it is characterised in that:
Step 5 specifically comprises the following steps:
S51, extraction surf characteristic points, the rectangular area block of a 4*4 is taken around characteristic point, 25 pictures are counted per sub-regions Element haar wavelet characters horizontally and vertically, the haar wavelet characters be horizontal direction value after, vertical direction After value, after horizontal direction absolute value and 4 directions of the sum of vertical direction absolute value, using this 4 values as each sub-block The feature vector in region determines matching degree by calculating the Euclidean distance between two characteristic points, and Euclidean distance is shorter, represents two The matching degree of a characteristic point is better, and parallax information can be obtained by matched characteristic point;
S52, according to parallax information d=xr-x1Three-dimensional coordinate a little is sought by following formula with camera interior and exterior parameter:
7. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 4, it is characterised in that:Institute Following steps can be divided by stating three-dimensional point cloud method for registering in step 6:
S1, K neighborhoods are established to each of adjacent three-dimensional point cloud coordinate system point;
S2, extraction characteristic point, establish feature description for each characteristic point, match adjacent three-dimensional point cloud characteristic point;
S3, the transformational relation that two cloud coordinate systems can be calculated according to matching double points complete initial registration;
S4, the initial position provided according to initial registration, are iterated consecutive points cloud coordinate system using iteration closest approach algorithm, The accurate transformation relationship that may finally obtain consecutive points cloud coordinate system keeps consecutive points cloud coordinate system unified into a coordinate system.
8. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 7, it is characterised in that:Institute State the K neighborhoods that KD-tree methods acceleration search point is used in step S1.
9. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 7, it is characterised in that:Institute It states in step S2 through Principal Component Analysis come the normal vector of estimation point, characteristic point is extracted using normal vector variable quantity as standard, According to the geometrical relationship between characteristic point and neighbor point, chooses various features and carry out Expressive Features point, it is special using the statistics of histogram Property statistical nature point and characteristic point K neighborhoods in put various features, for it establishes histogram feature description, obtain feature to Amount, it is matching double points that using the distance of feature vector between 2 points as matching foundation, between 2 points, feature vector distance is minimum.
10. a kind of binocular vision target surface 3D detection methods based on unmanned plane according to right 7, it is characterised in that: The transformation matrix between consecutive points cloud coordinate system is sought using Quaternion Method in the step S3, realizes that steps are as follows:
S31:Calculate separately the center of gravity of source point collection P and target point set Q:
Wherein,The center of gravity three-dimensional coordinate of respectively source point collection P, target point set Q;NP, NQRespectively source point collection P, target point Collect the number at the midpoints Q;Pi, QiI-th point of three-dimensional coordinate in respectively source point collection P, target point set Q.
S32:Covariance matrix is constructed according to data point set P and Q:
S33:According to the symmetrical matrix of covariance matrix construction 4 × 4:
Wherein, I3It is three rank unit matrixs, tr (∑ P, Q) is the mark of matrix ∑ P, Q,
Δ=[A23 A31 A12]T
S34:Acquire the characteristic value and feature vector of Q (∑ P, Q), the feature vector of maximum eigenvalue be desired rotation to Measure qR=[q0 q1 q3]T
S35:Rotating vector q required by above formulaRCan in the hope of spin matrix R, so as to obtain best translation vector, Formula is:
By two neighboring cloud coordinate system X in step S4i-1And XiIt is unified to be realized to the same coordinate system by following formula: Xi-1=XiR+T;
Wherein, R is spin matrix, and T is translation matrix.
CN201810081160.1A 2018-01-19 2018-01-19 A kind of binocular vision target surface 3D detection methods and system based on unmanned plane Pending CN108317953A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810081160.1A CN108317953A (en) 2018-01-19 2018-01-19 A kind of binocular vision target surface 3D detection methods and system based on unmanned plane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810081160.1A CN108317953A (en) 2018-01-19 2018-01-19 A kind of binocular vision target surface 3D detection methods and system based on unmanned plane

Publications (1)

Publication Number Publication Date
CN108317953A true CN108317953A (en) 2018-07-24

Family

ID=62887290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810081160.1A Pending CN108317953A (en) 2018-01-19 2018-01-19 A kind of binocular vision target surface 3D detection methods and system based on unmanned plane

Country Status (1)

Country Link
CN (1) CN108317953A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109145905A (en) * 2018-08-29 2019-01-04 河海大学常州校区 A kind of transmission line of electricity accessory detection method of view-based access control model conspicuousness
CN109242898A (en) * 2018-08-30 2019-01-18 华强方特(深圳)电影有限公司 A kind of three-dimensional modeling method and system based on image sequence
CN109341588A (en) * 2018-10-08 2019-02-15 西安交通大学 A kind of measuring three-dimensional profile method of three systems approach visual angle of binocular structure light weighting
CN109597087A (en) * 2018-11-15 2019-04-09 天津大学 A kind of 3D object detection method based on point cloud data
CN110058594A (en) * 2019-04-28 2019-07-26 东北大学 The localization for Mobile Robot navigation system and method for multisensor based on teaching
CN110120013A (en) * 2019-05-15 2019-08-13 深圳市凌云视迅科技有限责任公司 A kind of cloud method and device
CN110155369A (en) * 2019-05-29 2019-08-23 中国民航大学 A kind of aircraft skin face crack inspection method
CN110222382A (en) * 2019-05-22 2019-09-10 成都飞机工业(集团)有限责任公司 A kind of aircraft axes Optimal Fitting method
CN110717936A (en) * 2019-10-15 2020-01-21 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN110779933A (en) * 2019-11-12 2020-02-11 广东省智能机器人研究院 Surface point cloud data acquisition method and system based on 3D visual sensing array
CN110992291A (en) * 2019-12-09 2020-04-10 国网安徽省电力有限公司检修分公司 Distance measuring method, system and storage medium based on trinocular vision
CN111462213A (en) * 2020-03-16 2020-07-28 天目爱视(北京)科技有限公司 Equipment and method for acquiring 3D coordinates and dimensions of object in motion process
CN111784680A (en) * 2020-07-06 2020-10-16 天津大学 Detection method based on consistency of key points of left and right eye views of binocular camera
CN112268548A (en) * 2020-12-14 2021-01-26 成都飞机工业(集团)有限责任公司 Airplane local appearance measuring method based on binocular vision
WO2021046716A1 (en) * 2019-09-10 2021-03-18 深圳市大疆创新科技有限公司 Method, system and device for detecting target object and storage medium
CN113763562A (en) * 2021-08-31 2021-12-07 哈尔滨工业大学(威海) Binocular vision-based facade feature detection and facade feature processing method
CN114018158A (en) * 2021-11-02 2022-02-08 中国大唐集团科技工程有限公司 Non-contact three-dimensional thermal displacement detection system and application thereof
CN114396921A (en) * 2021-11-15 2022-04-26 中国计量大学 Qiantanjiang river tidal bore height and propagation speed measuring method based on unmanned aerial vehicle
CN114820777A (en) * 2021-03-24 2022-07-29 北京大成国测科技有限公司 Unmanned aerial vehicle three-dimensional data front-end processing method and device and unmanned aerial vehicle
CN115953605A (en) * 2023-03-14 2023-04-11 深圳中集智能科技有限公司 Machine vision multi-target image coordinate matching method
CN116045833A (en) * 2023-01-03 2023-05-02 中铁十九局集团有限公司 Bridge construction deformation monitoring system based on big data
CN116754039A (en) * 2023-08-16 2023-09-15 四川吉埃智能科技有限公司 Method for detecting earthwork of ground pit body
CN117491355A (en) * 2023-11-06 2024-02-02 广州航海学院 Visual detection method for abrasion loss of three-dimensional curved surface of rake teeth type large component

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105043350A (en) * 2015-06-25 2015-11-11 闽江学院 Binocular vision measuring method
CN105184863A (en) * 2015-07-23 2015-12-23 同济大学 Unmanned aerial vehicle aerial photography sequence image-based slope three-dimension reconstruction method
CN105928493A (en) * 2016-04-05 2016-09-07 王建立 Binocular vision three-dimensional mapping system and method based on UAV
CN106500669A (en) * 2016-09-22 2017-03-15 浙江工业大学 A kind of Aerial Images antidote based on four rotor IMU parameters
CN106796728A (en) * 2016-11-16 2017-05-31 深圳市大疆创新科技有限公司 Generate method, device, computer system and the mobile device of three-dimensional point cloud
US20170186164A1 (en) * 2015-12-29 2017-06-29 Government Of The United States As Represetned By The Secretary Of The Air Force Method for fast camera pose refinement for wide area motion imagery
CN107316325A (en) * 2017-06-07 2017-11-03 华南理工大学 A kind of airborne laser point cloud based on image registration and Image registration fusion method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105043350A (en) * 2015-06-25 2015-11-11 闽江学院 Binocular vision measuring method
CN105184863A (en) * 2015-07-23 2015-12-23 同济大学 Unmanned aerial vehicle aerial photography sequence image-based slope three-dimension reconstruction method
US20170186164A1 (en) * 2015-12-29 2017-06-29 Government Of The United States As Represetned By The Secretary Of The Air Force Method for fast camera pose refinement for wide area motion imagery
CN105928493A (en) * 2016-04-05 2016-09-07 王建立 Binocular vision three-dimensional mapping system and method based on UAV
CN106500669A (en) * 2016-09-22 2017-03-15 浙江工业大学 A kind of Aerial Images antidote based on four rotor IMU parameters
CN106796728A (en) * 2016-11-16 2017-05-31 深圳市大疆创新科技有限公司 Generate method, device, computer system and the mobile device of three-dimensional point cloud
CN107316325A (en) * 2017-06-07 2017-11-03 华南理工大学 A kind of airborne laser point cloud based on image registration and Image registration fusion method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HERBERT BAY, ET AL.: ""Speeded-Up Robust Features (SURF)"", 《COMPUTER VISION AND IMAGE UNDERSTANDING》 *
XIANG GAO: ""A Mosaic Method on Images Small of Unmanned Aerial Vehicle"", 《2ND WORKSHOP ON ADVANCED RESEARCH AND TECHNOLOGY IN INDUSTRY APPLICATIONS》 *
张蒙: ""基于改进的ICP算法的点云配准技术"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
陶海跻 等: ""一种基于法向量的点云自动配准方法"", 《中国激光》 *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109145905A (en) * 2018-08-29 2019-01-04 河海大学常州校区 A kind of transmission line of electricity accessory detection method of view-based access control model conspicuousness
CN109242898B (en) * 2018-08-30 2022-03-22 华强方特(深圳)电影有限公司 Three-dimensional modeling method and system based on image sequence
CN109242898A (en) * 2018-08-30 2019-01-18 华强方特(深圳)电影有限公司 A kind of three-dimensional modeling method and system based on image sequence
CN109341588A (en) * 2018-10-08 2019-02-15 西安交通大学 A kind of measuring three-dimensional profile method of three systems approach visual angle of binocular structure light weighting
CN109597087A (en) * 2018-11-15 2019-04-09 天津大学 A kind of 3D object detection method based on point cloud data
CN109597087B (en) * 2018-11-15 2022-07-01 天津大学 Point cloud data-based 3D target detection method
CN110058594A (en) * 2019-04-28 2019-07-26 东北大学 The localization for Mobile Robot navigation system and method for multisensor based on teaching
CN110120013A (en) * 2019-05-15 2019-08-13 深圳市凌云视迅科技有限责任公司 A kind of cloud method and device
CN110120013B (en) * 2019-05-15 2023-10-20 深圳市凌云视迅科技有限责任公司 Point cloud splicing method and device
CN110222382B (en) * 2019-05-22 2023-04-18 成都飞机工业(集团)有限责任公司 Aircraft coordinate system optimization fitting method
CN110222382A (en) * 2019-05-22 2019-09-10 成都飞机工业(集团)有限责任公司 A kind of aircraft axes Optimal Fitting method
CN110155369A (en) * 2019-05-29 2019-08-23 中国民航大学 A kind of aircraft skin face crack inspection method
CN110155369B (en) * 2019-05-29 2022-05-17 中国民航大学 Method for checking surface cracks of aircraft skin
WO2021046716A1 (en) * 2019-09-10 2021-03-18 深圳市大疆创新科技有限公司 Method, system and device for detecting target object and storage medium
CN110717936B (en) * 2019-10-15 2023-04-28 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN110717936A (en) * 2019-10-15 2020-01-21 哈尔滨工业大学 Image stitching method based on camera attitude estimation
CN110779933A (en) * 2019-11-12 2020-02-11 广东省智能机器人研究院 Surface point cloud data acquisition method and system based on 3D visual sensing array
CN110992291A (en) * 2019-12-09 2020-04-10 国网安徽省电力有限公司检修分公司 Distance measuring method, system and storage medium based on trinocular vision
CN110992291B (en) * 2019-12-09 2023-07-21 国网安徽省电力有限公司超高压分公司 Ranging method, system and storage medium based on three-eye vision
CN111462213A (en) * 2020-03-16 2020-07-28 天目爱视(北京)科技有限公司 Equipment and method for acquiring 3D coordinates and dimensions of object in motion process
CN111784680A (en) * 2020-07-06 2020-10-16 天津大学 Detection method based on consistency of key points of left and right eye views of binocular camera
CN111784680B (en) * 2020-07-06 2022-06-28 天津大学 Detection method based on consistency of key points of left and right eye views of binocular camera
CN112268548A (en) * 2020-12-14 2021-01-26 成都飞机工业(集团)有限责任公司 Airplane local appearance measuring method based on binocular vision
CN114820777A (en) * 2021-03-24 2022-07-29 北京大成国测科技有限公司 Unmanned aerial vehicle three-dimensional data front-end processing method and device and unmanned aerial vehicle
CN113763562A (en) * 2021-08-31 2021-12-07 哈尔滨工业大学(威海) Binocular vision-based facade feature detection and facade feature processing method
CN113763562B (en) * 2021-08-31 2023-08-29 哈尔滨工业大学(威海) Binocular vision-based vertical face feature detection and vertical face feature processing method
CN114018158A (en) * 2021-11-02 2022-02-08 中国大唐集团科技工程有限公司 Non-contact three-dimensional thermal displacement detection system and application thereof
CN114396921A (en) * 2021-11-15 2022-04-26 中国计量大学 Qiantanjiang river tidal bore height and propagation speed measuring method based on unmanned aerial vehicle
CN114396921B (en) * 2021-11-15 2023-12-08 中国计量大学 Method for measuring tidal height and propagation speed of Yangtze river on basis of unmanned aerial vehicle
CN116045833A (en) * 2023-01-03 2023-05-02 中铁十九局集团有限公司 Bridge construction deformation monitoring system based on big data
CN116045833B (en) * 2023-01-03 2023-12-22 中铁十九局集团有限公司 Bridge construction deformation monitoring system based on big data
CN115953605B (en) * 2023-03-14 2023-06-06 深圳中集智能科技有限公司 Machine vision multi-target image coordinate matching method
CN115953605A (en) * 2023-03-14 2023-04-11 深圳中集智能科技有限公司 Machine vision multi-target image coordinate matching method
CN116754039A (en) * 2023-08-16 2023-09-15 四川吉埃智能科技有限公司 Method for detecting earthwork of ground pit body
CN116754039B (en) * 2023-08-16 2023-10-20 四川吉埃智能科技有限公司 Method for detecting earthwork of ground pit body
CN117491355A (en) * 2023-11-06 2024-02-02 广州航海学院 Visual detection method for abrasion loss of three-dimensional curved surface of rake teeth type large component

Similar Documents

Publication Publication Date Title
CN108317953A (en) A kind of binocular vision target surface 3D detection methods and system based on unmanned plane
CN106356757B (en) A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic
CN107657640A (en) Intelligent patrol inspection management method based on ORB SLAM
CN110446159A (en) A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN106485655A (en) A kind of taken photo by plane map generation system and method based on quadrotor
JP2017537484A (en) System and method for detecting and tracking movable objects
Gee et al. Lidar guided stereo simultaneous localization and mapping (SLAM) for UAV outdoor 3-D scene reconstruction
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN105719277B (en) A kind of substation's three-dimensional modeling method and system based on mapping with two dimensional image
CN106096207B (en) A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure and system based on multi-vision visual
CN110319772A (en) Vision large span distance measuring method based on unmanned plane
Beekmans et al. Cloud photogrammetry with dense stereo for fisheye cameras
WO2019100219A1 (en) Output image generation method, device and unmanned aerial vehicle
CN111123962A (en) Rotor unmanned aerial vehicle repositioning photographing method for power tower inspection
CN107221006A (en) A kind of communication single pipe tower slant detection method based on unmanned plane imaging platform
CN115809986A (en) Multi-sensor fusion type intelligent external damage detection method for power transmission corridor
CN114812558B (en) Monocular vision unmanned aerial vehicle autonomous positioning method combining laser ranging
CN114022798A (en) Transformer substation inspection robot obstacle avoidance method based on digital twinning technology
CN116071424A (en) Fruit space coordinate positioning method based on monocular vision
CN112132900A (en) Visual repositioning method and system
CN109883400B (en) Automatic target detection and space positioning method for fixed station based on YOLO-SITCOL
Dutta et al. Real testbed for autonomous anomaly detection in power grid using low-cost unmanned aerial vehicles and aerial imaging
Teo Video-based point cloud generation using multiple action cameras
CN116539001A (en) Marine wind power tower verticality detection method and system based on unmanned aerial vehicle
CN111402324A (en) Target measuring method, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180724