CN113776505A - Method for realizing close-range photogrammetry and three-dimensional visualization - Google Patents

Method for realizing close-range photogrammetry and three-dimensional visualization Download PDF

Info

Publication number
CN113776505A
CN113776505A CN202110751675.XA CN202110751675A CN113776505A CN 113776505 A CN113776505 A CN 113776505A CN 202110751675 A CN202110751675 A CN 202110751675A CN 113776505 A CN113776505 A CN 113776505A
Authority
CN
China
Prior art keywords
photogrammetry
dimensional visualization
range
target object
close
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110751675.XA
Other languages
Chinese (zh)
Other versions
CN113776505B (en
Inventor
张文志
任筱芳
柳广春
邹友峰
薛永安
宋明伟
蔡来良
杨文府
杨森
杜梦豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University of Technology
Original Assignee
Henan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University of Technology filed Critical Henan University of Technology
Priority to CN202110751675.XA priority Critical patent/CN113776505B/en
Publication of CN113776505A publication Critical patent/CN113776505A/en
Application granted granted Critical
Publication of CN113776505B publication Critical patent/CN113776505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/28Special adaptation for recording picture point data, e.g. for profiles

Abstract

The invention discloses a method for realizing close-range photogrammetry and three-dimensional visualization, which comprises three steps of equipment verification, photogrammetry, three-dimensional visualization processing and the like. The invention has good general type and convenient operation and implementation, can effectively meet the requirement of carrying out close-range photogrammetry operation by utilizing various different types of common digital cameras, and can meet the requirement of comprehensive close-range photogrammetry operation of various scenes; on the other hand, the defect of poor measurement accuracy caused by unstable distortion coefficients of internal orientation elements and conformation is effectively overcome, and the synchronous operation of mapping and three-dimensional modeling can be realized, so that the close-range photogrammetry accuracy is greatly improved, and the convenience and the intuitiveness of measurement data acquisition can be effectively improved.

Description

Method for realizing close-range photogrammetry and three-dimensional visualization
Technical Field
The invention belongs to a visual mapping technology, and particularly relates to a method for realizing close-range photogrammetry and three-dimensional visualization.
Background
With the continuous development of photogrammetry technology, close-range photogrammetry is also fully developed, and the close-range photogrammetry is more and more superior as an important supplementary means for aerial photogrammetry. The common digital camera replaces a photographing theodolite, photographing is carried out in a flexible photographing mode, and the digital photogrammetry workstation is used for close-range photogrammetry and becomes the inevitable direction of the development of the close-range photogrammetry.
Because the mechanical structure of a common digital camera is unstable, the distortion coefficients of internal orientation elements and conformation are unstable, which brings difficulty to the calibration of the camera and restricts the popularization and application of close-range photogrammetry technology.
Therefore, in view of the current situation, there is an urgent need to develop a close-range photogrammetry method which overcomes the defects existing in the operation of the existing equipment and effectively improves the measurement efficiency and precision so as to meet the needs of actual work.
Disclosure of Invention
The invention provides a method for realizing close-range photogrammetry and three-dimensional visualization, which aims to solve the problems in the background technology.
In order to achieve the technical purpose, the invention provides the following technical scheme:
a method for realizing close-range photogrammetry and three-dimensional visualization comprises the following steps:
s1, device verification, selecting the video capture device to participate in the photogrammetry operation, and then using the mathematical model based on space back intersection and collinearityThe equation is used as the data operation basis, the coordinate of the image point is used as an observed value, and the selected orientation element (x) in the video acquisition equipment is subjected to0,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1,p2) Distortion parameter in area array (b)1,b2) Calculating and measuring, adjusting and setting the photogrammetry operation video acquisition equipment according to the calculation result, and recovering the correct shape of the image light beam;
s2, photogrammetry, namely, according to the structural characteristics of the target object to be measured, mounting and positioning the video acquisition equipment selected in the step S1 and participating in photogrammetry operation according to any one of two basic modes, namely a straight photography mode and a rotary multi-baseline photography mode, wherein the distance between the video acquisition equipment for photogrammetry operation and the target object is 1/4-1/5 of the average photography depth;
s3, three-dimensional visualization processing, after the step S2 is completed, the measured internal orientation element (x) is calculated in the step S10,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1, p2) Distortion parameter in area array (b)1,b2) The parameters and the target object image data collected in the step S2 are transmitted to a three-dimensional visualization computer system together, target area control point data are generated according to the recorded target object image data collected in the step S2, then the mapping data are the collected target object image data to be matched, and point clouds are synchronously generated in the target object image through control point adjustment operation; and finally, editing the point cloud to obtain the three-dimensional mapping image data of the target object.
Further, in the step S1:
the collinearity equation based on the spatial back intersection is:
Figure RE-GDA0003342925170000021
further, in the step S1, when the verification job is performed:
firstly, establishing a calibration field indoors, measuring and positioning coordinates of indoor control points, observing each control point by a loopback method, observing one loopback and only measuring the distance once;
then, carrying out data processing on the observation data by using Australis software, and obtaining corresponding parameters of the video acquisition equipment selected to participate in the photogrammetry operation;
further, the control point pitch or arrangement structure type is: the space array distribution of the distance of 30cm multiplied by 20cm or 30cm multiplied by 30 cm.
Further, in the step S2, when the photographing object is a vertical wall, a parallel photographing mode is adopted; when a photographic subject with a large difference in imaging depth is imaged, a photographic method of rotating multiple baselines is employed.
Further, in the step S3, the three-dimensional visualization computer application program is: LensPhotoo-V2.0
Further, in step S3, the control point adjustment function is:
further, in the step S3, when editing the point cloud, specific editing rules are as follows:
1) processing the production point cloud according to automatic matching, and removing redundant points;
2) when the model range is defined, attention should not be paid to the overlarge range, multiple times of circle selection can be carried out, and a three-dimensional model is preferentially reconstructed;
3) when defining the range of different partitions, it should be noted that a certain degree of overlap is preserved between each.
On one hand, the invention has good universality and convenient operation and implementation, can effectively meet the requirements of carrying out close-range photogrammetry operation by utilizing various different types of common digital cameras, and can meet the requirements of comprehensive close-range photogrammetry operation of various scenes; on the other hand, the defect of poor measurement accuracy caused by unstable distortion coefficients of internal orientation elements and conformation is effectively overcome, and the synchronous operation of mapping and three-dimensional modeling can be realized, so that the close-range photogrammetry accuracy is greatly improved, and the convenience and the intuitiveness of measurement data acquisition can be effectively improved.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention;
FIG. 2 is a schematic diagram of a partial structure of a calibration field;
FIG. 3 is a data statistics table of camera calibration results;
FIG. 4 is a schematic diagram of a point cloud subsection of a target object;
FIG. 5 is a schematic structural diagram of a target after three-dimensional visualization;
Detailed Description
In order to make the technical means, the creation characteristics, the achievement purposes and the effects of the invention easy to understand, the invention is further described with the specific embodiments.
As shown in fig. 1-5, a method for performing close-up photogrammetry and three-dimensional visualization includes the following steps:
s1, checking the equipment, firstly selecting the video acquisition equipment participating in the photogrammetry operation, then taking a mathematical model and a collinear equation based on space back intersection as data operation bases, taking the coordinates of image points as observed values, and carrying out alignment on the selected internal orientation elements (x) of the video acquisition equipment0,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1,p2) Distortion parameter in area array (b)1,b2) Calculating and measuring, adjusting and setting the photogrammetry operation video acquisition equipment according to the calculation result, and recovering the correct shape of the image light beam;
s2, photogrammetry, namely, according to the structural characteristics of the target object to be measured, mounting and positioning the video acquisition equipment selected in the step S1 and participating in photogrammetry operation according to any one of two basic modes, namely a straight photography mode and a rotary multi-baseline photography mode, wherein the distance between the video acquisition equipment for photogrammetry operation and the target object is 1/4-1/5 of the average photography depth;
s3, three-dimensional visualization processing, after the step S2 is completed, the measured internal orientation element (x) is calculated in the step S10,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1, p2) Distortion parameter in area array (b)1,b2) The parameters and the target object image data collected in the step S2 are transmitted to a three-dimensional visualization computer system together, target area control point data are generated according to the recorded target object image data collected in the step S2, then the mapping data are the collected target object image data to be matched, and point clouds are synchronously generated in the target object image through control point adjustment operation; and finally, editing the point cloud to obtain the three-dimensional mapping image data of the target object.
Of particular note, in the step S1:
the collinearity equation mathematical model based on spatial back intersection is:
Figure RE-GDA0003342925170000051
meanwhile, in the step S1, when the verification job is performed:
firstly, establishing a calibration field indoors, measuring and positioning coordinates of indoor control points, observing each control point by a loopback method, observing one loopback and only measuring the distance once;
then, carrying out data processing on the observation data by using Australis software, and obtaining corresponding parameters of the video acquisition equipment selected to participate in the photogrammetry operation;
further preferably, the control point pitch or arrangement structure type is: the space array distribution of the distance of 30cm multiplied by 20cm or 30cm multiplied by 30 cm.
In this embodiment, in the step S2, when the object to be photographed is a vertical wall, a parallel photography mode is adopted; when a photographic subject with a large difference in imaging depth is imaged, a photographic method of rotating multiple baselines is employed.
In addition, in the step S3, the three-dimensional visualization computer application program is: LensPhotoo-V2.0
In the step S3, the control point adjustment function is:
specifically, in the step S3, when editing the point cloud, the specific editing rule is as follows:
1) processing the production point cloud according to automatic matching, and removing redundant points;
2) when the model range is defined, attention should not be paid to the overlarge range, multiple times of circle selection can be carried out, and a three-dimensional model is preferentially reconstructed;
3) when defining the range of different partitions, it should be noted that a certain degree of overlap is preserved between each.
In order to fully explain the technical content related to the present invention and facilitate understanding and mastering of the technical content described in the present invention for those skilled in the relevant field, the technical solution described in the present invention will be described with reference to the EOS450D digital camera as a specific embodiment:
as shown in fig. 1-5, a method for performing close-up photogrammetry and three-dimensional visualization includes the following steps:
and S1, equipment verification, namely, firstly, performing calibration on the EOS450D digital camera to recover the correct shape of the image light beam, namely, acquiring the internal orientation elements and the image formation distortion coefficient of the image through calibration. The digital camera calibration content comprises: principal point coordinates (x)0,y0) The measurement of (f), the measurement of the principal distance (f), the measurement of the optical distortion coefficient, and the measurement of the distortion coefficient in the CCD area array. The calibration is carried out by adopting a mathematical model based on space back intersection. Based on a collinear equation, the coordinate of an image point is used as an observed value, and internal and external orientation elements, distortion coefficients and other additional parameters of the camera are solved;
during actual verification, a verification field is established indoors, coordinates of indoor control points are measured by adopting a GTS-3100 TOPCON total station, observation is carried out by a echo method, one echo is observed, and the distance is measured only once. The total station is a 5' grade total station, the measurement precision can reach millimeter level, and the measurement precision completely meets the requirement.
And (3) carrying out data processing on the observation data by using Australis software, and acquiring camera parameters of an EOS450D digital camera: inner orientation element (x)0,y0) Radial distortion parameter (k)1,k2, k3) Eccentric distortion parameter (p)1,p2) Distortion parameter in area array (b)1,b2)。
S2, photogrammetry, namely, according to the structural characteristics of the target object to be measured, mounting and positioning the video acquisition equipment selected in the step S1 and participating in photogrammetry operation according to any one of two basic modes, namely a straight photography mode and a rotary multi-baseline photography mode, wherein the distance between the video acquisition equipment for photogrammetry operation and the target object is 1/4-1/5 of the average photography depth; when the photographic object is a vertical wall surface, a parallel photography mode is adopted; when the imaging depth difference is large, a multi-base-line rotating photographing method is adopted;
s3, three-dimensional visualization processing, after the step S2 is completed, the measured internal orientation element (x) is calculated in the step S10,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1, p2) Distortion parameter in area array (b)1,b2) The parameters and the target object image data collected in the step S2 are transmitted to a three-dimensional visualization computer system together, target area control point data are generated according to the recorded target object image data collected in the step S2, then the mapping data are the collected target object image data to be matched, and point clouds are synchronously generated in the target object image through control point adjustment operation; and finally, editing the point cloud to obtain the three-dimensional mapping image data of the target object.
The method comprises the following steps of obtaining the adjustment precision of current data through adjustment operation of control points, specifically:
0.000880, the general reference value is: less than 1/2 pixels are valid data;
error in three directions X, Y, Z: [ rmsx ] 0.0009
[rmsy]:0.0009
[rmsz]:0.0004
Error in plane of 0.0013
Depth distance 3.4280
Relative accuracy of depth 1/2607
Relative accuracy of plane 1/9581
Relative accuracy of point location 1/2516
On one hand, the method has good universality and convenient operation and implementation, can effectively meet the requirement of carrying out close-range photogrammetry operation by utilizing various different types of common digital cameras, and can meet the requirement of comprehensive close-range photogrammetry operation of various scenes; on the other hand, the defect of poor measurement accuracy caused by unstable distortion coefficients of internal orientation elements and conformation is effectively overcome, and the synchronous operation of mapping and three-dimensional modeling can be realized, so that the close-range photogrammetry accuracy is greatly improved, and the convenience and the intuitiveness of measurement data acquisition can be effectively improved.
The foregoing is a more detailed description of the present invention and is not to be construed as limiting the invention. To those skilled in the art to which the invention relates, numerous changes, substitutions and alterations can be made without departing from the spirit of the invention, and these changes are deemed to be within the scope of the invention as defined by the appended claims.

Claims (8)

1. A method for realizing close-range photogrammetry and three-dimensional visualization is characterized by comprising the following steps:
s1, checking the equipment, firstly selecting the video acquisition equipment participating in the photogrammetry operation, then taking the collinear equation based on the space back intersection as the data operation basis, taking the coordinates of the image point as the observed value, and carrying out the orientation element (x) in the selected video acquisition equipment0,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1,p2) Distortion parameter in area array (b)1,b2) Calculating and measuring, adjusting and setting the photogrammetry operation video acquisition equipment according to the calculation result, and recovering the correct shape of the image light beam;
s2, photogrammetry, namely, according to the structural characteristics of the target object to be measured, mounting and positioning the video acquisition equipment selected in the step S1 and participating in photogrammetry operation according to any one of two basic modes, namely a straight photography mode and a rotary multi-baseline photography mode, wherein the distance between the video acquisition equipment for photogrammetry operation and the target object is 1/4-1/5 of the average photography depth;
s3, performing three-dimensional visualization processingAfter the step of S2, the measured inner orientation element (x) is calculated in the step of S10,y0) Radial distortion parameter (k)1,k2,k3) Eccentric distortion parameter (p)1,p2) Distortion parameter in area array (b)1,b2) The parameters and the target object image data collected in the step S2 are transmitted to a three-dimensional visual computer application program together, target area control point data are generated according to the recorded target object image data collected in the step S2, then the mapping data are the collected target object image data to be matched, and point clouds are synchronously generated in the target object image through control point adjustment operation; and finally, editing the point cloud to obtain the three-dimensional mapping image data of the target object.
2. The method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein in the step of S1:
the collinearity equation for the spatial back intersection is:
Figure FDA0003144858500000021
3. the method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein said step S1 is implemented by:
firstly, establishing a calibration field indoors, measuring and positioning coordinates of indoor control points, observing each control point by a loopback method, observing one loopback and only measuring the distance once;
and then, carrying out data processing on the observation data by using Australis software, and obtaining corresponding parameters of the video acquisition equipment selected to participate in the photogrammetry operation.
4. The method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein the control point spacing or arrangement structure type is: the space array distribution of the distance of 30cm multiplied by 20cm or 30cm multiplied by 30 cm.
5. The method for realizing close-up photogrammetry and three-dimensional visualization as claimed in claim 1, wherein in the step S2, when the photographic object is a vertical wall, a parallel photography mode is adopted; when a photographic subject with a large difference in imaging depth is imaged, a photographic method of rotating multiple baselines is employed.
6. The method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein in the step S3, the three-dimensional visualization computer application program is: LensPhotoo V2.0.
7. The method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein in the step S3, the control point adjustment function is:
Figure FDA0003144858500000031
Figure FDA0003144858500000032
Figure FDA0003144858500000033
Figure FDA0003144858500000034
Figure FDA0003144858500000035
Figure FDA0003144858500000036
in the formula: k1,K2Radial distortion is poor; p1,P2The eccentricity distortion becomes poor.
8. The method for realizing close-range photogrammetry and three-dimensional visualization as claimed in claim 1, wherein in the step S3, when editing the point cloud, specific editing rules are as follows:
1) processing the production point cloud according to automatic matching, and removing redundant points;
2) when the model range is defined, attention should not be paid to the overlarge range, multiple times of circle selection can be carried out, and a three-dimensional model is preferentially reconstructed;
3) when defining the range of different partitions, it should be noted that a certain degree of overlap is preserved between each.
CN202110751675.XA 2021-07-02 2021-07-02 Method for realizing close-range photogrammetry and three-dimensional visualization Active CN113776505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110751675.XA CN113776505B (en) 2021-07-02 2021-07-02 Method for realizing close-range photogrammetry and three-dimensional visualization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110751675.XA CN113776505B (en) 2021-07-02 2021-07-02 Method for realizing close-range photogrammetry and three-dimensional visualization

Publications (2)

Publication Number Publication Date
CN113776505A true CN113776505A (en) 2021-12-10
CN113776505B CN113776505B (en) 2023-07-04

Family

ID=78835970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110751675.XA Active CN113776505B (en) 2021-07-02 2021-07-02 Method for realizing close-range photogrammetry and three-dimensional visualization

Country Status (1)

Country Link
CN (1) CN113776505B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226057A (en) * 2008-02-01 2008-07-23 武汉朗视软件有限公司 Digital close range photogrammetry method
CN106352855A (en) * 2016-09-26 2017-01-25 北京建筑大学 Photographing measurement method and device
US9857172B1 (en) * 2017-09-25 2018-01-02 Beijing Information Science And Technology University Method for implementing high-precision orientation and evaluating orientation precision of large-scale dynamic photogrammetry system
CN108458665A (en) * 2018-02-11 2018-08-28 中铁八局集团第二工程有限公司 The method for carrying out the quick distortion measurement in tunnel using up short
KR20200132065A (en) * 2019-05-15 2020-11-25 주식회사 씨에스아이비젼 System for Measuring Position of Subject
CN112270698A (en) * 2019-12-31 2021-01-26 山东理工大学 Non-rigid geometric registration method based on nearest curved surface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226057A (en) * 2008-02-01 2008-07-23 武汉朗视软件有限公司 Digital close range photogrammetry method
CN106352855A (en) * 2016-09-26 2017-01-25 北京建筑大学 Photographing measurement method and device
US9857172B1 (en) * 2017-09-25 2018-01-02 Beijing Information Science And Technology University Method for implementing high-precision orientation and evaluating orientation precision of large-scale dynamic photogrammetry system
CN108458665A (en) * 2018-02-11 2018-08-28 中铁八局集团第二工程有限公司 The method for carrying out the quick distortion measurement in tunnel using up short
KR20200132065A (en) * 2019-05-15 2020-11-25 주식회사 씨에스아이비젼 System for Measuring Position of Subject
CN112270698A (en) * 2019-12-31 2021-01-26 山东理工大学 Non-rigid geometric registration method based on nearest curved surface

Also Published As

Publication number Publication date
CN113776505B (en) 2023-07-04

Similar Documents

Publication Publication Date Title
CN105091849B (en) A kind of non-parallel binocular distance-finding method of optical axis
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN102376089B (en) Target correction method and system
CN109118545A (en) 3-D imaging system scaling method and system based on rotary shaft and binocular camera
CN105243637B (en) One kind carrying out full-view image joining method based on three-dimensional laser point cloud
CN107492069B (en) Image fusion method based on multi-lens sensor
CN103743352B (en) A kind of 3 D deformation measuring method based on polyphaser coupling
CN109253706B (en) Tunnel three-dimensional topography measuring method based on digital image
CN107886547B (en) Fisheye camera calibration method and system
CN109272574B (en) Construction method and calibration method of linear array rotary scanning camera imaging model based on projection transformation
CN110827392B (en) Monocular image three-dimensional reconstruction method, system and device
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN110428501B (en) Panoramic image generation method and device, electronic equipment and readable storage medium
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN112949478A (en) Target detection method based on holder camera
CN112254680B (en) Multi freedom's intelligent vision 3D information acquisition equipment
CN111461963A (en) Fisheye image splicing method and device
CN104463969A (en) Building method of model of aviation inclined shooting geographic photos
CN102589529B (en) Scanning close-range photogrammetry method
CN113724337A (en) Camera dynamic external parameter calibration method and device without depending on holder angle
CN110378967B (en) Virtual target calibration method combining grating projection and stereoscopic vision
CN110044266B (en) Photogrammetry system based on speckle projection
CN112253913B (en) Intelligent visual 3D information acquisition equipment deviating from rotation center
CN113393413B (en) Water area measuring method and system based on monocular and binocular vision cooperation
JP4112077B2 (en) Image measurement processing method and apparatus, and recording medium recording image measurement processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant