CN106910238A - Color texture method for reconstructing based on high inclination-angle close-range image - Google Patents
Color texture method for reconstructing based on high inclination-angle close-range image Download PDFInfo
- Publication number
- CN106910238A CN106910238A CN201710040294.4A CN201710040294A CN106910238A CN 106910238 A CN106910238 A CN 106910238A CN 201710040294 A CN201710040294 A CN 201710040294A CN 106910238 A CN106910238 A CN 106910238A
- Authority
- CN
- China
- Prior art keywords
- image
- point
- distance value
- point cloud
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 239000011159 matrix material Substances 0.000 claims abstract description 24
- 239000004576 sand Substances 0.000 claims abstract description 10
- 238000013519 translation Methods 0.000 claims abstract description 10
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims 1
- 238000013507 mapping Methods 0.000 abstract description 8
- 238000002271 resection Methods 0.000 abstract 2
- 238000004040 coloring Methods 0.000 abstract 1
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 241001387976 Pera Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012821 model calculation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Abstract
The present invention provides a kind of color texture method for reconstructing based on high inclination-angle close-range image, and it includes, step one, obtains object cloud data to be reconstructed, chooses four control points and obtains the three-dimensional point cloud coordinate at the control point;Step 2, obtains the image space coordinate of the picture point in the image space coordinate system of the influence photo;Step 3, using the distance value D at the control pointiDivided by the distance value d of the picture pointiAverage value as scalefactor m, and calculate the exterior orientation line element X of the pixel centers、YsAnd YsValue;Step 4, profit is obtained and represented three-dimensional point cloud coordinate system and the spin matrix R and translation vector of image space coordinate system corresponding relation;Step 5, determines the colouring information of each cloud data;The color texture for completing the object is rebuild.The method that the pyrometric cone based on unnecessary control point of this method combines traditional resection is applied to any inclination angle close-range image texture mapping, and precision is higher than traditional resection solution, and iterations is less.
Description
Technical Field
The invention belongs to the field of texture reconstruction, and particularly relates to a color texture reconstruction method based on a large-dip-angle close-range image.
Background
The texture mapping technology is a technology for extracting correct color information from a two-dimensional image and endowing the color information to a point cloud or a triangulation network model, and is widely used in the industries of digital city modeling, historic building and legacy protection, face recognition and the like, and a challenging problem is always caused by how to accurately map two-dimensional texture information to the surface of a three-dimensional model.
The close-range image obtained by the CCD digital camera has only two-dimensional pixel coordinate information, while the triangulation (point cloud) obtained by land laser Scanning (STL) is three-dimensional coordinate information. The texture mapping requires the pixel coordinates on the image to be inversely calculated through the three-dimensional coordinates and corresponding pixel values to be extracted and given to the point cloud or the triangular network 1. How to determine the conversion relationship between the two-dimensional coordinates and the three-dimensional coordinates becomes a critical issue for texture mapping (Samuel r., 2001).
The conversion relation between the coordinates of the photo in the image space auxiliary coordinate system and the coordinates of the point cloud in the object coordinate system (world coordinate system) can be represented by a rotation matrix and three translation amounts, and the rotation matrix can be determined by three rotation angles. These three translational amounts and three rotational angles are referred to as exterior orientation elements. The key to texture mapping is how to accurately determine the exterior orientation element. In historic building legacy protection, close-up images and aerial images are different in nature, the shooting inclination angle is usually large, and the object surface information is complex, so that exterior orientation elements cannot be determined by a traditional rear intersection method (Yanglijun, et al, 2012).
The Wangcuo and others, Otero J and others (Wangcuo, et al, 1998; Otero J, et al, 2015) propose a Direct Linear Transformation (DLT) solution, which can overcome the problem of large angle caused by close-range photogrammetry, but the accuracy of the calculated internal and external orientation elements is not high, the result is very easy to be influenced by control points, and the number of the selected control points is too large (more than 6).
Hartley and Pateraki MN et al refer to a direct camera transformation method that solves 12 nonlinear transformation parameters by singular value decomposition to determine the transformation between two sets of coordinates (Hartley, et al, 2003; Pateraki N, et al, 2001). The method can overcome the problem of large inclination angle of a close-range image, but more than 6 control points are required to be selected in order to make the parameters more accurate.
Al-Manasiir K, Fraser C S and Cabrelles M propose to erect a plurality of shooting stations around a building, and to obtain accurate initial values of exterior orientation elements through images with certain overlapping degrees (A1-Manasiir K, 2006; Cabrelles M, et Al, 2009).
Habib et al uses a method of determining the elements of orientation outside a single aerial image using a modified Hough transform and collinearity equations (Habib, et al, 2001), but does not prove that the method is equally applicable to large-dip images.
The Yaojili introduces a rodriege matrix into three-dimensional coordinate conversion, and a seven-parameter model calculation formula (Yaojili, 2006) of any inclination angle is deduced, but the method solves the scaling factor as a fixed value, and ignores the condition that the scaling factors of each point in the large-inclination-angle close-range photogrammetry are different.
Chen et al first obtains an initial value of an exterior orientation element of an aerial image by using a DLT (digital Living transform) solution, and then determines an accurate solution (Chen et al, 2010) of the exterior orientation element by using single-image space back rendezvous and multi-image space back rendezvous, but the method is not proved to be applicable to a large-dip-angle close-range image.
Liu Y and the like utilize a DLT solution to manufacture point clouds for acquiring a human body model based on a plurality of images, and the DLT solution is proved to be suitable for close-range images (Liu Y, et al, 2010), but the number of control points adopted in the method is too large (33), when the number of the control points is reduced, the precision of the exterior orientation elements is not high, the stability is not strong, and the influence of the distribution of the control points is large.
Werner et al propose that line features in two-dimensional images and three-dimensional models are first detected and then a single close-range image texture map is completed by matching these line features (Werner, et al, 2002), but this approach is no longer applicable when the close-range image cannot extract numerous line features (e.g., murals, irregular roofs, etc.).
Official yunlan et al proposed a method for determining an initial value of an external orientation element of a pyramid image with redundant control points, and there is no description in the article that the method is still applicable to texture mapping of a large-dip angle close-up image (official yunlan et al, 2007). The rotation matrix R obtained by this method is not an orthogonal matrix, and therefore an accurate external orientation angle element cannot be obtained.
Therefore, how to quickly complete the color texture reconstruction on the premise of a large-dip-angle close-range image is an urgent technical problem to be solved.
Disclosure of Invention
An object of the present invention is to solve at least the above problems or disadvantages and to provide at least the advantages described hereinafter.
It is still another object of the present invention to provide a color texture reconstruction method based on a large-dip-angle close-up image, which can obtain the precise value of the exterior orientation element and the scale m of each control point under the condition of the known shooting focal length f and the known pixel scale factor (mm/pixel). The method is suitable for texture mapping of any dip angle close-range image, the situation of iteration unconvergence does not occur in the calculation process, the initial values of internal and external orientation elements do not need to be given artificially, fewer (4) control points need to be selected, and the convergence and the precision are higher than those of the traditional back intersection method. The method has good effect of texture visualization.
To achieve these objects and other advantages in accordance with the purpose of the invention, there is provided a color texture reconstruction method based on a close-up image with a large tilt angle, including
The method comprises the steps of firstly, obtaining target object point cloud data to be reconstructed, selecting four control points from the point cloud data, and obtaining three-dimensional point cloud coordinates (X) of the control pointsi,Yi,Zi) (ii) a Calculating the distance value D of every two control pointsi
Step two, shooting an image picture of the target object by using a shooting center S, and determining four image points corresponding to the four control points on the image picture; obtaining image space coordinates (x, y, -f) of said image point in said image space coordinate system influencing the photograph; calculating the distance value d of every two image pointsi
Step three, utilizing the distance value D of the control pointiDivided by the distance value d of the image pointiIs used as a scale factor m, and the outer orientation line element X of the pixel center is calculated according to the following formulaS、YSAnd YSA value;
step four, utilizing the external orientation line element X of the pixel centerS、YSAnd YSObtaining a rotation matrix R and a translation vector which represent the corresponding relation between a three-dimensional point cloud coordinate system and an image space coordinate system;
establishing a corresponding relation between all point cloud data and image points on the image photo according to the rotation matrix R and the translation vector, and determining color information of each point cloud data; and giving the color information to the point cloud data of the target object according to the one-to-one correspondence relationship, and finishing the color texture reconstruction of the target object.
Preferably, in the color texture reconstruction method based on the close-up image with a large inclination angle, a calculation formula of the scale factor m is as follows:
wherein D isiThe distance value of every two control points is obtained; diThe distance value of every two image points is obtained; n is 6.
Preferably, in the color texture reconstruction method based on the close-up image with a large inclination angle, the specific process of obtaining the rotation matrix in the fourth step is as follows:
4.1 three-dimensional Point cloud coordinates (X) using four of the control pointsi,Yi,Zi) And an outer orientation line element X of the center of the pixelS、YSAnd YSValue calculating a distance value S between the pixel center and each of the control pointsi(ii) a Wherein the distance value SiThe calculation formula of (2) is as follows:
4.2 root of Chinese wolframAccording to the distance value S between each pixel center and each control pointiObtaining the space coordinates of the image picture in the object coordinate system according to a direct solution method (ω,κ),
4.3 utilizing the spatial coordinates (ω, κ) obtaining initial values of the outer corner elements of the rotation matrix;
4.4 calculating the distance S between each pixel center and each control pointiSubstituting the initial value into the following formula to obtain the rotation matrix R;
preferably, in the method for reconstructing color texture based on a large-dip-angle close-up image, the translation vector is obtained in the fourth step according to the following formula;
preferably, in the method for reconstructing color texture based on large-dip-angle close-up image, the second step includes converting each photo control point coordinate with the upper left corner as the origin to a coordinate with the center of the photo as the origin, where the conversion formula is:
the invention has the following beneficial effects: the exact value of the exterior orientation element and the scale m for each control point can be determined with the known focal length f and pixel scale factor. The method is suitable for texture mapping of any dip angle close-range image, the situation of iteration unconvergence does not occur in the calculation process, the initial values of internal and external orientation elements do not need to be given artificially, fewer (4) control points need to be selected, and the convergence and the precision are higher than those of the traditional back intersection method. The method has good effect of texture visualization.
Detailed Description
The present invention is described in further detail below to enable those skilled in the art to practice the invention with reference to the description.
The invention discloses a color texture reconstruction method based on a large dip angle close-range image, which at least comprises the following steps:
the method comprises the steps of firstly, obtaining target object point cloud data to be reconstructed, selecting four control points from the point cloud data, and obtaining three-dimensional point cloud coordinates (X) of the control pointsi,Yi,Zi) (ii) a Calculating the distance value D of every two control pointsi
Step two, shooting an image picture of the target object by using a shooting center S, and determining four image points corresponding to the four control points on the image picture; obtaining image space coordinates (x, y, -f) of said image point in said image space coordinate system influencing the photograph; calculating the distance value d of every two image pointsi
S is the photography center, A, B, C, D is the object point coordinate, in most cases of close-range photogrammetry, the several image point coordinates are not distributed on the same plane. a. b, c and d are coordinates of image points on the image space plane. Taking the upper left corner of the photo as (0, 0) and the lower right corner of the photo as (width, height), each pixel coordinate can be expressed as (oldX, oldY).
When the focal length f at the moment of shooting of the CCD digital camera and pixel scale factors Xpixel (mm/pixel) and Ypixel (mm/pixel) are known, the coordinates of each photo control point with the upper left corner as the origin are converted into the coordinates with the center of the photo as the origin by using the formula. Finding the coordinates (internal orientation) of each image point in the image space coordinate system:
any two object space camera centers can form a triangle, and taking Δ ABS as an example, the following equations can be listed according to the cosine law:
this was linearized to give:
corresponding error equation is
Wherein:
the initial distance from the camera center to the object control point needs to be knownAnd an angle cos ∠ ASB formed by the two object control points and the camera center, whereinAndcan be obtained by the following method
Wherein,is the distance from the center of the image to the image point.
Step three, utilizing the distance value D of the control pointiDivided by the distance value d of the image pointiIs used as a scale factor m, and the outer orientation line element X of the pixel center is calculated according to the following formulaS、YSAnd YSA value;
wherein D isiIs the distance between two object control points, diN is the distance between the two corresponding image point coordinates, and n is 6, which is the number of distances used.
Step four, utilizing the external orientation line element X of the pixel centerS、YSAnd YSObtaining a rotation matrix R and a translation vector which represent the corresponding relation between a three-dimensional point cloud coordinate system and an image space coordinate system;
4.1 three-dimensional Point cloud coordinates (X) using four of the control pointsi,Yi,Zi) And an outer orientation line element X of the center of the pixelS、YSAnd YSValue calculating the pixel center and perA distance value S between the control pointsi(ii) a Wherein the distance value SiThe calculation formula of (2) is as follows:
assuming that the coordinates of the control points have no error, the equations (5-14) are linearly expanded into
Wherein
The error equation can be listed according to equation as
When there are n ground control points, several error equations in the form of formulas are listed and written in the form of matrix
Vn×1=Bn×3X3×1-Ln×1
Wherein
X=(BTB)-1(BTL)
Similarly, since the high order terms are omitted during the Taylor series expansion, the number of corrections is found and added to the initial value of the given outer orientation line element and the exact solution of the outer orientation line element is iteratively solved again using the formula to list the error equations.
The initial value of the exterior orientation line element can be obtained by the formula
Wherein m is a scale factor, and the formula is solved, so that the solution of the initial value of the external orientation element is completed.
According to the indirect adjustment theory, the external orientation element precision can be obtained:
the error in unit weight is:
the error in the exterior orientation line element is
4.2 according to the distance value S between each pixel center and each control pointiObtaining the space coordinates of the image picture in the object coordinate system according to a direct solution method (ω,κ);
4.3 utilizing the spatial coordinates (ω, κ) obtaining initial values of the outer corner elements of the rotation matrix;
4.4 calculating the distance S between each pixel center and each control pointiSubstituting the initial value into the following formula to obtain the rotation matrix R;
exact coordinates (X) in the S object coordinate system of the centre of photographyS,YS,ZS) And the distance Si from the photographing center S to each object space point, the orientation of each photo in the object space coordinate system can be calculated according to a direct solution by using the known points (ω, κ), i.e. determine the orientation matrix of the beam:
if the configuration of the object point A is a, then there is
Both sides of the equation are divided by S simultaneouslyaCan obtain the product
If a is toi、bi、ci(i ═ 1, 2, 3, 4) are approximately regarded as independent unknowns, and the object side can solve them with more than three control points, for example, for the case of four control points in the book, a is solved1、b1、c1The n error equations of (1) are:
written in matrix form as:
V=AX-L
wherein X is (a)1b1c1)T=(ATA)-1ATL
The initial value of the external angle element can then be found by rotating the matrix R:
establishing a corresponding relation between all point cloud data and image points on the image photo according to the rotation matrix R and the translation vector, and determining color information of each point cloud data; and giving the color information to the point cloud data of the target object according to the one-to-one correspondence relationship, and finishing the color texture reconstruction of the target object.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, the invention thus being limited not to the details shown, without departing from the general concept defined by the appended claims and their equivalents.
Claims (5)
1. The color texture reconstruction method based on the large dip angle close-range image is characterized by comprising the following steps:
the method comprises the steps of firstly, obtaining target object point cloud data to be reconstructed, selecting four control points from the point cloud data, and obtaining three-dimensional point cloud coordinates (X) of the control pointsi,Yi,Zi) (ii) a Calculating the distance value D of every two control pointsi
Step two, a photographing center S is used for photographing an image picture of the target object, and four corresponding to the four control points are determined on the image pictureAn image point; obtaining image space coordinates (x, y, -f) of said image point in said image space coordinate system influencing the photograph; calculating the distance value d of every two image pointsi
Step three, utilizing the distance value D of the control pointiDivided by the distance value d of the image pointiIs used as a scale factor m, and the outer orientation line element X of the pixel center is calculated according to the following formulas、YsAnd YsA value;
step four, utilizing the external orientation line element X of the pixel centers、YsAnd YsObtaining a rotation matrix R and a translation vector which represent the corresponding relation between a three-dimensional point cloud coordinate system and an image space coordinate system;
establishing a corresponding relation between all point cloud data and image points on the image photo according to the rotation matrix R and the translation vector, and determining color information of each point cloud data; and giving the color information to the point cloud data of the target object according to the one-to-one correspondence relationship, and finishing the color texture reconstruction of the target object.
2. The method for reconstructing color texture based on close-up images with large inclination angle as claimed in claim 1, wherein the formula for calculating the scale factor m is:
wherein,Dithe distance value of every two control points is obtained; diThe distance value of every two image points is obtained; n is 6.
3. The method for reconstructing color texture based on a close-up image with large inclination angle as claimed in claim 2, wherein the specific process of obtaining the rotation matrix in the fourth step is:
4.1 three-dimensional Point cloud coordinates (X) using four of the control pointsi,Yi,Zi) And an outer orientation line element X of the center of the pixels、YsAnd YsValue calculating a distance value S between the pixel center and each of the control pointsi(ii) a Wherein the distance value SiThe calculation formula of (2) is as follows:
4.2 according to the distance value S between each pixel center and each control pointiObtaining the space coordinates of the image picture in the object coordinate system according to a direct solution method
4.3 Using the spatial coordinatesObtaining an initial value of an external corner element of the rotation matrix;
4.4 calculating the distance S between each pixel center and each control pointiSubstituting the initial value into the following formula to obtain the rotation matrix R;
4. the method according to claim 3, wherein the fourth step is to obtain the translation vector according to the following formula;
5. the method according to claim 4, wherein the second step comprises transforming the coordinates of each photo control point with the top left corner as the origin to coordinates with the center of the photo as the origin by the following formula:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710040294.4A CN106910238A (en) | 2017-01-18 | 2017-01-18 | Color texture method for reconstructing based on high inclination-angle close-range image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710040294.4A CN106910238A (en) | 2017-01-18 | 2017-01-18 | Color texture method for reconstructing based on high inclination-angle close-range image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106910238A true CN106910238A (en) | 2017-06-30 |
Family
ID=59206553
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710040294.4A Pending CN106910238A (en) | 2017-01-18 | 2017-01-18 | Color texture method for reconstructing based on high inclination-angle close-range image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106910238A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108470370A (en) * | 2018-03-27 | 2018-08-31 | 北京建筑大学 | The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds |
CN109725340A (en) * | 2018-12-31 | 2019-05-07 | 成都纵横大鹏无人机科技有限公司 | Direct geographic positioning and device |
CN110163903A (en) * | 2019-05-27 | 2019-08-23 | 百度在线网络技术(北京)有限公司 | The acquisition of 3-D image and image position method, device, equipment and storage medium |
CN112179320A (en) * | 2019-07-02 | 2021-01-05 | 北京林业大学 | Method for creating 3D model by using Mini unmanned aerial vehicle in cooperation with common digital camera for shooting |
CN113487746A (en) * | 2021-05-25 | 2021-10-08 | 武汉海达数云技术有限公司 | Optimal associated image selection method and system in vehicle-mounted point cloud coloring |
CN114863030A (en) * | 2022-05-23 | 2022-08-05 | 广州数舜数字化科技有限公司 | Method for generating user-defined 3D model based on face recognition and image processing technology |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314674A (en) * | 2011-08-29 | 2012-01-11 | 北京建筑工程学院 | Registering method for data texture image of ground laser radar |
WO2016081722A1 (en) * | 2014-11-20 | 2016-05-26 | Cappasity Inc. | Systems and methods for 3d capture of objects using multiple range cameras and multiple rgb cameras |
CN105931222A (en) * | 2016-04-13 | 2016-09-07 | 成都信息工程大学 | High-precision camera calibration method via low-precision 2D planar target |
-
2017
- 2017-01-18 CN CN201710040294.4A patent/CN106910238A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314674A (en) * | 2011-08-29 | 2012-01-11 | 北京建筑工程学院 | Registering method for data texture image of ground laser radar |
WO2016081722A1 (en) * | 2014-11-20 | 2016-05-26 | Cappasity Inc. | Systems and methods for 3d capture of objects using multiple range cameras and multiple rgb cameras |
CN105931222A (en) * | 2016-04-13 | 2016-09-07 | 成都信息工程大学 | High-precision camera calibration method via low-precision 2D planar target |
Non-Patent Citations (3)
Title |
---|
唐燕: "基于近景摄影测量的文物三维重建研究", 《中国优秀硕士学位论文全文数据库基础科学辑》 * |
官云兰 等: "有多余控制点的角锥体影像外方位元素解算", 《桂林工学院学报》 * |
高颖 等编著: "《虚拟现实视景仿真技术》", 31 March 2014 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108470370A (en) * | 2018-03-27 | 2018-08-31 | 北京建筑大学 | The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds |
CN108470370B (en) * | 2018-03-27 | 2021-10-15 | 北京建筑大学 | Method for jointly acquiring three-dimensional color point cloud by external camera of three-dimensional laser scanner |
CN109725340A (en) * | 2018-12-31 | 2019-05-07 | 成都纵横大鹏无人机科技有限公司 | Direct geographic positioning and device |
CN110163903A (en) * | 2019-05-27 | 2019-08-23 | 百度在线网络技术(北京)有限公司 | The acquisition of 3-D image and image position method, device, equipment and storage medium |
CN110163903B (en) * | 2019-05-27 | 2022-02-25 | 百度在线网络技术(北京)有限公司 | Three-dimensional image acquisition and image positioning method, device, equipment and storage medium |
CN112179320A (en) * | 2019-07-02 | 2021-01-05 | 北京林业大学 | Method for creating 3D model by using Mini unmanned aerial vehicle in cooperation with common digital camera for shooting |
CN113487746A (en) * | 2021-05-25 | 2021-10-08 | 武汉海达数云技术有限公司 | Optimal associated image selection method and system in vehicle-mounted point cloud coloring |
CN113487746B (en) * | 2021-05-25 | 2023-02-24 | 武汉海达数云技术有限公司 | Optimal associated image selection method and system in vehicle-mounted point cloud coloring |
CN114863030A (en) * | 2022-05-23 | 2022-08-05 | 广州数舜数字化科技有限公司 | Method for generating user-defined 3D model based on face recognition and image processing technology |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103744086B (en) | A kind of high registration accuracy method of ground laser radar and close-range photogrammetry data | |
CN106910238A (en) | Color texture method for reconstructing based on high inclination-angle close-range image | |
CN108765298A (en) | Unmanned plane image split-joint method based on three-dimensional reconstruction and system | |
CN103017653B (en) | Registration and measurement method of spherical panoramic image and three-dimensional laser scanning point cloud | |
Pepe et al. | Techniques, tools, platforms and algorithms in close range photogrammetry in building 3D model and 2D representation of objects and complex architectures | |
CN102506824B (en) | Method for generating digital orthophoto map (DOM) by urban low altitude unmanned aerial vehicle | |
CN104484668B (en) | A kind of contour of building line drawing method of the how overlapping remote sensing image of unmanned plane | |
CN112927360A (en) | Three-dimensional modeling method and system based on fusion of tilt model and laser point cloud data | |
CN107492069B (en) | Image fusion method based on multi-lens sensor | |
Barazzetti et al. | True-orthophoto generation from UAV images: Implementation of a combined photogrammetric and computer vision approach | |
CN106485690A (en) | Cloud data based on a feature and the autoregistration fusion method of optical image | |
CN104361628A (en) | Three-dimensional real scene modeling system based on aviation oblique photograph measurement | |
CN109272574A (en) | Linear array rotary scanning camera imaging model building method and scaling method based on projective transformation | |
CN104463969B (en) | A kind of method for building up of the model of geographical photo to aviation tilt | |
CN116740288B (en) | Three-dimensional reconstruction method integrating laser radar and oblique photography | |
CN114241125B (en) | Multi-view satellite image-based fine three-dimensional modeling method and system | |
CN104363438A (en) | Panoramic three-dimensional image manufacturing method | |
Ozcanli et al. | A comparison of stereo and multiview 3-D reconstruction using cross-sensor satellite imagery | |
CN117576343B (en) | Three-dimensional MESH model manufacturing method based on high-resolution satellite stereoscopic image | |
CN109900205A (en) | A kind of quick calibrating method of high-precision single line laser device and optical camera | |
CN112862683A (en) | Adjacent image splicing method based on elastic registration and grid optimization | |
CN115631317A (en) | Tunnel lining ortho-image generation method and device, storage medium and terminal | |
CN108830921A (en) | Laser point cloud reflected intensity correcting method based on incident angle | |
CN113284249B (en) | Multi-view three-dimensional human body reconstruction method and system based on graph neural network | |
CN104964669A (en) | Orthoimage generation method of cylinder-like antique object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170630 |
|
RJ01 | Rejection of invention patent application after publication |