CN111222586B - Inclined image matching method and device based on three-dimensional inclined model visual angle - Google Patents

Inclined image matching method and device based on three-dimensional inclined model visual angle Download PDF

Info

Publication number
CN111222586B
CN111222586B CN202010312367.2A CN202010312367A CN111222586B CN 111222586 B CN111222586 B CN 111222586B CN 202010312367 A CN202010312367 A CN 202010312367A CN 111222586 B CN111222586 B CN 111222586B
Authority
CN
China
Prior art keywords
image
oblique
screening
model
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010312367.2A
Other languages
Chinese (zh)
Other versions
CN111222586A (en
Inventor
陈李胜
黄飞
林华军
王久玲
陈其孜
樊星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Ocn Network Technology Co ltd
Original Assignee
Guangzhou Ocn Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Ocn Network Technology Co ltd filed Critical Guangzhou Ocn Network Technology Co ltd
Priority to CN202010312367.2A priority Critical patent/CN111222586B/en
Publication of CN111222586A publication Critical patent/CN111222586A/en
Application granted granted Critical
Publication of CN111222586B publication Critical patent/CN111222586B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a tilt image matching method and a device based on a three-dimensional tilt model view angle, wherein the method comprises the following steps: obtaining query parameters, wherein the query parameters comprise: the name of the layer where the inclined model is located, the position coordinate of the inclined model, the azimuth angle of the inclined model view angle and the inclination angle of the inclined model view angle; screening an oblique image database according to the query parameters so as to screen an oblique image matched with the query parameters; the screening operation comprises the following steps: an inclination angle screening step, an azimuth angle screening step, a distance screening step and a relative azimuth angle screening step. The method can quickly match the corresponding oblique images according to the relevant parameters of the three-dimensional oblique model visual angle, and realizes the dynamic linkage display of the scene oblique model and the original oblique aviation images.

Description

Inclined image matching method and device based on three-dimensional inclined model visual angle
Technical Field
The invention relates to the technical field of three-dimensional tilt models, in particular to a tilt image matching method and device based on a three-dimensional tilt model view angle.
Background
The oblique photography technology is a high and new technology developed in the international photogrammetry field in the last ten years, and acquires abundant high-resolution textures of the top surface and the side view of a building by synchronously acquiring images from a vertical angle, four oblique angles and five different visual angles. The method can truly reflect the ground and object conditions, acquire object texture information with high precision, and generate a real three-dimensional city model through advanced positioning, fusion, modeling and other technologies. By utilizing the image recognition technology, a group of pictures shot at the same position of the three-dimensional model at the visual angle of a specific position corresponding to the airplane during aerial photography can be effectively found out, so that the surrounding environment of the model is mapped.
Although the method for searching the corresponding oblique image by using the image recognition technology can accurately match the target photo, the oblique photo generated during the aerial photography of the airplane is massive, so that the image recognition process is long and the real-time requirement is difficult to meet.
Disclosure of Invention
In order to overcome the defects of the prior art, an object of the present invention is to provide a tilt image matching method based on a three-dimensional tilt model view angle, which can quickly match a corresponding tilt image according to related parameters of the three-dimensional tilt model view angle, and realize dynamic linkage display of a scene tilt model and an original tilt aerial image.
The second objective of the present invention is to provide an electronic device, which can rapidly match the corresponding tilted image according to the related parameters of the viewing angle of the three-dimensional tilted model, so as to realize the dynamic linkage display of the scene tilted model and the original tilted aerial image.
The invention also provides a computer readable storage medium, and a program in the storage medium can quickly match a corresponding inclined image according to related parameters of a three-dimensional inclined model view angle when running, so as to realize dynamic linkage display of a scene inclined model and an original inclined aviation image.
One of the purposes of the invention is realized by adopting the following technical scheme:
a tilt image matching method based on a three-dimensional tilt model view angle comprises the following steps:
obtaining query parameters, wherein the query parameters comprise: the name of the layer where the inclined model is located, the position coordinate of the inclined model, the azimuth angle of the inclined model view angle and the inclination angle of the inclined model view angle;
screening an oblique image database according to the query parameters so as to obtain oblique image information matched with the query parameters through screening;
the screening operation comprises the following steps:
and (3) screening the inclined angle:
judging whether the inclination angle of the inclination model visual angle is larger than or equal to n, wherein the range of n is [75 degrees ], 90 degrees ];
if so, acquiring an orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located and performing a distance screening step on the orthoscopic oblique image;
if not, acquiring other oblique images except the orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located, and performing an azimuth angle screening step on the acquired oblique images;
azimuth screening:
screening out an inclined image of which the azimuth angle meets the azimuth angle screening condition from the inclined images acquired in the inclined angle screening step; the azimuth screening conditions are as follows: the azimuth angle of the oblique image is less than or equal to Yaw-Tolerancel and less than or equal to Yaw + Tolerancel; wherein, Yaw is the azimuth angle of the three-dimensional model visual angle, and Tolerance1 is the Tolerance;
performing a distance screening step on the oblique images meeting the azimuth angle screening formula;
distance screening:
calculating the distance between the center point of each inclined image group and the inclined model, which is obtained by screening in the inclined angle screening step or the azimuth angle screening step;
calculating the radius of an outer circle of each inclined image group obtained by screening in the inclined angle screening step or the azimuth angle screening step;
screening an inclined image group of which the distance between the center point of the inclined image group and the inclined model is less than or equal to the radius of an outer circle of the inclined image group, and performing a relative azimuth angle screening step on the screened inclined image group;
screening relative azimuth angles:
calculating the relative azimuth angle between the central point of the inclined image group obtained by screening in the distance screening step and the position of the inclined model;
screening an oblique image of which the azimuth angle of the oblique image meets the relative azimuth angle screening condition from an oblique image group obtained by screening in the distance screening step, wherein the relative azimuth angle screening condition is as follows:
yaw 2-Tolerance 2 is less than or equal to the azimuth angle of the inclined image is less than or equal to Yaw2+ Tolerance2,
wherein, Yaw2 is the relative azimuth angle between the center point of the tilted image set and the position of the tilted model, and Tolerance2 is the Tolerance.
Further, the distance between the central point of the oblique image group and the oblique model is calculated by the following formula:
Figure GDA0002555360110000031
wherein, the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
Further, the radius of the outer circle is calculated by the following formula:
Figure GDA0002555360110000041
further, the relative azimuth is calculated by the following formula:
Yaw2=arctan((BulidingPositionY-ImageGroupY)/(BulidingPositionX-ImageG roupX)),
wherein, the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
Further, the data information carried by each oblique image in the oblique image library includes: the unique identification of the oblique image, the file path of the oblique image, the group of the oblique image, the serial number of the oblique camera and the central point coordinate of the group of the oblique image.
The second purpose of the invention is realized by adopting the following technical scheme:
an electronic device comprising a processor and a memory, wherein the memory stores an executable computer program, and the processor can read the computer program in the memory and operate to implement the three-dimensional oblique model view angle-based oblique image matching method as described above.
The third purpose of the invention is realized by adopting the following technical scheme:
a computer-readable storage medium, in which an executable computer program is stored, which when running can implement the above-mentioned tilt image matching method based on a three-dimensional tilt model view angle.
Compared with the prior art, the invention has the beneficial effects that:
the oblique image matching method based on the three-dimensional oblique model visual angle can quickly match the corresponding oblique images according to the relevant parameters (the name of the layer where the oblique model is located, the position coordinates of the oblique model, the azimuth angle of the oblique model visual angle and the oblique angle of the oblique model visual angle) of the three-dimensional oblique model visual angle, and quickly screen the oblique images. Compared with a mode of matching by utilizing image identification, the method can quickly match the inclined image corresponding to the visual angle of the inclined model selected by the current scene, has shorter time delay, can realize dynamic linkage display of the scene inclined model and the original inclined aviation flying image, and meets the real-time requirement.
Drawings
Fig. 1 is a schematic flow chart of a tilt image matching method based on a three-dimensional tilt model view angle according to the present invention;
fig. 2 is a schematic diagram illustrating a tilted photography imaging rule in a tilted image matching method based on a three-dimensional tilted model view angle according to the present invention;
fig. 3 is a schematic diagram of an inclination angle of a three-dimensional model view in an oblique image matching method based on a three-dimensional oblique model view according to the present invention;
fig. 4 is a schematic view of an azimuth angle of a three-dimensional model view in the tilted image matching method based on the three-dimensional tilted model view according to the present invention, wherein the azimuth angle of a scene view is equal to the azimuth angle of the three-dimensional model view;
fig. 5 is a schematic diagram of distance screening in the oblique image matching method based on the three-dimensional oblique model view angle provided by the present invention, in which the radius of the circle in the diagram is the buffer distance, i.e. the radius of the outer circle of the oblique image group;
fig. 6 is a schematic diagram of a relative azimuth angle in a tilted image matching method based on a three-dimensional tilted model view angle according to the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and the detailed description, and it should be noted that any combination of the embodiments or technical features described below can be used to form a new embodiment without conflict.
Referring to fig. 1 to 6, a method for matching a tilted image based on a three-dimensional tilted model view angle includes the following steps:
obtaining query parameters, wherein the query parameters comprise: the name of the layer where the inclined model is located, the position coordinate of the inclined model, the azimuth angle of the inclined model view angle and the inclination angle of the inclined model view angle;
screening an oblique image database according to the query parameters so as to obtain oblique image information matched with the query parameters through screening;
the screening operation comprises the following steps:
and (3) screening the inclined angle:
judging whether the inclination angle of the inclination model visual angle is larger than or equal to n, wherein the range of n is [75 degrees ], 90 degrees ];
if so, acquiring an orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located and performing a distance screening step on the orthoscopic oblique image;
if not, acquiring other oblique images except the orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located, and performing an azimuth angle screening step on the acquired oblique images;
azimuth screening:
screening out an inclined image of which the azimuth angle meets the azimuth angle screening condition from the inclined images acquired in the inclined angle screening step; the azimuth screening conditions are as follows: the azimuth angle of the oblique image is less than or equal to Yaw-Tolerancel and less than or equal to Yaw + Tolerancel;
wherein, Yaw is the azimuth angle of the three-dimensional model visual angle, and Tolerance1 is the Tolerance;
performing a distance screening step on the oblique images meeting the azimuth angle screening formula;
distance screening:
calculating the distance between the center point of each inclined image group and the inclined model, which is obtained by screening in the inclined angle screening step or the azimuth angle screening step;
calculating the radius of an outer circle of each inclined image group obtained by screening in the inclined angle screening step or the azimuth angle screening step;
screening an inclined image group of which the distance between the center point of the inclined image group and the inclined model is less than or equal to the radius of an outer circle of the inclined image group, and performing a relative azimuth angle screening step on the screened inclined image group;
screening relative azimuth angles:
calculating the relative azimuth angle between the central point of the inclined image group obtained by screening in the distance screening step and the position of the inclined model;
screening an oblique image of which the azimuth angle of the oblique image meets the relative azimuth angle screening condition from an oblique image group obtained by screening in the distance screening step, wherein the relative azimuth angle screening condition is as follows:
yaw 2-Tolerance 2 is less than or equal to the azimuth angle of the inclined image is less than or equal to Yaw2+ Tolerance2,
wherein, Yaw2 is the relative azimuth angle between the center point of the tilted image set and the position of the tilted model, and Tolerance2 is the Tolerance.
The oblique image matching method based on the three-dimensional oblique model visual angle can quickly match the corresponding oblique images according to the relevant parameters (the name of the layer where the oblique model is located, the position coordinates of the oblique model, the azimuth angle of the oblique model visual angle and the oblique angle of the oblique model visual angle) of the three-dimensional oblique model visual angle, and quickly screen the oblique images; by the method, the oblique images corresponding to the visual angles of the oblique models selected by the current scene can be quickly matched, the scene oblique models and the original oblique aviation images are dynamically displayed in a linkage manner, and the real-time requirement is met.
Specifically, the distance between the center point of the oblique image group and the oblique model is calculated by the following formula:
Figure GDA0002555360110000071
wherein, the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
The radius of the outer circle is calculated by the following formula:
Figure GDA0002555360110000081
the relative azimuth is calculated by the following formula:
yaw2 ═ arctan ((building position y-imagegroup y)/(building position x-ImageG roupX)), where the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
Specifically, the data information carried by each oblique image in the oblique image library includes: the unique identification of the oblique image, the file path of the oblique image, the group of the oblique image, the serial number of the oblique camera and the central point coordinate of the group of the oblique image.
The oblique image database is constructed as follows:
the oblique images are stored according to a preset oblique image data structure model so as to form an oblique image library, and the oblique image data structure model is as follows:
name of field Type of field Remarks for note
Id NVARchar2(255) Unique identifier for inclined image
ImagePath NVARchar2(255) Tilting image file path
ImageGroup NUMBER Oblique image group
PhotoGroup NUMBER Camera number
ImageGroupX NUMBER Center point X coordinate of inclined image group
ImageGroupY NUMBER Y coordinate of central point of inclined image group
ImageYaw NUMBER Azimuth angle of oblique image
Specifically, the information of each oblique image can be obtained from the original oblique image data information file as follows:
oblique image unique identification (corresponding data model Id attribute);
the oblique image file path (corresponding to the data model ImagePath attribute);
an oblique image group (corresponding to the ImageGroup attribute of the data model) (which indicates that each shooting in the flight of the unmanned aerial vehicle generates a group of oblique images, and each oblique image in the group corresponds to an image shot by a camera);
the number of the associated oblique photography camera (corresponding to the photo group attribute of the data model) (stating that a number corresponds to the photography camera of an unmanned plane, and the shooting direction of the photography camera is fixed relative to the navigation direction of the unmanned plane);
oblique image group center coordinate point X, Y (corresponding to data model imagegroup X attribute, imagegroup Y attribute).
Note that the oblique image azimuth (corresponding to the oblique image data model ImageYaw attribute) is calculated:
since the oblique image groups are numbered and the images are consecutive, for example, the next image group corresponding to the 001 image group is 002 group, the navigation direction of the drone when shooting the nth image group can be expressed as: angle (sailing direction) ═ arctan ((Yn +1) -Yn/(Xn +1) -Xn);
according to the oblique photography imaging rule shown in fig. 2, the oblique image imaging sight line direction is fixed relative to the unmanned aerial vehicle navigation direction, namely: the shooting direction of the camera with the number of 1 is right ahead of the navigation direction, and the azimuth Angle (ImageYaw) of the shot inclined image is consistent with the navigation direction (Angle); the camera shooting direction with the number of 2 is right and right of the navigation direction, the azimuth Angle (ImageYaw) of the shot oblique image is the navigation direction (Angle) +90 degrees, the camera shooting direction with the number of 3 is right and back of the navigation direction, the azimuth Angle (ImageYaw) of the shot oblique image is the navigation direction (Angle) +180 degrees, the camera shooting direction with the number of 4 is right and left of the navigation direction, the azimuth Angle (ImageYaw) of the shot oblique image is the navigation direction (Angle) +270 degrees, the camera shooting direction with the number of 5 is right and below of the navigation direction, and the azimuth Angle (ImageYaw) of the shot oblique image is 0 degrees; when n is the last group of image groups, the directions of the images in the image groups are expressed by n-1 group directions.
The invention also provides an electronic device, which comprises a processor and a memory, wherein the memory stores an executable computer program, and the processor can read the computer program in the memory and operate to realize the three-dimensional tilt model view angle-based tilt image matching method.
In addition, the present invention also provides a computer readable storage medium, which stores an executable computer program, and when the computer program runs, the method for matching a tilted image based on a three-dimensional tilted model view angle as described above can be implemented.
The above embodiments are only preferred embodiments of the present invention, and the protection scope of the present invention is not limited thereby, and any insubstantial changes and substitutions made by those skilled in the art based on the present invention are within the protection scope of the present invention.

Claims (7)

1. A tilt image matching method based on a three-dimensional tilt model view angle is characterized by comprising the following steps:
obtaining query parameters, wherein the query parameters comprise: the name of the layer where the inclined model is located, the position coordinate of the inclined model, the azimuth angle of the inclined model view angle and the inclination angle of the inclined model view angle;
screening an oblique image database according to the query parameters so as to screen an oblique image matched with the query parameters;
the screening operation comprises the following steps:
and (3) screening the inclined angle:
judging whether the inclination angle of the inclination model visual angle is larger than or equal to n, wherein the range of n is [75 degrees ], 90 degrees ];
if so, acquiring an orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located and performing a distance screening step on the orthoscopic oblique image;
if not, acquiring other oblique images except the orthoscopic oblique image from an oblique image library according to the name of the layer where the oblique model is located, and performing an azimuth angle screening step on the acquired oblique images;
azimuth screening:
screening out an inclined image of which the azimuth angle meets the azimuth angle screening condition from the inclined images acquired in the inclined angle screening step; the azimuth screening conditions are as follows: the azimuth angle of the inclined image is not less than Yaw-Tolerance1 and not more than Yaw + Tolerance 1;
wherein, Yaw is the azimuth angle of the three-dimensional model visual angle, and Tolerance1 is the Tolerance;
performing a distance screening step on the oblique images meeting the azimuth angle screening formula;
distance screening:
calculating the distance between the center point of each inclined image group and the inclined model, which is obtained by screening in the inclined angle screening step or the azimuth angle screening step;
calculating the radius of an outer circle of each inclined image group obtained by screening in the inclined angle screening step or the azimuth angle screening step;
screening an inclined image group of which the distance between the center point of the inclined image group and the inclined model is less than or equal to the radius of an outer circle of the inclined image group, and performing a relative azimuth angle screening step on the screened inclined image group;
screening relative azimuth angles:
calculating the relative azimuth angle between the central point of the inclined image group obtained by screening in the distance screening step and the position of the inclined model;
screening an oblique image of which the azimuth angle of the oblique image meets the relative azimuth angle screening condition from an oblique image group obtained by screening in the distance screening step, wherein the relative azimuth angle screening condition is as follows:
yaw 2-Tolerance 2 is less than or equal to the azimuth angle of the inclined image is less than or equal to Yaw2+ Tolerance2,
wherein, Yaw2 is the relative azimuth angle between the center point of the tilted image set and the position of the tilted model, and Tolerance2 is the Tolerance.
2. The method for matching a tilted image based on a three-dimensional tilted model viewing angle according to claim 1, wherein the distance between the tilted image group center point and the tilted model is calculated by the following formula:
Figure FDA0002555360100000021
wherein, the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
3. The method for matching a tilted image based on a three-dimensional tilted model view angle according to claim 1, wherein the radius of the outer circle is calculated by the following formula:
Figure FDA0002555360100000031
4. the method for matching tilted images based on three-dimensional tilted model view angle according to claim 1, wherein the relative azimuth angle is calculated by the following formula:
Yaw2=arctan((BulidingPositionY-ImageGroupY)/(BulidingPositionX-ImageGroupX)),
wherein, the position coordinates of the tilt model are: (BulidingPositionX, BulidingPositionY), the coordinates of the center point of the tilted image group are: (ImageGroupX, ImageGroupY).
5. The method for matching tilted images based on three-dimensional tilted model view angle according to any one of claims 1 to 4, wherein the data information carried by each tilted image in the tilted image library comprises: the unique identification of the oblique image, the file path of the oblique image, the group of the oblique image, the serial number of the oblique camera and the central point coordinate of the group of the oblique image.
6. An electronic device, comprising a processor and a memory, wherein the memory stores an executable computer program, and the processor can read the computer program in the memory and operate to implement the method for matching a tilted image based on a three-dimensional tilted model view according to any one of claims 1 to 5.
7. A computer-readable storage medium, wherein the computer-readable storage medium stores an executable computer program, and when the computer program runs, the method for matching a tilted image based on a three-dimensional tilted model view according to any one of claims 1 to 5 is implemented.
CN202010312367.2A 2020-04-20 2020-04-20 Inclined image matching method and device based on three-dimensional inclined model visual angle Active CN111222586B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010312367.2A CN111222586B (en) 2020-04-20 2020-04-20 Inclined image matching method and device based on three-dimensional inclined model visual angle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010312367.2A CN111222586B (en) 2020-04-20 2020-04-20 Inclined image matching method and device based on three-dimensional inclined model visual angle

Publications (2)

Publication Number Publication Date
CN111222586A CN111222586A (en) 2020-06-02
CN111222586B true CN111222586B (en) 2020-09-18

Family

ID=70830105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010312367.2A Active CN111222586B (en) 2020-04-20 2020-04-20 Inclined image matching method and device based on three-dimensional inclined model visual angle

Country Status (1)

Country Link
CN (1) CN111222586B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112504237B (en) * 2020-11-30 2023-03-24 贵州北斗空间信息技术有限公司 Lightweight rapid generation method for inclination data
CN112698661B (en) * 2021-03-22 2021-08-24 成都睿铂科技有限责任公司 Aerial survey data acquisition method, device and system for aircraft and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794490A (en) * 2015-04-28 2015-07-22 中测新图(北京)遥感技术有限责任公司 Slanted image homonymy point acquisition method and slanted image homonymy point acquisition device for aerial multi-view images
CN106289188A (en) * 2016-08-05 2017-01-04 航天恒星科技有限公司 A kind of measuring method based on multi-vision aviation image and system
CN106898047A (en) * 2017-02-24 2017-06-27 朱庆 The adaptive network method for visualizing of oblique model and multivariate model dynamic fusion
CN108399631A (en) * 2018-03-01 2018-08-14 北京中测智绘科技有限公司 A kind of inclination image of scale invariability regards dense Stereo Matching method more
CN109238239A (en) * 2018-09-12 2019-01-18 成都坤舆空间科技有限公司 Digital measurement three-dimensional modeling method based on aeroplane photography

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7424133B2 (en) * 2002-11-08 2008-09-09 Pictometry International Corporation Method and apparatus for capturing, geolocating and measuring oblique images
US8531472B2 (en) * 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US8422825B1 (en) * 2008-11-05 2013-04-16 Hover Inc. Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery
US9025810B1 (en) * 2010-04-05 2015-05-05 Google Inc. Interactive geo-referenced source imagery viewing system and method
CN106327573B (en) * 2016-08-25 2019-03-12 成都慧途科技有限公司 A kind of outdoor scene three-dimensional modeling method for urban architecture
CN107025449B (en) * 2017-04-14 2020-04-07 西南交通大学 Oblique image straight line feature matching method constrained by local area with unchanged visual angle
CN107833273B (en) * 2017-11-02 2021-03-02 重庆市勘测院 Oblique photography three-dimensional model objectification application method based on three-dimensional simulation model
CN108665536B (en) * 2018-05-14 2021-07-09 广州市城市规划勘测设计研究院 Three-dimensional and live-action data visualization method and device and computer readable storage medium
CN108981700B (en) * 2018-06-13 2022-02-15 江苏实景信息科技有限公司 Positioning and attitude determining method and device
CN110458945B (en) * 2019-08-09 2022-11-11 中科宇图科技股份有限公司 Automatic modeling method and system by combining aerial oblique photography with video data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794490A (en) * 2015-04-28 2015-07-22 中测新图(北京)遥感技术有限责任公司 Slanted image homonymy point acquisition method and slanted image homonymy point acquisition device for aerial multi-view images
CN106289188A (en) * 2016-08-05 2017-01-04 航天恒星科技有限公司 A kind of measuring method based on multi-vision aviation image and system
CN106898047A (en) * 2017-02-24 2017-06-27 朱庆 The adaptive network method for visualizing of oblique model and multivariate model dynamic fusion
CN108399631A (en) * 2018-03-01 2018-08-14 北京中测智绘科技有限公司 A kind of inclination image of scale invariability regards dense Stereo Matching method more
CN109238239A (en) * 2018-09-12 2019-01-18 成都坤舆空间科技有限公司 Digital measurement three-dimensional modeling method based on aeroplane photography

Also Published As

Publication number Publication date
CN111222586A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
JP4854819B2 (en) Image information output method
US7233691B2 (en) Any aspect passive volumetric image processing method
AU2007355942B2 (en) Arrangement and method for providing a three dimensional map representation of an area
TW202036480A (en) Image positioning method and system thereof
CN113850126A (en) Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN110617821A (en) Positioning method, positioning device and storage medium
CN111222586B (en) Inclined image matching method and device based on three-dimensional inclined model visual angle
JP2015114954A (en) Photographing image analysis method
CN114359476A (en) Dynamic 3D urban model construction method for urban canyon environment navigation
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN113496503B (en) Point cloud data generation and real-time display method, device, equipment and medium
CN117078756A (en) Airborne ground target accurate positioning method based on scene retrieval matching
CN107784666B (en) Three-dimensional change detection and updating method for terrain and ground features based on three-dimensional images
US20220276046A1 (en) System and method for providing improved geocoded reference data to a 3d map representation
CN113011212B (en) Image recognition method and device and vehicle
CN114187344A (en) Map construction method, device and equipment
CN112767477A (en) Positioning method, positioning device, storage medium and electronic equipment
CN116824068B (en) Real-time reconstruction method, device and equipment for point cloud stream in complex dynamic scene
CN117274472B (en) Aviation true projection image generation method and system based on implicit three-dimensional expression
CN114067071B (en) High-precision map making system based on big data
CN114663596A (en) Large scene mapping method based on real-time ground-imitating flight method of unmanned aerial vehicle
CN117726758A (en) Rapid large-scale three-dimensional reconstruction method based on nerve radiation field
CN117570974A (en) Unmanned aerial vehicle positioning method and device based on visual inertial interaction
CN117893600A (en) Unmanned aerial vehicle image visual positioning method supporting scene apparent difference

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant