CN111693028A - Method for obtaining digital water depth model based on projection image - Google Patents

Method for obtaining digital water depth model based on projection image Download PDF

Info

Publication number
CN111693028A
CN111693028A CN202010580001.3A CN202010580001A CN111693028A CN 111693028 A CN111693028 A CN 111693028A CN 202010580001 A CN202010580001 A CN 202010580001A CN 111693028 A CN111693028 A CN 111693028A
Authority
CN
China
Prior art keywords
water depth
image
water
projection
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010580001.3A
Other languages
Chinese (zh)
Inventor
汪佳丽
徐笑
陈艳楠
胡莉停
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Ocean University
Original Assignee
Shanghai Ocean University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Ocean University filed Critical Shanghai Ocean University
Priority to CN202010580001.3A priority Critical patent/CN111693028A/en
Publication of CN111693028A publication Critical patent/CN111693028A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C13/00Surveying specially adapted to open water, e.g. sea, lake, river or canal
    • G01C13/008Surveying specially adapted to open water, e.g. sea, lake, river or canal measuring depth of open water
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2433Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring outlines by shadow casting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles
    • G01C7/02Tracing profiles of land surfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Hydrology & Water Resources (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for acquiring a digital water depth model based on a projection image, which solves the problems that the acquisition precision of the water depth of shallow sea is low and the shallow sea cannot be normally applied at present.

Description

Method for obtaining digital water depth model based on projection image
Technical Field
The invention relates to a measurement technology, in particular to a method for acquiring a digital water depth model based on a projection image.
Background
In recent years, due to the progress of science and technology, a great deal of research is carried out on the ocean, and how to acquire high-precision submarine topography is a goal consistently pursued by related technicians. At present, aerial photogrammetry technology is mature, digital surface models and orthoimages can be automatically generated through images acquired by aviation, the automatically generated digital surface models and orthoimages have high precision due to clear textures on land, but for water areas, the digital water depth models (namely underwater digital surface models) acquired through an automatic matching technology have low precision, and cannot be normally applied. The error is usually caused by several reasons:
firstly, due to the refraction effect of water on light, a conventional collinear equation is not suitable any more when a digital water depth model of a water area is obtained;
secondly, because the surface of the water area usually lacks textures, errors can occur in matching;
and thirdly, the water surface is a dynamic condition under the windy condition.
Since the optical-based digital water depth model acquisition can overcome the problem that acoustics cannot collect data on the spot in a shallow sea area, the method is widely applied to the shallow sea area at present. How to more efficiently acquire shallow sea water depth on the basis of optics is a research subject in the aspect of oceans.
Disclosure of Invention
The invention aims to provide a method for acquiring a digital water depth model based on a projection image, which can acquire shallow sea water depth more efficiently and accurately.
The technical purpose of the invention is realized by the following technical scheme:
a method for obtaining a digital water depth model based on projection images comprises the following steps:
generating a projection image by acquiring an original remote sensing image of a shallow water area and a digital surface model with elevation values both being water surface elevations;
acquiring an original water depth model of a textured area by using a digital surface model with projected images and elevation values as water surface elevations and adopting an object-based image matching method;
performing precision verification on the obtained original water depth model, performing regional water depth correction aiming at the existing error matching region, and performing single-point water depth correction on the existing error matching point;
and generating a digital water depth model of the textured area by splicing.
Preferably, the projection image is generated by the following specific steps:
obtaining an original remote sensing image of a water area, preprocessing the original remote sensing image, and determining an external orientation element value of the original remote sensing image;
acquiring a surface model of a water area corresponding to an original remote sensing image;
acquiring water surface elevation values at the moment of shooting, and setting the water surface elevation values of the surface model as the water surface elevation values at the moment of shooting to obtain a digital surface model with the elevation values all being water surface elevations;
and projecting the original remote sensing image to a digital surface model with elevation values all being water surface elevations, and generating a projection image corresponding to the original remote sensing image through differential correction.
Preferably, the correspondence between the coordinates of the feature points of the obtained projection image and the image points on the original remote sensing image is as follows
Figure BDA0002552834310000021
Wherein (X)P,YP) The coordinates of the characteristic points of the projected image are (x, y) the coordinates of image points on the original remote sensing image; a is1,b1,c1,a2,b2,c2,a3,b3,c3Is nine elements corresponding to the original remote sensing image rotation matrix (X)S,YS,ZS) Spatial object space coordinates, x, of the projection center of the original remote-sensing image0,y0F is an internal parameter of the camera, Z0And acquiring the water depth value of the original remote sensing image at the moment.
Preferably, the specific steps of obtaining the original water depth model of the textured area by the object-based image matching method are as follows:
determining the position of the textured area on the digital surface model;
estimating an approximate depth value Z of the textured regionuSetting up search range and search interval DeltaZ to obtain water depth search value Zi
Zi=Zu+i*ΔZ
Wherein i is 1,2, 3.. n;
searching for value Z by depth of wateriSearching corresponding projection point pairs P on the left and right projection imagesli’,Pri', depending on the refractive index n, the projection point Pi’(XPi’,Ypi') search point P (X)P,YP,ZPi) And a projection center S (X)S,YS,ZS) The relationship of (A) is as follows
Figure BDA0002552834310000031
By searching for water depth value Zi(i 1,2,3, …, n), the corresponding projected point coordinates (X) can be found on the left and right projected imagesPi’,YPi’);
Calculating the matching measure between the projection point pairs, wherein the search water depth value corresponding to the projection point pair with the highest matching measure is the optimal water depth value of the characteristic point;
and obtaining an original water depth model of the textured area by a multipoint image matching method.
Preferably, the correction of the existing mismatch points is as follows:
correcting the single-point water depth value of the textured area based on a light refraction correction method of the projection image;
searching error matching points needing to be corrected on the left and right projection images;
setting a lowest water depth value which is smaller than the lowest water depth value of the current area;
and searching the plane coordinate when the error matching point on the projection image corresponds to the lowest water depth value:
Figure BDA0002552834310000041
wherein (X)pi,Ypi,Zpi) For the sought plane coordinates, (X)pi’,Ypi') are the coordinates of the error-matching points, ZminThe water depth value is the lowest water depth value,
Figure BDA0002552834310000042
β is the angle of refraction of a light ray in water, α is the angle of refraction of a light ray in air;
determining corresponding points of the left and right mismatching points of the projection image on the lowest horizontal plane to obtain refracted rays of the left and right mismatching points, and judging the two refracted rays to determine a real water depth value:
if the two refraction rays are intersected, the intersection point is the real water depth value of the error matching point; and if the two refracted rays do not intersect, taking the point with the shortest distance with the two refracted rays as the real water depth value of the error matching point.
Preferably, the regional water depth correction is performed by reducing/enlarging a matching window or replacing the image matching method for the existing mismatching region.
In conclusion, the invention has the following beneficial effects:
the object space image matching method is combined with the projection image concept, so that the matching efficiency is improved to a great extent, a large-area digital water depth model can be obtained efficiently and accurately, the problem that data cannot be acquired on site in a shallow sea area by traditional acoustics can be solved, and the shallow sea water depth can be obtained more efficiently.
Drawings
FIG. 1 is a schematic flow diagram of the process;
FIG. 2 is a schematic diagram of a relationship between an original remote sensing image and a projected image;
FIG. 3 is a schematic view of object-based depth correction;
FIG. 4 is a diagram illustrating the relationship between projection points, search points, and projection centers;
FIG. 5 is a schematic diagram of a light refraction correction method based on a projected image.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
According to one or more embodiments, a method for acquiring a digital water depth model based on projection images is disclosed, as shown in fig. 1, comprising the following steps:
generating a projection image by acquiring an original remote sensing image of a shallow water area and a digital surface model with elevation values both being water surface elevations;
acquiring an original water depth model of a textured area by using a digital surface model with projected images and elevation values as water surface elevations and adopting an object-based image matching method;
performing precision verification on the obtained original water depth model, performing regional water depth correction aiming at the existing error matching region, and performing single-point water depth correction on the existing error matching point;
and generating a digital water depth model of the textured area by splicing.
The shot shallow water area needs to contain a textured area and a non-textured area, and the more obvious the texture of the textured area is, the greater the advantage of water depth correction is.
According to the information of the internal orientation element and the external orientation element of the left original remote sensing image and the right original remote sensing image, the original remote sensing images are projected onto a digital surface model with elevation values of water surface elevation, an orthoimage corresponding to the original remote sensing images is generated by utilizing a differential correction technology, in order to distinguish the orthoimage from a conventional orthoimage, the orthoimage generated by utilizing the digital surface elevation of the original remote sensing image is called as a 'projected image', and as shown in figure 2, the generation steps of the projected image are as follows:
obtaining an original remote sensing image of a water area, preprocessing the original remote sensing image, and determining an external orientation element value of the original remote sensing image; the image quality is improved by using the technologies of geometric correction, atmospheric correction, flare elimination and the like; the precision of the exterior orientation element of the image influences the subsequent image matching precision, and the higher the precision of the exterior orientation element value is, the higher the precision of the obtained water depth is;
acquiring a digital surface model of a water area corresponding to an original remote sensing image, and automatically acquiring the digital surface model of the area by utilizing a photogrammetry technology or acquiring the digital surface model of the area through an open source website;
acquiring a water surface elevation value at the moment of shooting, and performing photogrammetry forward intersection according to tide data or fixed characteristic points of the water surface to acquire the water surface elevation value at the moment of image shooting;
setting the water surface elevation values of the digital surface model as water surface elevation values at the moment of shooting to obtain the digital surface model with the elevation values being water surface elevations;
and projecting the original remote sensing image to a digital surface model, and generating a projection image corresponding to the original remote sensing image through differential correction.
At this time, the correspondence between the feature point coordinates of the obtained projection image and the image points on the original remote sensing image is as follows
Figure BDA0002552834310000061
Wherein (X)P,YP) The coordinates of the characteristic points of the projected image are (x, y) the coordinates of image points on the original remote sensing image; a is1,b1,c1,a2,b2,c2,a3,b3,c3Is nine elements corresponding to the original remote sensing image rotation matrix (X)S,YS,ZS) Spatial object space coordinates, x, of the projection center of the original remote-sensing image0,y0F is an internal parameter of the camera, Z0And acquiring the water depth value of the original remote sensing image at the moment.
Specifically, as shown in fig. 3 and 4, the specific steps of obtaining the digital water depth model through the original water depth model are as follows:
determining the position of the textured area on the digital surface model;
estimating an approximate depth value Z of the textured regionuSetting up search range and search interval DeltaZ to obtain water depth search value Zi
Zi=Zu+i*ΔZ
Wherein i is 1,2, 3.. n;
searching for value Z by depth of wateriSearching corresponding projection point pairs P on the left and right projection imagesli’,Pri', depending on the refractive index n, the projection point Pi’(XPi’,Ypi') search point P (X)P,YP,ZPi) And a projection center S (X)S,YS,ZS) The relationship of (A) is as follows
Figure BDA0002552834310000071
By searching for water depth value Zi(i 1,2,3, …, n), the corresponding projected point coordinates (X) can be found on the left and right projected imagesPi’,YPi’);
Calculating the matching measure between the projection point pairs, wherein the search water depth value corresponding to the projection point pair with the highest matching measure is the optimal water depth value of the characteristic point;
and acquiring a digital water depth model of the textured area by a multipoint image matching method.
The matching between the pairs of projection points may be selected from a multipoint-based image matching method, such as a multipoint least square matching method, a dynamic programming matching method, a relaxation method, a semi-global matching method, and the like, where the relaxation method is selected as an example, as shown in the following formula.
Figure BDA0002552834310000072
Wherein:
Figure BDA0002552834310000073
(i, j) is the target point and its conjugate point corresponding to another image, (h, k) is the candidate point corresponding to (i, j), and P (i, j) represents the correlation of the target point corresponding to the conjugate candidate point i (ai). C (i, j; h, k) represents the compatibility metric between (i, j) and the alternative point (h, k). In the process of each iteration of the relaxation method, the probability P (i, j) between the alternative point i and the target point j is determined, and the P1(i, j), P2(i, j) and … are continuously adjusted through iteration. Here q (n) (i, j) represents the increment of probability between the matching point and the candidate point at the nth iteration.
Performing precision verification on the original water depth model acquired from the textured area, judging whether the acquired original water depth model meets a precision standard, if so, outputting the digital water depth model of the textured area, and if not, searching the error matching points and the error matching areas on the corrected original water depth model:
specifically, the specific operation of performing precision verification on the obtained original water depth model to judge and search the error matching region or the error matching point can be realized through software operations such as ER mapper, Arcgis and the like, and for clarity of explanation, specific operation steps are shown as follows:
obtaining multiple groups of original water depth model data of a target area according to the same target area appearing on multiple stereo opposite surfaces;
and (3) carrying out superposition display on the obtained multiple groups of original water depth model data, and carrying out detection and judgment: if multiple groups of original water depth models at the same position are present at the same position, the obtained original water depth models are correct, and the water depth position of the target area is correct; if multiple groups of original water depth models at the same position are at different positions, the obtained original water depth models are wrong, and the water depth position of the target area is wrong.
The correction of the existing mismatch points is as follows:
as shown in fig. 5, the single-point depth value of the textured area is corrected based on the light refraction correction method of the projected image;
searching error matching points needing to be corrected on the left and right projection images;
setting a lowest water depth value which is smaller than the lowest water depth value of the current area;
and searching the plane coordinate when the error matching point on the projection image corresponds to the lowest water depth value:
Figure BDA0002552834310000091
wherein (X)pi,Ypi,Zpi) For the sought plane coordinates, (X)pi’,Ypi') are the coordinates of the error-matching points, ZminThe water depth value is the lowest water depth value,
Figure BDA0002552834310000092
β is the angle of refraction of a light ray in water, α is the angle of refraction of a light ray in air;
determining corresponding points of the left and right mismatching points of the projection image on the lowest horizontal plane to obtain refracted rays of the left and right mismatching points, and judging the two refracted rays to determine a real water depth value:
if the two refraction rays are intersected, the intersection point is the real water depth value of the error matching point; and if the two refracted rays do not intersect, taking the point with the shortest distance with the two refracted rays as the real water depth value of the error matching point.
If there are multiple light rays, the point with the shortest distance to the multiple light rays is the target point.
For the mismatching area, the area water depth correction can be performed by adopting a reduction/enlargement matching window or an alternative image matching method.
The present embodiment is only for explaining the present invention, and it is not limited to the present invention, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present invention.

Claims (6)

1. A method for obtaining a digital water depth model based on a projection image is characterized by comprising the following steps:
generating a projection image by acquiring an original remote sensing image of a shallow water area and a digital surface model with elevation values both being water surface elevations;
acquiring an original water depth model of a textured area by using a digital surface model with projected images and elevation values as water surface elevations and adopting an object-based image matching method;
performing precision verification on the obtained original water depth model, performing regional water depth correction aiming at the existing error matching region, and performing single-point water depth correction on the existing error matching point;
and generating a digital water depth model of the textured area by splicing.
2. The method of claim 1, wherein the projection image is generated by the steps of:
obtaining an original remote sensing image of a water area, preprocessing the original remote sensing image, and determining an external orientation element value of the original remote sensing image;
acquiring a surface model of a water area corresponding to an original remote sensing image;
acquiring water surface elevation values at the moment of shooting, and setting the water surface elevation values of the surface model as the water surface elevation values at the moment of shooting to obtain a digital surface model with the elevation values all being water surface elevations;
and projecting the original remote sensing image to a digital surface model with elevation values all being water surface elevations, and generating a projection image corresponding to the original remote sensing image through differential correction.
3. The method for acquiring a digital water depth model based on projection images as claimed in claim 2, wherein: the correspondence between the feature point coordinates of the obtained projection image and the image points on the original remote sensing image is as follows
Figure FDA0002552834300000011
Wherein (X)P,YP) The coordinates of the characteristic points of the projected image are (x, y) the coordinates of image points on the original remote sensing image; a is1,b1,c1,a2,b2,c2,a3,b3,c3Is nine elements corresponding to the original remote sensing image rotation matrix (X)S,YS,ZS) Is composed ofSpatial object space coordinate, x, of projection center of original remote sensing image0,y0F is an internal parameter of the camera, Z0And acquiring the water depth value of the original remote sensing image at the moment.
4. The method for obtaining a digital water depth model based on projection images as claimed in claim 3, wherein the object-based image matching method for obtaining the original water depth model with textured regions comprises the following steps:
determining the position of the textured area on the digital surface model;
estimating an approximate depth value Z of the textured regionuSetting up search range and search interval DeltaZ to obtain water depth search value Zi
Zi=Zu+i*ΔZ
Wherein i is 1,2, 3.. n;
searching for value Z by depth of wateriSearching corresponding projection point pairs P on the left and right projection imagesli’,Pri', depending on the refractive index n, the projection point Pi’(XPi’,Ypi') search point P (X)P,YP,ZPi) And a projection center S (X)S,YS,ZS) The relationship of (A) is as follows
Figure FDA0002552834300000021
By searching for water depth value Zi(i 1,2,3, …, n), the corresponding projected point coordinates (X) can be found on the left and right projected imagesPi’,YPi’);
Calculating the matching measure between the projection point pairs, wherein the search water depth value corresponding to the projection point pair with the highest matching measure is the optimal water depth value of the characteristic point;
and obtaining an original water depth model of the textured area by a multipoint image matching method.
5. The method of claim 4, wherein the error matching points are corrected as follows:
correcting the single-point water depth value of the textured area based on a light refraction correction method of the projection image;
searching error matching points needing to be corrected on the left and right projection images;
setting a lowest water depth value which is smaller than the lowest water depth value of the current area;
and searching the plane coordinate when the error matching point on the projection image corresponds to the lowest water depth value:
Figure FDA0002552834300000031
wherein (X)pi,Ypi,Zpi) For the sought plane coordinates, (X)pi’,Ypi') are the coordinates of the error-matching points, ZminThe water depth value is the lowest water depth value,
Figure FDA0002552834300000032
β is the angle of refraction of a light ray in water, α is the angle of refraction of a light ray in air;
determining corresponding points of the left and right mismatching points of the projection image on the lowest horizontal plane to obtain refracted rays of the left and right mismatching points, and judging the two refracted rays to determine a real water depth value:
if the two refraction rays are intersected, the intersection point is the real water depth value of the error matching point; and if the two refracted rays do not intersect, taking the point with the shortest distance with the two refracted rays as the real water depth value of the error matching point.
6. The method of claim 5 for acquiring a digital water depth model based on projection images, wherein: and performing regional water depth correction on the existing mismatching region by reducing/amplifying a matching window or replacing an image matching method.
CN202010580001.3A 2020-06-23 2020-06-23 Method for obtaining digital water depth model based on projection image Pending CN111693028A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010580001.3A CN111693028A (en) 2020-06-23 2020-06-23 Method for obtaining digital water depth model based on projection image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010580001.3A CN111693028A (en) 2020-06-23 2020-06-23 Method for obtaining digital water depth model based on projection image

Publications (1)

Publication Number Publication Date
CN111693028A true CN111693028A (en) 2020-09-22

Family

ID=72483376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010580001.3A Pending CN111693028A (en) 2020-06-23 2020-06-23 Method for obtaining digital water depth model based on projection image

Country Status (1)

Country Link
CN (1) CN111693028A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348951A (en) * 2020-11-30 2021-02-09 长春工程学院 Digital elevation data reconstruction method for heterogeneous remote sensing image content
CN113175917A (en) * 2021-04-27 2021-07-27 天津水运工程勘察设计院有限公司 Method for measuring topography of coastal shallow water area by using low-altitude unmanned machine
SE2100063A1 (en) * 2021-04-15 2022-10-16 Saab Ab A method, software product, and system for determining a position and orientation in a 3D reconstruction of the Earth´s surface

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1412524A (en) * 2002-11-28 2003-04-23 武汉大学 Method for measuring formation of seamless space stereomodel
CN101604018A (en) * 2009-07-24 2009-12-16 中国测绘科学研究院 High-definition remote sensing image data disposal route and system thereof
CN102073874A (en) * 2010-12-29 2011-05-25 中国资源卫星应用中心 Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method
CN103530904A (en) * 2013-11-04 2014-01-22 东南大学 Method for establishing underwater landform digital elevation based on Kriging method
CN103630120A (en) * 2013-07-16 2014-03-12 中国人民解放军信息工程大学 Mars surface linear array image epipolar ray resampling method based on strict geometric model
CN104764445A (en) * 2015-04-20 2015-07-08 中测新图(北京)遥感技术有限责任公司 Method and device for determining coordinates of underwater object point
CN105445751A (en) * 2015-11-18 2016-03-30 国家海洋局第一海洋研究所 Shallow water area water depth ratio remote sensing inversion method
CN105716581A (en) * 2016-02-15 2016-06-29 中测新图(北京)遥感技术有限责任公司 Underwater object point coordinate determination method and device based on double-medium photography technology
CN107705272A (en) * 2017-11-21 2018-02-16 桂林航天工业学院 A kind of high-precision geometric correction method of aerial image
CN108919319A (en) * 2018-05-15 2018-11-30 中国人民解放军战略支援部队信息工程大学 Sea island reef satellite image Pillarless caving localization method and system
CN110675450A (en) * 2019-09-06 2020-01-10 武汉九州位讯科技有限公司 Method and system for generating orthoimage in real time based on SLAM technology
CN110779498A (en) * 2019-09-19 2020-02-11 中国科学院测量与地球物理研究所 Shallow river water depth mapping method and system based on unmanned aerial vehicle multi-viewpoint photography
CN111121729A (en) * 2020-01-11 2020-05-08 黄文超 Optical remote sensing satellite internal calibration method and system based on flat terrain

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1412524A (en) * 2002-11-28 2003-04-23 武汉大学 Method for measuring formation of seamless space stereomodel
CN101604018A (en) * 2009-07-24 2009-12-16 中国测绘科学研究院 High-definition remote sensing image data disposal route and system thereof
CN102073874A (en) * 2010-12-29 2011-05-25 中国资源卫星应用中心 Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method
CN103630120A (en) * 2013-07-16 2014-03-12 中国人民解放军信息工程大学 Mars surface linear array image epipolar ray resampling method based on strict geometric model
CN103530904A (en) * 2013-11-04 2014-01-22 东南大学 Method for establishing underwater landform digital elevation based on Kriging method
CN104764445A (en) * 2015-04-20 2015-07-08 中测新图(北京)遥感技术有限责任公司 Method and device for determining coordinates of underwater object point
CN105445751A (en) * 2015-11-18 2016-03-30 国家海洋局第一海洋研究所 Shallow water area water depth ratio remote sensing inversion method
CN105716581A (en) * 2016-02-15 2016-06-29 中测新图(北京)遥感技术有限责任公司 Underwater object point coordinate determination method and device based on double-medium photography technology
CN107705272A (en) * 2017-11-21 2018-02-16 桂林航天工业学院 A kind of high-precision geometric correction method of aerial image
CN108919319A (en) * 2018-05-15 2018-11-30 中国人民解放军战略支援部队信息工程大学 Sea island reef satellite image Pillarless caving localization method and system
CN110675450A (en) * 2019-09-06 2020-01-10 武汉九州位讯科技有限公司 Method and system for generating orthoimage in real time based on SLAM technology
CN110779498A (en) * 2019-09-19 2020-02-11 中国科学院测量与地球物理研究所 Shallow river water depth mapping method and system based on unmanned aerial vehicle multi-viewpoint photography
CN111121729A (en) * 2020-01-11 2020-05-08 黄文超 Optical remote sensing satellite internal calibration method and system based on flat terrain

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
徐笑 等: "基于重叠正射影像的浅水水深测量方法", 《海洋测绘》 *
曹彬才 等: "双介质摄影测量的折射改正方法", 《哈尔滨工程大学学报》 *
王炜杰 等: "超短基线多度重叠影像的水下交会精度分析", 《海洋测绘》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112348951A (en) * 2020-11-30 2021-02-09 长春工程学院 Digital elevation data reconstruction method for heterogeneous remote sensing image content
CN112348951B (en) * 2020-11-30 2022-08-26 长春工程学院 Digital elevation data reconstruction method for heterogeneous remote sensing image content
SE2100063A1 (en) * 2021-04-15 2022-10-16 Saab Ab A method, software product, and system for determining a position and orientation in a 3D reconstruction of the Earth´s surface
WO2022220729A1 (en) * 2021-04-15 2022-10-20 Saab Ab A method, software product, and system for determining a position and orientation in a 3d reconstruction of the earth's surface
SE544823C2 (en) * 2021-04-15 2022-12-06 Saab Ab A method, software product, and system for determining a position and orientation in a 3D reconstruction of the Earth´s surface
US12000703B2 (en) 2021-04-15 2024-06-04 Saab Ab Method, software product, and system for determining a position and orientation in a 3D reconstruction of the earth's surface
CN113175917A (en) * 2021-04-27 2021-07-27 天津水运工程勘察设计院有限公司 Method for measuring topography of coastal shallow water area by using low-altitude unmanned machine

Similar Documents

Publication Publication Date Title
CN111693028A (en) Method for obtaining digital water depth model based on projection image
CN110807809B (en) Light-weight monocular vision positioning method based on point-line characteristics and depth filter
CN101887589B (en) Stereoscopic vision-based real low-texture image reconstruction method
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN111709985B (en) Underwater target ranging method based on binocular vision
CN113658337B (en) Multi-mode odometer method based on rut lines
CN113137920A (en) Underwater measurement equipment and underwater measurement method
CN104574387A (en) Image processing method in underwater vision SLAM system
Bleier et al. Low-cost 3d laser scanning in air orwater using self-calibrating structured light
CN110766669A (en) Pipeline measuring method based on multi-view vision
CN110966956A (en) Binocular vision-based three-dimensional detection device and method
CN112509065B (en) Visual guidance method applied to deep sea mechanical arm operation
CN113008158B (en) Multi-line laser tire pattern depth measuring method
Fan et al. Underwater optical 3-D reconstruction of photometric stereo considering light refraction and attenuation
CN114608522B (en) Obstacle recognition and distance measurement method based on vision
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
CN110487254B (en) Rapid underwater target size measuring method for ROV
CN113393413B (en) Water area measuring method and system based on monocular and binocular vision cooperation
CN113096047A (en) Geometric fine correction method and system for generalized cloud driving and radiation cooperative remote sensing image
CN115790539B (en) Cooperative target underwater photogrammetry method
CN110533702B (en) Underwater binocular vision stereo matching method based on optical field multilayer refraction model
CN115200505A (en) Muddy water three-dimensional point cloud measuring method based on infrared diffraction light spots and binocular vision
CN103411535B (en) A kind of Changeable weight picture point localization method for retro-reflective target
Ferreira et al. Using stereo image reconstruction to survey scale models of rubble-mound structures
CN117765048B (en) Cross-modal fusion-based underwater target three-dimensional registration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922