CN113124835A - Multi-lens photogrammetric data processing device for unmanned aerial vehicle - Google Patents

Multi-lens photogrammetric data processing device for unmanned aerial vehicle Download PDF

Info

Publication number
CN113124835A
CN113124835A CN202110437022.4A CN202110437022A CN113124835A CN 113124835 A CN113124835 A CN 113124835A CN 202110437022 A CN202110437022 A CN 202110437022A CN 113124835 A CN113124835 A CN 113124835A
Authority
CN
China
Prior art keywords
image data
module
image
information
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110437022.4A
Other languages
Chinese (zh)
Inventor
黄海锋
李宁
丁永祥
闫少霞
文述生
王江林
陈婉
周光海
肖浩威
马原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South GNSS Navigation Co Ltd
Original Assignee
South GNSS Navigation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South GNSS Navigation Co Ltd filed Critical South GNSS Navigation Co Ltd
Priority to CN202110437022.4A priority Critical patent/CN113124835A/en
Publication of CN113124835A publication Critical patent/CN113124835A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/30Interpretation of pictures by triangulation
    • G01C11/34Aerial triangulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The application discloses unmanned aerial vehicle many camera lenses photogrammetry data processing apparatus, storage space through scanning every camera lens, according to the elevation that contains when POS information confirms this image data of unmanned aerial vehicle's shooting in the image data and with the relative position between the measurement target, and according to elevation and relative position, automatic quick screening, reject useless image data, the arrangement efficiency and the reliability of image data are improved, reduce personnel's input and the probability of making mistakes in the measurement work, the invalid data that has solved current photogrammetry and has obtained need rely on the manual work to filter and arrange a large amount of useless data, the technical problem of photogrammetry's measurement efficiency has been restricted.

Description

Multi-lens photogrammetric data processing device for unmanned aerial vehicle
Technical Field
The application relates to the technical field of unmanned aerial vehicle measurement, especially, relate to an unmanned aerial vehicle image data processing apparatus.
Background
Photogrammetry refers to a technology of measuring the shape, size and spatial position of a target object in a shooting mode of a camera module, and aerial triangulation is one of the technologies. In the current air triangulation, an unmanned aerial vehicle is generally used to carry a digital camera to acquire digital images, the calculation capability of a computer is utilized to generate a DOM (digital ortho model), a DEM (digital elevation model) and a DSM (digital surface model), the side textures of ground surface objects are acquired by using an inclined five-lens camera at the same time before and after 2015, feature point matching is performed after the homonymous points of photos are extracted by using the computer, a triangulation network is constructed after dense point clouds are generated, then texture mapping is performed, and finally a 3DM (three-dimensional live-action model) is generated.
The general working logic of the current measurement type oblique photography multi-lens is as follows: the unmanned aerial vehicle is used for mounting a multi-lens oblique photographing device to photograph ground objects on a designated air route, a flight control system of the unmanned aerial vehicle triggers a shutter of the multi-lens through a signal line, meanwhile, a photographing success feedback signal line is connected to the flight control system in a hot shoe by utilizing the principle that the hot shoe of the camera and the shutter of the camera respond simultaneously, and the flight control system records the photographing time point of the shutter. Secondly, the pictures successfully shot by multiple shots will record the successfully shot pictures according to the automatically increasing sequence, a new folder will be automatically established when a certain number of pictures are full, and the picture names are named from 1 again.
In practical situations, pictures shot in one measurement process often contain a large number of invalid images, and in order to ensure measurement accuracy, a large amount of useless data needs to be screened and sorted manually, so that the measurement efficiency of photogrammetry is severely limited.
Disclosure of Invention
The application provides an unmanned aerial vehicle many camera lenses photogrammetry data processing apparatus for solve the invalid data that current photogrammetry obtained and need rely on the manual work to filter and arrange a large amount of useless data, limited photogrammetry's measurement of efficiency's technical problem.
In view of this, this application provides an unmanned aerial vehicle many camera lenses photogrammetry data processing apparatus, includes:
the storage space scanning module is used for acquiring image data stored in an unmanned aerial vehicle storage space, the unmanned aerial vehicle storage space comprises a plurality of storage subspaces, and each storage subspace correspondingly stores image data shot by one lens module;
a POS information processing module for extracting POS information contained in the image data;
the image elevation information determining module is used for determining image elevation information corresponding to the image data according to the POS information;
the first invalid image identification module is used for comparing the image elevation information with a ground-imitating flight elevation threshold value and judging the image data of which the image elevation information is lower than the ground-imitating flight elevation threshold value as invalid image data;
the second invalid image recognition module is used for comparing the POS information with preset position information of a measurement target, determining the relative orientation between the unmanned aerial vehicle and the measurement target when image data are shot, and judging the image data which do not contain the measurement target as invalid image data according to the relative orientation and the real-time shooting range of each lens module in the unmanned aerial vehicle;
and the invalid image eliminating module is used for eliminating the invalid image data.
Preferably, the method further comprises the following steps:
and the image data grouping module is used for comparing the shooting time of the image data with the flying time, and dividing the image data belonging to the same frame into the same group.
Preferably, the method further comprises the following steps:
and the image data classification module is used for determining the lens module corresponding to the image data according to the storage path identifier of the image data and by combining the corresponding relation between the storage subspace and the lens module so as to classify the image data shot by the same lens module into the same category.
Preferably, the method further comprises the following steps:
and the missed shooting detection module is used for carrying out transverse comparison according to the shooting time of each image data and obtaining an image missed shooting detection result according to the comparison result of the image data with the same shooting time and the number of the lens modules.
Preferably, the method further comprises the following steps:
and the coordinate offset determination module is used for sequencing according to the shooting time of each image data, determining POS information to be determined and adjacent POS information of the POS information to be determined according to a sequencing result, generating a flight path according to the POS information to be determined and the flight attitude information of the adjacent POS information, calculating the deviation between the coordinate of the POS information to be determined and the flight path, and determining the coordinate offset determination result of the POS information to be determined according to the deviation value.
Preferably, the method further comprises the following steps:
and the POS information filling module is used for filling information in the missing POS information according to the adjacent POS information of the missing POS information and by combining the flight direction of the unmanned aerial vehicle, the photographing time interval and the average flight speed of the unmanned aerial vehicle when the missing POS information is detected out.
Preferably, the method further comprises the following steps:
and the image dip angle adjusting module is used for rotating the image data into a forward image according to the image characteristic points extracted from the image data and by combining the perspective principle.
Preferably, the method further comprises the following steps:
and the thumbnail generation module is used for resampling the image data and generating a thumbnail corresponding to the image data.
According to the technical scheme, the method has the following advantages:
according to the method and the device, the storage space of each lens is scanned, the elevation of the unmanned aerial vehicle when the unmanned aerial vehicle shoots the image data and the relative position between the unmanned aerial vehicle and the measurement target are determined according to the POS information contained in the image data, the useless image data are automatically and quickly screened and removed, the sorting efficiency and reliability of the image data are improved, the investment and error probability of personnel in measurement work are reduced, the problem that invalid data obtained by existing photogrammetry need to be screened and sorted manually on a large amount of useless data is solved, and the technical problem of the measurement efficiency of the photogrammetry is limited.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a first embodiment of an apparatus for processing data of multiple-lens photogrammetry of an unmanned aerial vehicle according to the present application;
fig. 2 is a schematic structural diagram of a second embodiment of an apparatus for processing multiple-lens photogrammetric data of an unmanned aerial vehicle according to the present application.
Detailed Description
The embodiment of the application provides an unmanned aerial vehicle many camera lenses photogrammetry data processing apparatus for solve the invalid data that current photogrammetry obtained and need rely on the manual work to filter and arrange a large amount of useless data, limited photogrammetry's measurement of efficiency's technical problem.
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, a first embodiment of the present application provides an apparatus for processing data of multiple-lens photogrammetry of an unmanned aerial vehicle, including:
storage space scanning module 101 for obtain the image data of saving in unmanned aerial vehicle storage space, unmanned aerial vehicle storage space contains a plurality of storage subspaces, and every storage subspace corresponds the image data that a storage lens module was shot.
A POS information processing module 102 for extracting POS information contained in the image data.
Wherein the general POS data mainly includes GPS data, generally indicated at X, Y, Z, representing the geographical position of the aircraft at the moment of exposure in flight, and IMU data mainly including: heading angle (Phi), pitch angle (Omega) and roll angle (Kappa).
It should be noted that, in the present application, the photographing time points of each image output by the hardware system of the unmanned aerial vehicle, the high-precision position information (also referred to as POS data) recorded by the GNSS, and the image stored according to the own logic of the camera are used to quickly distinguish the data of the image of each frame, align the image corresponding to each photographing time point, and write the position information of the photographing point into the image data, so that the image data can be directly used for the subsequent aerial triangulation.
And the image elevation information determining module 103 is configured to determine image elevation information corresponding to the image data according to the POS information.
And the first invalid image identification module 104 is configured to compare the image elevation information with a ground-imitating flight elevation threshold value, and determine that the image data with the image elevation information lower than the ground-imitating flight elevation threshold value is invalid image data.
It should be noted that, in order to ensure the normal camera, the trial shots are taken on the ground before flight and before shutdown after landing. And the software automatically calculates the ground-imitating flying elevation threshold value according to the average elevation data of the POS points, and image data with POS elevation lower than the ground-imitating flying elevation threshold value can be judged as invalid image data and is automatically removed. Therefore, POS data of the ground-imitating flight elevation change can be guaranteed not to be affected.
And the second invalid image recognition module 105 is used for comparing the POS information with preset position information of the measurement target, determining the relative orientation between the unmanned aerial vehicle and the measurement target when the image data is shot, and judging the image data which does not contain the measurement target as invalid image data according to the relative orientation and the real-time shooting range of each lens module in the unmanned aerial vehicle.
Meanwhile, since the inclined lenses in the multi-lens camera are used for photographing the side details of all the ground features in the measuring area range, when the ground features are photographed outside the measuring area into the measuring area, only the image data of one lens is valid. The utility model provides a second invalid image recognition module is then according to POS information compares with predetermined measurement target position information, and relative position between unmanned aerial vehicle and the measurement target when confirming to shoot image data, according to relative position combines the real-time scope of shooing of each camera lens module in the unmanned aerial vehicle will not contain the image data of measurement target is judged as invalid image data.
And an invalid image eliminating module 106, configured to eliminate the invalid image data.
Then, the image data which is judged to be invalid image data is subjected to automatic elimination processing, useless image data are automatically and quickly screened and eliminated, the sorting efficiency and reliability of the image data are improved, the input and error probability of personnel in the measurement work is reduced, the image data quantity subsequently participating in aerial triangulation can be reduced, the calculation efficiency is improved, and the technical problem that the measurement efficiency of photogrammetry is limited because invalid data obtained by the conventional photogrammetry need to be screened and sorted manually on a large amount of useless data is solved.
The above is a detailed description of a first embodiment of an unmanned aerial vehicle multi-lens photogrammetric data processing device provided by the present application, and the following is a detailed description of a second embodiment of an unmanned aerial vehicle multi-lens photogrammetric data processing device provided by the present application on the basis of the first embodiment described above
Referring to fig. 2, a second embodiment of the present application provides an apparatus for processing data of multiple-lens photogrammetry of an unmanned aerial vehicle, including:
on the basis of the first embodiment, further, the method further comprises the following steps:
and an image data grouping module 107, configured to compare the shooting time of the image data with the flying frame time, and divide the image data belonging to the same frame into the same group.
Further, still include:
and the image data classification module 108 is configured to determine, according to the storage path identifier of the image data, a lens module corresponding to the image data in combination with the correspondence between the storage subspace and the lens module, so as to classify the image data captured by the same lens module into the same category.
It should be noted that the device provided by this embodiment also provides a function of automatically classifying and grouping image data, and through the processing of the image data grouping module and the image data classification module, image data grouped according to the flight number of the unmanned aerial vehicle can be obtained, and image data in each group can be further classified according to the difference of the shooting lens module, so that the efficiency of image data arrangement is further improved.
Further, still include:
and the missed-shooting detection module 109 is used for performing transverse comparison according to the shooting time of each image data, and obtaining an image missed-shooting detection result according to the comparison result of the image data with the same shooting time and the number of the lens modules.
It should be noted that, according to the synchronism of the photographing time, the photographing time of the photographs of each lens is transversely compared to identify whether a certain lens is missed, and when the missed image detection result determines that a missed image exists, a missed image identifier can be inserted.
Further, still include:
the coordinate offset determination module 110 is configured to perform sorting according to the shooting time of each image data, determine POS information to be determined and adjacent POS information of the POS information to be determined according to a sorting result, generate a flight path according to the POS information to be determined and flight attitude information of the adjacent POS information, and calculate a deviation between a coordinate of the POS information to be determined and the flight path, so as to determine a coordinate offset determination result of the POS information to be determined according to the deviation value.
It should be noted that, the apparatus according to the embodiment of the present application may further identify a flying spot caused by GNSS device interference according to the aircraft attitude and the positions of the front and rear points recorded by the POS point, specifically perform sorting according to the shooting time of each image data, determine POS information to be determined and POS information adjacent to the POS information to be determined according to a sorting result, generate a flight path according to the POS information to be determined and the flight attitude information of the adjacent POS information, and then calculate a deviation between the coordinate of the POS information to be determined and the flight path, so as to determine a coordinate deviation determination result of the POS information to be determined according to the deviation value.
Further, still include:
and the POS information filling module 111 is used for filling information in the missing POS information according to the adjacent POS information of the missing POS information, the flight direction of the unmanned aerial vehicle, the photographing time interval and the average flight speed of the unmanned aerial vehicle when the missing POS information is detected.
It should be noted that, the apparatus of this embodiment may also automatically identify, according to the flight direction and the photographing interval or photographing time, a non-POS point record caused by the failure of the photographing feedback of the device by using a POS point missing interpolation algorithm. POS point data is automatically inserted smoothly.
The adjacent POS information mentioned in the present embodiment refers to points before and after the flight path of the reference point (the coordinate point corresponding to the POS information to be determined or the coordinate point corresponding to the missing POS information).
Further, still include:
and the image dip angle adjusting module 112 is used for rotating the image data into a forward image according to the image feature points extracted from the image data and by combining the perspective principle.
It should be noted that, the apparatus of the embodiment of the present application may also automatically perform lossless rotation on the image stored by being rotated due to the tilt angle to the forward direction according to the perspective principle and the set threshold value by extracting the shot feature point.
Further, still include:
and a thumbnail generation module 113, configured to resample the image data and generate a thumbnail corresponding to the image data.
It should be noted that the device in the embodiment of the present application may further combine with a fast resampling algorithm to extract an image thumbnail, where the thumbnail is supported to be written in the photo exif information, so as to facilitate browsing and checking by a user.
Furthermore, the terms "first," "second," "third," "fourth," and the like (if any) in the description of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (8)

1. The utility model provides an unmanned aerial vehicle many camera lenses photogrammetry data processing apparatus which characterized in that includes:
the storage space scanning module is used for acquiring image data stored in an unmanned aerial vehicle storage space, the unmanned aerial vehicle storage space comprises a plurality of storage subspaces, and each storage subspace correspondingly stores image data shot by one lens module;
a POS information processing module for extracting POS information contained in the image data;
the image elevation information determining module is used for determining image elevation information corresponding to the image data according to the POS information;
the first invalid image identification module is used for comparing the image elevation information with a ground-imitating flight elevation threshold value and judging the image data of which the image elevation information is lower than the ground-imitating flight elevation threshold value as invalid image data;
the second invalid image recognition module is used for comparing the POS information with preset position information of a measurement target, determining the relative orientation between the unmanned aerial vehicle and the measurement target when image data are shot, and judging the image data which do not contain the measurement target as invalid image data according to the relative orientation and the real-time shooting range of each lens module in the unmanned aerial vehicle;
and the invalid image eliminating module is used for eliminating the invalid image data.
2. The apparatus of claim 1, further comprising:
and the image data grouping module is used for comparing the shooting time of the image data with the flying time, and dividing the image data belonging to the same frame into the same group.
3. The apparatus of claim 1, further comprising:
and the image data classification module is used for determining the lens module corresponding to the image data according to the storage path identifier of the image data and by combining the corresponding relation between the storage subspace and the lens module so as to classify the image data shot by the same lens module into the same category.
4. The apparatus of claim 1, further comprising:
and the missed shooting detection module is used for carrying out transverse comparison according to the shooting time of each image data and obtaining an image missed shooting detection result according to the comparison result of the image data with the same shooting time and the number of the lens modules.
5. The apparatus of claim 1, further comprising:
and the coordinate offset determination module is used for sequencing according to the shooting time of each image data, determining POS information to be determined and adjacent POS information of the POS information to be determined according to a sequencing result, generating a flight path according to the POS information to be determined and the flight attitude information of the adjacent POS information, calculating the deviation between the coordinate of the POS information to be determined and the flight path, and determining the coordinate offset determination result of the POS information to be determined according to the deviation value.
6. The apparatus of claim 5, further comprising:
and the POS information filling module is used for filling information in the missing POS information according to the adjacent POS information of the missing POS information and by combining the flight direction of the unmanned aerial vehicle, the photographing time interval and the average flight speed of the unmanned aerial vehicle when the missing POS information is detected out.
7. The apparatus of claim 1, further comprising:
and the image dip angle adjusting module is used for rotating the image data into a forward image according to the image characteristic points extracted from the image data and by combining the perspective principle.
8. The apparatus of claim 1, further comprising:
and the thumbnail generation module is used for resampling the image data and generating a thumbnail corresponding to the image data.
CN202110437022.4A 2021-04-22 2021-04-22 Multi-lens photogrammetric data processing device for unmanned aerial vehicle Pending CN113124835A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110437022.4A CN113124835A (en) 2021-04-22 2021-04-22 Multi-lens photogrammetric data processing device for unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110437022.4A CN113124835A (en) 2021-04-22 2021-04-22 Multi-lens photogrammetric data processing device for unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN113124835A true CN113124835A (en) 2021-07-16

Family

ID=76779230

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110437022.4A Pending CN113124835A (en) 2021-04-22 2021-04-22 Multi-lens photogrammetric data processing device for unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113124835A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109034376A (en) * 2018-07-18 2018-12-18 东北大学 A kind of unmanned plane during flying trend prediction method and system based on LSTM
CN110186433A (en) * 2019-03-27 2019-08-30 成都睿铂科技有限责任公司 A kind of airborne survey method and device for rejecting extra aerophotograph
CN110345925A (en) * 2019-08-06 2019-10-18 陕西土豆数据科技有限公司 One kind is for five mesh aerial photograph quality testings and empty three processing methods
CN112362033A (en) * 2020-10-29 2021-02-12 中国自然资源航空物探遥感中心 Quality inspection method for aerial remote sensing camera image
CN112514363A (en) * 2019-12-17 2021-03-16 深圳市大疆创新科技有限公司 Image transmission system and method, control device and movable platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109034376A (en) * 2018-07-18 2018-12-18 东北大学 A kind of unmanned plane during flying trend prediction method and system based on LSTM
CN110186433A (en) * 2019-03-27 2019-08-30 成都睿铂科技有限责任公司 A kind of airborne survey method and device for rejecting extra aerophotograph
CN110345925A (en) * 2019-08-06 2019-10-18 陕西土豆数据科技有限公司 One kind is for five mesh aerial photograph quality testings and empty three processing methods
CN112514363A (en) * 2019-12-17 2021-03-16 深圳市大疆创新科技有限公司 Image transmission system and method, control device and movable platform
CN112362033A (en) * 2020-10-29 2021-02-12 中国自然资源航空物探遥感中心 Quality inspection method for aerial remote sensing camera image

Similar Documents

Publication Publication Date Title
CN111179358B (en) Calibration method, device, equipment and storage medium
US20210141378A1 (en) Imaging method and device, and unmanned aerial vehicle
CN110345925B (en) Quality detection and air-to-air processing method for five-eye aerial photo
US10311595B2 (en) Image processing device and its control method, imaging apparatus, and storage medium
CN103246044A (en) Automatic focusing method, automatic focusing system, and camera and camcorder provided with automatic focusing system
JP5290865B2 (en) Position and orientation estimation method and apparatus
CN107560603A (en) A kind of unmanned plane oblique photograph measuring system and measuring method
Ahmadabadian et al. Image selection in photogrammetric multi-view stereo methods for metric and complete 3D reconstruction
CN111524091A (en) Information processing apparatus, information processing method, and storage medium
CN111899345A (en) Three-dimensional reconstruction method based on 2D visual image
CN110458945B (en) Automatic modeling method and system by combining aerial oblique photography with video data
CN110766731A (en) Method and device for automatically registering panoramic image and point cloud and storage medium
CN111536947B (en) Method and system for automatically detecting oblique photography missing and quickly performing rephotography
CN113124835A (en) Multi-lens photogrammetric data processing device for unmanned aerial vehicle
CN116051980B (en) Building identification method, system, electronic equipment and medium based on oblique photography
CN115272446B (en) Method and system for calculating head-stacking occupied area
JP2006113832A (en) Stereoscopic image processor and program
WO2022041119A1 (en) Three-dimensional point cloud processing method and apparatus
CN113421332A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
CN115156102B (en) Aviation sheet picking method and system
JP6861592B2 (en) Data thinning device, surveying device, surveying system and data thinning method
CN116468878B (en) AR equipment positioning method based on positioning map
CN115619783B (en) Method and device for detecting product processing defects, storage medium and terminal
US20240054747A1 (en) Image processing apparatus, image processing method, and storage medium
US20240054668A1 (en) Image processing apparatus, image processing method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210716