CN118245453B - Unmanned aerial vehicle data acquisition processing method and computer equipment - Google Patents

Unmanned aerial vehicle data acquisition processing method and computer equipment Download PDF

Info

Publication number
CN118245453B
CN118245453B CN202410666280.3A CN202410666280A CN118245453B CN 118245453 B CN118245453 B CN 118245453B CN 202410666280 A CN202410666280 A CN 202410666280A CN 118245453 B CN118245453 B CN 118245453B
Authority
CN
China
Prior art keywords
data
photo
sequence
time
pos
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410666280.3A
Other languages
Chinese (zh)
Other versions
CN118245453A (en
Inventor
木林
王月恒
袁克飞
倪大银
苏文松
张金伟
杨正春
阮玉玲
李剑修
桂宇娟
刘锋
薛亚锋
朱袁杰
查亮
黄江
耿志盼
吴思诚
杨锋
余小明
于飞
黄亚林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongshui Huaihe Planning And Design Research Co ltd
Original Assignee
Zhongshui Huaihe Planning And Design Research Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongshui Huaihe Planning And Design Research Co ltd filed Critical Zhongshui Huaihe Planning And Design Research Co ltd
Priority to CN202410666280.3A priority Critical patent/CN118245453B/en
Publication of CN118245453A publication Critical patent/CN118245453A/en
Application granted granted Critical
Publication of CN118245453B publication Critical patent/CN118245453B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a processing method for unmanned aerial vehicle collected data and computer equipment, wherein the method comprises the following steps: acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprises POS data and photo data; normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence to obtain the normalized time of each photo in the photo sequence; based on the normalized time of each photo and the recording time and error threshold of each photo, the correspondence between the recorded data in the data sequence and the photos in the photo sequence is determined. The invention can accurately and automatically establish the one-to-one correspondence between each piece of recorded data in POS data acquired by the unmanned aerial vehicle and the photo in photo data, finally reduce the labor intensity and the operation time of technicians, rapidly realize the inspection and the matching of the POS and the photo data, and provide reliable basic data for the subsequent aerial photography data processing.

Description

Unmanned aerial vehicle data acquisition processing method and computer equipment
Technical Field
The invention relates to the technical field of unmanned aerial vehicle acquisition data processing, in particular to a processing method and computer equipment of unmanned aerial vehicle acquisition data.
Background
In the unmanned aerial vehicle aerial photographing process, POS data recorded by the positioning module and the inertial navigation module are not stored and recorded uniformly with the photo. The POS data and the photo data are sequentially recorded according to fixed time intervals by flight control software; after the aerial photography work is finished, a POS file for recording the flight path of the aerial photography work is arranged under the project file. In addition, if the aerial camera is detachable equipment, the aerial camera can be independent of the positioning module and the inertial navigation module, at an aerial shooting point, the camera receives a shooting signal to shoot, and stores the shooting signal in an aerial shooting film folder after finishing, and the aerial shooting film folder mainly comprises a fixed letter part and a digital number which is automatically accumulated according to shooting sequence, wherein the format is a general JPG format.
In practice, in the field navigation measurement process, because of the accidental uncertain reasons such as interference of external electromagnetic signals and internal access of instruments, the situation that the recording of individual POS points or photo data is lost can occur, so that the recorded POS data and photo data are not corresponding, the recorded data volume of the two is inconsistent or even consistent, but the recorded contents are inconsistent, and the later data processing fails.
At present, technicians need to check POS files and photo files simultaneously to ensure that the quantity relation of the POS files and the photo files is consistent. When the POS data and the photo data are inconsistent, the lost POS or photo data are checked and the corresponding data are deleted according to the position information of the POS data and the photo data one by one, so that the POS or the photo data and the photo data are guaranteed to be in one-to-one correspondence. When aerial photographing data are more, manual investigation of thousands or tens of thousands of data is extremely difficult, the basis of manual inspection is mainly based on the shooting distance and shooting direction of the photos, a great deal of time is needed for carrying out visual inspection on the photo data one by one, and the problems cannot be fundamentally solved due to the fact that the detection reliability is low, time and effort are wasted, errors are prone to occur and abundant experience and quick logic reasoning capability are relied on.
Disclosure of Invention
The invention solves the problem of how to automatically realize the accurate correspondence of the data collected by the unmanned aerial vehicle.
In order to solve the above problems, in a first aspect, the present invention provides a processing method for collecting data by an unmanned aerial vehicle, where the method includes:
Acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprise POS data and photo data, the POS data comprise recorded data sequences and recording time of each piece of recorded data in the data sequences, and the photo data comprise acquired photo sequences and recording time of each photo in the photo sequences;
Normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence to obtain the normalized time of each photo in the photo sequence;
and determining the corresponding relation between the recorded data in the data sequence and the photos in the photo sequence based on the normalized time of each photo and the recorded time of each photo and the error threshold values of the POS time system and the camera time system.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle provided by the present invention, normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recording data in the data sequence and the recording time of the first photo in the photo sequence, and obtaining the normalized time of each photo in the photo sequence includes:
Determining the recording time of the first piece of recorded data in the data sequence and the difference value of the recording time of the first piece of photo in the photo sequence as the system time difference of the POS time system and the camera time system;
and determining the recording time of each photo in the photo sequence in a POS time system based on the system time difference as the normalization time of each photo.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle according to the present invention, the determining, based on the normalized time of each photo, the recording time of each photo, and an error threshold of a POS time system and the camera time system, a correspondence between the recorded data in the data sequence and the photo in the photo sequence includes:
determining whether missing data exists in the data sequence or the photo sequence according to the absolute value of the difference value between the normalized time and the recording time of each photo;
when it is determined that lost data exists, according to the determined position and category of the lost data in the data sequence or the photo sequence, the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence is determined.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle provided by the present invention, the determining whether missing data exists in the data sequence or the photo sequence according to an absolute value of a difference value between the normalized time of each photo and the recording time of each photo includes:
And when the absolute value of the difference value between the normalized time and the recording time of the photo is larger than the error threshold value, determining that lost data exists in the POS data or the photo data.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle provided by the present invention, the determining, according to the determined position and category of the lost data, a correspondence between recorded data in the data sequence and a photo in the photo sequence includes:
Determining the lost data as data record or photo according to the record time of the record data or photo corresponding to the record time of the lost data;
and according to the determined position of the lost data in the data sequence or the photo sequence, determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle provided by the present invention, determining that the lost data is a data record or a photo according to the record time of the record data or the photo corresponding to the record time of the lost data includes:
when the recording time of the recorded data corresponding to the recording time of the lost data is earlier than the recording time of the photo, the lost data is indicated to be the photo;
otherwise, the lost data is recorded data.
Optionally, in the processing method for collecting data by an unmanned aerial vehicle provided by the present invention, the determining, according to the determined position of the lost data in the data sequence or the photo sequence, a correspondence between recorded data in the data sequence and a photo in the photo sequence includes:
deleting the recorded data or photo corresponding to the position of the lost data to obtain a data sequence or photo sequence after deleting operation;
And determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence after the deleting operation according to the data record or the photo sequence after the deleting operation.
Optionally, in the processing method for collecting data by using an unmanned aerial vehicle provided by the present invention, when the recorded data corresponding to the lost data position is earlier than the recording time of the photo, the method further includes:
extracting feature points of adjacent photos in the photo sequence;
Determining the overlapping area of the adjacent photos according to the extracted characteristic points;
and determining the overlapping rate of the adjacent photos based on the determined overlapping area.
Optionally, in the processing method for acquiring data by using an unmanned aerial vehicle provided by the invention, when the overlapping rate is not greater than a set threshold, the suspected loss of the photo is indicated.
In a second aspect, the present invention provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the unmanned aerial vehicle acquisition data processing method according to the first aspect when the program is executed.
According to the unmanned aerial vehicle acquired data processing method and the computer equipment, when the acquired data of the unmanned aerial vehicle are processed, the recorded time in the POS data and the recorded time characteristic in the photo data are utilized, namely, the recorded time in the POS data and the recorded time of the first photo in the photo sequence are taken as references, the recorded time of each photo in the photo sequence is normalized into the POS time system, so that the normalized time, the original recorded time of the photo, the set POS time system and the error threshold of the camera time system are taken as judgment references to determine the data loss basis, the establishment of one-to-one correspondence between each recorded data in the POS data acquired by the unmanned aerial vehicle and the photo in the photo data can be accurately and automatically realized after the data loss position is determined, the labor intensity and the operation time of technicians can be reduced finally, the inspection and the matching of the POS data and the photo data can be rapidly realized, and reliable basic data is provided for the subsequent aerial data processing.
Drawings
FIG. 1 is a schematic diagram of data loss of POS data and photo data according to some embodiments of the present invention;
fig. 2 is a flow chart of a processing method of data collected by an unmanned aerial vehicle according to some embodiments of the present invention;
FIG. 3 is a diagram illustrating recording times of photos in photo data according to some embodiments of the present invention;
FIG. 4 is a logical intent of a method of processing data collected by a drone in accordance with some embodiments of the present invention;
Fig. 5 is a flow chart of a method for processing data collected by a drone according to further embodiments of the present invention;
FIG. 6 is a schematic diagram of a photo overlapping ratio calculation process according to some embodiments of the present invention;
FIG. 7 is a diagram illustrating data processing results according to some embodiments of the present invention;
FIG. 8 is a schematic diagram of a computer system according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be noted that, for convenience of description, only the portions related to the invention are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
It can be appreciated that with the development and iteration of the technology, the unmanned aerial vehicle low-altitude aerial survey technology is rapidly developed, and is widely applied to mapping production. The current mainstream unmanned aerial vehicle aerial survey system is carried with RTK, PPK positioning module and high-precision inertial navigation module, and the high-precision positioning and posture provides great convenience for the calculation of the external azimuth element of the later photo, greatly reduces the arrangement density of the image control points, and even can realize the measurement mode without image control.
In the actual mapping operation process of the unmanned aerial vehicle, POS data and photos recorded by the positioning module and the inertial navigation module are not stored and recorded uniformly. The POS data and the photo data are sequentially recorded according to fixed time intervals by flight control software; after the aerial photography work is finished, a POS file for recording the flight path of the aerial photography work is arranged under the project file, and the main content comprises information such as aerial photography point time, coordinates, flight attitude and the like.
In general, the aerial camera is detachable equipment, is independent of the positioning module and the inertial navigation module, receives a photographing signal at an aerial photographing point, photographs, stores the photographing signal in an aerial photographing file after finishing, and mainly comprises a fixed letter part and a digital number which is automatically accumulated according to photographing sequence, and is in a general JPG format.
It can be further understood that under normal conditions, the POS point data is consistent with the photo data amount and corresponds to each other one by one, and when in use, the POS point data and the photo data amount are matched and used through sequence numbers.
For example, as shown in the basic logic of POS data and photo data in FIG. 1, if the POS data is in the order of POS001, POS002, POS003, POS004, …, the photo data is in the order of photo 001, photo 002, photo 003, photo 004, …; the two are used in a serial number sequence in a matching way. However, in the field navigation measurement process, because of the accidental uncertain reasons such as external electromagnetic signal interference, internal access of instruments and the like, the situation that the recording of individual POS points or photo data is lost can occur, so that the recorded POS data and photo data are not corresponding, the recorded data volume of the two is inconsistent or even consistent, but the recorded content is inconsistent, and the later data processing fails.
Namely, POS data and photo data are respectively generated according to the sequence, when the photo data processing is carried out subsequently, the POS data file and the photo data file are matched only through sequence numbers, and are not checked and approved in other ways, and the POS data or the photo data are lost, so that the POS data or the photo data cannot be mutually corresponding to each other, and the photo processing cannot be carried out.
In the embodiment of the invention, in order to solve the defect and accurate correspondence of POS data and photo data and reduce the labor intensity and the operation time of technicians, the processing method of unmanned aerial vehicle acquisition data is provided, and by utilizing the recording time characteristics of the POS data and the photo data and adopting a computer automation mode, the processing method rapidly realizes the inspection and the matching of the POS data and the photo data and provides reliable basic data for the subsequent aerial photography data processing.
The processing method of the embodiment of the invention can be executed by a computer processing device with a data processing function.
In order to better understand the processing method of the unmanned aerial vehicle acquired data provided by the embodiment of the invention, the following detailed description is given by the accompanying drawings.
Fig. 2 is a flow chart of a processing method for collecting data by an unmanned aerial vehicle according to an embodiment of the present invention, as shown in fig. 2, the method specifically includes:
S110, acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprise POS data and photo data, the POS data comprise recorded data sequences and recording time of each piece of recorded data in the data sequences, and the photo data comprise acquired photo sequences and recording time of each photo in the photo sequences.
S120, normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recording data in the data sequence and the recording time of the first photo in the photo sequence, and obtaining the normalized time of each photo in the photo sequence.
S130, based on the normalized time of each photo and the recording time of each photo, and the error threshold values of the POS time system and the camera time system, determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence.
Specifically, in the embodiment of the invention, the acquisition data of the unmanned aerial vehicle, such as the acquisition data in the research area to be measured, can be acquired first.
The acquisition data may include POS data and photo data.
The POS data may specifically include a data sequence and a recording time of each piece of recorded data. The data sequence may in particular comprise position data, gesture data.
The shot data may specifically include a shot sequence and a recording time of each shot, i.e., a creation time of the shot data. As shown in fig. 3.
It will be appreciated that in practice, both the POS data and photo data will be recorded sequentially at regular time intervals in accordance with the flight control software.
Then after the end of the aerial photography operation, there will be a POS file under the project file that records the flight path of the time. As shown in fig. 3. For photo data, as the aerial camera is detachable equipment and is independent of the positioning module and the inertial navigation module, at the aerial shooting point, the camera receives a shooting signal to shoot, and stores the shooting signal in the aerial shooting photo folder after finishing, and the aerial shooting photo folder mainly comprises a fixed letter part and a digital number which is automatically accumulated according to shooting sequence, wherein the format is a universal JPG format.
I.e. in practice the POS data may comprise a sequence of parameters recorded in time. Such as GPS and IMU sequences recorded at set time intervals. I.e. position data, attitude data, and corresponding recording time data, recorded at set time intervals.
The shot data may also include shot sequences acquired at set time intervals.
Further, after the POS data and the photo data are obtained, the recording time of each piece of recording data in the data sequence and the recording time of the first photo in the photo sequence may be taken as a reference, and the photo sequence in the photo data may be time-normalized to be normalized in the POS time system, so as to obtain the recording time of each photo in the photo sequence in the POS time system, which is used as the normalization time of each photo.
Finally, after normalized time information of each photo in the photo sequence is obtained, the normalized time information is compared with original recorded time information, namely, recorded time in a camera time system, and according to error thresholds of the camera time system and a POS time system, the corresponding relation between each data record in the POS data and the photo in the photo data is determined.
The corresponding relation indicates the correct one-to-one correspondence between the recorded data in each POS data and the photo in the photo data after the lost data is eliminated.
It can be understood that in the embodiment of the invention, when processing the acquired data of the unmanned aerial vehicle, by using the recording time in the POS data and the recording time characteristic in the photo data, that is, by taking the recording time in the POS data and the recording time of the first photo in the photo sequence as references, the recording time of each photo in the photo sequence in the photo data is normalized into the POS time system, so that the data loss basis can be determined by means of the normalized time, the original recording time of the photo, the set POS time system and the error threshold of the camera time system as judgment basis, so that after determining the data loss position, the establishment of the one-to-one correspondence between each piece of the recorded data in the POS data acquired by the unmanned aerial vehicle and the photo in the photo data can be accurately and automatically realized, finally, the labor intensity and the operation time of technicians can be reduced, and the inspection and matching of the POS and the photo data can be rapidly realized, thereby providing reliable basic data for the subsequent aerial photography data processing.
Optionally, in some embodiments of the present invention, in S120, the recording time of each photo in the photo sequence is normalized to the POS time system according to the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence, so as to obtain the normalized time of each photo in the photo sequence, which may be implemented specifically by the following steps.
S121, determining the recording time of the first piece of recorded data in the data sequence and the difference value of the recording time of the first piece of photo in the photo sequence as the system time difference of the POS time system and the camera time system.
S122, based on the system time difference, determining the recording time of each photo in the photo sequence in the POS time system as the normalization time of each photo.
Specifically, the time in the POS data information is accurate, and can be accurate to one thousandth of a second, and the recording format is UTC world time; the recording time in the photo data is divided into creation time and file time, and the recording format of the recording time is Beijing time, which can only be accurate to seconds, so that the normalized photo data time and POS data time have errors.
The recording time of the photo in the embodiment of the invention can be understood as the time of generating the photo data file when the photo data is collected, namely the creation time of the photo.
According to the principle of aerial photography, POS data and photo data are synchronously recorded at the same exposure point, and the time interval between adjacent POS point data and the time interval between photo data are consistent. The time record of the POS data is accurate, if the starting time of the photo data is unified with the starting time of the POS data, the shooting time of each photo data in the POS data time system can be accurately calculated, and then the photo data and the POS data are matched according to time, so that the photo data and the POS data can be accurately corresponding to each other. Normalizing the photo data time to the POS time system, the following equation holds:
(1)
(2)
(3)
wherein, T pic_first_ True sense is the real value of the camera system time of the 1 st aerial photographing point, namely the recording time of the first photo in the photo data; t pos_first is POS system time of the 1 st aerial photography point, namely recording time of the first piece of recorded data in POS data; t d is the system time difference between the POS system time and the camera system time; a photo system time truth value of the ith aerial photograph point of T pic_i_ True sense ; t pos_i is POS system time of the ith aerial photograph point; and T pic_i_pos, normalizing the processing result of the photo system time of the ith aerial photograph point.
It will be appreciated that the ith aerial camera system time and the POS system time for that aerial camera will have the same time difference according to equation (1).
According to equation (2), the system time differences for all aerial camera system times through equation (1) can be normalized to the POS system time.
According to the formula (3), the system time of the ith aerial photographing point camera is equal to the system time of the ith aerial photographing point POS after normalization processing.
In practice, the recording time of the photo data can be only accurate to seconds, and the photo creation time is obtained from the photo, but only an approximation of the photo time. As shown in fig. 5, the photo creation time is 2023, 4, 18, 6:44:15.
It will be appreciated that the above formula holds in a theoretical case, whereas in practice the camera time system is only able to acquire a whole number of seconds. Then in the actual calculation, equation (3) may be transformed as follows:
further, derived from equation (1) and equation (2), equation (4) converts as follows:
(4)
Where T pic_i_ Finishing the whole is an integer value of the camera system time of the i-th shot and T pic_first_ Finishing the whole is an integer value of the camera system time of the first shot.
As can be seen from the above formula, since the time system of the camera can be accurate to only one second, there is a certain error between the normalized time of the normalized photo and the corresponding time system of the POS, and the threshold of the error can be used as a parameter for establishing a corresponding relationship between the photo and each piece of recorded data in the POS data.
That is, in the embodiment of the present invention, the error may be used as a basis for determining whether there is missing data in the POS data and the photo data.
In addition, when rounding the recording time data of each of the photographs in the photograph data described in the above embodiment, rounding-down, rounding-up, and the like may be performed.
For example, for the rounding method, assuming that the method of rounding the photo time system is rounded in a rounding manner, the following formula holds:
(5)
(6)
it can be determined from this that the above formula (4) can be converted into:
(7)
for another example, for the down rounding method, assuming that the method of rounding the photo time system rounds in a direct integer cut, the following formula holds:
(8)
(9)
It follows that the range of values, i.e. the error threshold, is still 1, i.e.:
(10)
For another example, for the round-up method, assuming that the method of rounding the photo time system rounds up in a round-up manner, the following formula holds:
(11)
(12)
it follows that the range of values is still:
(13)
as can be seen from the above three processing methods, the POS time system and the camera time system error threshold in the embodiment of the invention may be set to 1.0.
It will be appreciated that the error threshold may be specifically determined according to practical situations, and the present invention is not limited thereto.
Optionally, in the embodiment of the present invention, as shown in fig. 4 and 5, in S130, when determining the correspondence between the recorded data in the data sequence and the shots in the shot sequence based on the normalized time of each shot, the recorded time of each shot, and the error threshold of the POS time system and the camera time system, the following steps may be specifically adopted:
S131, determining whether missing data exists in the data sequence or the photo sequence according to the absolute value of the difference value between the normalized time and the recording time of each photo;
and S132, when the existence of the lost data is determined, according to the determined position and category of the lost data in the data sequence or the photo sequence, determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence.
Specifically, as shown in fig. 4 and fig. 5, after normalizing the recording time of each photo in the photo sequence, the difference between the normalized time of each photo and the recording time can be determined first, and then, whether missing data exists is determined according to the relation between the absolute value of the difference and the error threshold.
I.e. the absolute value of the difference between the normalized time and the creation time of a photo is greater than the error threshold, it may be determined that missing data is present in the POS data or the photo data.
For example, when the absolute value of the difference between the normalized time and the recording time of a photo is greater than 1.0, it may be determined that there is missing data in the POS data or the photo data.
Further, after determining that there is lost data, the type of the lost data, i.e. whether it belongs to recorded data or photo, may be determined, and after determining the type of the lost data, the position of the lost data in the data sequence or photo sequence may be determined.
Namely, determining the lost data to be data record or photo according to the record time of the record data or photo corresponding to the record time of the lost data; and further, according to the determined position of the lost data in the data sequence or the photo sequence, the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence can be determined.
Specifically, when the recording time of the recorded data corresponding to the recording time of the lost data is earlier than the recording time of the photo, the lost data is indicated to be the photo; otherwise, the lost data is recorded data.
I.e. as shown in connection with fig. 4 and 5, if the data sequence and the photo are found to be in time sequence, the time interval is greater than the error threshold, and it is determined that a missing POS or photo is present, i.e. that there is missing data.
Further, the recording time of the lost data is determined, such as the POS time in the recorded data and the shooting time of the photo. If the shooting time of the photo is later, the situation of losing the photo is indicated, and if the POS data time point is later, the POS data is indicated to be lost.
Further, after determining the specific position and the category of the lost data, deleting the recorded data or photo corresponding to the position of the lost data to obtain a data sequence or photo sequence after deleting operation; and further, according to the data record or the photo sequence after the deleting operation, determining the corresponding relation between the record data in the data sequence and the photo in the photo sequence after the deleting operation.
It can be appreciated that in the embodiment of the present invention, for the specific determination of the missing data, the complete processing of the two sets of data may be implemented by traversing each data object in the data sequence and the photo sequence during the execution of the computer device.
Optionally, in some embodiments of the present invention, to better determine whether there is a shot loss, the overlapping ratio of adjacent shots may be used to determine.
That is, in some embodiments, when it is determined that the recording time of the recorded data corresponding to the recording time of the lost data is earlier than the recording time of the photo, the method further includes calculating an overlap ratio of adjacent photos, and further determining whether there is a photo loss using the overlap ratio.
The overlapping rate of the adjacent photos can be calculated by the following steps:
s01, extracting characteristic points of adjacent photos in the photo sequence;
S02, determining the overlapping area of the adjacent photos according to the extracted characteristic points;
S03, determining the overlapping rate of the adjacent photos based on the determined overlapping area.
Specifically, as shown in fig. 6, during the course of unmanned aerial vehicle panning, the course overlap rate of the photo is an important parameter, which represents the size of the overlap region between the photos shot by two adjacent panning points, and in general, the unmanned aerial vehicle course overlap rate is not lower than 70%. The course overlapping rate of the photo is used as an index to assist in judging whether the photo data is lost in the navigation band.
If the course overlapping rate is set to 80%, the overlapping rate of two adjacent pictures is about 80%, if one picture is lost in the middle, the overlapping rate is about 80% = 64%, when 64% is used as the judging threshold value of the course overlapping rate and is smaller than the threshold value, the area can be preliminarily predicted to be possibly lost, and the current data loss condition can be accurately judged under the condition of further combining normalization time processing.
Further, in order to better understand the accurate establishment of the correspondence between POS data and photo data in the embodiment of the present invention, the following details are described with reference to fig. 4 and 5.
Specifically, in conjunction with fig. 4 and 5:
s1, acquiring POS file information, and extracting a unique mark number of a POS file and time corresponding to POS acquisition;
S2, reading the internal information of the photo, wherein the internal information mainly comprises the photo name and the photo creation time;
S3, normalization processing is carried out, the POS time system and the photo time system are not unified, and time normalization processing is needed. An airplane can send a POS record and take photos before taking off, and mainly aims to test whether the unmanned aerial vehicle works normally or not, the test data cannot have data loss, the time generated by a first POS file and the time taken by the first photo are used for processing in a unified format, and the photo time is unified to be the POS file generation time;
S4, creating a POS storage space posList and a photo storage space xpList, traversing POS data by taking the first POS time as a reference, traversing photo files at the same time, performing one-to-one correspondence through the normalized time sequence, and respectively storing the data which are in one-to-one correspondence in the posList and xpList;
s5, when the POS and the photo are found to be in time sequence and the time interval is more than 1.0 second, indicating that the POS or the photo is lost in the part;
S6, further checking POS time and photo shooting time, wherein the photo shooting time is later, and describing the photo loss condition, wherein a posList stores current POS data, and xpList is empty; after the POS data time point, the lost POS data is indicated, the posList is empty, and xpList stores the current photo data;
S7, if the POS data are lost, traversing the next POS data without changing the current photo data; if the photo data is lost, the current POS data is unchanged, the next photo data is traversed, and the operations of S5-S7 are continued.
And S8, after the data processing is finished, traversing corresponding display is carried out on the posList and xpList, null values appear, the data loss is indicated, and corresponding deleting operation is carried out.
S9, while traversing the photo data, calculating the overlapping rate between adjacent photos to be used as reference data for judging whether the photo data are lost.
Further, the following embodiment illustrates a processing method for acquiring data by using the unmanned aerial vehicle provided by the embodiment of the invention.
Namely, according to the aerial photography data processing method, POS data and photo data of a plurality of aerial bands of the same equipment are selected, the POS data are combined into a POS file, the photos of the plurality of aerial bands are copied into the same folder, in which any POS data and photo data are deleted, and the deletion is recorded, the method can accurately inquire the deleted POS data and the deleted photo data and correspondingly process the deleted POS data and the deleted photo data. As shown in fig. 7:
It can be understood that the processing method is simple, convenient and quick, can help technicians to determine whether original aerial survey data lose photo or POS data and accurately judge the position of the lost data, and through the operation, the POS data are corresponding to the photo data, unmatched photo and POS data are removed, so that the unmatched problem of the two is fundamentally solved, and on the basis, the space three data calculation is carried out, so that the space three calculation success rate can be effectively improved.
On the other hand, the embodiment of the invention provides a processing device for collecting data by an unmanned aerial vehicle, which specifically comprises:
The acquisition module is used for acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprise POS data and photo data, the POS data comprise recorded data sequences and recording time of each piece of recorded data in the data sequences, and the photo data comprise acquired photo sequences and recording time of each photo in the photo sequences;
The normalization module is used for normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence to obtain the normalization time of each photo in the photo sequence;
and the determining module is used for determining the corresponding relation between the recorded data in the data sequence and the photos in the photo sequence based on the normalized time of each photo, the recorded time of each photo and the error threshold values of the POS time system and the camera time system.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the normalization module is specifically configured to:
Determining the recording time of the first piece of recorded data in the data sequence and the difference value of the recording time of the first piece of photo in the photo sequence as the system time difference of the POS time system and the camera time system;
and determining the recording time of each photo in the photo sequence in a POS time system based on the system time difference as the normalization time of each photo.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the determining module is specifically configured to:
determining whether missing data exists in the data sequence or the photo sequence according to the absolute value of the difference value between the normalized time and the recording time of each photo;
when it is determined that lost data exists, according to the determined position and category of the lost data in the data sequence or the photo sequence, the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence is determined.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the determining module is specifically configured to:
And when the absolute value of the difference value between the normalized time and the recording time of the photo is larger than the error threshold value, determining that lost data exists in the POS data or the photo data.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the determining module is specifically configured to:
Determining the lost data as data record or photo according to the record time of the record data or photo corresponding to the record time of the lost data;
and according to the determined position of the lost data in the data sequence or the photo sequence, determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the determining module is specifically configured to:
when the recording time of the recorded data corresponding to the recording time of the lost data is earlier than the recording time of the photo, the lost data is indicated to be the photo;
otherwise, the lost data is recorded data.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention, the determining module is specifically configured to:
deleting the recorded data or photo corresponding to the position of the lost data to obtain a data sequence or photo sequence after deleting operation;
And determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence after the deleting operation according to the data record or the photo sequence after the deleting operation.
Optionally, the processing device for collecting data by using the unmanned aerial vehicle provided by the embodiment of the invention further includes a calculation module, where the calculation module is specifically configured to:
extracting feature points of adjacent photos in the photo sequence;
Determining the overlapping area of the adjacent photos according to the extracted characteristic points;
and determining the overlapping rate of the adjacent photos based on the determined overlapping area.
Optionally, the processing device for acquiring data by using an unmanned aerial vehicle provided by the embodiment of the present invention indicates that the photo is suspected to be lost when the overlapping rate is not greater than a set threshold.
On the other hand, the embodiment of the invention provides the computer device, which further comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the processing method for collecting data by the unmanned aerial vehicle.
Referring now to fig. 8, fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
As shown in fig. 8, the computer device includes a Central Processing Unit (CPU) 301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the electronic apparatus 300 are also stored. The CPU 301, ROM 302, and RAM 303 are connected to each other through a bus 304. An input/output (I/O) interface 305 is also connected to bus 304. In some embodiments, the following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, and the like; an output portion 307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 308 including a hard disk or the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The drive 310 is also connected to the I/O interface 305 as needed. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 310 as needed, so that a computer program read therefrom is installed into the storage section 308 as needed. In particular, according to embodiments of the present invention, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a machine-readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311. The above-described functions defined in the electronic device of the present invention are performed when the computer program is executed by the Central Processing Unit (CPU) 301.
The computer readable medium shown in the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic device, apparatus, or device of electronic, magnetic, optical, electromagnetic, infrared, or semiconductor, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution electronic device, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution electronic device, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of electronic devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based electronic devices which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules involved in the embodiments of the present invention may be implemented in software or in hardware. The described units or modules may also be provided in a processor, for example, as: a processor, comprising: the device comprises an acquisition module, a normalization module and a determination module. The names of these units or modules do not in any way limit the units or modules themselves, and the determining module may be described as "determining the correspondence between the recorded data in the data sequence and the shots in the shot sequence based on the normalized time of each shot and the recorded time of each shot, and the POS time system and the error threshold of the camera time system".
As another aspect, the present invention also provides a computer-readable storage medium that may be contained in the computer device described in the above embodiment; or may exist alone without being incorporated into the computer device. The computer readable storage medium stores one or more computer programs which, when used by one or more processors, perform the methods described in the present invention:
Acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprise POS data and photo data, the POS data comprise recorded data sequences and recording time of each piece of recorded data in the data sequences, and the photo data comprise acquired photo sequences and recording time of each photo in the photo sequences;
Normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence to obtain the normalized time of each photo in the photo sequence;
and determining the corresponding relation between the recorded data in the data sequence and the photos in the photo sequence based on the normalized time of each photo and the recorded time of each photo and the error threshold values of the POS time system and the camera time system.
In summary, according to the unmanned aerial vehicle acquired data processing method and the computer device provided by the invention, when the acquired data of the unmanned aerial vehicle is processed, by utilizing the recording time in the POS data and the recording time characteristic in the photo data, namely, by taking the recording time in the POS data and the recording time of the first photo in the photo sequence as references, the recording time of each photo in the photo sequence in the photo data is normalized into the POS time system, so that the normalized time, the original photo recording time, the set POS time system and the error threshold of the camera time system are used as judgment references to determine the data loss basis, the establishment of the one-to-one correspondence between each piece of recorded data in the POS data acquired by the unmanned aerial vehicle and the photo in the photo data can be accurately and automatically realized, the labor intensity and the operation time of technicians can be reduced, the inspection and the matching of the POS and the photo data can be realized rapidly, and reliable basic data is provided for the subsequent navigation data processing.
The above description is only illustrative of the preferred embodiments of the present invention and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in the present invention is not limited to the specific combinations of technical features described above, but also covers other technical features which may be formed by any combination of the technical features described above or their equivalents without departing from the spirit of the disclosure. Such as the above-mentioned features and the technical features disclosed in the present invention (but not limited to) having similar functions are replaced with each other.

Claims (9)

1. A method for processing data collected by an unmanned aerial vehicle, the method comprising:
Acquiring unmanned aerial vehicle acquisition data, wherein the acquisition data comprise POS data and photo data, the POS data comprise recorded data sequences and recording time of each piece of recorded data in the data sequences, and the photo data comprise acquired photo sequences and recording time of each photo in the photo sequences;
Normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each piece of recorded data in the data sequence and the recording time of the first photo in the photo sequence to obtain the normalized time of each photo in the photo sequence;
Determining a correspondence between recorded data in the data sequence and shots in the shot sequence based on the normalized time of each shot and a recorded time of each shot, and an error threshold of a POS time system and a camera time system;
The normalizing the recording time of each photo in the photo sequence to a POS time system based on the recording time of each recorded data in the data sequence and the recording time of the first photo in the photo sequence, and obtaining the normalized time of each photo in the photo sequence includes:
determining the recording time of the first piece of recorded data in the data sequence and the difference value of the recording time of the first piece of photo in the photo sequence as the system time difference of the POS time system and the camera time system;
and determining the recording time of each photo in the photo sequence in a POS time system based on the system time difference as the normalization time of each photo.
2. The method according to claim 1, wherein the determining the correspondence between the recorded data in the data sequence and the shots in the shot sequence based on the normalized time of each shot and the recorded time of each shot, and an error threshold of a POS time system and the camera time system includes:
determining whether missing data exists in the data sequence or the photo sequence according to the absolute value of the difference value between the normalized time and the recording time of each photo;
when it is determined that lost data exists, according to the determined position and category of the lost data in the data sequence or the photo sequence, the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence is determined.
3. The method according to claim 2, wherein determining whether missing data exists in the data sequence or the shot sequence according to an absolute value of a difference between the normalized time of each shot and a recording time of each shot comprises:
and when the absolute value of the difference value between the normalized time and the recording time of the photo is larger than the error threshold value, determining that lost data exists in the POS data or the photo data.
4. The method for processing the collected data of the unmanned aerial vehicle according to claim 3, wherein the determining the correspondence between the recorded data in the data sequence and the shots in the shot sequence according to the determined position and the determined category of the lost data comprises:
Determining the lost data as data record or photo according to the record time of the record data or photo corresponding to the record time of the lost data;
and according to the determined position of the lost data in the data sequence or the photo sequence, determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence.
5. The method according to claim 4, wherein determining that the lost data is a data record or a photo according to the record time of the record data or the photo corresponding to the record time of the lost data comprises:
when the recording time of the recorded data corresponding to the recording time of the lost data is earlier than the recording time of the photo, the lost data is indicated to be the photo;
otherwise, the lost data is recorded data.
6. The method according to claim 5, wherein determining the correspondence between the recorded data in the data sequence and the shots in the shot sequence according to the determined position of the lost data in the data sequence or the shot sequence comprises:
deleting the recorded data or photo corresponding to the position of the lost data to obtain a data sequence or photo sequence after deleting operation;
And determining the corresponding relation between the recorded data in the data sequence and the photo in the photo sequence after the deleting operation according to the data record or the photo sequence after the deleting operation.
7. A method of processing data collected by a drone according to claim 3, wherein when the recorded data corresponding to the missing data location is earlier than the recording time of the photo, the method further comprises:
extracting feature points of adjacent photos in the photo sequence;
Determining the overlapping area of the adjacent photos according to the extracted characteristic points;
and determining the overlapping rate of the adjacent photos based on the determined overlapping area.
8. The method of claim 7, wherein the image is suspected to be lost when the overlap rate is not greater than a set threshold.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of processing the data collected by the drone according to any one of claims 1-8 when the program is executed by the processor.
CN202410666280.3A 2024-05-28 2024-05-28 Unmanned aerial vehicle data acquisition processing method and computer equipment Active CN118245453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410666280.3A CN118245453B (en) 2024-05-28 2024-05-28 Unmanned aerial vehicle data acquisition processing method and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410666280.3A CN118245453B (en) 2024-05-28 2024-05-28 Unmanned aerial vehicle data acquisition processing method and computer equipment

Publications (2)

Publication Number Publication Date
CN118245453A CN118245453A (en) 2024-06-25
CN118245453B true CN118245453B (en) 2024-08-13

Family

ID=91560824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410666280.3A Active CN118245453B (en) 2024-05-28 2024-05-28 Unmanned aerial vehicle data acquisition processing method and computer equipment

Country Status (1)

Country Link
CN (1) CN118245453B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111536947A (en) * 2020-04-30 2020-08-14 南昌伦宇科技有限公司 Method and system for automatically detecting oblique photography missing and quickly performing rephotography

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11006095B2 (en) * 2015-07-15 2021-05-11 Fyusion, Inc. Drone based capture of a multi-view interactive digital media
CN106356757B (en) * 2016-08-11 2018-03-20 河海大学常州校区 A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic
CN117235299A (en) * 2023-10-18 2023-12-15 广州市城市规划勘测设计研究院 Quick indexing method, system, equipment and medium for oblique photographic pictures

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111536947A (en) * 2020-04-30 2020-08-14 南昌伦宇科技有限公司 Method and system for automatically detecting oblique photography missing and quickly performing rephotography

Also Published As

Publication number Publication date
CN118245453A (en) 2024-06-25

Similar Documents

Publication Publication Date Title
US9185289B2 (en) Generating a composite field of view using a plurality of oblique panoramic images of a geographic area
Baboud et al. Automatic photo-to-terrain alignment for the annotation of mountain pictures
CN108230379A (en) For merging the method and apparatus of point cloud data
CN110826549A (en) Inspection robot instrument image identification method and system based on computer vision
CN109461208A (en) Three-dimensional map processing method, device, medium and calculating equipment
CN112465969A (en) Real-time three-dimensional modeling method and system based on unmanned aerial vehicle aerial image data
CN111461981A (en) Error estimation method and device for point cloud splicing algorithm
US11875524B2 (en) Unmanned aerial vehicle platform based vision measurement method for static rigid object
CN110717861A (en) Image splicing method and device, electronic equipment and computer readable storage medium
CN110533766B (en) Oblique photography image intelligent writing method based on image control-free PPK data
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
CN108759788A (en) Unmanned plane image positioning and orientation method and unmanned plane
CN115330876B (en) Target template graph matching and positioning method based on twin network and central position estimation
CN110378174A (en) Road extracting method and device
CN112799430A (en) Programmable unmanned aerial vehicle-based road surface image intelligent acquisition method
CN115358486A (en) Port freight volume prediction method, system and application based on three-dimensional satellite image
CN112270748B (en) Three-dimensional reconstruction method and device based on image
CN118245453B (en) Unmanned aerial vehicle data acquisition processing method and computer equipment
CN116704037B (en) Satellite lock-losing repositioning method and system based on image processing technology
CN113763466B (en) Loop detection method and device, electronic equipment and storage medium
CN117274338A (en) Unmanned aerial vehicle hyperspectral image alignment method, device, terminal and storage medium
EP4407563A1 (en) Multi-type map-based fusion positioning method and electronic device
CN115620264A (en) Vehicle positioning method and device, electronic equipment and computer readable medium
CN116295466A (en) Map generation method, map generation device, electronic device, storage medium and vehicle
CN114937200A (en) Pond dead fish detection method, device, equipment and medium based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant