CN110599412A - Remote sensing data processing method and system based on unmanned aerial vehicle - Google Patents

Remote sensing data processing method and system based on unmanned aerial vehicle Download PDF

Info

Publication number
CN110599412A
CN110599412A CN201910753662.9A CN201910753662A CN110599412A CN 110599412 A CN110599412 A CN 110599412A CN 201910753662 A CN201910753662 A CN 201910753662A CN 110599412 A CN110599412 A CN 110599412A
Authority
CN
China
Prior art keywords
remote sensing
sensing data
thermal infrared
optical
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910753662.9A
Other languages
Chinese (zh)
Inventor
陈富龙
李儒�
檀畅
邓飚
时丕龙
周伟
朱海涛
徐进勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Remote Sensing and Digital Earth of CAS
Satellite Application Center for Ecology and Environment of MEE
Original Assignee
Institute of Remote Sensing and Digital Earth of CAS
Satellite Application Center for Ecology and Environment of MEE
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Remote Sensing and Digital Earth of CAS, Satellite Application Center for Ecology and Environment of MEE filed Critical Institute of Remote Sensing and Digital Earth of CAS
Priority to CN201910753662.9A priority Critical patent/CN110599412A/en
Publication of CN110599412A publication Critical patent/CN110599412A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The embodiment of the invention provides a remote sensing data processing method, which comprises the following steps: receiving optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle end; and fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data. Wherein portable unmanned aerial vehicle carries on hot infrared equipment of high resolution, the hot infrared data spatial resolution who obtains is high, carried on professional optical camera simultaneously, accomplish archaeology target optical data acquisition in step, two kinds of data pass through information fusion, optics and hot infrared data advantage have been realized, the spatial resolution who has both improved data, make the inside detail characteristic of data obtain the expression again, solve in the remote sensing archaeology quick, mobile, low-cost demand problem to high spatial resolution (compromise abundant texture information) hot infrared data.

Description

Remote sensing data processing method and system based on unmanned aerial vehicle
Technical Field
The invention relates to remote sensing data processing, in particular to a remote sensing data processing method and system based on an unmanned aerial vehicle.
Background
Remote sensing archaeology is a research work carried out by taking remote sensing data as basic data, and is mainly used for distribution survey and prediction of archaeological relics, detection of existence and distribution pattern of overground and underground archaeological relics, space mapping of the archaeological relics and virtual restoration of environments where the archaeological relics are located at present. The remote sensing technology can quickly and effectively find out the distribution information of the overground and underground ancient sites, plays a very obvious role in modern archaeology, and gradually becomes an essential working program in the archaeology exploration, particularly the large-scale ancient site exploration.
Principle of the technology
The remote sensing archaeology theory is based on the close relation among the physical property of the relict or phenomenon, the electromagnetic wave spectrum characteristic and the image characteristic, the interpretation of the remote sensing image is to determine the property and the space position of the (suspected) relict or phenomenon according to the color tone, the texture, the shape, the size and the space distribution characteristic of the pattern of the image and the spectrum characteristic of the relict or phenomenon.
The remote sensing archaeology can be simply divided into optical remote sensing archaeology, thermal infrared remote sensing archaeology and microwave remote sensing archaeology.
1. Optical remote sensing archaeology
The optical remote sensing records the spectral characteristics of the ground features, obtains the spatial distribution pattern of the ground features (large-scale distribution of the vestige) on the ground on a large scale in a visible and available form, and records and explains archaeological targets and the environment thereof from texture differences of crops, bare soil, shadows generated by topographic relief and the like. The optical data has the characteristics of being obtained by what you see is, high spatial resolution and good visual effect, and is the preferred data of remote sensing archaeology.
2. Radar (microwave) remote sensing archaeology
The radar, represented by a synthetic aperture radar, has penetrating imaging capability, so that the radar becomes an important remote sensing data source for observing cultural heritage in tropical rainforests and desert regions. The radar is sensitive to terrain with linear geometric characteristics, and the radar waves with different wavelengths have different sensitivities to different vegetation and environmental factors, so that the radar is extremely suitable for identifying archaeological objects such as temple, building, road and the like.
3. Thermal infrared remote sensing archaeology
All substances, as long as their temperature exceeds absolute zero, will continuously emit infrared energy. The thermal infrared remote sensing is to collect and record the thermal infrared information of the ground object by using a satellite-borne or airborne sensor, and identify the ground object and invert surface parameters such as temperature, humidity, thermal inertia and the like by using the thermal infrared information. With this archaeological technique, an archaeologist can trace back an ancient road network or the like that is hidden under dense vegetation. A large prehistoric site including prehistoric roads, city walls, buildings and farmlands is found in an archaeological survey of the Chaco canyon in New Mexico, and the site can be traced back to a Gregorian 900 years. The archaeological characteristics can be found only by the thermal infrared remote sensing technology without being obtained in the ground survey and the aerial photograph analysis.
Therefore, the thermal infrared remote sensing archaeology has irreplaceability, but the data is limited by equipment, has low spatial resolution and has limited capability of expressing details of ground vestige.
The thermal infrared remote sensing archaeology has irreplaceability, and the existing method for acquiring thermal infrared data comprises a satellite-borne thermal infrared system and a machine-carried thermal infrared system.
The ground resolution of the thermal infrared sensor is proportional to the distance of the target from the sensor. If the thermal infrared sensor is carried on Landsat satellite, the resolution is 60m (Landsat-7), 100m (Landsat-8) or 120m (Landsat-5), and the resolution limits the archaeological detection capability. If the sensor is mounted on a manned aircraft, the corresponding ground resolution of the sensor is improved to a meter sub-meter level (depending on the flying height). Meanwhile, due to the performance limitation of the sensor, under the same spatial resolution and observation conditions, the internal detail characteristics of the thermal infrared data are weaker and the expression is limited compared with the optical data.
Although the thermal infrared data spatial resolution can be improved to a meter-level sub-meter level by using the manned aircraft, the manned aircraft is high in cost and long in period and is controlled by the flying height, the spatial resolution is still limited, and the archaeological detection requirement cannot be met in a low-cost, convenient, rapid, flexible and flexible mode.
Disclosure of Invention
The invention aims to provide a remote sensing data processing method and system based on an unmanned aerial vehicle, wherein the remote sensing data processing method is adopted, the portable unmanned aerial vehicle carries high-resolution thermal infrared equipment, the thermal infrared data obtained by the portable unmanned aerial vehicle is high in spatial resolution, a professional optical camera is carried at the same time, the archaeological target optical data acquisition is synchronously completed, the two types of data are fused through information, the advantages of optical and thermal infrared data are realized, the spatial resolution of the data is improved, the internal detail characteristics of the data are expressed, and the problem of the requirement on high spatial resolution (abundant texture information) thermal infrared data, rapidness, mobility and low cost in remote sensing archaeology is solved.
In order to achieve the above object, an embodiment of the present invention provides a method for processing remote sensing data, where the method for processing remote sensing data includes:
receiving optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle end; and
and fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data.
Optionally, the fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fused data includes:
fusing the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation;
fusing the high-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation; and
and fusing the high-frequency part and the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data in a one-to-one correspondence mode according to corresponding rules, and performing NSST inverse transformation on the fused high-frequency part and the fused low-frequency part to obtain a final fusion result.
Optionally, the fusion rule for the low frequency part is shown in the following formula:
wherein alpha and beta represent weight coefficients of a low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, and K represents the weight coefficients of the low-frequency part0Which represents the domain of K and,the method comprises the steps of representing a low-frequency component fusion value of thermal infrared remote sensing data and optical remote sensing data NSST after transformation at the nth row and the nth column, wherein the first part reflects the main low-frequency energy of images, and the second part reflects the detail difference between the images;
the fusion rule for the high frequency part includes:
wherein, ω is1,ω2For the weight coefficient of the high frequency part, j represents the number of hierarchical layers of the image, k represents the imageThe amount of translation is such that,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,and (4) representing a high-frequency component fusion value obtained after NSST transformation of the thermal infrared remote sensing data and the optical remote sensing data at the mth row and the nth column. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
Optionally, the method for processing remote sensing data further includes:
performing lens distortion correction on a photographing lens on an unmanned aerial vehicle end before photographing
Optionally, the method for processing remote sensing data further includes: acquiring the optical remote sensing data and the thermal infrared remote sensing data;
wherein the acquiring the optical remote sensing data and the thermal infrared remote sensing data comprises:
acquiring optical remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
acquiring thermal infrared remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
and recording the obtained optical remote sensing data and the obtained thermal infrared remote sensing data in a memory, and transmitting the obtained thermal infrared video to a control terminal.
Optionally, the method for processing remote sensing data further includes:
receiving a planned route of the unmanned aerial vehicle before the unmanned aerial vehicle flies;
and receiving lens exposure information when the unmanned aerial vehicle reaches the preset shooting position.
The present invention also provides a remote sensing data processing apparatus, comprising:
the first information transmission device is used for receiving optical remote sensing data and thermal infrared remote sensing data transmitted by the unmanned aerial vehicle end; and
and the fusion device is used for fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data.
Optionally, the fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fused data includes:
fusing the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation;
fusing the high-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation; and
and fusing the high-frequency part and the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data in a one-to-one correspondence mode according to corresponding rules, and performing NSST inverse transformation on the fused high-frequency part and the fused low-frequency part to obtain a final fusion result.
Optionally, the fusion rule for the low frequency part is shown in the following formula:
wherein alpha and beta represent weight coefficients of a low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, and K represents the weight coefficients of the low-frequency part0Which represents the domain of K and,the method comprises the steps of representing a low-frequency component fusion value of thermal infrared remote sensing data and optical remote sensing data NSST after transformation at the nth row and the nth column, wherein the first part reflects the main low-frequency energy of images, and the second part reflects the detail difference between the images;
the fusion rule for the high frequency part includes:
wherein, ω is1,ω2Is a weight coefficient of a high frequency part, j represents the number of image hierarchical layers, k represents the amount of image translation,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,indicating that at the mth row and nth column,and (4) high-frequency component fusion values obtained after NSST transformation of the thermal infrared remote sensing data and the optical remote sensing data. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
Optionally, the remote sensing data processing device further includes:
and the correcting device is used for carrying out lens distortion correction on the shooting lens on the unmanned aerial vehicle end before shooting.
The invention also provides a remote sensing data processing control system based on the unmanned aerial vehicle, which comprises:
the remote sensing data processing control system based on the unmanned aerial vehicle comprises:
remote sensing data acquisition equipment based on an unmanned aerial vehicle; and
the remote sensing data processing device;
wherein the remote sensing data acquisition device includes:
the optical camera is arranged on the unmanned aerial vehicle and used for acquiring optical remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
the thermal infrared imager is arranged on the unmanned aerial vehicle and used for acquiring thermal infrared remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
and the second information transmission device is arranged on the unmanned aerial vehicle and used for transmitting the acquired optical remote sensing data and the acquired thermal infrared remote sensing data to remote sensing data processing equipment.
Optionally, the remote sensing data obtaining device further includes:
the first information transmission device is arranged on the unmanned aerial vehicle and used for executing the following operations:
recording remote sensing data obtained by an optical camera and a thermal infrared imager in real time in the flight process of the unmanned aerial vehicle;
and recording the thermal infrared video data shot by the thermal infrared imager in real time, and simultaneously transmitting the thermal infrared video data to the remote sensing data acquisition equipment based on the unmanned aerial vehicle in real time.
In another aspect, the present invention provides a machine-readable storage medium having stored thereon instructions for causing a machine to execute the method for processing remote sensing data described above.
Through the technical scheme, optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle end are received and fused to obtain optical thermal infrared fused data, wherein the portable unmanned aerial vehicle carries high-resolution thermal infrared equipment, the spatial resolution of the acquired thermal infrared data is high, a professional optical camera is carried, the optical data acquisition of an archaeological target is completed synchronously, the two types of data are fused through information, the advantages of optical data and thermal infrared data are realized, the spatial resolution of the data is improved, the internal detail characteristics of the data are expressed, and the demand problem of high spatial resolution (abundant texture information) thermal infrared data speed, mobility and low cost in remote sensing archaeology is solved.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 is a basic flow chart of a method for processing remote sensing data according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for processing remote sensing data according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method for obtaining remote sensing data according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a camera imaging model provided by an embodiment of the invention;
FIG. 5 is a flowchart of a lens distortion correction technique provided by an embodiment of the present invention;
fig. 6 is a schematic diagram of a target of a nickel-aluminum plated block of a black-and-white grid according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a remote sensing data processing device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a data fusion process provided by an embodiment of the invention;
FIG. 9 is a schematic structural diagram of a remote sensing data processing control system according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of operation of a long-wave thermal infrared imager provided by an embodiment of the invention;
fig. 11 is a schematic diagram of NSST transformation flow provided by the embodiment of the present invention.
Description of the reference numerals
40 remote sensing data processing device 50 remote sensing data acquisition device
401 first information transmission device 402 fusion device
501 optical camera 502 thermal infrared camera
503 second information transmission device
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
Fig. 1 shows a basic flowchart of a method for processing remote sensing data according to an embodiment of the present invention, and as shown in fig. 1, the method for processing remote sensing data includes:
receiving optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle end; and
and fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data.
According to the user requirements, the unmanned aerial vehicle flies to a set position and shoots a set target through the optical camera and the thermal infrared camera, corresponding optical remote sensing data and thermal infrared remote sensing data are provided, optical remote sensing data acquisition and thermal infrared remote sensing data acquisition of the set target are synchronously completed, the acquired optical remote sensing data and thermal infrared remote sensing data are fused to obtain optical thermal infrared fusion data, the combination of the advantages of the optical data and the advantages of the thermal infrared data are realized, the spatial resolution of the data is improved, the internal detail characteristics of the data are expressed, and the problem of the requirement of abundant texture information of high-altitude resolution in the work such as archaeology can be solved.
In the invention, the optical lens and the thermal infrared lens are rigidly connected and are fixed relatively to the outside, which is beneficial to image registration, and the thermal infrared data and the optical data are registered in a pixel-to-pixel mode by taking the optical image as a reference and utilizing the geometric relation between pixels through the fixed mathematical relation between pixels of the two types of data, rather than the traditional registration method utilizing texture information and the like.
Because the flight height of the unmanned aerial vehicle is limited, although the distortion of the camera lens is corrected, the distortion of the edge of the photo is still large, if the distortion is directly fused, the fusion effect of the edge part is poor, before the fusion, data is cut according to the flight path overlapping design parameters (generally about 10% of the data is cut, and the photo is ensured to have enough overlapping degree to realize the subsequent image mosaic), and the cut data is used for fusion.
For the fusion of the optical remote sensing data and the thermal infrared remote sensing data, NNST (Non-sampled shear wave Transform) Non-downsampling shear wave transformation may be used to perform the fusion of the optical remote sensing data and the thermal infrared remote sensing data. The method carries out multi-level and multi-directional decomposition to obtain low-frequency sub-band coefficients (components) and sub-band coefficients (components) in all band-pass directions.
Fig. 2 shows a flowchart of a method for processing remote sensing data according to an embodiment of the present invention, and as shown in fig. 2, a low-frequency portion is fused based on a weight average fusion rule of gray level jump. And fusing the high-frequency part based on the fusion rule of the combination of the regional texture smoothness and the Laplace energy, and further fusing the high-frequency part in the optical remote sensing data and the thermal infrared remote sensing data based on the fusion rule of the regional texture smoothness and the Laplace energy. The fusion image generated by the fusion algorithm has more and clearer details and shorter execution time.
The key to the NSST fusion method is the determination of the fusion rule:
a) the low-frequency partial image fusion rule is shown in the following formula:
wherein alpha and beta represent weight coefficients of a low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, and K represents the weight coefficients of the low-frequency part0Which represents the domain of K and,the fusion value of the low-frequency components of the thermal infrared remote sensing data and the optical remote sensing data after NSST transformation at the nth row and the nth column is shown, wherein the first part reflects the main low-frequency energy of the image, and the second part reflects the low-frequency energy of the imageThe two parts reflect the difference in detail between the images.
b) High-frequency partial image fusion rule:
when the background information remained in the high-frequency parts of the two source images is more, the fusion rule at the moment is shown as the following formula:
wherein, ω is1,ω2Is a weight coefficient of a high frequency part, j represents the number of image hierarchical layers, k represents the amount of image translation,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,and (4) representing a high-frequency component fusion value obtained after NSST transformation of the thermal infrared remote sensing data and the optical remote sensing data at the mth row and the nth column. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
Acquisition for optical remote sensing data and thermal infrared remote sensing data
Based on unmanned aerial vehicle, optical camera, thermal infrared camera and information transmission device can construct basic high spatial resolution ratio thermal infrared data integration acquisition equipment based on unmanned aerial vehicle. The unmanned aerial vehicle can be controlled to fly to a set place according to user requirements to shoot a set target, so that optical remote sensing data and thermal infrared remote sensing data about the set target are obtained.
Before the unmanned aerial vehicle goes to a set place to shoot, the following operations are executed:
1) device assembly and power-up debugging
The method for confirming whether each component works normally by powering up the unmanned aerial vehicle end equipment comprises the following steps: unmanned aerial vehicle system, optical camera, thermal infrared camera, information transmission device (data transmission system), storage device. Each system is supplied power by unmanned aerial vehicle, but if needs data transmission system exclusive use battery power supply.
The power-up debugging sequence is as follows: a) turning on a system main power supply; b) each system of the unmanned aerial vehicle is powered up for testing; after being tested to be normal; c) powering up the optical camera; d) testing camera exposure and data storage; e) powering up the thermal infrared system; f) testing the exposure (photographing mode) of the thermal infrared system and storing data, and recording the exposure times; g) testing the exposure (camera shooting mode) and data storage of the thermal infrared system, and recording the shooting duration; h) the data display test of the ground monitoring screen is carried out by the test data system; i) the data storage card (digital camera, thermal infrared system data memory module) takes out and confirms the data recording condition, including: whether data recording exists or not, whether the data quantity is correct or not, whether a lens is focused or not and whether other quality problems exist in the data or not; j) and if the data is normal, the device enters a standby stage, otherwise, the problems of all parts are checked at one time.
2) Flight plan planning
The method mainly comprises the steps of determining the type of acquired data (image data, video data and the like), the data coverage, the data acquisition time, the minimum spatial resolution of the data, the data acquisition mode and the like according to the requirements of an archaeological scheme. The planning scheme is the main basis of the design of the air route.
3) Flight zone determination and route planning
And determining the range of a flight area, the flight time and the number of the aerial flyers according to the archaeological flight scheme. And meanwhile, determining the maximum course distance and the exposure time (distance) according to the ground width of the photo of the optical camera and the thermal infrared system and the required lateral and heading overlapping degrees.
The thermal infrared device is large in heat radiation influence on the receiving surface of the ground, in order to avoid or reduce the radiation influence as much as possible, the flight time selection needs to be specially set, and the selection is carried out in the time before and after the day as much as possible, for example, no special requirement or no requirement is provided for optical data, and the flying around midnight is more suitable.
Aiming at archaeological needs, selecting an observation mode of a thermal infrared device: a photographing mode or a photography mode.
The optical camera determines whether to work according to the aerial photographing time, and can be always on in flight to observe the ground under the condition of light permission.
4) Lens distortion calibration
The precision requirement of remote sensing archaeology on flight data is limited, but the precision requirement is limited by flight height, the distortion of a lens edge image is large, lens distortion correction is needed for better fusion of flight data and the like, and in principle, lens calibration should be completed before each flight to obtain the distortion of the lens.
The basic principle for lens distortion calibration is as follows: and establishing a collinear equation containing camera distortion parameters (unknowns to be solved) by using the accurate position (relative position) of the target point and the coordinates of the corresponding image point.
FIG. 4 shows phases provided by an embodiment of the present inventionAs shown in fig. 4, it is assumed that the plane rectangular coordinates of the control point in the image corresponding to the image point are (X, Y), the ground (position on the target) coordinates of the control point are (X, Y, Z), the focal length of the digital camera to be obtained is f, and the photographing center coordinates are (X, Y, Z), respectivelys,Ys,Zs) Or in two-dimensional coordinates (x)0,y0) If the image point error caused by camera distortion is (Δ x, Δ y), then the following collinearity equation can be established according to the above parameters:
in the formula, ai、bi、ci(i ═ 1, 2, 3) is the 9 orientation cosines consisting of the 3 external orientation angle elements phi, omega, kappa of the image.
Fig. 5 shows a flowchart of a lens distortion correction technique provided by an embodiment of the present invention, specifically, the lens distortion data includes radial distortion (K), tangential distortion (P1, P2), and thin prism distortion (S1, S2). After linearization, the formula (1) substitutes the coordinates of the control point pair (coordinate pair: photo coordinate and coordinate of corresponding position on the target) to calculate the lens distortion parameter, and then deduces each single distortion correction number as follows:
radial distortion:
Δxr=(x-x0)(K1·r2+K2·r4+K3·r6+O[r8])
Δyr=(y-y0)(K1·r2+K2·r4+K3·r6+O[r8]) (2)
in the formula, K1、K2、…、KnRepresenting the radial distortion coefficient, which is generally taken to be K for a common lens3An item. r represents radial direction, r2=x2+y2(the same applies below).
Tangential distortion:
Δxd=P1(r2+2·(x-x0)2)+2·P2(x-x0)·(y-y0)+O[(x-x0,y-y0)4]
Δyd=P2(r2+2·(y-y0)2)+2·P1(x-x0)·(y-y0)+O[(x-x0,y-y0)4] (3)
for ordinary lens, the tangential distortion coefficient is generally taken to be P2Two steps are taken.
Prism distortion:
Δxp=s1((x-x0)2+(y-y0)2)+O[(x-x0,y-y0)4]
Δyp=s2((x-x0)2+(y-y0)2)+O[(x-x0,y-y0)4] (4)
the prism distortion coefficient is generally taken to be s to the first order for a common lens.
By utilizing a large number of control points, the distortion parameter (K) to be solved can be obtained by solving through a least square method1,K2,K3,P1,P2,s1,s2). Further, the position coordinates (x ', y') after the lens distortion correction is completed are obtained as:
x'=x+Δxr+Δxd+Δxp (5)
y'=y+Δyr+Δyd+Δyp (6)
the lens is not changed after being installed, such as violent vibration, re-assembly and disassembly and the like, and can not be re-calibrated within a certain precision range on the premise that the use environment is not changed violently, and distortion parameters can be reused.
Lens distortion correction technique flow chart
The process is realized by a program in a matlab program development language environment.
Fig. 6 shows a schematic diagram of a target of a nickel-aluminum plated block on a black-and-white grid according to an embodiment of the present invention, as shown in fig. 6:
a) when there is no lens distortion parameter, firstly, the lens distortion is corrected quickly. The method comprises the steps of enabling a black-and-white grid nickel-plated aluminum block target (applicable to optical and thermal infrared equipment at the same time) to reach a temperature balance state under the irradiation of sunlight, using data acquisition equipment with good rigid connection to simultaneously expose and photograph the target, acquiring a large amount of coordinate pair data, establishing a collinear equation, solving a lens distortion correction parameter through a least square method, and completing lens calibration.
b) When the lens distortion parameters exist, the image data are read one by using an immediate command through a while loop program, and the operation is carried out according to the formulas (2) to (6), so that the distortion correction of the image lens is completed.
According to user requirements, the unmanned aerial vehicle end obtains a planned air route of the unmanned aerial vehicle before flying, and the unmanned aerial vehicle flies according to the planned air route based on a storefront control station. And after the aircraft arrives at a preset shooting place, receiving the exposure information of the lens of the aircraft flight control system, and exposing according to a set exposure point. And determining whether the camera at the unmanned aerial vehicle end is in a camera shooting mode (video shooting mode), wherein under the condition that the thermal infrared camera is in the camera shooting mode, the shot video is transmitted to the ground control end and is played in real time through a display screen of the ground control end. And under the condition that the thermal infrared camera is in a shooting mode (photographing mode), sending the shot optical data and thermal infrared data to the ground control terminal and storing the optical data and the thermal infrared data.
The rapid correction of the lens distortion is carried out again after the unmanned aerial vehicle finishes the shooting task, whether a large deviation exists between the lens distortion degree of the unmanned aerial vehicle in the current state and the result of the rapid correction of the lens distortion finished before flight is determined, specifically, the result of the rapid correction of the lens distortion can be compared twice, and the optical data and the thermal infrared data shot by the flight can be used under the condition that the difference value is in the set difference range by comparing twice.
Fig. 7 shows a schematic structural diagram of a remote sensing data processing device according to an embodiment of the present invention, as shown in fig. 7, the remote sensing data processing device 40 may include a first information transmission device 401 and a fusion device 402, specifically, the first information transmission device 401 is configured to receive optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle, and the fusion device 402 is configured to fuse the optical remote sensing data and the thermal infrared remote sensing data received by the information transmission device 401 to obtain optical thermal infrared fusion data, so that respective advantages of the optical data and the thermal infrared data are combined, a spatial resolution of the data is improved, an internal detail feature of the data is expressed, and a problem of a requirement for high-altitude resolution and rich texture information in operations such as archaeology is solved.
Data fusion
Fig. 8 shows a data fusion flow diagram, as shown in fig. 8,
in the invention, the optical lens and the thermal infrared lens are rigidly connected and are fixed relatively to the outside, which is beneficial to image registration, and the thermal infrared data and the optical data are registered in a pixel-to-pixel mode by taking the optical image as a reference and utilizing the geometric relation between pixels through the fixed mathematical relation between pixels of the two types of data, rather than the traditional registration method utilizing texture information and the like.
Because the flight height of the unmanned aerial vehicle is limited, although the distortion of the camera lens is corrected, the distortion of the edge of the photo is still large, if the distortion is directly fused, the fusion effect of the edge part is poor, before the fusion, data is cut according to the flight path overlapping design parameters (generally about 10% of the data is cut, and the photo is ensured to have enough overlapping degree to realize the subsequent image mosaic), and the cut data is used for fusion.
For the fusion of the optical remote sensing data and the thermal infrared remote sensing data, the fusion device 402 may perform the fusion of the optical remote sensing data and the thermal infrared remote sensing data by using LNNST (Local Non-sampled shear wave Transform) localized Non-down-sampling shear wave transformation. The method carries out multi-level and multi-directional decomposition to obtain low-frequency sub-band coefficients (components) and sub-band coefficients (components) in all band-pass directions.
Specifically, the low-frequency part is fused based on a weight average fusion rule of gray level mutation. And fusing the high-frequency part based on the fusion rule of the combination of the regional texture smoothness and the Laplace energy, and further fusing the high-frequency part in the optical remote sensing data and the thermal infrared remote sensing data based on the fusion rule of the regional texture smoothness and the Laplace energy. The fusion image generated by the fusion algorithm has more and clearer details and shorter execution time.
As shown in fig. 11, the NSST fusion method is characterized by the following steps:
a) the low-frequency partial image fusion rule is shown in the following formula:
wherein alpha and beta represent weight coefficients of a low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, and K represents the weight coefficients of the low-frequency part0Which represents the domain of K and,and (3) representing a low-frequency component fusion value of the thermal infrared remote sensing data and the optical remote sensing data NSST after transformation at the nth row and the nth column, wherein the first part reflects the main low-frequency energy of the images, and the second part reflects the detail difference between the images.
b) High-frequency partial image fusion rule:
when the background information remained in the high-frequency parts of the two source images is more, the fusion rule at the moment is shown as the following formula:
wherein, ω is1,ω2Is a weight coefficient of a high frequency part, j represents the number of image hierarchical layers, k represents the amount of image translation,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,and (4) representing a high-frequency component fusion value obtained after NSST transformation of the thermal infrared remote sensing data and the optical remote sensing data at the mth row and the nth column. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
Fig. 9 shows a remote sensing data processing control system provided by an embodiment of the present invention, and as shown in fig. 9, the system includes a remote sensing data acquisition device 50 based on an unmanned aerial vehicle and the remote sensing data processing device 40, specifically, the remote sensing data acquisition device 50 includes an optical camera 501, a thermal infrared camera 502, and a second information transmission device 503. The optical camera 501 and the thermal infrared camera 502 are connected side by side through a rigid plate, the body of the camera is vertical to the flight direction, and the lens of the camera is vertical to the ground. After the rigid connection, the relative positions of the optical camera 501 and the thermal infrared camera 502 are fixed, and the relative position parameters of the optical camera 501 and the thermal infrared camera 502 can be known, the relative position relationship of the two cameras can be determined through a geometric mathematical relationship, and under the condition that the position of one camera is known, the position information of the other camera can be calculated through the mathematical relationship. A rigidity board for installing two cameras can hang under unmanned aerial vehicle's the belly, and this rigidity board is connected with unmanned aerial vehicle load platform, sets up damping device preferably between rigidity board and unmanned aerial vehicle load platform.
With respect to system composition
The remote sensing data processing control system based on the unmanned aerial vehicle can be composed of three parts:
remote sensing data acquisition equipment, remote sensing data processing equipment and unmanned aerial vehicle flight control equipment based on unmanned aerial vehicle.
Wherein unmanned aerial vehicle can choose for use and satisfy the load and hang in the condition to can provide the power supply, can trigger the core and shoot, with the unmanned aerial vehicle that the figure data transmission system does not have the electromagnetic compatibility problem can regard as the platform of carrying on.
The endurance time of the unmanned aerial vehicle platform is determined by a user according to the demand, and only requirements are made on unmanned aerial vehicle power supply equipment, flight control equipment and mounting.
A power supply device: not less than 3v, > 6W.
Flight control equipment: the flight attitude data (spatial attitude and position data) can be derived, and the requirement of controlling the sensor exposure by the flight control signal can be met.
Mounting: and a mounting point is arranged, so that a load system (without the weight of an unmanned plane and an airplane battery) of not less than 2kg can be mounted.
Because the core of thermal infrared imager is influenced by temperature greatly, consequently, the unmanned aerial vehicle selection should use battery drive as the owner, does not use oil to move equipment, avoids engine heat to produce the influence to thermal infrared equipment, and oil moves the engine vibrations and produces the influence.
Unmanned aerial vehicle flight platform performance and reliability are ensured by the unmanned aerial vehicle manufacturer.
The invention discloses a general unmanned aerial vehicle flight platform, which can cause that a thermal infrared sensor lens cannot focus due to flight vibration. To higher requirement user, can dispose cloud platform alone on the unmanned aerial vehicle platform and realize shock-absorbing function, guarantee simultaneously that the sensor camera lens is perpendicular horizontal plane all the time and make a video recording.
The FLIR Tau 640 long-wave infrared uncooled thermal imager core submodule with the spatial resolution of 640 x 480 (effective pixel number) is adopted, and a data storage and read-write submodule and a graphic image data transmission submodule (a subsequent circuit and a control system thereof) are matched to realize the acquisition, storage and transmission (video data) of target thermal infrared data. FIG. 10 shows a long wave thermal infrared imager operating schematic.
Different focal length lenses can be selected for use by the core of the long-wave infrared uncooled thermal imager, actual requirements (data ground resolution, edge distortion, image splicing and the like) of remote sensing archaeology are combined, according to calculation, a 25mm lens is used in the device, and when the flying height of the unmanned aerial vehicle is 100m, the size of one pixel of the long-wave infrared thermal imager corresponding to the ground is about 10cm x 7cm, namely, the size is superior to the spatial resolution of 10 cm.
Basic performance parameters of core of FLIR Tau 640 long-wave thermal infrared imager shown in Table 1
TABLE 1
Because unmanned aerial vehicle flying height ground, in order to alleviate equipment weight, choose card formula digital camera for use can satisfy the data acquisition requirement. Preferably, a digital camera having the following functions may be selected: effective pixels 2010 ten thousand, 8 times optical editing, 25mm lens. When the image is vertically shot, the size of the image is 1920 x 4912 pixels, the standard size is 1920 x 3424 pixels, and when the flying height is 100m, the image space resolution ensures centimeter-level effect.
The optical camera may use its own independent data storage function.
Aiming at data storage and read-write submodule
An industrial-grade serial port data recorder can be used as a data storage and read-write submodule, a module is packaged, a memory card is directly plugged and pulled, and the size of a shell is as follows: 90mm 80mm 24 mm. The unmanned aerial vehicle has a power supply reverse connection prevention protection circuit, can input 5-24VDC, is connected to an unmanned aerial vehicle platform, and is driven by 9V direct current provided by the unmanned aerial vehicle platform.
The onboard Micro SD card seat is directly inserted into the module through a memory card (Micro SD), records and stores flight data, and is connected with a corresponding interface of the core of the FLIR Tau 640 long-wave thermal infrared imager through a special interface.
The module board card uses a high-speed SD card as data storage equipment and is connected with the thermal infrared system movement link and the graphic image data transmission submodule through a standard interface. The Micro SD card writing speed class10 of the device is the lowest writing speed of 10MB/s, the capacity of the Micro SD card is 32G, and the FAT32 format.
The submodule only provides service for the long-wave thermal infrared imager core submodule.
Transmitting sub-module for graphic image data
The image data transmission submodule can be selected and matched according to actual conditions, the actual requirements (flight range) of the unmanned aerial vehicle thermal infrared archaeology are combined, in order to guarantee the spatial resolution of the acquired data, the flight height is not too high (the relative height is generally more than 300 meters), and therefore the requirements on the performance of the module are not too high (the acceptable cost is used as a limit); for flight safety considerations, the drone and the load equipment must be within visual range and the transmission distance need not be too far. Therefore, the data transmission submodule adopted in the invention has the distance to the open ground of about 1100 meters at the rate of 250K, the distance to the open ground of about 750 meters at the rate of 1M, and the distance to the open ground of about 520 meters at the rate of 2M.
The graphic image data transmission submodule is connected with the long-wave thermal infrared imager core submodule and the data storage and reading-writing submodule through a special interface.
And the graphic image data transmission only provides service for the core sub-module of the long-wave thermal infrared imager.
Relating to pre-flight device commissioning
Based on unmanned aerial vehicle, optical camera, thermal infrared camera and information transmission device can construct basic high spatial resolution ratio thermal infrared data integration acquisition equipment based on unmanned aerial vehicle. The unmanned aerial vehicle can be controlled to fly to a set place according to user requirements to shoot a set target, so that optical remote sensing data and thermal infrared remote sensing data about the set target are obtained.
Before the unmanned aerial vehicle goes to a set place to shoot, the following operations are executed:
1) device assembly and power-up debugging
The method for confirming whether each component works normally by powering up the unmanned aerial vehicle end equipment comprises the following steps: unmanned aerial vehicle system, optical camera, thermal infrared camera, information transmission device (data transmission system), storage device. Each system is supplied power by unmanned aerial vehicle, but if needs data transmission system exclusive use battery power supply.
The power-up debugging sequence is as follows: a) turning on a system main power supply; b) each system of the unmanned aerial vehicle is powered up for testing; after being tested to be normal; c) powering up the optical camera; d) testing camera exposure and data storage; e) powering up the thermal infrared system; f) testing the exposure (photographing mode) of the thermal infrared system and storing data, and recording the exposure times; g) testing the exposure (camera shooting mode) and data storage of the thermal infrared system, and recording the shooting duration; h) the data display test of the ground monitoring screen is carried out by the test data system; i) the data storage card (digital camera, thermal infrared system data memory module) takes out and confirms the data recording condition, including: whether data recording exists or not, whether the data quantity is correct or not, whether a lens is focused or not and whether other quality problems exist in the data or not; j) and if the data is normal, the device enters a standby stage, otherwise, the problems of all parts are checked at one time.
2) Flight plan planning
The method mainly comprises the steps of determining the type of acquired data (image data, video data and the like), the data coverage, the data acquisition time, the minimum spatial resolution of the data, the data acquisition mode and the like according to the requirements of an archaeological scheme. The planning scheme is the main basis of the design of the air route.
3) Flight zone determination and route planning
And determining the range of a flight area, the flight time and the number of the aerial flyers according to the archaeological flight scheme. And meanwhile, determining the maximum course distance and the exposure time (distance) according to the ground width of the photo of the optical camera and the thermal infrared system and the required lateral and heading overlapping degrees.
The thermal infrared device is large in heat radiation influence on the receiving surface of the ground, in order to avoid or reduce the radiation influence as much as possible, the flight time selection needs to be specially set, and the selection is carried out in the time before and after the day as much as possible, for example, no special requirement or no requirement is provided for optical data, and the flying around midnight is more suitable.
Aiming at archaeological needs, selecting an observation mode of a thermal infrared device: a photographing mode or a photography mode.
The optical camera determines whether to work according to the aerial photographing time, and can be always on in flight to observe the ground under the condition of light permission.
4) Lens distortion calibration
The precision requirement of remote sensing archaeology on flight data is limited, but the precision requirement is limited by flight height, the distortion of a lens edge image is large, lens distortion correction is needed for better fusion of flight data and the like, and in principle, lens calibration should be completed before each flight to obtain the distortion of the lens.
The basic principle for lens distortion calibration is as follows: and establishing a collinear equation containing camera distortion parameters (unknowns) by using the accurate position (relative position) of the target point and the coordinates of the corresponding image point.
Fig. 4 shows a schematic view of a camera imaging model according to an embodiment of the present invention, and as shown in fig. 4, it is assumed that the plane rectangular coordinates of the control point in the image corresponding to the image point are (X, Y), the ground (position on the target) coordinates of the control point are (X, Y, Z), the focal length of the digital camera to be obtained is f, and the shooting center coordinate is (X, Y, Z)s,Ys,Zs) Or in two-dimensional coordinates (x)0,y0) If the image point error caused by camera distortion is (Δ x, Δ y), then the following collinearity equation can be established according to the above parameters:
in the formula, ai、bi、ci(i ═ 1, 2, 3) is the 9 orientation cosines consisting of the 3 external orientation angle elements phi, omega, kappa of the image.
Fig. 5 shows a flowchart of a lens distortion correction technique provided by an embodiment of the present invention, specifically, the lens distortion data includes radial distortion (K), tangential distortion (P1, P2), and thin prism distortion (S1, S2). After linearization, the formula (1) substitutes the coordinates of the control point pair (coordinate pair: photo coordinate and coordinate of corresponding position on the target) to calculate the lens distortion parameter, and then deduces each single distortion correction number as follows:
radial distortion:
Δxr=(x-x0)(K1·r2+K2·r4+K3·r6+O[r8])
Δyr=(y-y0)(K1·r2+K2·r4+K3·r6+O[r8]) (2)
in the formula, K1、K2、…、KnRepresenting the radial distortion coefficient, which is generally taken to be K for a common lens3An item. r represents radial direction, r2=x2+y2(the same applies below).
Tangential distortion:
Δxd=P1(r2+2·(x-x0)2)+2·P2(x-x0)·(y-y0)+O[(x-x0,y-y0)4]
Δyd=P2(r2+2·(y-y0)2)+2·P1(x-x0)·(y-y0)+O[(x-x0,y-y0)4] (3)
for ordinary lens, the tangential distortion coefficient is generally taken to be P2Two steps are taken.
Prism distortion:
Δxp=s1((x-x0)2+(y-y0)2)+O[(x-x0,y-y0)4]
Δyp=s2((x-x0)2+(y-y0)2)+O[(x-x0,y-y0)4] (4)
the prism distortion coefficient is generally taken to be s to the first order for a common lens.
By utilizing a large number of control points, the distortion parameter (K) to be solved can be obtained by solving through a least square method1,K2,K3,P1,P2,s1,s2). Further, the position coordinates (x ', y') after the lens distortion correction is completed are obtained as:
x'=x+Δxr+Δxd+Δxp (5)
y'=y+Δyr+Δyd+Δyp (6)
the lens is not changed after being installed, such as violent vibration, re-assembly and disassembly and the like, and can not be re-calibrated within a certain precision range on the premise that the use environment is not changed violently, and distortion parameters can be reused.
Lens distortion correction technique flow chart
The process is realized by a program in a matlab program development language environment.
Fig. 6 shows a schematic diagram of a target of a nickel-aluminum plated block on a black-and-white grid according to an embodiment of the present invention, as shown in fig. 6:
a) when there is no lens distortion parameter, firstly, the lens distortion is corrected quickly. The method comprises the steps of enabling a black-and-white grid nickel-plated aluminum block target (applicable to optical and thermal infrared equipment at the same time) to reach a temperature balance state under the irradiation of sunlight, using data acquisition equipment with good rigid connection to simultaneously expose and photograph the target, acquiring a large amount of coordinate pair data, establishing a collinear equation, solving a lens distortion correction parameter through a least square method, and completing lens calibration.
b) When the lens distortion parameters exist, the image data are read one by using an immediate command through a while loop program, and the operation is carried out according to the formulas (2) to (6), so that the distortion correction of the image lens is completed.
According to user requirements, the unmanned aerial vehicle end obtains a planned air route of the unmanned aerial vehicle before flying, and the unmanned aerial vehicle flies according to the planned air route based on a storefront control station. And after the aircraft arrives at a preset shooting place, receiving the exposure information of the lens of the aircraft flight control system, and exposing according to a set exposure point. And determining whether the camera at the unmanned aerial vehicle end is in a camera shooting mode (video shooting mode), wherein under the condition that the thermal infrared camera is in the camera shooting mode, the shot video is transmitted to the ground control end and is played in real time through a display screen of the ground control end. And under the condition that the thermal infrared camera is in a shooting mode (photographing mode), sending the shot optical data and thermal infrared data to the ground control terminal and storing the optical data and the thermal infrared data.
The rapid correction of the lens distortion is carried out again after the unmanned aerial vehicle finishes the shooting task, whether a large deviation exists between the lens distortion degree of the unmanned aerial vehicle in the current state and the result of the rapid correction of the lens distortion finished before flight is determined, specifically, the result of the rapid correction of the lens distortion can be compared twice, and the optical data and the thermal infrared data shot by the flight can be used under the condition that the difference value is in the set difference range by comparing twice.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solutions of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention do not describe every possible combination.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program, which is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In addition, any combination of various different implementation manners of the embodiments of the present invention is also possible, and the embodiments of the present invention should be considered as disclosed in the embodiments of the present invention as long as the combination does not depart from the spirit of the embodiments of the present invention.

Claims (13)

1. A remote sensing data processing method is characterized by comprising the following steps:
receiving optical remote sensing data and thermal infrared remote sensing data transmitted by an unmanned aerial vehicle end; and
and fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data.
2. The method of processing remote sensing data according to claim 1, wherein said fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fused data comprises:
fusing the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation;
fusing a high-frequency part which is transformed by NSST in the optical remote sensing data and the thermal infrared remote sensing data; and
and fusing the high-frequency part and the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data in a one-to-one correspondence mode according to corresponding rules, and performing NSST inverse transformation on the fused high-frequency part and the fused low-frequency part to obtain a final fusion result.
3. The remote sensing data processing method of claim 2,
the fusion rule for the low frequency part is shown in the following formula:
wherein alpha and beta represent weight coefficients of a low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, and K represents the weight coefficients of the low-frequency part0Which represents the domain of K and,the method comprises the steps of representing a low-frequency component fusion value of thermal infrared remote sensing data and optical remote sensing data NSST after transformation at the nth row and the nth column, wherein the first part reflects the main low-frequency energy of images, and the second part reflects the detail difference between the images;
the blending rule for the high frequencies includes:
wherein, ω is1,ω2Is a weight coefficient of a high frequency part, j represents the number of image hierarchical layers, k represents the amount of image translation,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,and (4) representing a high-frequency component fusion value obtained after NSST transformation of the thermal infrared remote sensing data and the optical remote sensing data at the mth row and the nth column. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
4. The method of processing remote sensing data according to claim 1, further comprising:
before shooting, lens distortion correction is performed on a shooting lens on the unmanned aerial vehicle end.
5. The method of processing remote sensing data according to claim 1, further comprising: acquiring the optical remote sensing data and the thermal infrared remote sensing data;
wherein acquiring the optical remote sensing data and the thermal infrared remote sensing data comprises:
acquiring optical remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
acquiring thermal infrared remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
and recording the obtained optical remote sensing data and the obtained thermal infrared remote sensing data in a memory, and transmitting the obtained thermal infrared video to a control terminal.
6. The method of processing remote sensing data according to claim 5, further comprising:
receiving a planned route of the unmanned aerial vehicle before the unmanned aerial vehicle flies;
and receiving lens exposure information when the unmanned aerial vehicle reaches the preset shooting position.
7. A remote sensing data processing apparatus, characterized in that the remote sensing data processing apparatus comprises:
the first information transmission device is used for receiving optical remote sensing data and thermal infrared remote sensing data transmitted by the unmanned aerial vehicle end; and
and the fusion device is used for fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fusion data.
8. The remote sensing data processing device of claim 7, wherein said fusing the optical remote sensing data and the thermal infrared remote sensing data to obtain optical thermal infrared fused data comprises:
fusing the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data after NSST transformation;
fusing a high-frequency part which is transformed by NSST in the optical remote sensing data and the thermal infrared remote sensing data; and
and fusing the high-frequency part and the low-frequency part of the optical remote sensing data and the thermal infrared remote sensing data in a one-to-one correspondence mode according to corresponding rules, and performing NSST inverse transformation on the fused high-frequency part and the fused low-frequency part to obtain a final fusion result.
9. The remote sensing data processing device of claim 8,
the fusion rule for the low frequency part is shown in the following formula:
wherein, alpha and beta represent weight coefficients of low-frequency part, (m and n) represent the nth row and column of the mth row, I represents thermal infrared remote sensing data, V represents optical remote sensing data, K represents0Which represents the domain of K and,the method comprises the steps of representing a low-frequency component fusion value of thermal infrared remote sensing data and optical remote sensing data NSST after transformation at the nth row and the nth column, wherein the first part reflects the main low-frequency energy of images, and the second part reflects the detail difference between the images;
the fusion rule for the high frequency part includes:
wherein, ω is1,ω2Is a weight coefficient of a high frequency part, j represents the number of image hierarchical layers, k represents the amount of image translation,representing the residual background information of the high-frequency part at the mth row and the nth column of the thermal infrared remote sensing data after NSST conversion,representing residual background information of high-frequency parts of the optical remote sensing data after NSST transformation,denotes the m-th row and n-th columnAnd fusing the high-frequency component of the thermal infrared remote sensing data and the optical remote sensing data after NSST transformation. The weight coefficient of the high-frequency part is adjusted by the following formula:
ω2=1-ω1
wherein H (m, n) is the region texture smoothness, and T is the region texture smoothness threshold;
in the case of H (m, n) < T, the high frequency part of the optical remote sensing data and the thermal infrared remote sensing data is fused according to a fusion rule shown in the following formula:
wherein, SMLj,kWhich represents the sum of the laplace energies and,showing the high-frequency component after NSST transformation of the thermal infrared remote sensing data at the mth row and the nth column,and the high-frequency component after NSST transformation of the optical remote sensing data at the mth row and the nth column is shown.
10. The remote sensing data processing device of claim 7, further comprising:
and the correcting device is used for carrying out lens distortion correction on the shooting lens on the unmanned aerial vehicle end before shooting.
11. The utility model provides a remote sensing data processing control system based on unmanned aerial vehicle which characterized in that, remote sensing data processing control system based on unmanned aerial vehicle includes:
remote sensing data acquisition equipment based on an unmanned aerial vehicle; and
the remote sensing data processing device of any of claims 5-8;
wherein the remote sensing data acquisition device includes:
the optical camera is arranged on the unmanned aerial vehicle and used for acquiring optical remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
the thermal infrared imager is arranged on the unmanned aerial vehicle and used for acquiring thermal infrared remote sensing data of a ground target at a preset shooting position on a planned route of the unmanned aerial vehicle;
and the second information transmission device is arranged on the unmanned aerial vehicle and used for transmitting the acquired optical remote sensing data and the acquired thermal infrared remote sensing data to remote sensing data processing equipment.
12. The unmanned aerial vehicle-based remote sensing data acquisition system of claim 11, wherein the remote sensing data acquisition device further comprises:
the first information transmission device is arranged on the unmanned aerial vehicle and used for executing the following operations:
recording remote sensing data obtained by an optical camera and a thermal infrared imager in real time in the flight process of the unmanned aerial vehicle;
and recording the thermal infrared video data shot by the thermal infrared imager in real time, and simultaneously transmitting the thermal infrared video data to the remote sensing data acquisition equipment based on the unmanned aerial vehicle in real time.
13. A machine-readable storage medium having stored thereon instructions for causing a machine to execute the method of processing remote sensing data according to any one of claims 1-6.
CN201910753662.9A 2019-08-15 2019-08-15 Remote sensing data processing method and system based on unmanned aerial vehicle Pending CN110599412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910753662.9A CN110599412A (en) 2019-08-15 2019-08-15 Remote sensing data processing method and system based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910753662.9A CN110599412A (en) 2019-08-15 2019-08-15 Remote sensing data processing method and system based on unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN110599412A true CN110599412A (en) 2019-12-20

Family

ID=68854300

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910753662.9A Pending CN110599412A (en) 2019-08-15 2019-08-15 Remote sensing data processing method and system based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN110599412A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429531A (en) * 2020-04-24 2020-07-17 Oppo广东移动通信有限公司 Calibration method, calibration device and non-volatile computer-readable storage medium
CN113324656A (en) * 2021-05-28 2021-08-31 中国地质科学院 Unmanned aerial vehicle-mounted infrared remote sensing earth surface heat anomaly detection method and system
CN114441595A (en) * 2022-02-09 2022-05-06 四川省安全科学技术研究院 Detection method for coal seam outcrop spontaneous combustion and influence range thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102637297A (en) * 2012-03-21 2012-08-15 武汉大学 Visible light and infrared image fusion method based on Curvelet transformation
CN102789640A (en) * 2012-07-16 2012-11-21 中国科学院自动化研究所 Method for fusing visible light full-color image and infrared remote sensing image
CN103544707A (en) * 2013-10-31 2014-01-29 王浩然 Method for detecting change of optical remote sensing images based on contourlet transformation
CN106846385A (en) * 2016-12-30 2017-06-13 广州地理研究所 Many sensing Remote Sensing Images Matching Methods, device and system based on unmanned plane
CN108346143A (en) * 2018-01-30 2018-07-31 浙江大学 A kind of crop disease monitoring method and system based on the fusion of unmanned plane multi-source image
CN108364264A (en) * 2018-02-07 2018-08-03 大连航天北斗科技有限公司 A kind of ocean temperature monitoring method and system based on unmanned plane infrared remote sensing technology
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 A kind of fusion method of target in hyperspectral remotely sensed image and full-colour image based on JSKF models and NSCT technologies
CN109345495A (en) * 2018-09-11 2019-02-15 中国科学院长春光学精密机械与物理研究所 Image interfusion method and device based on energy minimum and gradient regularisation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102637297A (en) * 2012-03-21 2012-08-15 武汉大学 Visible light and infrared image fusion method based on Curvelet transformation
CN102789640A (en) * 2012-07-16 2012-11-21 中国科学院自动化研究所 Method for fusing visible light full-color image and infrared remote sensing image
CN103544707A (en) * 2013-10-31 2014-01-29 王浩然 Method for detecting change of optical remote sensing images based on contourlet transformation
CN106846385A (en) * 2016-12-30 2017-06-13 广州地理研究所 Many sensing Remote Sensing Images Matching Methods, device and system based on unmanned plane
CN108346143A (en) * 2018-01-30 2018-07-31 浙江大学 A kind of crop disease monitoring method and system based on the fusion of unmanned plane multi-source image
CN108364264A (en) * 2018-02-07 2018-08-03 大连航天北斗科技有限公司 A kind of ocean temperature monitoring method and system based on unmanned plane infrared remote sensing technology
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 A kind of fusion method of target in hyperspectral remotely sensed image and full-colour image based on JSKF models and NSCT technologies
CN109345495A (en) * 2018-09-11 2019-02-15 中国科学院长春光学精密机械与物理研究所 Image interfusion method and device based on energy minimum and gradient regularisation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
胡文等: "LNSST域灰度突变度的红外与可见光图像融合", 《红外技术》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429531A (en) * 2020-04-24 2020-07-17 Oppo广东移动通信有限公司 Calibration method, calibration device and non-volatile computer-readable storage medium
CN113324656A (en) * 2021-05-28 2021-08-31 中国地质科学院 Unmanned aerial vehicle-mounted infrared remote sensing earth surface heat anomaly detection method and system
CN114441595A (en) * 2022-02-09 2022-05-06 四川省安全科学技术研究院 Detection method for coal seam outcrop spontaneous combustion and influence range thereof
CN114441595B (en) * 2022-02-09 2022-07-29 四川省安全科学技术研究院 Detection method for coal seam outcrop spontaneous combustion and influence range thereof

Similar Documents

Publication Publication Date Title
Mikhail et al. Introduction to modern photogrammetry
Roth et al. PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems
CN111436208B (en) Planning method and device for mapping sampling points, control terminal and storage medium
CN110599412A (en) Remote sensing data processing method and system based on unmanned aerial vehicle
CN111415409B (en) Modeling method, system, equipment and storage medium based on oblique photography
CN111540048A (en) Refined real scene three-dimensional modeling method based on air-ground fusion
JP2008186145A (en) Aerial image processing apparatus and aerial image processing method
WO2008152740A1 (en) Digital aerial photographing three-dimensional measurement system
CN106408601A (en) GPS-based binocular fusion positioning method and device
CN104704424A (en) Infrastructure mapping system and method
Raczynski Accuracy analysis of products obtained from UAV-borne photogrammetry influenced by various flight parameters
Bevilacqua et al. Digital technology and mechatronic systems for the architectural 3D metric survey
CN114495416A (en) Fire monitoring method and device based on unmanned aerial vehicle and terminal equipment
CN111527375B (en) Planning method and device for surveying and mapping sampling point, control terminal and storage medium
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
CN113415433B (en) Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN100498246C (en) Machine-carried broom pushing type multidimension imaging device
CN110986888A (en) Aerial photography integrated method
CN107421503A (en) Simple detector three-linear array stereo mapping imaging method and system
King et al. Airborne digital frame camera imaging for elevation determination
CN108195359A (en) The acquisition method and system of spatial data
CN114440836B (en) Unmanned aerial vehicle photogrammetry modeling method attached with glass curtain wall building
KR20230047734A (en) Method for monitoring solar panels using video streams from uav
Hsu Geocoded terrestrial mosaics using pose sensors and video registration
Whitley Unmanned aerial vehicles (UAVs) for documenting and interpreting historical archaeological Sites: Part II—return of the drones

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination