CN115993609A - Sensing and storing integrated linear array remote sensing system and data processing method thereof - Google Patents

Sensing and storing integrated linear array remote sensing system and data processing method thereof Download PDF

Info

Publication number
CN115993609A
CN115993609A CN202211584407.4A CN202211584407A CN115993609A CN 115993609 A CN115993609 A CN 115993609A CN 202211584407 A CN202211584407 A CN 202211584407A CN 115993609 A CN115993609 A CN 115993609A
Authority
CN
China
Prior art keywords
linear array
remote sensing
sensing data
pixel
slave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211584407.4A
Other languages
Chinese (zh)
Inventor
周志艳
姜锐
欧媛珍
黄俊浩
刘梓博
万欢
林键沁
罗锡文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN202211584407.4A priority Critical patent/CN115993609A/en
Publication of CN115993609A publication Critical patent/CN115993609A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a linear array remote sensing system integrating sensing and memory and a data processing method thereof; the linear array remote sensing system with integrated sensing and storage functions comprises a host system, a slave system, a cabin box and a standard reflecting plate; the host system comprises a central processing unit, a data storage device, a high-precision pose module and an incident light calibration module; the high-precision pose module is used for acquiring real-time geographic position information and pose information of the linear array remote sensing system when the linear array remote sensing system acquires data in the process of external carrier movement; the host system is arranged on the back of the cabin box; the front surface of the cabin box comprises a plurality of cabins; the slave system comprises a linear array imaging sensor and a coprocessor, is arranged in the cabin and is connected with the host system through an electric bus in the cabin box; the host system is used for controlling the slave system to acquire and process the linear remote sensing data; the standard reflecting plate is used for correcting the linear array imaging sensor; the data processing method is used for correcting and mapping the linear array remote sensing data in real time.

Description

Sensing and storing integrated linear array remote sensing system and data processing method thereof
Technical Field
The invention relates to the technical field of crop remote sensing monitoring, in particular to a sensing and storing integrated linear array remote sensing system and a data processing method thereof.
Background
The method can rapidly, nondestructively and accurately monitor the growth vigor of crops, accurately know the growth state of the crops, is a premise and a foundation for developing accurate agriculture, is favorable for improving the management strategy of water, fertilizer and medicine, improves the management accuracy, and has important significance for preventing farmland environment pollution caused by excessive use of fertilizer and pesticide and further developing research and application of accurate agriculture.
The traditional aerospace and aviation remote sensing technology has the defects of multiple weather influencing factors, long period, high cost and the like. The method adopts ground tractors, overhead vehicles, unmanned aerial vehicles and other modes to remotely obtain the crop growth information near the ground, so that the defects of the existing aerospace remote sensing technology can be overcome. The unmanned aerial vehicle technology has rapid development in recent years, better operability, lower cost, better coverage area of single remote sensing images than ground machinery and higher efficiency, so that the application range of the unmanned aerial vehicle low-altitude remote sensing information acquisition technology in the agricultural field is rapidly expanding, and the unmanned aerial vehicle low-altitude remote sensing information acquisition technology becomes one of the main methods of farmland crop information remote sensing at present and is also an important direction of accurate agricultural development.
The remote sensing of crop growth information is mainly divided into three types of points, lines and planes according to the arrangement mode of single observation data dot matrix.
Currently, the unmanned aerial vehicle low-altitude remote sensing monitoring mainly adopts an area array imaging sensor, and the representative model comprises a Rededge-M series multispectral camera of Micase company, an ADC series multispectral camera of tetra cam company, an MS600 multispectral camera of long-light Yuchen company, a fairy 4 multispectral version of DJI company, a 9-channel multispectral camera of SILIOS company, a SURVEY series multispectral camera of MAPIR company and the like. The remote sensing unmanned aerial vehicle adopting the area array imaging sensor is used for remote sensing, and the quantity of pixels is huge, so that the resource consumption and the time cost of data transmission, storage and processing during data processing are relatively high. It is reported that the common planar array imaging unmanned aerial vehicle low-altitude remote sensing is set to 75% according to the flying height of 200 m, the lateral overlapping degree and the heading overlapping degree are both set to 75%, the planar array multispectral camera which is common in the market at present is used for remote sensing monitoring of 1000 mu, the image acquisition quantity before splicing is about 5000 photos, if the resolution of a single photo is 1280×960, a storage space of 12Gb is required, the post-processing work such as correction, splicing and analysis is required by a common graphic workstation, the data analysis speed is only about 3.3 mu/min, and the requirements of agriculture time tightness in modern agricultural production and the real-time image and high-frequency use of remote sensing data are difficult to be met.
The information acquired by the spot sensor at a time is planar comprehensive information of ground object reflection spectrum in the view field range, and has the advantages of small pixel number, low resource consumption during data post-processing and data utilization, high information acquisition efficiency, and common application in short-range canopy scale nitrogen measurement, but has the disadvantage of difficulty in directly acquiring a high-resolution two-dimensional image.
Compared with a dot-shaped sensor, the linear array sensor has more one-dimensional pixel numbers, the total pixel numbers are less than those of a planar array type, the frame number is more, the processing efficiency is high, a push-broom imaging mode can be adopted to obtain a two-dimensional image with higher resolution, and the linear array sensor is mainly applied to the aspects of static object (such as image-text) scanning, moving object transient detection, quality detection and the like at present. In the remote sensing field, the linear array sensor has more applications on satellite remote sensing platforms, such as IKONOS, quickBird, SPOT 1-4, china CBERS-02B satellite HR cameras, resource No. three satellite panchromatic cameras and the like. However, the remote sensing technology based on the linear array sensor has few researches and applications on unmanned plane platforms, and the main reasons are that factors such as unmanned plane flight stability, fuselage vibration, ultra-low altitude rapid movement, instantaneous illumination condition change and the like have adverse effects on the aspects such as linear array image quality, projection registration and data precision.
Disclosure of Invention
The invention aims to overcome at least one defect (deficiency) of the prior art, and provides a linear array remote sensing system with integrated sensing and storing and a data processing method thereof, which have the beneficial effects of accurate and rapid response to ground object multispectral information, can effectively reduce the resource consumption and time cost of remote sensing data transmission, storage and processing, shorten the data post-processing time, and improve the remote sensing efficiency, image quality, projection registration and data precision.
In order to achieve the purpose of the invention, the technical scheme adopted by the invention is as follows:
the technical scheme provides a linear array remote sensing system integrating sensing and storage, which adopts a linear array imaging technology to combine the motion of an external carrier to carry out large-breadth scanning acquisition, thereby improving the measurement efficiency, providing data processing speed better than the surface imaging remote sensing technology and data dimension better than the in-situ single-point measurement of multispectral data, and being more in line with the requirements of users. In addition, the technical scheme adopts real-time data calculation, and does not need the later image correction and splicing process, so that the data post-processing time can be greatly shortened.
Specifically, a linear array remote sensing system integrating sensing and storage is arranged on an external carrier and is characterized by comprising a host system, a plurality of slave systems, a cabin box and a standard reflecting plate;
Further, the host system comprises a central processing unit, a data storage device, a high-precision pose module and an incident light calibration module;
further, the pod is mounted on an external carrier; the back of the cabin box is used for installing a host system; the front of the cabin box comprises a plurality of cabins, and each cabin can accommodate a slave machine system; the normal line of the geometric center of the slave system is coaxial with the normal line of the geometric center of the cabin;
further, the high-precision pose module is used for acquiring geographic position information and pose information when the linear array remote sensing system acquires remote sensing data in the external carrier movement process and transmitting the geographic position information and the pose information to the data storage equipment;
further, the slave system comprises a linear array imaging sensor and a coprocessor, and is connected with the host system through an electrical bus in the cabin box; the coprocessor communicates with a central processing unit of the host system; the central processing unit is used for sending instructions to the coprocessor to control the linear array imaging sensor to acquire ground object linear array remote sensing data and transmitting the data to the data storage equipment;
further, a standard reflecting plate is used for correcting the linear array imaging sensor;
further, the host system synchronously triggers the incident light calibration module to collect the ambient light information and the slave system to collect the linear array remote sensing data, and the host system automatically adjusts the shooting parameters of the slave system in real time according to the ambient light information;
Further, the central processing unit is also used for analyzing the linear array remote sensing data in the data storage device and constructing an orthographic remote sensing image by combining the geographic position information and the attitude information.
Further, the linear array imaging sensor comprises an imaging lens with a focal length f and a p×1 linear array arranged photodiode group with the same spectral response curve. Wherein p is the number of photodiodes, namely the total pixels of the linear array imaging sensor; the geometric center of the photodiode group is set as the centremost pixel of the linear array imaging sensor; the center normal line of the photodiode group coincides with the optical axis of the imaging lens; the angle of view of the linear array imaging sensor is gamma. The linear array imaging sensor is provided with a narrow-band optical filter; the data acquired by the linear array imaging sensor at one time is a frame of linear array remote sensing data, and the data comprises p pixels; and the arrangement directions of the photodiode groups in the slave systems of all the bilges in the bilge are consistent. The normal line of the geometric center of the slave system is coaxial with the normal line of the geometric center of the cabin;
further, the incident light calibration module comprises a plurality of photoelectric sensors, an auxiliary processor and a multi-channel optical module; the multi-channel optical module is provided with a plurality of narrow-band optical filters; the spectral response curves of the photoelectric sensor and the photodiode group are consistent; the linear array imaging sensors of the plurality of slave systems are consistent with the optical characteristics of the narrow-band optical filters of the multi-channel optical module. The geographic position information comprises latitude coordinates X, longitude coordinates Y, an altitude H and a course angle theta; the attitude information comprises a Yaw angle Yaw, a Pitch angle Pitch and a Roll angle Roll;
Further, the heading angle theta is increased by 0 degrees in the north and 0-360 degrees in the clockwise direction; the Yaw angle Yaw is gradually increased by 0-360 degrees clockwise by taking the machine head direction of the external carrier as 0 degrees; the Pitch angle Pitch and the Roll angle Roll are both referenced to the ground level; the concrete value of the Pitch angle Pitch is eta, positive number is given when the Pitch angle is the elevation angle, and negative number is given when the Pitch angle is the depression angle; the Roll angle Roll is specifically ζ, positive in the case of left Roll, and negative in the case of right Roll.
Further, the data processing method comprises a correction stage and a remote sensing operation stage;
the correction phase comprises the steps of:
j1 initial exposure time calculation: calculating initial exposure time in a sensor correction stage, and ensuring that overflow saturation condition of the linear array remote sensing data does not occur;
correcting a J2 linear array remote sensing system: based on the initial exposure time, analyzing the optical attenuation characteristic of the linear array imaging sensor, and determining the attenuation coefficient of each pixel bit of the linear array remote sensing data;
the remote sensing operation stage comprises the following steps:
s1, obtaining initial pose information: acquiring initial geographic position information and attitude information of a linear array remote sensing system;
s2, calculating real-time exposure time: calculating the exposure time for collecting the current frame of linear array remote sensing data;
S3, acquiring current frame array remote sensing data and synchronously acquiring real-time geographic position information and attitude information;
s4, calculating the relative height: namely, the difference value between the height of the geographical position information corresponding to the current frame of linear array remote sensing data in S3 and the height of the initial geographical position information in S1;
s5, calculating the ground resolution: the ground resolution of the current frame linear array remote sensing data is calculated according to the relative height in the step S4 and the real-time geographic position information and the attitude information acquired in the step S3;
s6, performing primary processing on the current frame linear array remote sensing data: correcting and calculating the reflectivity of the current frame linear array remote sensing data acquired in the step S3, and constructing data after preliminary processing;
s7, calculating projection coordinates of the current frame linear array remote sensing data: calculating projection coordinates corresponding to all pixels of the current frame of linear array remote sensing data, and further constructing a linear array remote sensing image;
s8, storing the processed linear array remote sensing data;
the steps S1-S8 are processes for completing the processing of the remote sensing data of the frame of linear array, and the whole process needs to be circulated until the operation task is completed in practical application. Further, the same exposure time calculation method is adopted in the steps J1 and S2, specifically, the host system automatically adjusts shooting parameters of each slave system in the process of acquiring the linear array remote sensing data, wherein the shooting parameters are exposure time t int_in The method comprises the following specific steps of:
in the process of acquiring linear remote sensing data of a certain slave system, determining the exposure time of each slave system according to the exposure time of a channel (a narrow-band optical filter with the same optical characteristic) corresponding to an incident light calibration module; the exposure time of the slave system is consistent with that of the incident light calibration module;
the exposure time calculation step of any channel of the incident light calibration module comprises the following steps:
s01: setting the power supply voltage of the linear array remote sensing system as U;
s02: incidence ofOutput voltage V of any spectrum channel of light calibration module out_in The calculation formula of (V) is:
V out_in =V drk_in +(R e )×(E e_in )×(t int_in ) (1)
wherein V is drk_in Is dark voltage (V), R e For the sensitivity of the photosensitive element to a specific wavelength (V/(/ cm) 2 )),E e_in For the incident light irradiation illuminance (μW +) 2 ),t int_in Exposure time(s);
S03:V drk_in dark processing and advanced sampling are carried out before the acquisition of the ground object linear array remote sensing data, V out_in Subtracting the dark voltage V drk_in After that, the formula (1) can be simplified as:
V out_in =(R e )×(E e_in )×(t int_in ) (2)
S04:t int_in at V out_in When the power supply voltage U is reached, the channel is determined to correspond to the exposure time and t of the slave computer system int_in Consistent;
s05: since the spectrum response curves of the photodiode group and the incident light calibration module are consistent, the output voltage V of any photodiode of the corresponding channel of the slave system out_o (V) calculating as:
V out_o =(R e )×(E e_o )×(t int_in ) (3)
Wherein E is e_o For the reflected light radiation illuminance (μW +) 2 )。
Further, step J2 is to calibrate the linear remote sensing system, taking a slave system as an example, and the specific steps are as follows:
j21, firstly acquiring and locking the exposure time of a certain slave system determined in the step S04;
j22: horizontally placing a standard reflecting plate under a clear sky, vertically aligning the slave system with the standard reflecting plate, and adjusting the position and the height of the slave system to ensure that the view field range of the slave system is completely covered by the standard reflecting plate;
j23: using slave systemsContinuously acquiring linear array remote sensing data at fixed time intervals, wherein a frame of linear array remote sensing data acquired once comprises p V out_o Data;
j24: according to peak output voltage V of photodiode corresponding to the centermost pixel out_o_max Output voltage V of photodiode corresponding to other pixel out_o [i]Is calculated for each pixel position i The mathematical relationship is as follows:
Figure BDA0003991018600000051
where i is the pixel number and the range is [0, p-1]The attenuation coefficient alpha corresponding to the pixel sequence number i can be sequentially calculated by the formula (4) i
J25 output voltage V after correction of all pixels out_o_cal [i]It can be calculated as:
Figure BDA0003991018600000052
j26. Alpha. When correcting the actually collected one-frame linear array remote sensing data i F (i) can be obtained through a table look-up method or through model training, and the following relation exists:
α i =F(i) (6)
After model training of F (i) is finished, the attenuation coefficient alpha of all pixels in one frame of linear array remote sensing data can be calculated through the pixel sequence number i i
J27 other slave systems use the steps J21-J26 to perform similar calculations to determine the respective attenuation coefficients.
Further, the ground resolution calculation method in step S5 is as follows:
when the height of the acquired data is H, the actual distance D of the unit pixel, namely the single photodiode, corresponding to the ground object W The method comprises the following steps:
Figure BDA0003991018600000061
further, the specific method for data preprocessing in step S6 is as follows:
the data collected by the linear array remote sensing system comprises reflectivity data, and when a certain slave system collects the linear array remote sensing data, the incident light calibration module records that the incident light radiation illuminance of a channel corresponding to the slave system is E e_in According to formula (2); the ground object reflected light radiation illuminance collected by the slave machine system is an array E e_o Comprising p elements, E e_o []For the ground object reflected light radiation illuminance of the ith pixel serial number, according to formulas (3) and (5), the linear array reflectivity array R, which contains p elements, can be calculated as:
Figure BDA0003991018600000062
where i is the pixel number and the range is [0, p-1]Setting the serial number of the current slave system linear array remote sensing data acquisition as N (namely, frame number, starting from 0), and setting the result array REF of the data primary processing N The method comprises the following steps:
REF N ={N,Lon N ,Lat N ,R,T} (9)
in Lon N Acquiring longitude coordinates of the centremost pixel of the linear array remote sensing data with the sequence number of N; lat N Acquiring the latitude coordinate of the centremost pixel of the linear array remote sensing data with the sequence number of N; r is the reflectivity array calculated according to the formula (8) when the acquisition sequence number is N, and T is the world time when the acquisition sequence number is N; the method is that the data of a single slave system is initially processed, and other slave systems (other wave bands) adopt the same method.
Further, calculating the projection coordinates of the current frame line remote sensing data in step S7 includes calculating the projection coordinates of the most central pixel, the left central pixel, the right central pixel, the left side pixel and the right side pixel of the current frame line remote sensing data (as shown in fig. 8); the left center pixel and the right center pixel are respectively a first pixel at the left side and a first pixel at the right side of the centremost pixel of the linear array remote sensing data; the left side pixels and the right side pixels are all pixels left of a left center pixel and right of a right center pixel of the linear array remote sensing data respectively; the linear array remote sensing data takes an external carrier machine head as a zero axis, establishes a local coordinate system clockwise by 0-360 degrees, takes the direction of 90 degrees as the right, and takes the direction of 270 degrees as the left; the calculation method of the projection coordinates of the centremost pixel of the linear array remote sensing data of all the slave computer systems comprises the following steps:
S71, carrying out one-to-one correspondence labeling on all slave systems and the cabin positions, setting a cabin position No. 1, namely, enabling the normal line of the slave system No. 1 to coincide with the normal line of the geometric center of an antenna of the high-precision pose module, and setting the projection coordinate of the high-precision pose module at the current frame linear array remote sensing data acquisition moment as (X) cor_ref_N ,Y cor_ref_N );
S72, setting the central projection coordinate of the cabin number 1 as an origin,
Figure BDA0003991018600000077
for the included angle between any other cabin and cabin number 1 (origin), M is the number of other cabin (namely slave system);
s73, setting the projection coordinate of any slave system outside the slave system No. 1 in the local coordinate system as (X) csa_M ,Y csa_M );
S74.1 Slave computer System current frame Linear array remote sensing data center-most pixel projection coordinate (X) cor_cen_1 ,Y cor_cen_1 ) The calculation method of (1) is as follows:
X cor_cen_1_N =X cor_ref_N +H×tanη×sinθ-H×tanζ×cosθ (10)
Y cor_cen_1_N =Y cor_ref_N +H×tanη×cosθ-H×tanζ×sinθ (11)
the centremost pixel projection coordinate (X) of the linear array remote sensing data of the current frame of any other slave system cor_csa_M_N ,Y xor_csa_M_n ) The calculation method of (1) is as follows:
Figure BDA0003991018600000071
Figure BDA0003991018600000072
further, under any motion state of the external carrier, the left pixel projection coordinate (X) of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000073
Figure BDA0003991018600000074
/>
the pixel projection coordinate (X) on the right side of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000075
Figure BDA0003991018600000076
linear array remote sensing data left center pixel projection coordinates (X) of current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000081
Figure BDA0003991018600000082
linear array remote sensing of current frame measured by any slave systemData right center pixel projection coordinates (X Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000083
Figure BDA0003991018600000084
according to formulas (12) - (13), the projection coordinates of the centremost pixels of the current frame linear array remote sensing data set obtained by the slave computer systems in all the bilges can be calculated, then according to formulas (14) - (21), the projection coordinates of other pixels of the linear array remote sensing data set obtained by all the slave computer systems can be calculated, and finally the linear array remote sensing data collected by all the slave computer systems are integrated and converted into a two-dimensional orthographic image.
Further, when the course angle theta of the high-precision pose module is abnormal, the local calculation of the course angle theta can be performed according to adjacent longitude and latitude coordinates; let the center projection coordinate of the previous sampling point of the 1 st bin be (x 1, y 1), and the plane projection coordinate of the next sampling point be (x 2, y 2), then the azimuth θ of the previous sampling point is calculated as follows:
dx=x2-x1 (22)
dy=y2-y1 (23)
when dx=0 and dy >0, θ=0°;
when dx=0 and dy <0, θ=180°;
when dy=0 and dx >0, θ=90°;
when dy=0 and dx <0, θ=270°;
when dx >0 and dy >0, θ=arctan (dx/dy);
when dx <0 and dy >0, θ=360° + arctan (dx/dy);
θ=180++arctan (dx/dy) when dx <0 and dy <0 or when dx >0 and dy < 0.
Compared with the prior art, the invention has the beneficial effects that:
compared with an area array imaging sensor, the invention greatly reduces the data volume and can perform rapid spectrum data acquisition and plane mapping; compared with a point-shaped sensor, the sensor has more one-dimensional pixel numbers, the scanning area can fully cover the area to be measured, omission of the measuring area is effectively avoided, resolution meeting actual production needs can be obtained in the multispectral data acquisition operation process, the sensor has higher acquisition efficiency, and rapid acquisition is realized.
The standard reflector calibration method adopted by the invention can correct cosine-like effects possibly caused by ageing and pollution of the photosensitive semiconductor device or optical defects of the lens, and realizes accurate response of each pixel (photosensitive semiconductor photodiode device) to actual illumination.
The invention mainly synchronously triggers the slave computer system to acquire accurate linear array remote sensing data through the host computer system, and simultaneously, the user-defined wave band (narrow-band optical filter) of the slave computer system can support inversion and calculation of various vegetation indexes and also support single-wave-band reflectivity data output. In addition, the linear array remote sensing data correction and projection coordinate calculation method can effectively reduce the resource consumption and time cost of remote sensing data transmission, storage and processing, realize on-line calculation on a machine, realize integration of sensing and storage, shorten the data post-processing time and improve the remote sensing efficiency, image quality, projection registration and data precision.
Drawings
Fig. 1 is a schematic diagram of a linear array remote sensing system integrated with sensing and calculation, which is mounted on a double-rotor unmanned aerial vehicle.
Fig. 2 is a schematic diagram of the bottom of the five-band sensing and storing integrated linear array remote sensing system.
Fig. 3 is a schematic diagram of the top of the five-band sensing and storing integrated linear array remote sensing system.
Fig. 4 is a flow chart of the processing of the linear array remote sensing data.
Fig. 5 is a schematic view of pitch and roll angles of a slave system.
Fig. 6 is a schematic diagram of linear array remote sensing data correction.
FIG. 7 is a flow chart of an exposure time determination for an incident light calibration module and a slave system.
Fig. 8 is a schematic diagram showing the relationship among the left pixel, the left center pixel, the right pixel, and the right center pixel (the number of pixels is an even number as an example).
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the invention. For better illustration of the following embodiments, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the actual product dimensions; it will be appreciated by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
Examples
As shown in fig. 1, 2 and 3, the embodiment discloses a five-band sensing and calculating integrated linear array remote sensing system which is mounted on a double-rotor unmanned aerial vehicle and comprises a host system 13, slave systems (1, 6, 7, 8 and 9), a cabin box 5 and a standard reflecting plate;
The host system comprises a central processing unit 12, a data storage device 10, a high-precision pose module 2 and an incident light calibration module 11, and is used for controlling the slave system to acquire linear remote sensing data;
the capsule is mounted on an external carrier; the back of the cabin box is used for installing a host system; the front of the cabin box comprises 6 cabins, each cabin can be filled with and connected with a slave computer system, and the cabins connect the master computer system 13 and the slave computer systems (1, 6, 7, 8 and 9) through an electric bus SPI. Specifically, each bin is marked with a specific serial number, in this embodiment, the bin corresponding to the red band slave system is set to be the number 1 bin, and the coordinate center of the number 1 bin (i.e. the optical axis of the undistorted imaging lens) coincides with the normal line of the geometric center of the antenna of the high-precision pose module 2.
The high-precision pose module is used for acquiring the geographical position information (latitude coordinate X, longitude coordinate Y, altitude H and heading angle theta) of the slave system and the pose information (Yaw angle Yaw, pitch angle Pitch and Roll angle Roll) of the capsule when the linear array imaging sensor acquires remote sensing data in the external carrier movement process, and transmitting the geographical position information to the data storage device;
The slave system comprises a linear array imaging sensor and a coprocessor, and is connected with the host system through an electrical bus in the cabin box; the coprocessor communicates with a central processing unit of the host system; the central processing unit is used for sending instructions to the coprocessor to control the linear array imaging sensor to acquire ground object linear array remote sensing data and transmitting the data to the data storage equipment;
as shown in fig. 6, the slave system corrects the linear imaging sensor by using a standard reflecting plate, calculates linear array reflectivity data by combining the incident light calibration module, transmits the calculated linear array reflectivity data to the host system, and finally transmits the calculated linear array reflectivity data to the data storage device;
the host system synchronously triggers the incident light calibration module to collect the ambient light information and the linear array imaging sensor to collect the linear array remote sensing data, and the host system automatically adjusts the shooting parameters of the slave system in real time according to the ambient light information;
the central processing unit is also used for analyzing the linear array remote sensing data in the data storage device and constructing an orthographic remote sensing image by combining the geographic position information and the attitude information.
As shown in fig. 2, in particular, the line array imaging sensor is equipped with 5 bands of red, green, blue, near infrared and red, which are mainly related to vegetation indexes; the supply voltage U for the whole process was set to 3.3V.
Further, in the present embodiment, the line array imaging sensor includes a distortion-free imaging lens having a focal length f of 6.00mm and a photodiode group of 128×1 line array arrangement having high sensitivity, ultra-low dark current and the same spectral response curve. Wherein p is the number of photodiodes, namely the total pixels of the linear array imaging sensor; the geometric center of the photodiode group is set as the centremost pixel of the linear array imaging sensor; the center normal line of the photodiode group coincides with the optical axis of the undistorted imaging lens; the angle of view of the linear array imaging sensor is 60 degrees; the linear array imaging sensor is provided with a narrow-band optical filter; the data collected by the linear array imaging sensor at one time is a frame of linear array remote sensing data, and the data comprises 128 pixels; and the arrangement directions of the photodiode groups in the slave systems of all the bilges in the bilge are consistent.
The incident light calibration module comprises a plurality of photoelectric sensors, an auxiliary processor and a multichannel optical module; the multi-channel optical module is provided with a plurality of narrow-band optical filters; the spectral response curves of the photoelectric sensor and the photodiode group are consistent; the linear array imaging sensors of the plurality of slave systems are consistent with the optical characteristics of the narrow-band optical filters of the multi-channel optical module; each channel of the linear array imaging sensor and each channel of the incident light calibration module are provided with a narrow-band optical filter with the same wave band; the arrangement directions of the photodiode groups of the slave systems in all the cabin positions in the cabin box are consistent;
Specifically, the working mode of the acquisition system is a line scanning mode, and the data sampling mode of the acquisition system can be set to be timing acquisition and equidistant acquisition through time information and longitude and latitude information of the high-precision pose module 2 in the host system, and the embodiment is set to be 20Hz timing acquisition.
Further, in the present embodiment, the heading angle θ is incremented by 0 ° in true north, 0 ° to 360 ° in clockwise; the Yaw angle Yaw is gradually increased by 0-360 degrees clockwise by taking the machine head direction of the external carrier as 0 degrees; the Pitch angle Pitch and the Roll angle Roll are both referenced to the ground level; the concrete value of the Pitch angle Pitch is eta, positive number is given when the Pitch angle is the elevation angle, and negative number is given when the Pitch angle is the depression angle; the Roll angle Roll is specifically ζ, positive in the case of left Roll, and negative in the case of right Roll (see fig. 5).
Further, the embodiment also discloses a linear array remote sensing data processing method (shown in fig. 4) integrating sensing and memory. The data processing method comprises a correction stage and a remote sensing operation stage;
the correction phase comprises the steps of:
j1 initial exposure time calculation: calculating initial exposure time in a sensor correction stage, and ensuring that overflow saturation condition of the linear array remote sensing data does not occur;
Correcting a J2 linear array remote sensing system: based on the initial exposure time, analyzing the optical attenuation characteristic of the linear array imaging sensor, and determining the attenuation coefficient of each pixel bit of the linear array remote sensing data;
the remote sensing operation stage comprises the following steps:
s1, obtaining initial pose information: acquiring initial geographic position information and attitude information of a linear array remote sensing system;
s2, calculating real-time exposure time: calculating the exposure time for collecting the current frame of linear array remote sensing data;
s3, acquiring current frame array remote sensing data and synchronously acquiring real-time geographic position information and attitude information;
s4, calculating the relative height: namely, the difference value between the height of the geographical position information corresponding to the current frame of linear array remote sensing data in S3 and the height of the initial geographical position information in S1;
s5, calculating the ground resolution: the ground resolution of the current frame linear array remote sensing data is calculated according to the relative height in the step S4 and the real-time geographic position information and the attitude information acquired in the step S3;
s6, performing primary processing on the current frame linear array remote sensing data: correcting and calculating the reflectivity of the current frame linear array remote sensing data acquired in the step S3, and constructing data after preliminary processing;
s7, calculating projection coordinates of the current frame linear array remote sensing data: calculating projection coordinates corresponding to all pixels of the current frame of linear array remote sensing data, and further constructing a linear array remote sensing image;
S8, storing the processed linear array remote sensing data;
the steps S1-S8 are processes for completing the processing of the remote sensing data of the frame of linear array, and the whole process needs to be circulated until the operation task is completed in practical application.
The same exposure time calculation method is adopted in the steps J1 and S2, specifically, the host system automatically adjusts shooting parameters of each slave system in the process of acquiring the linear array remote sensing data, wherein the shooting parameters are exposure time t int_in The method comprises the following specific steps of:
in the process of acquiring linear remote sensing data of a certain slave system, determining the exposure time of the slave system according to a channel corresponding to an incident light calibration module;
the exposure time calculation step (as shown in fig. 7) of any channel of the incident light calibration module is as follows:
s01: setting the power supply voltage of the linear array remote sensing system as U;
s02: output voltage V of any spectrum channel of incident light calibration module out_in The calculation formula of (V) is:
V out_in =V drk_in +(R e )×(E e_in )×(t int_in ) (1)
wherein V is drk_in Is dark voltage (V), R e For the sensitivity of the photosensitive element to a specific wavelength (V/(/ cm) 2 )),E e_in For the incident light irradiation illuminance (μW +) 2 ),t int_in Exposure time(s);
S03:V drk_in dark processing and advanced sampling are carried out before the acquisition of the ground object linear array remote sensing data, V out_in Subtracting the dark voltage V drk_in After that, the formula (1) can be simplified as:
V out_in =(R e )×(E e_in )×(t int_in ) (2)
S04:t int_in at V out_in When the power supply voltage U is reached, the channel is determined to correspond to the exposure time and t of the slave computer system int_in Consistent;
s05: since the photoelectric diode group of the slave system is consistent with the spectral response curve of the photoelectric sensor in the incident light calibration module, the output voltage V of any photoelectric diode of the corresponding channel of the slave system out_o (V) calculating as:
V out_o =(R e )×(E e_o )×(t int_in ) (3)
wherein E is e_o For the reflected light radiation illuminance (μW +) 2 )。
Further, step J2 is to calibrate the linear remote sensing system, taking a slave system as an example, and the specific steps are as follows:
j21: firstly, acquiring and locking the exposure time of a certain slave system determined in the step S04;
j22: horizontally placing a standard reflecting plate under a clear sky, vertically aligning the slave system with the standard reflecting plate, and adjusting the position and the height of the slave system to ensure that the view field range of the slave system is completely covered by the standard reflecting plate;
j23: the slave system is used for continuously collecting linear array remote sensing data at fixed time intervals, and one frame of linear array remote sensing data collected once comprises p V out_o Data;
j24: according to peak output voltage V of photodiode corresponding to the centermost pixel out_o_max Output voltage V of photodiode corresponding to other pixel out_o [i]Is calculated for each pixel position i The mathematical relationship is as follows:
Figure BDA0003991018600000131
where i is the pixel number and the range is [0, p-1]The attenuation coefficient alpha corresponding to the pixel sequence number i can be sequentially calculated by the formula (4) i
J25 output voltage V after correction of all pixels out_o_cal [i]It can be calculated as:
Figure BDA0003991018600000132
j26. Alpha. When correcting the actually collected one-frame linear array remote sensing data i F (i) can be obtained through a table look-up method or through model training, and the following relation exists:
α i =F(i) (6)
after model training of F (i) is finished, the attenuation coefficient alpha of all pixels in one frame of linear array remote sensing data can be calculated through the pixel sequence number i i
J27 other slave systems use the steps J21-J26 to perform similar calculations to determine the respective attenuation coefficients.
Further, in step S5, the ground resolution is calculated by the following method:
when collectingWhen the height of the data is H, the actual distance D of the corresponding ground feature of the unit pixel (namely the single photodiode) W The method comprises the following steps:
Figure BDA0003991018600000133
further, the data pre-processing in step S6 specifically includes:
the data collected by the linear array remote sensing system comprises reflectivity data, and when a certain slave system collects the linear array remote sensing data, the incident light calibration module records that the incident light radiation illuminance of a channel corresponding to the slave system is E e_in According to formula (2); the ground object reflected light radiation illuminance collected by the slave machine system is an array E e_o Comprising 128 elements, E e_o []For the ground object reflected light radiation illuminance of the ith pixel serial number, according to formulas (3) and (5), the linear array reflectivity array R, which contains 128 elements, can be calculated as:
Figure BDA0003991018600000134
Where i is the pixel number and the range is 0, 127]Setting the serial number of the current slave system linear array remote sensing data acquisition as N (namely, frame number, starting from 0), and setting the result array REF of the data primary processing N The method comprises the following steps:
REF N ={N,Lon N ,Lat N ,R,T} (9)
in the formula, kon N Acquiring longitude coordinates of the centremost pixel of the linear array remote sensing data with the sequence number of N; lat N Acquiring the latitude coordinate of the centremost pixel of the linear array remote sensing data with the sequence number of N; r is the reflectivity array calculated according to the formula (8) when the acquisition sequence number is N, and T is the world time when the acquisition sequence number is N; the method is that the data of a single slave system is initially processed, and other slave systems (other wave bands) adopt the same method.
Further, in step S7, projection coordinates of the current frame of remote sensing data are calculated, including projection coordinates of a center-most pixel, a left center pixel, a right center pixel, a left side pixel and a right side pixel of the current frame of remote sensing data are calculated; the left center pixel and the right center pixel are respectively a first pixel at the left side and a first pixel at the right side of the centremost pixel of the linear array remote sensing data; the left side pixels and the right side pixels are all pixels left of a left center pixel and right of a right center pixel of the linear array remote sensing data respectively; the linear array remote sensing data takes an external carrier machine head as a zero axis, establishes a local coordinate system clockwise by 0-360 degrees, takes the direction of 90 degrees as the right, and takes the direction of 270 degrees as the left; the calculation method of the projection coordinates of the centremost pixel of the linear array remote sensing data of all the slave computer systems comprises the following steps:
S71, carrying out one-to-one correspondence labeling on all slave systems and the cabin positions, setting a cabin position No. 1, namely, enabling the normal line of the slave system No. 1 to coincide with the normal line of the geometric center of an antenna of the high-precision pose module, and setting the projection coordinate of the high-precision pose module at the current frame linear array remote sensing data acquisition moment as (X) cor_ref_N ,Y cor_ref_N );
S72, setting the central projection coordinate of the cabin number 1 as an origin,
Figure BDA0003991018600000143
for the included angle between any other cabin and cabin number 1 (origin), M is the number of other cabin (namely slave system);
s73, setting the projection coordinate of any slave system outside the slave system No. 1 in the local coordinate system as (X) csa_M ,Y csa_M );
S74.1 Slave computer System current frame Linear array remote sensing data center-most pixel projection coordinate (X) cor_cen_1 ,Y cor_cen_1 ) The calculation method of (1) is as follows:
X cor_cen_1_N =X cor_ref_N +H×tanη×sinθ-H×tanζ×cosθ (10)
Y cor_cen_1_N =Y cor_ref_N +H×tanη×cosθ-H×tanζ×sinθ (11)
the centremost pixel projection coordinate (X) of the linear array remote sensing data of the current frame of any other slave system cor_csa_M_N ,Y cor_csa_M_N ) The calculation method of (1) is as follows:
Figure BDA0003991018600000141
Figure BDA0003991018600000142
further, under any motion state of the external carrier, the left pixel projection coordinate (X) of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000151
Figure BDA0003991018600000152
the pixel projection coordinate (X) on the right side of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000153
Figure BDA0003991018600000154
linear array remote sensing data left center pixel projection coordinates (X) of current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000155
Figure BDA0003991018600000156
linear array remote sensing data right center pixel projection coordinates (X) of current frame measured by any slave system cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure BDA0003991018600000157
Figure BDA0003991018600000158
according to formulas (12) - (13), the projection coordinates of the centremost pixels of the current frame linear array remote sensing data set obtained by the slave computer systems in all the bilges can be calculated, then according to formulas (14) - (21), the projection coordinates of other pixels of the linear array remote sensing data set obtained by all the slave computer systems can be calculated, and finally the linear array remote sensing data collected by all the slave computer systems are integrated and converted into a two-dimensional orthographic image.
Further, when the course angle θ of the high-precision pose module is abnormal, the local calculation of the course angle θ is performed according to the adjacent longitude and latitude coordinates, the central projection coordinate of the previous sampling point of the 1 st cabin is set as (x 1, y 1), the plane projection coordinate of the next sampling point is set as (x 2, y 2), and then the azimuth angle θ of the previous sampling point is calculated as follows:
dx=x2-x1 (22)
dy=y2-y1 (23)
when dx=0 and dy >0, θ=0°;
when dx=0 and dy <0, θ=180°;
when dy=0 and dx >0, θ=90°;
when dy=0 and dx <0, θ=270°;
when dx >0 and dy >0, θ=arctan (dx/dy);
when dx <0 and dy >0, θ=360° + arctan (dx/dy);
θ=180++arctan (dx/dy) when dx <0 and dy <0 or when dx >0 and dy < 0.
It should be understood that the foregoing examples of the present invention are provided for the purpose of clearly illustrating the technical aspects of the present invention and are not intended to limit the specific embodiments of the present invention. Any modification, equivalent replacement, improvement, etc. that comes within the spirit and principle of the claims of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. The linear array remote sensing system is arranged on an external carrier and is characterized by comprising a host system, a plurality of slave systems, a cabin box and a standard reflecting plate;
the host system comprises a central processing unit, a data storage device, a high-precision pose module and an incident light calibration module;
the capsule is mounted on an external carrier; the back of the cabin box is used for installing a host system; the front of the cabin box comprises a plurality of cabins, and each cabin can accommodate a slave machine system;
the high-precision pose module is used for acquiring geographic position information and pose information when the linear array remote sensing system acquires data in the external carrier movement process and transmitting the geographic position information and the pose information to the data storage equipment;
the slave system comprises a linear array imaging sensor and a coprocessor, and is connected with the host system through an electrical bus in the cabin box; the coprocessor communicates with a central processing unit of the host system; the central processing unit is used for sending instructions to the coprocessor to control the linear array imaging sensor to acquire ground object linear array remote sensing data and transmitting the data to the data storage equipment;
The standard reflecting plate is used for correcting the linear array imaging sensor;
the host system synchronously triggers the incident light calibration module to collect the ambient light information and the slave system to collect the linear array remote sensing data, and the host system automatically adjusts the shooting parameters of the slave system in real time according to the ambient light information;
the central processing unit is also used for analyzing the linear array remote sensing data in the data storage device and constructing an orthographic remote sensing image by combining the geographic position information and the attitude information.
2. The integrated linear array remote sensing system of claim 1, wherein the linear array imaging sensor comprises an imaging lens with a focal length f and a p x 1 linear array arranged photodiode group with the same spectral response curve; wherein p is the number of photodiodes, namely the total pixels of the linear array imaging sensor; the geometric center of the photodiode group is set as the centremost pixel of the linear array imaging sensor; the center normal line of the photodiode group coincides with the optical axis of the imaging lens; the angle of view of the linear array imaging sensor is gamma; the linear array imaging sensor is provided with a narrow-band optical filter; the data acquired by the linear array imaging sensor at one time is a frame of linear array remote sensing data, and the data comprises p pixels; the arrangement directions of the photodiode groups in the slave systems of all the cabin positions in the cabin box are consistent; and the geometric center normal line of the slave system is coaxial with the geometric center normal line of the cabin.
3. The integrated linear array remote sensing system of claim 2, wherein the incident light calibration module comprises a plurality of photosensors, an auxiliary processor and a multi-channel optical module; the multi-channel optical module is provided with a plurality of narrow-band optical filters; the spectral response curves of the photoelectric sensor and the photodiode group are consistent; the linear array imaging sensors of the plurality of slave systems are consistent with the optical characteristics of the narrow-band optical filters of the multi-channel optical module; the geographic position information comprises latitude coordinates X, longitude coordinates Y, an altitude H and a course angle theta; the attitude information comprises a Yaw angle Yaw, a Pitch angle Pitch and a Roll angle Roll; the heading angle theta is increased by 0 degrees in the north and 0-360 degrees in the clockwise direction; the Yaw angle Yaw is gradually increased by 0-360 degrees clockwise by taking the machine head direction of the external carrier as 0 degrees; the Pitch angle Pitch and the Roll angle Roll are both referenced to the ground level; the concrete value of the Pitch angle Pitch is eta, positive number is given when the Pitch angle is the elevation angle, and negative number is given when the Pitch angle is the depression angle; the Roll angle Roll is specifically ζ, positive in the case of left Roll, and negative in the case of right Roll.
4. A linear array remote sensing data processing method of a linear array remote sensing system according to claims 1 to 3, characterized in that the data processing method comprises a correction phase and a remote sensing operation phase;
The correction phase comprises the steps of:
j1 initial exposure time calculation: calculating initial exposure time in a sensor correction stage, and ensuring that overflow saturation condition of the linear array remote sensing data does not occur;
correcting a J2 linear array remote sensing system: based on the initial exposure time, analyzing the optical attenuation characteristic of the linear array imaging sensor, and determining the attenuation coefficient of each pixel bit of the linear array remote sensing data;
the remote sensing operation stage comprises the following steps:
s1, obtaining initial pose information: acquiring initial geographic position information and attitude information of a linear array remote sensing system;
s2, calculating real-time exposure time: calculating the exposure time for collecting the current frame of linear array remote sensing data;
s3, acquiring current frame array remote sensing data and synchronously acquiring real-time geographic position information and attitude information;
s4, calculating the relative height: namely, the difference value between the height of the geographical position information corresponding to the current frame of linear array remote sensing data in S3 and the height of the initial geographical position information in S1;
s5, calculating the ground resolution: the ground resolution of the current frame linear array remote sensing data is calculated according to the relative height in the step S4 and the real-time geographic position information and the attitude information acquired in the step S3;
s6, performing primary processing on the current frame linear array remote sensing data: correcting and calculating the reflectivity of the current frame linear array remote sensing data acquired in the step S3, and constructing data after preliminary processing;
S7, calculating projection coordinates of the current frame linear array remote sensing data: calculating projection coordinates corresponding to all pixels of the current frame of linear array remote sensing data, and further constructing a linear array remote sensing image;
s8, storing the processed linear array remote sensing data;
the steps S1-S8 are processes for completing the processing of the remote sensing data of the frame of linear array, and the whole process needs to be circulated until the operation task is completed in practical application.
5. The method of claim 4, wherein steps J1 and S2 use the same exposure time calculation method, specifically, the host system automatically adjusts the exposure time t of each slave system in the process of collecting the linear array remote sensing data int_in The method comprises the following specific steps of:
in the process of acquiring linear remote sensing data of a certain slave system, determining the exposure time of the slave system according to a channel corresponding to an incident light calibration module;
the exposure time calculation step of any channel of the incident light calibration module comprises the following steps:
s01: setting the power supply voltage of the linear array remote sensing system as U;
s02: output voltage V of any spectrum channel of incident light calibration module out_in The calculation formula of (V) is:
V out_in =V drk_in +(R e )×(E e_in )×(t int_in ) (1)
wherein V is drk_in Is dark voltage (V), R e For the sensitivity of the photosensitive element to a specific wavelength (V/(μJ/cm) 2 )),E e_in For incident light irradiance (. Mu.W/cm) 2 ),t int_in Exposure time(s);
S03:V drk_in dark processing and advanced sampling are carried out before the acquisition of the ground object linear array remote sensing data, V out_in Subtracting the dark voltage V drk_in After that, the formula (1) can be simplified as:
V out_in =(R e )×(E e_in )×(t int_in ) (2)
S04:t int_in at V out_in When the power supply voltage U is reached, the channel is determined to correspond to the exposure time and t of the slave computer system int_in Consistent;
s05: since the photoelectric diode group of the slave system is consistent with the spectral response curve of the photoelectric sensor in the incident light calibration module, the output voltage V of any photoelectric diode of the corresponding channel of the slave system out_o (V) calculating as:
V out_o =(R e )×(E e_o )×(t int_in ) (3)
wherein E is e_o For reflecting light radiation illuminance (mu W/cm) 2 )。
6. The method for processing linear array remote sensing data according to claim 4, wherein the step J2 of correcting the linear array remote sensing system comprises the following specific steps:
j21: firstly, acquiring and locking the exposure time of a certain slave system determined in the step S04;
j22: horizontally placing a standard reflecting plate under a clear sky, vertically aligning the slave system with the standard reflecting plate, and adjusting the position and the height of the slave system to ensure that the view field range of the slave system is completely covered by the standard reflecting plate;
j23: the slave system is used for continuously collecting linear array remote sensing data at fixed time intervals, and one frame of linear array remote sensing data collected once comprises p V out_o Data;
j24: according to peak output voltage V of photodiode corresponding to the centermost pixel out_o_max Output voltage V of photodiode corresponding to other pixel out_o [i]Is calculated for each pixel position i The mathematical relationship is as follows:
Figure FDA0003991018590000041
where i is the pixel number and the range is [0, p-1]The attenuation coefficient alpha corresponding to the pixel sequence number i can be sequentially calculated by the formula (4) i
J25: output voltage V after correction of all pixels out_o_cal [i]It can be calculated as:
Figure FDA0003991018590000042
j26: when correcting actually acquired one-frame linear array remote sensing data, alpha i F (i) can be obtained through a table look-up method or through model training, and the following relation exists:
α i =F(i) (6)
after model training of F (i) is finished, the attenuation coefficient alpha of all pixels in one frame of linear array remote sensing data can be calculated through the pixel sequence number i i
J27: other slave systems perform similar calculations using the J2l-J26 steps to determine the respective attenuation coefficients.
7. The method for processing linear array remote sensing data according to claim 4, wherein the ground resolution calculation in step S5 is performed by:
when the height of the acquired data is H, the actual distance D of the unit pixel, namely the single photodiode, corresponding to the ground object W The method comprises the following steps:
Figure FDA0003991018590000043
8. the method for processing linear array remote sensing data according to claim 4, wherein the data pre-processing in step S6 comprises the following specific steps:
The data collected by the linear array remote sensing system comprises reflectivity data, and when a certain slave system collects the linear array remote sensing data, the incident light calibration module records that the incident light radiation illuminance of a channel corresponding to the slave system is E e_in According to formula (2); the ground object reflected light radiation illuminance collected by the slave machine system is an array E e_o Comprising p elements, E e_o [i]For the ground object reflected light radiation illuminance of the ith pixel serial number, according to formulas (3) and (5), the linear array reflectivity array R, which contains p elements, can be calculated as:
Figure FDA0003991018590000044
where i is the pixel number and the range is [0, p-1]The serial number of the current slave system linear array remote sensing data acquisition is set as N, namely the frame number, and the data initial processing result array REF is started from 0 N The method comprises the following steps:
REF N ={N,Lon N ,Lat N ,R,T} (9)
in Lon N Acquiring longitude coordinates of the centremost pixel of the linear array remote sensing data with the sequence number of N; lat N Acquiring the latitude coordinate of the centremost pixel of the linear array remote sensing data with the sequence number of N; r is the reflectivity array calculated according to the formula (8) when the acquisition sequence number is N, and T is the world time when the acquisition sequence number is N; the method is that the data of a single slave system is initially processed, and other slave systems (other wave bands) adopt the same method.
9. The method for processing linear array remote sensing data according to claim 4, wherein the step S7 of calculating the projection coordinates of the current frame of linear array remote sensing data includes calculating the projection coordinates of the most central pixel, the left central pixel, the right central pixel, the left pixel and the right pixel of the current frame of linear array remote sensing data; the left center pixel and the right center pixel are respectively a first pixel at the left side and a first pixel at the right side of the centremost pixel of the linear array remote sensing data; the left side pixels and the right side pixels are all pixels left of a left center pixel and right of a right center pixel of the linear array remote sensing data respectively; the linear array remote sensing data takes an external carrier machine head as a zero axis, establishes a local coordinate system clockwise by 0-360 degrees, takes the direction of 90 degrees as the right, and takes the direction of 270 degrees as the left; the calculation method of the projection coordinates of the centremost pixel of the linear array remote sensing data of all the slave computer systems comprises the following steps:
S71, carrying out one-to-one correspondence labeling on all slave systems and the cabin positions, setting a cabin position No. 1, namely, overlapping the normal line of the slave system with the I number with the normal line of the geometric center of the antenna of the high-precision pose module, and setting the cabin position asThe projection coordinate of the high-precision pose module at the time of the acquisition of the front frame linear array remote sensing data is (X) cor_ref_N ,Y cor_ref_N );
S72, setting the central projection coordinate of the cabin number 1 as an origin,
Figure FDA0003991018590000051
for the included angle between any other cabin and the I cabin (origin), M is the number of the other cabin (namely the slave system);
s73, setting the projection coordinate of any slave system outside the I-number slave system in the local coordinate system as (X) csa_M ,Y csa_M );
S74.1 Slave computer System current frame Linear array remote sensing data center-most pixel projection coordinate (X) cor_cen_1 ,Y cor_cen_1 ) The calculation method of (1) is as follows:
X cor_cen_N =X cor_ref_N +H×tanη×sinθ-H×tanζ×cosθ (10)
Y cor_cen_N =Y cor_ref_N +H×tanη×cosθ-H×tanζ×sinθ (11)
the centremost pixel projection coordinate (X) of the linear array remote sensing data of the current frame of any other slave system cor_csa_M_N ,Y cor_csa_M_N ) The calculation method of (1) is as follows:
Figure FDA0003991018590000052
Figure FDA0003991018590000053
further, under any motion state of the external carrier, the left pixel projection coordinate (X) of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure FDA0003991018590000054
Figure FDA0003991018590000061
the pixel projection coordinate (X) on the right side of the linear array remote sensing data of the current frame measured by any slave system Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure FDA0003991018590000062
Figure FDA0003991018590000063
linear array remote sensing data left center pixel projection coordinate of current frame measured by any slave system
(X Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure FDA0003991018590000064
Figure FDA0003991018590000065
linear array remote sensing data right center pixel projection coordinate of current frame measured by any slave system
(X Cor[N][i][M] ,Y Cor[N][i][M] ) The calculation formula of (2) is as follows:
Figure FDA0003991018590000066
Figure FDA0003991018590000067
according to formulas (12) - (13), the projection coordinates of the centremost pixels of the current frame linear array remote sensing data set obtained by the slave computer systems in all the bilges can be calculated, then according to formulas (14) - (21), the projection coordinates of other pixels of the linear array remote sensing data set obtained by all the slave computer systems can be calculated, and finally the linear array remote sensing data collected by all the slave computer systems are integrated and converted into a two-dimensional orthographic image.
10. According to the linear array remote sensing data processing method, when the heading angle theta of the high-precision pose module is obtained abnormally, the local calculation of the heading angle theta can be performed according to adjacent longitude and latitude coordinates; let the center projection coordinate of the previous sampling point of the 1 st bin be (x 1, y 1), and the plane projection coordinate of the next sampling point be (x 2, y 2), then the azimuth θ of the previous sampling point is calculated as follows:
dx=x2-x1 (22)
dy=y2-y1 (23)
when dx=0 and dy > 0, θ=0°;
when dx=0 and dy < 0, θ=180°;
when dy=0 and dx > 0, θ=90°;
when dy=0 and dx < 0, θ=270°;
θ=arctan (dx/dy) when dx > 0 and dy > 0;
When dx < 0 and dy > 0, θ=360° +arctan (dx/dy);
θ=180++aΩ ctan (dx/dy) when dx < 0 and dy < 0 or when dx > 0 and dy < 0.
CN202211584407.4A 2022-12-09 2022-12-09 Sensing and storing integrated linear array remote sensing system and data processing method thereof Pending CN115993609A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211584407.4A CN115993609A (en) 2022-12-09 2022-12-09 Sensing and storing integrated linear array remote sensing system and data processing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211584407.4A CN115993609A (en) 2022-12-09 2022-12-09 Sensing and storing integrated linear array remote sensing system and data processing method thereof

Publications (1)

Publication Number Publication Date
CN115993609A true CN115993609A (en) 2023-04-21

Family

ID=85989731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211584407.4A Pending CN115993609A (en) 2022-12-09 2022-12-09 Sensing and storing integrated linear array remote sensing system and data processing method thereof

Country Status (1)

Country Link
CN (1) CN115993609A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118050328A (en) * 2024-03-26 2024-05-17 华南农业大学 Monocular high-resolution multispectral imaging system and control method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118050328A (en) * 2024-03-26 2024-05-17 华南农业大学 Monocular high-resolution multispectral imaging system and control method

Similar Documents

Publication Publication Date Title
CN107807125B (en) Plant information calculation system and method based on unmanned aerial vehicle-mounted multispectral sensor
US20240053477A1 (en) System and method for measuring image distance of power transmission lines with unmanned aerial vehicle (uav)
CN110567891B (en) Winter wheat canopy chlorophyll estimation system and method
CN110244766B (en) Planning method and system for unmanned aerial vehicle routing inspection route of photovoltaic power station
US20160232650A1 (en) Method and system of calibrating a multispectral camera on an aerial vehicle
JP2003009664A (en) Crop growth level measuring system, crop growth level measuring method, crop growth level measuring program, and computer-readable recording medium recorded with the program
CN206832361U (en) A kind of unmanned plane snap formula hyperspectral remote sensing system
JP7074126B2 (en) Image processing equipment, growth survey image creation system and program
CN112489130A (en) Distance measuring method and device for power transmission line and target object and electronic equipment
CN110006452B (en) Relative geometric calibration method and system for high-resolution six-size wide-view-field camera
JP2020515809A (en) Apparatus and method relating to multi-sensor irradiance estimation
CN110987183A (en) Multispectral imaging system and method
CN115993609A (en) Sensing and storing integrated linear array remote sensing system and data processing method thereof
CN110689505B (en) Scene-based satellite-borne remote sensing instrument self-adaptive correction method and system
CN114544006B (en) Low-altitude remote sensing image correction system and method based on ambient illumination condition
CN107798668A (en) The method and system of unmanned plane imaging EO-1 hyperion geometric correction based on RGB images
Elbahnasawy et al. Multi-sensor integration onboard a UAV-based mobile mapping system for agricultural management
CN103776426A (en) Geometric correction method for rotary platform farmland image
CN112949411A (en) Spectral image correction method and device
CN111598937A (en) Farmland land area measurement method and system based on calibration block target correction
CN210242985U (en) Airborne radiation correction device and system
CN115598071A (en) Plant growth distribution state detection method and device
CN102636267A (en) Sky brightness instrument
CN112257630A (en) Unmanned aerial vehicle detection imaging method and device of power system
CN113970753B (en) Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination