CN112379373B - Space-borne SAR real-time imaging device - Google Patents
Space-borne SAR real-time imaging device Download PDFInfo
- Publication number
- CN112379373B CN112379373B CN202011053099.3A CN202011053099A CN112379373B CN 112379373 B CN112379373 B CN 112379373B CN 202011053099 A CN202011053099 A CN 202011053099A CN 112379373 B CN112379373 B CN 112379373B
- Authority
- CN
- China
- Prior art keywords
- sar
- target
- imaging
- data
- processing unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 111
- 238000012545 processing Methods 0.000 claims abstract description 74
- 238000001514 detection method Methods 0.000 claims abstract description 33
- 238000007667 floating Methods 0.000 claims abstract description 18
- 238000011835 investigation Methods 0.000 claims abstract description 14
- 238000013135 deep learning Methods 0.000 claims abstract description 10
- 238000000034 method Methods 0.000 claims description 36
- 238000013507 mapping Methods 0.000 claims description 25
- 230000010365 information processing Effects 0.000 claims description 16
- 239000013307 optical fiber Substances 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 6
- 238000013139 quantization Methods 0.000 claims description 6
- 238000007405 data analysis Methods 0.000 claims description 3
- 238000013144 data compression Methods 0.000 claims description 3
- 238000004148 unit process Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000005540 biological transmission Effects 0.000 abstract description 3
- 239000007787 solid Substances 0.000 abstract description 2
- 238000001228 spectrum Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 208000004350 Strabismus Diseases 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000010183 spectrum analysis Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9004—SAR image acquisition techniques
- G01S13/9005—SAR image acquisition techniques with optical processing of the SAR signals
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a satellite-borne SAR real-time imaging device, which constructs SAR imaging by adopting two imaging modes of wide-area wide-range low-resolution general investigation and high-resolution detailed investigation, combines the satellite-borne SAR real-time imaging processing, target detection and a target classification recognition technology based on deep learning, realizes quick and effective reconnaissance on a sensitive area, and acquires information such as the number, the position, the type, the confidence and the slice of an interested target; the problems of large ground processing time delay, low timeliness and the like caused by directly downloading massive high-resolution SAR data are effectively solved, the bottleneck of data transmission is effectively broken through, the solid storage and load utilization rate is improved, and the combat efficiency of satellites is remarkably improved; the input data of the deep learning forward reasoning model is floating point complex data, the floating point complex data not only contains target characteristic information but also contains target phase information, the information dimension is increased, the information precision is high, and the classification accuracy is convenient to improve.
Description
Technical Field
The invention belongs to the technical field of satellite load data processing and platform control, and particularly relates to a satellite-borne SAR real-time imaging device.
Background
The space-borne SAR has the characteristics of high spatial resolution and large coverage area, the accurate positioning of the ship targets can be realized, and the high-resolution SAR image provides possibility for ship type identification, but the ship extraction result is difficult to effectively verify, so that the performance of target extraction and type identification cannot be effectively improved. The deep learning technology which is recently developed is increasingly widely applied to aspects of target classification, identification, video analysis, natural language processing and the like. By means of the deep learning technology, the on-orbit target detection and identification based on the remote sensing image can effectively improve the use efficiency of satellites, and timely find out remarkable targets on land and sea surfaces.
The paper Deramp Chirp Scaling imaging algorithm of high-resolution spaceborne bunching SAR by Beijing university of transportation Wang Guodong proposes a Deramp Chirp Scaling (DCS) algorithm suitable for high-resolution spaceborne bunching Synthetic Aperture Radar (SAR). The algorithm combines the advantages of a spectrum analysis (SPECAN) algorithm and a Chirp Scaling algorithm, adopts deramp processing with fixed Doppler frequency modulation to realize coarse focusing of azimuth, eliminates the special azimuth spectrum aliasing phenomenon of the satellite-borne bunching SAR, then adopts the Chirp Scaling principle to realize accurate focusing of distance, compensates azimuth phase error caused by deramp processing, and realizes azimuth fine focusing. The algorithm is suitable for precise imaging processing of the wide-swath high-resolution spaceborne beam-focusing SAR.
The China academy of sciences electronics institute CN103323828A/B provides an ultra-high resolution spaceborne SAR imaging processing method and device. The method comprises the steps of establishing a two-dimensional spectrum model of a satellite-borne SAR echo signal, eliminating a range migration phase item and a high-order coupling phase item at a reference slant distance in the two-dimensional spectrum model by multiplying the two-dimensional spectrum model with a corresponding reference function, performing range inverse Fourier transform on the eliminated signal, correcting residual range migration quantity in the two-dimensional spectrum model by an interpolation mode, and finally eliminating an azimuth modulation phase item of the two-dimensional spectrum model by an azimuth compression mode to obtain a focused satellite-borne SAR image. By utilizing the technical scheme of the invention, the high-quality ultra-high resolution spaceborne SAR image can be obtained, and the positioning capability is good.
Beijing spacecraft overall design department CN201811280751.8 discloses an on-board autonomous imaging method based on-board AIS real-time information guidance. The main steps of the invention are as follows: (1) capturing and confirming a ship target to be observed; (2) determining satellite observation time and area; (3) calculating the satellite's observed attitude to the vessel; (4) And adjusting the satellite attitude, and determining SAR load to perform imaging work. According to the method, the satellite-borne AIS information is utilized to carry out matching screening on the interested target in real time, the area where the target is located is determined, real-time calculation of data such as time, gesture, imaging parameters and the like required by SAR load imaging is achieved, a satellite gesture adjusting instruction and an SAR load starting imaging instruction are automatically generated, and imaging tasks of a selected area and the selected target are completed. The comprehensive utilization of AIS information and SAR imaging information improves the recognition and confirmation efficiency of marine ship targets.
However, the above method has the following problems:
(1) The existing technical means all use AIS information as verification guiding means to assist in completing SAR imaging or detecting and identifying ship targets in SAR images, the core algorithm adopts target detection and identification methods based on traditional image segmentation and morphology, the processing precision, accuracy and performance improvement capability are limited, and the capability requirements of target extraction and type identification of subsequent high-resolution SAR images cannot be met.
(2) If the ship targets are detected, positioned and identified only by means of AIS, the method has single technical means, and can not effectively monitor the behaviors such as AIS report missing, non-cooperative targets (warship and the like) or intentional avoidance (AIS is not opened), and the mature spaceborne SAR image ship target detection and identification algorithm can make up for the shortages of AIS in ship target monitoring
(3) The existing method mainly considers that the function closed loop can be completed by ground application or participation of ground equipment, and the method must rely on a space-earth data transmission system to occupy a large amount of satellite-earth communication bandwidth, so that a large amount of processing time is consumed, the communication efficiency of a satellite-earth network is reduced, the efficiency of a remote sensing satellite is further influenced, and the processing efficiency and the instantaneity cannot meet the use requirements of the modern remote sensing satellite.
Disclosure of Invention
In view of the above, the invention aims to provide a satellite-borne SAR real-time imaging device which can realize rapid acquisition of information of a region of interest and detailed reconnaissance of an important target.
A satellite-borne SAR real-time imaging device comprises a satellite attitude orbit control unit, a central processing unit, an SAR load, an SAR imaging processing unit, an intelligent information processing unit, a data processing unit and a load control unit;
the central processing unit determines control parameters including imaging resolution, breadth, load direction and satellite attitude according to the target information, and sends the control parameters to the satellite attitude orbit control unit and the load control unit;
the satellite attitude orbit control unit adjusts the satellite attitude to a task specified state according to the control parameters, the load control unit calculates a load imaging parameter according to the control parameters to control SAR load operation, the SAR load starts to detect and scan a specified area according to the imaging parameter, and the received radio frequency echo signal is down-converted into an intermediate frequency analog signal and is sent to the SAR imaging processing unit;
the SAR imaging processing unit processes the echo signals into SAR images by combining the radar parameters sent by the load control unit, and then sends imaging data to the intelligent information processing unit through optical fibers;
the intelligent information processing unit firstly carries out image quantization on SAR image data, namely, original single-precision floating point complex data is quantized into 8-bit fixed point data, and then target detection processing is carried out: if no target is detected, the task ends; if the target is detected, judging and calculating the coordinate position of the target, and determining the most dense central position of the target as a central imaging point of detail-checking imaging; calculating and obtaining detailed investigation parameter information including imaging resolution, breadth, imaging lower view angle, satellite orbit and attitude according to the central imaging point and target distribution condition, sending the detailed investigation parameter information to a central processing unit for calculating SAR imaging parameters in a detailed investigation mode, and distributing the parameter information to a load control unit, an attitude orbit control unit and an SAR imaging processing unit by the central processing unit;
the gesture track control unit moves to a specified position in a specified time according to the track and gesture parameter information and adjusts the gesture to a specified state, so that the target detection of the subsequent SAR load is facilitated; the load control unit controls the beam direction of the SAR load to perform target detection scanning of interest according to the beam control signal; SAR imaging processing is carried out by the SAR imaging processing unit according to the acquired SAR load echo signal and radar parameters to acquire high-resolution SAR image data, and the high-resolution SAR image data is sent to the intelligent information processing unit through the optical fiber module;
the intelligent information processing unit firstly carries out image quantization on the high-resolution SAR image data sent by the SAR imaging processing unit, and quantizes the original single-precision floating point complex data into 8-bit fixed point data for target detection processing; after the target detection is completed, on one hand, a target slice is obtained from the original floating point complex image data according to the detection result, the target slice is sent to a deep learning forward reasoning model for target identification processing, and the type and the confidence information of the target are obtained; on the other hand, space geometric positioning processing is carried out according to the coordinate position of the target detection result and SAR load beam pointing, satellite orbit and attitude information, longitude and latitude information of the position of the target is determined, and result information such as the target position, the target slice, the target type and the target confidence level is sent to a data processing unit through a SpaceWire bus;
the data processing unit receives the result information, and after the result information is subjected to data compression and coding, the information is quickly downloaded to form high-quality information. Meanwhile, the data processing unit can download the original echo data or the processed SAR image data transmitted by the SAR imaging processing unit through the optical fiber according to the task requirement and the instruction control mode, so that the ground is convenient for data analysis.
Preferably, the calculating method of the center imaging point position of the target comprises the following steps: the coordinates of the object i are (X i ,Y i ) The total number of targets is k, and square differences delta of the target i and other k-1 target coordinates are calculated respectively i Delta then i The position coordinate corresponding to the minimum value is the central imaging point.
Preferably, the process of quantizing the image data is: the 32-bit unsigned fixed point data is divided into 6 sections, namely: the first interval is [0, 256), the second interval is [256, 4096 ], the third interval is [4096,16384), the fourth interval is [16384,65535), the fifth interval is [65536,262143), the sixth interval is [262144,4294967296), 6 gray mapping tables are correspondingly generated in six intervals, each table size is 256,6 gray tables are combined into one table with size of 1536, and the first interval gray table generation method is as follows: assuming that the input data is n, where n=0, 1,2,3, …,255, after performing gray mapping, the obtained gray scale value is 10×log (n), and the corresponding address index value is n;
the second interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 256,257,258,259, …,4094,4095, after gray mapping, the obtained gray scale value is 10×log (round (n/16) ×16), and the corresponding address index value is round (n/16);
the third interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 4096,4097,4098,4099, …,16382,16383, after gray mapping, the obtained gray scale value is 10×log (round (n/64) ×64), and the corresponding address index value is round (n/64);
the fourth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 16384,16385,16386, …,65534,65535, after gray mapping, the obtained gray scale value is 10×log (round (n/256) ×256), and the corresponding address index value is round (n/256);
the fifth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 65536,65537,65538,65539, …,262142,262143, after gray mapping, the obtained gray scale value is 10×log (round (n/1024) ×1024), and the corresponding address index value is round (n/1024);
the sixth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 262144,262145,262146, …,4294967294,4294967295, after the gradation mapping is performed, the obtained gradation table value is 10×log (round (n/4096) ×4096), and the corresponding address index value is round (n/4096).
Judging which section the data of the input table belongs to: if the address is in the 1 st interval, directly looking up a table by using the input value, namely directly looking up a table by using the input value as an address index value;
if the address belongs to the 2 nd section, dividing the input value by 16, and adding 256 to the obtained result as an address index value to look up a table; if the address belongs to the 3 rd interval, dividing the input value by 64, and adding 512 as an address index value to the obtained result to look up a table; if the address belongs to the 4 th interval, dividing the input value by 256, and adding 768 to the obtained result as an address index value to look up a table; if the address belongs to the 5 th interval, dividing the input value by 1024, and adding 1024 to the obtained result as an address index value to look up a table; if the address belongs to the 6 th interval, dividing the input value by 4096, and adding 1280 as an address index value to obtain a result for table lookup;
and 8-bit gray data is obtained by searching a gray mapping table, namely the quantized image data.
The invention has the following beneficial effects:
1) According to the invention, SAR imaging is constructed by adopting two imaging modes of wide-area wide-range low-resolution general investigation and high-resolution detailed investigation, and the quick and effective reconnaissance of a sensitive area is realized by combining satellite-borne SAR real-time imaging processing, target detection and target classification recognition technology based on deep learning, so that the number, position, type, confidence level, slice and other information of interested targets are acquired. The problems of large ground processing time delay, low timeliness and the like caused by directly downloading massive high-resolution SAR data are effectively solved, the data transmission bottleneck is effectively broken through, the solid storage and load utilization rate is improved, and the combat efficiency of satellites is remarkably improved.
2) The satellite-borne SAR real-time imaging method adopts an on-orbit autonomous closed-loop working mode of actively finding a target and autonomously guiding high-resolution imaging, so that the intelligent level of a satellite is greatly improved, the good use and easy use experience of a user is enhanced, the use of the satellite is simplified, the flexibility and timeliness of imaging a sensitive target in the application process are improved, and the application requirements of the user are met.
3) The input data of the deep learning forward reasoning model is floating point complex data, the floating point complex data not only contains target characteristic information but also contains target phase information, the information dimension is increased, the information precision is high, and the classification accuracy is convenient to improve.
Drawings
FIG. 1 is an imaging task timeline;
FIG. 2 is a front side view stripe observation model;
fig. 3 is a rear-view oblique sliding bunching observation model.
Detailed Description
The invention will now be described in detail by way of example with reference to the accompanying drawings.
A satellite-borne SAR real-time imaging device comprises a satellite attitude orbit control unit, a central processing unit, an SAR load, an SAR imaging processing unit, an intelligent information processing unit, a data processing unit and a load control unit, wherein the modules are matched to complete satellite SAR imaging tasks. The task flow is set to firstly perform front-side view stripe low-resolution wide area general survey SAR imaging, perform initial detection of an interesting target on the obtained SAR image, perform strabismus bunching high-resolution detailed survey SAR imaging after being organized according to detection results and satellite related information, perform interesting target detection, slice extraction and target identification based on deep learning on high-resolution SAR image data, package and download information such as target positions, target slices, target types, confidence and the like to a ground receiving station after processing is completed, and rapidly generate high-quality information.
As shown in fig. 1, the SAR imaging is divided into two working modes of census and detailed examination, and in the imaging process, census imaging is first performed, and a wide-area imaging search is performed on an object of interest and an object position of interest is obtained; and acquiring detailed-investigation imaging parameters according to the wide-area imaging search result, and carrying out detailed-investigation imaging on the interested target by the tissue satellite platform and the load so as to determine the type and longitude and latitude position of the target. The general investigation adopts a positive side view stripe low resolution imaging mode, and the detailed investigation adopts a rear squint beam focusing high resolution imaging. Imaging parameters and modes such as breadth, resolution and the like can be flexibly selected according to task requirements and target characteristics.
As shown in fig. 2, the front-side view stripe low-resolution imaging procedure is as follows: the central processing unit determines control parameters such as imaging resolution, breadth, load orientation, satellite attitude and the like according to information such as a target size, a target area and the like, the control parameters are sent to the satellite attitude orbit control unit and the load control unit, the satellite attitude orbit control unit adjusts the satellite attitude to a task stipulated state according to the control parameters, the load control unit calculates load imaging parameters according to the control parameters to control SAR load work, the SAR load starts detection scanning on a designated area according to the imaging parameters, the received radio frequency echo signals are down-converted into intermediate frequency analog signals and sent to the SAR imaging processing unit, and the SAR imaging processing unit combines the radar parameters sent by the load control unit to process the echo signals into SAR images.
The target detection flow of the low-resolution SAR image is as follows: the SAR imaging processing unit sends low-resolution imaging data to the intelligent information processing unit through the optical fiber, the intelligent information processing unit firstly carries out image quantization on SAR image data, and the original single-precision floating point complex data is quantized into 8-bit fixed point data for target detection processing. If no target is detected, the task ends; if the target is detected, the coordinate position of the target is judged and calculated, and the most dense central position of the target is determined as a central imaging point of detail imaging.
The calculation method of the position of the central imaging point of detail imaging comprises the following steps: the coordinates of the object i are (X i ,Y i ) The total number of targets is k, and square differences delta of the target i and other k-1 target coordinates are calculated respectively i Delta then i The position coordinate corresponding to the minimum value is the central imaging point. Wherein,,the square difference δ=min (δ) of the central imaging point coordinate values i ),i=1,2…k。
As shown in fig. 3, the back strabismus bunching high-resolution detailed SAR imaging parameter calculation and configuration flow is as follows: and calculating parameter information such as detail-check imaging resolution, breadth, imaging lower view angle, satellite orbit and attitude according to the central imaging point and target distribution condition, sending the parameter information to a central processing unit to calculate SAR imaging parameters in detail-check mode, and distributing the parameter information to a load control unit, an attitude orbit control unit and an SAR imaging processing unit by the central processing unit.
The high-resolution SAR imaging process comprises the following steps: after the imaging parameters are calculated and configured, the gesture track control unit moves to a specified position in a specified time according to the track and gesture parameter information and adjusts the gesture to a specified state, so that the target detection of the subsequent SAR load is facilitated; the load control unit controls the beam direction of the SAR load to perform target detection scanning of interest according to the beam control signal; and the SAR imaging processing unit performs SAR imaging processing according to the acquired SAR load echo signals and radar parameters to acquire high-resolution SAR image data, and sends the high-resolution SAR image data to the intelligent information processing unit through the optical fiber module.
The intelligent information processing unit carries out image quantization on the high-resolution SAR image data sent by the SAR imaging processing unit, and quantizes the original single-precision floating point complex data into 8-bit fixed point data for target detection processing. After the target detection is completed, on one hand, a target slice is obtained from the original floating point complex image data according to the detection result, the target slice is sent to a deep learning forward reasoning model for target identification processing, and the type and the confidence information of the target are obtained; on the other hand, space geometric positioning processing is carried out according to the coordinate position of the target detection result, the information such as SAR load beam pointing, satellite orbit and gesture, longitude and latitude information of the position of the target is determined, and the result information such as the target position (longitude and latitude), the target slice, the target type, the target confidence level and the like is sent to the data processing unit a/b through a SpaceWire bus (new generation of satellite-borne high-speed data bus standard).
The data processing unit receives the result information, and after the result information is subjected to data compression and coding, the information is quickly downloaded to form high-quality information. Meanwhile, the data processing unit can download the original echo data or the processed SAR image data transmitted by the SAR imaging processing unit through the optical fiber according to the task requirement and the instruction control mode, so that the ground is convenient for data analysis.
The floating point complex data sent to the intelligent information processing unit is quantized and used for target detection based on morphology. The method comprises the following steps: and (3) calculating an absolute value of the received floating point complex data, converting an absolute value result from single-precision floating point data into 32-bit unsigned fixed point data, and obtaining 8-bit gray data by searching a gray mapping table by taking the result as input, namely quantized image data. The specific generation flow of the gray mapping table is as follows:
the 32-bit unsigned fixed point data is divided into 6 sections, namely: the first interval is [0, 256), the second interval is [256, 4096 ], the third interval is [4096,16384), the fourth interval is [16384,65535), the fifth interval is [65536,262143), the sixth interval is [262144,4294967296), 6 gray mapping tables are correspondingly generated in six intervals, each table size is 256,6 gray tables are combined into one table with size of 1536, and the first interval gray table generation method is as follows: assuming that the input data is n, where n=0, 1,2,3, …,255, after performing gray mapping, the obtained gray scale value is 10×log (n), and the corresponding address index value is n;
the second interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 256,257,258,259, …,4094,4095, after gray mapping, the obtained gray scale value is 10×log (round (n/16) ×16), and the corresponding address index value is round (n/16);
the third interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 4096,4097,4098,4099, …,16382,16383, after gray mapping, the obtained gray scale value is 10×log (round (n/64) ×64), and the corresponding address index value is round (n/64);
the fourth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 16384,16385,16386, …,65534,65535, after gray mapping, the obtained gray scale value is 10×log (round (n/256) ×256), and the corresponding address index value is round (n/256);
the fifth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 65536,65537,65538,65539, …,262142,262143, after gray mapping, the obtained gray scale value is 10×log (round (n/1024) ×1024), and the corresponding address index value is round (n/1024);
the sixth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 262144,262145,262146, …,4294967294,4294967295, after the gradation mapping is performed, the obtained gradation table value is 10×log (round (n/4096) ×4096), and the corresponding address index value is round (n/4096).
Judging which section the data of the input table belongs to: if the address is in the 1 st interval, directly looking up a table by using the input value, namely directly looking up a table by using the input value as an address index value;
if the address belongs to the 2 nd section, dividing the input value by 16, and adding 256 to the obtained result as an address index value to look up a table; if the address belongs to the 3 rd interval, dividing the input value by 64, and adding 512 as an address index value to the obtained result to look up a table; if the address belongs to the 4 th interval, dividing the input value by 256, and adding 768 to the obtained result as an address index value to look up a table; if the address belongs to the 5 th interval, dividing the input value by 1024, and adding 1024 to the obtained result as an address index value to look up a table; if the address belongs to the 6 th section, dividing the input value by 4096, and adding 1280 to the obtained result as an address index value to perform table lookup.
And the target slice data sent to the deep learning model is subjected to target classification recognition. The method comprises the following steps: the target slice generation module is used for extracting a target slice according to the set size information by informing the floating point complex image data storage module to take the coordinate position as a center according to the target coordinate position transmitted by the target detection module, and sending the extracted floating point complex slice data to the deep learning model for forward reasoning calculation to obtain the target type and the confidence information.
In summary, the above embodiments are only preferred embodiments of the present invention, and are not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (3)
1. The satellite-borne SAR real-time imaging device is characterized by comprising a satellite attitude orbit control unit, a central processing unit, an SAR load, an SAR imaging processing unit, an intelligent information processing unit, a data processing unit and a load control unit;
the central processing unit determines control parameters including imaging resolution, breadth, load direction and satellite attitude according to the target information, and sends the control parameters to the satellite attitude orbit control unit and the load control unit;
the satellite attitude orbit control unit adjusts the satellite attitude to a task specified state according to the control parameters, the load control unit calculates a load imaging parameter according to the control parameters to control SAR load operation, the SAR load starts to detect and scan a specified area according to the imaging parameter, and the received radio frequency echo signal is down-converted into an intermediate frequency analog signal and is sent to the SAR imaging processing unit;
the SAR imaging processing unit processes the echo signals into SAR images by combining the radar parameters sent by the load control unit, and then sends imaging data to the intelligent information processing unit through optical fibers;
the intelligent information processing unit firstly carries out image quantization on SAR image data, namely, original single-precision floating point complex data is quantized into 8-bit fixed point data, and then target detection processing is carried out: if no target is detected, the task ends; if the target is detected, judging and calculating the coordinate position of the target, and determining the most dense central position of the target as a central imaging point of detail-checking imaging; calculating and obtaining detailed investigation parameter information including imaging resolution, breadth, imaging lower view angle, satellite orbit and attitude according to the central imaging point and target distribution condition, sending the detailed investigation parameter information to a central processing unit for calculating SAR imaging parameters in a detailed investigation mode, and distributing the parameter information to a load control unit, an attitude orbit control unit and an SAR imaging processing unit by the central processing unit;
the gesture track control unit moves to a specified position in a specified time according to the track and gesture parameter information and adjusts the gesture to a specified state, so that the target detection of the subsequent SAR load is facilitated; the load control unit controls the beam direction of the SAR load to perform target detection scanning of interest according to the beam control signal; SAR imaging processing is carried out by the SAR imaging processing unit according to the acquired SAR load echo signal and radar parameters to acquire high-resolution SAR image data, and the high-resolution SAR image data is sent to the intelligent information processing unit through the optical fiber module;
the intelligent information processing unit firstly carries out image quantization on the high-resolution SAR image data sent by the SAR imaging processing unit, and quantizes the original single-precision floating point complex data into 8-bit fixed point data for target detection processing; after the target detection is completed, on one hand, a target slice is obtained from the original floating point complex image data according to the detection result, the target slice is sent to a deep learning forward reasoning model for target identification processing, and the type and the confidence information of the target are obtained; on the other hand, space geometric positioning processing is carried out according to the coordinate position of the target detection result and SAR load beam pointing, satellite orbit and attitude information, longitude and latitude information of the position of the target is determined, and the target position, the target slice, the target type and target confidence result information are sent to a data processing unit through a SpaceWire bus;
the data processing unit receives the result information, and rapidly downloads the information after carrying out data compression and encoding processing on the result information to form high-quality information; meanwhile, the data processing unit downloads the original echo data or the processed SAR image data transmitted by the SAR imaging processing unit through the optical fiber according to the task requirement and the instruction control mode, so that the ground is convenient for data analysis.
2. The on-board SAR real-time imaging device according to claim 1, wherein the method for calculating the position of the central imaging point of the target comprises: the coordinates of the object i are (X i ,Y i ) The total number of targets is k, and square differences delta of the target i and other k-1 target coordinates are calculated respectively i Delta then i The position coordinate corresponding to the minimum value is the central imaging point.
3. The on-board SAR real-time imaging apparatus according to claim 1, wherein the process of quantizing the image data is: the 32-bit unsigned fixed point data is divided into 6 sections, namely: the first interval is [0, 256), the second interval is [256, 4096 ], the third interval is [4096,16384), the fourth interval is [16384,65535), the fifth interval is [65536,262143), the sixth interval is [262144,4294967296), 6 gray mapping tables are correspondingly generated in six intervals, each table size is 256,6 gray tables are combined into one table with size of 1536, and the first interval gray table generation method is as follows: assuming that the input data is n, where n=0, 1,2,3, …,255, after performing gray mapping, the obtained gray scale value is 10×log (n), and the corresponding address index value is n;
the second interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 256,257,258,259, …,4094,4095, after gray mapping, the obtained gray scale value is 10×log (round (n/16) ×16), and the corresponding address index value is round (n/16);
the third interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 4096,4097,4098,4099, …,16382,16383, after gray mapping, the obtained gray scale value is 10×log (round (n/64) ×64), and the corresponding address index value is round (n/64);
the fourth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 16384,16385,16386, …,65534,65535, after gray mapping, the obtained gray scale value is 10×log (round (n/256) ×256), and the corresponding address index value is round (n/256);
the fifth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 65536,65537,65538,65539, …,262142,262143, after gray mapping, the obtained gray scale value is 10×log (round (n/1024) ×1024), and the corresponding address index value is round (n/1024);
the sixth interval gray scale generation method comprises the following steps: assuming that the input data is n, where n= 262144,262145,262146, …,4294967294,4294967295, after gray mapping, the obtained gray scale value is 10×log (round (n/4096) ×4096), and the corresponding address index value is round (n/4096);
judging which section the data of the input table belongs to: if the address is in the 1 st interval, directly looking up a table by using the input value, namely directly looking up a table by using the input value as an address index value;
if the address belongs to the 2 nd section, dividing the input value by 16, and adding 256 to the obtained result as an address index value to look up a table; if the address belongs to the 3 rd section, dividing the input value by 64, and adding 512 to the obtained result as an address index value to look up a table; if the address belongs to the 4 th interval, dividing the input value by 256, and adding 768 to the obtained result as an address index value to look up a table; if the address belongs to the 5 th interval, dividing the input value by 1024, and adding 1024 to the obtained result as an address index value to look up a table; if the address belongs to the 6 th section, dividing the input value by 4096, and adding 1280 to the obtained result as an address index value to perform table lookup;
and 8-bit gray data is obtained by searching a gray mapping table, namely the quantized image data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011053099.3A CN112379373B (en) | 2020-09-29 | 2020-09-29 | Space-borne SAR real-time imaging device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011053099.3A CN112379373B (en) | 2020-09-29 | 2020-09-29 | Space-borne SAR real-time imaging device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112379373A CN112379373A (en) | 2021-02-19 |
CN112379373B true CN112379373B (en) | 2023-09-01 |
Family
ID=74580898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011053099.3A Active CN112379373B (en) | 2020-09-29 | 2020-09-29 | Space-borne SAR real-time imaging device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112379373B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2613152B (en) * | 2021-11-24 | 2024-10-23 | Iceye Oy | Satellite operations |
CN115097456A (en) * | 2022-08-02 | 2022-09-23 | 北京卫星信息工程研究所 | SAR satellite remote sensing data on-orbit detection method and device and readable storage medium |
CN115685205B (en) * | 2022-12-29 | 2023-04-11 | 北京九天微星科技发展有限公司 | Low-delay target tracking method, device and system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109507665A (en) * | 2018-10-30 | 2019-03-22 | 北京空间飞行器总体设计部 | It is a kind of based on spaceborne AIS real time information guidance star on autonomous imaging method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5011817B2 (en) * | 2006-05-18 | 2012-08-29 | 日本電気株式会社 | Synthetic aperture radar image processing apparatus and method |
-
2020
- 2020-09-29 CN CN202011053099.3A patent/CN112379373B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109507665A (en) * | 2018-10-30 | 2019-03-22 | 北京空间飞行器总体设计部 | It is a kind of based on spaceborne AIS real time information guidance star on autonomous imaging method |
Non-Patent Citations (1)
Title |
---|
王鹏波 ; 周荫清 ; 陈杰 ; 李春升 ; .基于二维deramp处理的高分辨率聚束SAR成像算法.北京航空航天大学学报.2007,(第01期),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN112379373A (en) | 2021-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112379373B (en) | Space-borne SAR real-time imaging device | |
Pu et al. | Motion errors and compensation for bistatic forward-looking SAR with cubic-order processing | |
EP0097490B1 (en) | Range/azimuth/elevation angle ship imaging for ordnance control | |
EP0100141B1 (en) | Range/doppler ship imagine for ordnance control | |
Capraro et al. | Implementing digital terrain data in knowledge-aided space-time adaptive processing | |
KR102028324B1 (en) | Synthetic Aperture Radar Image Enhancement Method and Calculating Coordinates Method | |
CN106204629A (en) | Space based radar and infrared data merge moving target detection method in-orbit | |
CN111989588A (en) | Symmetric multi-base radar constellation for earth observation | |
Wang et al. | Resolution calculation and analysis in bistatic SAR with geostationary illuminator | |
KR102151362B1 (en) | Image decoding apparatus based on airborn using polar coordinates transformation and method of decoding image using the same | |
WO2020151213A1 (en) | Air and ground combined intertidal zone integrated mapping method | |
CN106569206B (en) | A kind of object detection method compound based on Microwave Optics | |
US11249185B2 (en) | Signal processing device and radar apparatus | |
Santi et al. | Passive multistatic SAR with GNSS transmitters: Preliminary experimental study | |
Saeedi | Feasibility study and conceptual design of missile-borne synthetic aperture radar | |
CN117075076B (en) | Sport ship positioning method using detection imaging synthetic aperture radar | |
CN107728144B (en) | Interference SAR imaging method based on forward-looking double-basis mode | |
CN117538873A (en) | SAR (synthetic aperture radar) offshore target positioning method and system based on Doppler displacement estimation | |
CN112596037A (en) | Distributed SAR anti-interference efficiency evaluation method and system | |
KR102028323B1 (en) | Synthetic Aperture Radar Image Enhancement Apparatus and System | |
Zhou et al. | High-squint SAR imaging for noncooperative moving ship target based on high velocity motion platform | |
CN113030943B (en) | Multi-target tracking algorithm based on monopulse radar signal acquisition azimuth range profile | |
CN113989659A (en) | Ship target rapid detection method facing GEO remote sensing satellite | |
KR102053845B1 (en) | Method and apparatus for tracking target based on phase gradient autofocus | |
Wojaczek et al. | First results of polarimetric passive SAR imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |