CN111476705A - Active and passive three-dimensional imaging real-time processing system and method - Google Patents

Active and passive three-dimensional imaging real-time processing system and method Download PDF

Info

Publication number
CN111476705A
CN111476705A CN202010471907.1A CN202010471907A CN111476705A CN 111476705 A CN111476705 A CN 111476705A CN 202010471907 A CN202010471907 A CN 202010471907A CN 111476705 A CN111476705 A CN 111476705A
Authority
CN
China
Prior art keywords
data
image
module
laser
passive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010471907.1A
Other languages
Chinese (zh)
Inventor
李传荣
贺文静
潘苗苗
胡坚
李子扬
黎荆梅
周春城
韩雅兰
姚强强
陈林生
朱运维
何锐斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202010471907.1A priority Critical patent/CN111476705A/en
Publication of CN111476705A publication Critical patent/CN111476705A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The invention provides an active and passive three-dimensional imaging real-time processing system and method, which comprise a PS (packet switch) end connected with an external end through an external interface controller and used for realizing low-parallelism operation by using a serial processor, wherein the PS end comprises a system initialization module, a data preprocessing module, a laser ranging data error correction module, an optical image data processing module, a P L end connected with the PS end through an AXI (advanced extensible interface) bus and used for realizing high-parallelism operation by using programmable logic, and the P L end comprises two DMA controllers which are respectively a first DMA controller and a second DMA controller, a geodetic coordinate resolving module connected with the first DMA controller, a Gaussian projection module with one end connected with the geodetic coordinate resolving module, and an equation collineation resolving module connected with the second DMA controller.

Description

Active and passive three-dimensional imaging real-time processing system and method
Technical Field
The disclosure relates to the technical field of remote sensing data high-speed processing, in particular to a system and a method for active and passive three-dimensional imaging real-time processing.
Background
In recent years, the remote sensing technology is utilized to quickly and accurately acquire multi-dimensional attribute information such as space, texture, spectrum, material and the like of an observation target, and the method becomes a research hotspot. The active and passive integrated imaging technology is used as a novel remote sensing detection technology, an active laser radar and a passive optical imager are integrated in a common light path mode, the two sensors share the light path, the same observation scanning mode is adopted, and a three-dimensional imaging processing system is matched, so that the integrated acquisition of space dimension and spectrum dimension information of an observation area and the real-time display of three-dimensional imaging can be realized. Therefore, the active and passive integrated imaging technology can be widely applied to the remote sensing technical fields of urban planning, land utilization, environment monitoring, disaster assessment and the like.
However, with the increasing load data acquisition capability, the amount of remote sensing data has exponentially and explosively increased, which puts an urgent need on the high-speed real-time processing capability of the remote sensing data. On the other hand, the active and passive three-dimensional imaging processing needs to simultaneously process optical load image data, laser radar ranging data and corresponding position and attitude data, and the processing process comprises matrix operation, transcendental operation and the like of a large amount of double-precision floating point data, and has the characteristics of data density and calculation density. In addition, the limitation requirements of the system such as volume, power consumption and the like need to be comprehensively considered. The above factors all present significant challenges to active and passive three-dimensional imaging system design.
Development and maturity of SoC (System on chip) design technology open up a new development space for low power consumption and high-speed data processing. The high-performance processing core, the accelerator module, the special module, the memory and other computing and storing devices are integrated into a chip, so that the data transmission bandwidth is effectively reduced and improved, and higher system performance can be obtained. The software and hardware collaborative design technology fully considers the inseparable relationship between software and hardware in the SOC, so that the software design and the hardware design are used as an organic whole to carry out parallel design, the respective advantages of the software and the hardware are fully exerted, and the system obtains high-efficiency working capacity.
However, the implementation of the active and passive three-dimensional imaging processing SOC system design by using a software and hardware cooperation method needs to make important breakthrough in the following two problems: on one hand, aiming at known hardware resources, how to realize reasonable division of software and hardware tasks enables the system to have good performance in the aspects of time consumption, hardware resource consumption, power consumption and the like. On the other hand, how to realize the efficient interaction between software and hardware is the key to improve the overall performance of the system.
BRIEF SUMMARY OF THE PRESENT DISCLOSURE
Technical problem to be solved
Based on the above problems, the present disclosure provides an active and passive three-dimensional imaging real-time processing system and method, so as to alleviate the technical problems of poor performance, high power consumption, and the like of the active and passive three-dimensional imaging processing system and method in the prior art.
(II) technical scheme
In one aspect of the present disclosure, there is provided an active-passive three-dimensional imaging real-time processing system, comprising:
the PS terminal is connected with the peripheral terminal through an external interface controller, and realizes the operation with lower parallelism by using a serial processor, and the PS terminal comprises:
the system initialization module is used for completing initialization configuration after the system is powered on;
the data preprocessing module is used for finishing accurate time registration of optical image data, laser ranging data and position attitude data, and finishing calculation of a rotation torque matrix and a translation parameter in the laser point cloud resolving model and calculation of an external orientation element of an optical image by using the position attitude data;
the laser ranging data error correction module is used for carrying out laser ranging data error correction processing on the ranging value;
the optical image data processing module is used for carrying out data calculation by utilizing the optical image data and the corresponding external orientation elements and the laser point cloud data;
the P L end is connected with the PS end through an AXI bus and realizes the high parallelization operation by utilizing the programmable logic, and the P L end comprises:
the two DMA controllers are respectively a first DMA controller and a second DMA controller and are used for high-speed mutual transmission of data between the PS end and the P L end;
the geodetic coordinate calculation module is connected with the first DMA controller and is used for calculating the geodetic coordinate of each laser point based on the array push-scan airborne laser radar three-dimensional point cloud calculation model;
one end of the Gaussian projection module is connected with the geodetic coordinate resolving module and used for achieving Gaussian plane rectangular coordinate resolving of the laser point and transmitting the Gaussian plane rectangular coordinate resolving to a PS (packet switched) end memory through the first DMA (direct memory access) controller;
and the collinear equation calculating module is connected with the second DMA controller and is used for realizing the ground coordinate calculation of the ground point corresponding to the optical image point.
In an embodiment of the present disclosure, the operation with low parallelism includes: data preprocessing, laser ranging data error correction and elevation interpolation.
In an embodiment of the present disclosure, the highly parallelized operations include: geodetic coordinate calculation, Gaussian projection and collinear equation calculation.
In an embodiment of the present disclosure, the laser ranging data error correction processing includes: correcting system errors and filtering out gross error points.
In an embodiment of the present disclosure, the collinearity equation solving module includes: the device comprises a controller, an arithmetic unit, an input data cache BRAM, an output data cache BRAM and a parameter cache register.
In another aspect of the present disclosure, there is provided an active and passive three-dimensional imaging real-time processing method, which performs active and passive three-dimensional imaging real-time processing based on the active and passive three-dimensional imaging real-time processing system described in any one of the above, the active and passive three-dimensional imaging real-time processing method including:
step S1: accurately registering the observed load original data through a time code to complete the matching of the multi-load data in a time dimension; the observation load raw data comprises: optical image data, laser ranging data, position and attitude data;
step S2: respectively calculating rotation matrix and translation parameters for coordinate transformation from a laser sensor coordinate system to a geodetic coordinate system and external orientation elements at the time of imaging the optical linear array image by using the position posture data;
step S3: completing system error correction of the laser ranging data by using the calibration parameters and performing rapid gross error correction;
step S4: solving the geodetic coordinates of each laser point based on an array push-broom type airborne laser radar three-dimensional point cloud resolving model by using laser ranging data, a rotation matrix of coordinate conversion and translation parameters;
step S5: solving the Gaussian plane rectangular coordinate of each laser point;
step S6: calculating the elevation value of the image point according to the corresponding relation between the image point of the optical camera and the laser point in the optical image data and by combining the laser point cloud result;
step S7: according to the coordinates and corresponding elevations of the image points of the optical images and the elements of the inner direction and the outer direction of the optical images, solving the ground coordinates of the ground points corresponding to each image point according to a collinear equation; and
step S8: and giving a gray value to each image point of the optical image to obtain a three-dimensional image, thereby completing the real-time processing of the active and passive three-dimensional imaging.
In the embodiment of the present disclosure, in the step S3, the gross error points in the ranging values are filtered according to the distance threshold, and the missing points are supplemented by using an interpolation method, so as to implement fast gross error correction.
In the embodiment of the present disclosure, the step S5 obtains a laser point cloud result after gaussian projection.
In the embodiment of the present disclosure, in step S6, the elevation value of the image point is calculated by a linear interpolation method according to the corresponding relationship between the image point of the optical camera in the optical image data and the laser point in the laser ranging data and by combining the result of the laser point cloud.
In an embodiment of the present disclosure, the step S7 includes:
substep S71: implementing, by the controller, message communication with a second DMA controller; enabling control of the arithmetic unit is realized; the read-write control of an input data cache BRAM, an output data cache BRAM and a parameter cache register is realized;
substep S72: receiving elevation data corresponding to the optical image point and external orientation elements of the image through a second DMA controller, and caching by utilizing an input data cache BRAM;
substep S73: reading the external orientation element data of the image from an input data cache BRAM, and writing the external orientation element data into a parameter cache register for caching;
substep S74: by means of the operation unit and with the help of the FPGA, the XY coordinates of the image points are solved according to a collinear condition equation by utilizing the coordinates and the elevation values of the image points and the elements of the external orientation at the image imaging time and combining the elements of the internal orientation, and the calculation formula is as follows:
Figure BDA0002512928330000041
wherein, XA,YA,ZASolving three-dimensional coordinates of an object space corresponding to the image points of the image for the requirement; xS,YS,ZSFor imaging with the optical centre in the object coordinate systemThe position of (a); a is1,a2,a3,b1,b2,b3,c1,c2,c3Generating a rotation matrix element of an optical center relative to an object coordinate system at the imaging moment by three outer azimuth angle elements; x and y are image coordinates of the image points; f is the picture principal distance. And
substep S75: and enabling the output data cache BRAM to cache the ground coordinate data of the ground point corresponding to the optical image, and finally sending the ground coordinate data to the PS end through the second DMA controller.
(III) advantageous effects
According to the technical scheme, the active and passive three-dimensional imaging real-time processing system and the method have at least one or part of the following beneficial effects:
(1) the system operation performance can be greatly improved, and low-power consumption and high-speed data processing can be realized;
(2) the defects of high system complexity, poor real-time performance and the like caused by algorithm complexity, resource limitation and the like under a single processor architecture are overcome.
Drawings
Fig. 1 is a schematic diagram of a composition architecture of an active and passive three-dimensional imaging processing system according to an embodiment of the disclosure.
Fig. 2 is a schematic flow chart of an active and passive three-dimensional imaging processing method according to an embodiment of the disclosure.
Fig. 3 is a schematic flow architecture diagram of an active and passive three-dimensional imaging processing method according to an embodiment of the disclosure.
Fig. 4 is a schematic diagram of the working principle and the processing flow of the collinearity equation resolving module in the embodiment of the disclosure.
Detailed Description
The heterogeneous multi-core processor based on ZYNQ utilizes ARM and FPGA to carry out software and hardware collaborative design, carries out reasonable division on an active and passive three-dimensional imaging processing algorithm and flow, and gives full play to respective advantages of software and hardware, thereby enabling the system to obtain the best performance.
For the purpose of promoting a better understanding of the objects, aspects and advantages of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings.
In an embodiment of the present disclosure, there is provided an active and passive three-dimensional imaging real-time processing system, which is shown in fig. 1, 3, and 4, and includes:
a PS (Processing System) end connected to the peripheral end through an external interface controller, and implementing operations with low parallelism such as data preprocessing, laser ranging data error correction, elevation interpolation and the like by using a serial processor, wherein the PS end includes:
the system initialization module is used for completing initialization configuration after the system is powered on;
the data preprocessing module is used for finishing accurate time registration of optical image data, laser ranging data and position attitude data, and finishing calculation of a rotation torque matrix and a translation parameter in the laser point cloud resolving model and calculation of an external orientation element of an optical image by using the position attitude data;
the laser ranging data error correction module is used for carrying out laser ranging data error correction processing such as system error correction, gross error point filtering and the like on the ranging values by using the calibration parameters, the distance threshold value and the like;
the optical image data processing module is used for carrying out calculation such as elevation interpolation, data fusion and the like by utilizing the optical image data and corresponding external orientation elements and laser point cloud data;
a P L (Programmable L ogic) terminal connected with the PS terminal through an AXI bus and utilizing Programmable logic to realize highly parallelized operations such as geodetic coordinate calculation, Gaussian projection, collineation equation calculation, etc., wherein the P L terminal comprises:
two DMA (Direct Memory Access) controllers, namely a first DMA controller (AXI _ DMA1) and a second DMA controller (AXI _ DMA2), are used for high-speed mutual transmission of data between a PS terminal and a P L terminal;
the geodetic coordinate calculation module is connected with the first DMA controller and is used for calculating the geodetic coordinate of each laser point based on the array push-scan airborne laser radar three-dimensional point cloud calculation model;
one end of the Gaussian projection module is connected with the geodetic coordinate resolving module and used for achieving Gaussian plane rectangular coordinate resolving of the laser point and transmitting the Gaussian plane rectangular coordinate resolving to a PS (packet switched) end memory through the first DMA (direct memory access) controller;
and the collinear equation calculating module is connected with the second DMA controller and is used for realizing the ground coordinate calculation of the ground point corresponding to the optical image point.
The peripheral end is used for acquiring the original data of the observed load and displaying the three-dimensional image in real time, and comprises the following components:
a position and orientation measurement device for providing position and orientation data;
a laser radar for providing laser ranging data;
an optical camera for providing optical image data, and
the system initialization module firstly realizes initialization configuration of an external interface, an interrupt, a timer and the like, and then completes calibration parameter configuration of a PS end and a P L end.
The data preprocessing module firstly completes accurate time registration of optical image data, laser ranging data and position attitude data by combining an interpolation method, and completes calculation of a rotation matrix and a translation parameter for coordinate transformation and calculation of an external orientation element of an optical image from a laser sensor coordinate system to a WGS84 geodetic coordinate system by utilizing the position attitude data, the preprocessed data is sent to a DDR3 for caching, and meanwhile, in order to support calculation of a geodetic coordinate resolving module and a collinear equation resolving module in a P L end, the PS end respectively transmits the rotation matrix, the translation parameter and the external orientation element of the optical image to the two modules.
The laser ranging data error correction module completes ranging data calibration by using calibration parameters and system error correction of ranging values, filters gross error points in the ranging data according to a distance threshold value, and supplements missing points by adopting an interpolation method so as to realize rapid gross error correction of the laser ranging data, and meanwhile, in order to realize cooperative calculation with a geodetic coordinate calculation module and a Gaussian projection module in a P L end, the module realizes interaction and control of the laser ranging data, the laser point coordinate data and the like.
The optical image data processing module carries out calculation such as elevation interpolation and data fusion aiming at optical image data and corresponding external orientation elements and laser point cloud data, and meanwhile, in order to realize cooperative calculation with the P L end collineation equation calculation module, the module realizes interaction and control of data such as optical image geometric data and optical image external orientation elements.
The DMA controller comprises a first DMA controller (AXI _ DMA1) and a second DMA controller (AXI _ DMA2), data interaction and control of a PS end and a P L end adopt an AXI-4 protocol-based DMA high-speed transmission mode, and Xilinx official IP core AXI DMA (AXI Direct Memory Access) is adopted for realization.
The geodetic coordinate calculation module is mainly used for developing geodetic coordinate calculation of each laser point based on an array push-broom type airborne laser radar three-dimensional point cloud calculation model, and has strong parallelism, so that programmable logic resources are utilized to realize the geodetic coordinate calculation, and the calculation performance is favorably improved. The module effectively optimizes the running performance of the circular function by using a pipeline parallel processing method. The module carries out data interaction with a PS end laser ranging data error correction module and a data preprocessing module through AXI _ DMA1 to obtain laser ranging data, a rotation matrix and translation parameters. The processed result is transmitted to the gaussian projection module by using FIFO (First Input First Output).
The Gaussian projection module is mainly used for realizing the conversion of laser points from a WGS84 geodetic coordinate system to a Gaussian plane rectangular coordinate system, and each laser point needs to be resolved point by point in the calculation process to obtain a laser point cloud result after Gaussian projection. The module adopts a pipeline parallel mode to realize multipoint data parallel processing. The module acquires the geodetic coordinate value through FIFO, and then transmits the resolving result to the PS-side memory through AXI _ DMA1 after further data processing is finished.
And the collinear equation calculating module is used for realizing the ground coordinate calculation of the ground point corresponding to each image point of the optical image. The module utilizes programmable logic resources, adopts a pipeline parallel mode, reasonably uses a register to cache key data, and improves the data access rate. The module performs data interaction with a PS-end optical image data processing module and a data preprocessing module through AXI _ DMA2, obtains optical image data and external orientation elements of an optical image, and returns a processing result to the optical image data processing module.
The external interface controller comprises a 232 interface, a 422 interface, a Camera L ink interface and an Ethernet interface, wherein the first three interfaces realize data communication with external observation load equipment, the external observation load equipment comprises position/attitude measurement equipment, a laser radar, an optical Camera and other equipment, and the Ethernet interface realizes communication with image display equipment and supports real-time display and storage of three-dimensional images.
The present disclosure further provides an active and passive three-dimensional imaging real-time processing method, which not only considers the calculation characteristics and the processor characteristics in the software and hardware partition design, but also analyzes the characteristics of cohesion, coupling, and the like of the processing algorithm, analyzes the active and passive three-dimensional imaging processing algorithm, and decomposes the algorithm into modules with relatively independent and complete functions, including: the active and passive three-dimensional imaging real-time processing method comprises a time registration module, a position and attitude data processing module, a geodetic coordinate calculation module, a Gaussian projection module, an elevation interpolation module, a collinear equation calculation module and a data fusion module, and is shown in combined figures 1-4, and comprises the following steps:
step S1: the time configuration module is used for accurately registering the observed load original data through time codes to complete the matching of the multi-load data in the time dimension; the observation load raw data comprises: optical image data, laser ranging data, position and attitude data;
step S2: the external orientation element resolving module respectively calculates rotation matrix and translation parameters for coordinate transformation from a laser sensor coordinate system to a WGS84 geodetic coordinate system and external orientation elements at the time of imaging of the optical linear array image by using the position posture data;
step S3: the ranging data error correction module completes the system error correction of the laser ranging data by using the calibration parameters and performs quick gross error correction;
step S4: the geodetic coordinate calculation module is used for calculating the geodetic coordinate of each laser point based on an array push-broom type airborne laser radar three-dimensional point cloud calculation model by using laser ranging data, a rotation matrix of coordinate conversion and translation parameters;
step S5: solving the Gaussian plane rectangular coordinate of each laser point by a Gaussian projection module;
step S6: the elevation interpolation module calculates the elevation value of the image point according to the corresponding relation between the image point of the optical camera and the laser point in the optical image data and by combining the laser point cloud result;
step S7: the collinearity equation resolving module resolves and calculates the ground coordinates of the ground points corresponding to each image point according to the collinearity equation according to the coordinates and corresponding elevations of the image points of the optical images and the internal and external orientation elements of the optical images;
step S8: the data fusion module endows a gray value to each image point of the optical image to obtain a three-dimensional image, and the real-time processing of the active and passive three-dimensional imaging is completed.
In the step S1, the operation is simple, but three operations such as load data reading, time code extraction, data registration and packaging are required, and the data access pressure is high. The data Access efficiency is improved by the multi-level Cache design combining the SRAM (Static Random-Access Memory) and the Cache (high-speed Cache) at the PS end. And allocating to the PS end for realization.
In the steps S2 and S3, the calculation of the external orientation element and the calculation of the error correction of the laser ranging data are relatively simple, but there are many judgment operations, override operations, and the like, which are suitable for software implementation, and therefore, the calculation is distributed to the PS end.
In step S3, the gross error points in the distance measurement values are filtered according to the distance threshold, and the missing points are supplemented by an interpolation method, thereby implementing fast gross error correction.
In step S5, a laser point cloud result after gaussian projection is obtained.
In step S6, according to the correspondence between the image point of the optical camera in the optical image data and the laser point in the laser ranging data, the elevation value of the image point is calculated by a linear interpolation method in combination with the laser point cloud result;
in step S6, the elevation interpolation process needs to determine the index of each laser spot, and a large number of determination operations are suitable for being implemented in software, and therefore, are allocated to the PS end.
In the step S8, the data fusion process reads, assigns, and stores the optical image point data point by point, so that the data access pressure is high, and the data access pressure is suitable for software implementation and is distributed to the PS end for implementation;
in the steps S4, S5 and S7, coordinate calculation is carried out point by point for laser points or optical image points, no data dependence exists between different laser points and image points, the parallelism degree is high, and the method accords with the characteristic that programmable logic is suitable for repeated operation, and therefore the method is distributed to a P L end to be realized.
In the disclosed embodiment, the geodetic coordinates of each optical image point are solved by a collinearity equation solving module in step S7 using collinearity equations, as shown in fig. 4, the collinearity equation solving module including: the device comprises a controller, an arithmetic unit, an input data cache BRAM (Block Random Access Memory), an output data cache BRAM and a parameter cache register. The step S7 includes:
substep S71: the controller realizes the message communication of task control, state feedback and the like with the second DMA controller; enabling control of the arithmetic unit is realized; and the read-write control of the input data cache BRAM, the output data cache BRAM and the parameter cache register is realized.
Substep S72: receiving elevation data corresponding to the optical image point and external orientation elements of the image through a second DMA controller, and caching by utilizing an input data cache BRAM;
substep S73: and reading the external orientation element data of the image from the input data cache BRAM, and writing the external orientation element data into a parameter cache register for caching. The data volume of the external orientation element is small, the external orientation element is frequently called in the resolving process, and the access efficiency can be improved by using the parameter cache register.
And a substep S74 of calculating the XY coordinates of the image points according to a collinear condition equation by using the coordinates and the elevation values of the image points and the external orientation elements at the image imaging time and combining the internal orientation elements through the operation unit by means of the DSP and L UT resources in the FPGA, wherein the calculation formula is as follows:
Figure BDA0002512928330000101
wherein (X)A,YA,ZA) Solving three-dimensional coordinates of an object space corresponding to the image points of the image for the requirement; (X)S,YS,ZS) The position of the optical center under an object space coordinate system during imaging; a is1,a2,a3,b1,b2,b3,c1,c2,c3Generating a rotation matrix element of an optical center relative to an object coordinate system at the imaging moment by three outer azimuth angle elements; x and y are image coordinates of the image points; f is the picture principal distance.
Substep S75: and enabling the output data cache BRAM to cache the ground coordinate data of the ground point corresponding to the optical image, and finally sending the ground coordinate data to the PS end through the second DMA controller.
The data communication between the P L end and the PS end is an important component for realizing software and hardware cooperative processing, and in combination with the characteristics of small local storage space, high access speed, large storage space and low access speed of the P L end, in order to obtain the maximum transmission performance of the PS end and the P L end, an AXI-4 protocol-based direct memory access high-speed transmission mode is adopted to support the rapid moving of data blocks without the intervention of a CPU (Central processing Unit). A DMA (direct memory access) controller is simultaneously controlled by a P L end hardware accelerator and a PS end CPU (Central processing Unit), so that the interaction of data in a BRAM (bridge-bridge) at the P L end and a memory at the PS end is realized.
So far, the embodiments of the present disclosure have been described in detail with reference to the accompanying drawings. It is to be noted that, in the attached drawings or in the description, the implementation modes not shown or described are all the modes known by the ordinary skilled person in the field of technology, and are not described in detail. Further, the above definitions of the various elements and methods are not limited to the various specific structures, shapes or arrangements of parts mentioned in the examples, which may be easily modified or substituted by those of ordinary skill in the art.
From the above description, those skilled in the art should clearly recognize that the active-passive three-dimensional imaging real-time processing system and method of the present disclosure.
In summary, the present disclosure provides a real-time processing system and method for active and passive three-dimensional imaging, which divide function modules according to the processing characteristics and advantages of ARM and FPGA, and then perform data processing, caching, interface, task control and other designs of the PS terminal and the P L terminal, thereby implementing efficient cooperative work of software and hardware.
It should also be noted that directional terms, such as "upper", "lower", "front", "rear", "left", "right", and the like, used in the embodiments are only directions referring to the drawings, and are not intended to limit the scope of the present disclosure. Throughout the drawings, like elements are represented by like or similar reference numerals. Conventional structures or constructions will be omitted when they may obscure the understanding of the present disclosure.
And the shapes and sizes of the respective components in the drawings do not reflect actual sizes and proportions, but merely illustrate the contents of the embodiments of the present disclosure. Furthermore, in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim.
Unless otherwise indicated, the numerical parameters set forth in the specification and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by the present disclosure. In particular, all numbers expressing quantities of ingredients, reaction conditions, and so forth used in the specification and claims are to be understood as being modified in all instances by the term "about". Generally, the expression is meant to encompass variations of ± 10% in some embodiments, 5% in some embodiments, 1% in some embodiments, 0.5% in some embodiments by the specified amount.
Furthermore, the word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements.
The use of ordinal numbers such as "first," "second," "third," etc., in the specification and claims to modify a corresponding element does not by itself connote any ordinal number of the element or any ordering of one element from another or the order of manufacture, and the use of the ordinal numbers is only used to distinguish one element having a certain name from another element having a same name.
In addition, unless steps are specifically described or must occur in sequence, the order of the steps is not limited to that listed above and may be changed or rearranged as desired by the desired design. The embodiments described above may be mixed and matched with each other or with other embodiments based on design and reliability considerations, i.e., technical features in different embodiments may be freely combined to form further embodiments.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Also in the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the disclosure, various features of the disclosure are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various disclosed aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that is, the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, disclosed aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this disclosure.
The above-mentioned embodiments are intended to illustrate the objects, aspects and advantages of the present disclosure in further detail, and it should be understood that the above-mentioned embodiments are only illustrative of the present disclosure and are not intended to limit the present disclosure, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. An active passive three-dimensional imaging real-time processing system, comprising:
the PS terminal is connected with the peripheral terminal through an external interface controller, and realizes the operation with lower parallelism by using a serial processor, and the PS terminal comprises:
the system initialization module is used for completing initialization configuration after the system is powered on;
the data preprocessing module is used for finishing accurate time registration of optical image data, laser ranging data and position attitude data, and finishing calculation of a rotation torque matrix and a translation parameter in the laser point cloud resolving model and calculation of an external orientation element of an optical image by using the position attitude data;
the laser ranging data error correction module is used for carrying out laser ranging data error correction processing on the ranging value;
the optical image data processing module is used for carrying out data calculation by utilizing the optical image data and the corresponding external orientation elements and the laser point cloud data;
the P L end is connected with the PS end through an AXI bus and realizes the high parallelization operation by utilizing the programmable logic, and the P L end comprises:
the two DMA controllers are respectively a first DMA controller and a second DMA controller and are used for high-speed mutual transmission of data between the PS end and the P L end;
the geodetic coordinate calculation module is connected with the first DMA controller and is used for calculating the geodetic coordinate of each laser point based on the array push-scan airborne laser radar three-dimensional point cloud calculation model;
one end of the Gaussian projection module is connected with the geodetic coordinate resolving module and used for achieving Gaussian plane rectangular coordinate resolving of the laser point and transmitting the Gaussian plane rectangular coordinate resolving to a PS (packet switched) end memory through the first DMA (direct memory access) controller;
and the collinear equation calculating module is connected with the second DMA controller and is used for realizing the ground coordinate calculation of the ground point corresponding to the optical image point.
2. The active passive three-dimensional imaging real-time processing system according to claim 1, the less-parallel operations comprising: data preprocessing, laser ranging data error correction and elevation interpolation.
3. The active passive three-dimensional imaging real-time processing system according to claim 1, the highly parallelized operations comprising: geodetic coordinate calculation, Gaussian projection and collinear equation calculation.
4. The active passive three dimensional imaging real time processing system as claimed in claim 1, said laser range data error correction process comprising: correcting system errors and filtering out gross error points.
5. The active passive three-dimensional imaging real-time processing system according to claim 1, the collinearity equation solution module comprising: the device comprises a controller, an arithmetic unit, an input data cache BRAM, an output data cache BRAM and a parameter cache register.
6. An active and passive three-dimensional imaging real-time processing method, which is based on the active and passive three-dimensional imaging real-time processing system of any one of the preceding claims 1 to 5 for active and passive three-dimensional imaging real-time processing, and comprises the following steps:
step S1: accurately registering the observed load original data through a time code to complete the matching of the multi-load data in a time dimension; the observation load raw data comprises: optical image data, laser ranging data, position and attitude data;
step S2: respectively calculating rotation matrix and translation parameters for coordinate transformation from a laser sensor coordinate system to a geodetic coordinate system and external orientation elements at the time of imaging the optical linear array image by using the position posture data;
step S3: completing system error correction of the laser ranging data by using the calibration parameters and performing rapid gross error correction;
step S4: solving the geodetic coordinates of each laser point based on an array push-broom type airborne laser radar three-dimensional point cloud resolving model by using laser ranging data, a rotation matrix of coordinate conversion and translation parameters;
step S5: solving the Gaussian plane rectangular coordinate of each laser point;
step S6: calculating the elevation value of the image point according to the corresponding relation between the image point of the optical camera and the laser point in the optical image data and by combining the laser point cloud result;
step S7: according to the coordinates and corresponding elevations of the image points of the optical images and the elements of the inner direction and the outer direction of the optical images, solving the ground coordinates of the ground points corresponding to each image point according to a collinear equation; and
step S8: and giving a gray value to each image point of the optical image to obtain a three-dimensional image, thereby completing the real-time processing of the active and passive three-dimensional imaging.
7. The active-passive three-dimensional imaging real-time processing method according to claim 6, wherein in step S3, the gross error points in the range values are filtered according to the distance threshold, and interpolation is used to supplement the missing points, so as to achieve fast gross error correction.
8. The active and passive three-dimensional imaging real-time processing method according to claim 6, wherein the step S5 is implemented by acquiring a laser point cloud result after Gaussian projection.
9. The active-passive three-dimensional imaging real-time processing method according to claim 6, wherein in step S6, the elevation value of the image point is calculated by linear interpolation method according to the corresponding relationship between the image point of the optical camera in the optical image data and the laser point in the laser ranging data and by combining the laser point cloud result.
10. The active passive three-dimensional imaging real-time processing method according to claim 6, the step S7 comprising:
substep S71: implementing, by the controller, message communication with a second DMA controller; enabling control of the arithmetic unit is realized; the read-write control of an input data cache BRAM, an output data cache BRAM and a parameter cache register is realized;
substep S72: receiving elevation data corresponding to the optical image point and external orientation elements of the image through a second DMA controller, and caching by utilizing an input data cache BRAM;
substep S73: reading the external orientation element data of the image from an input data cache BRAM, and writing the external orientation element data into a parameter cache register for caching;
substep S74: by means of the operation unit and with the help of the FPGA, the XY coordinates of the image points are solved according to a collinear condition equation by utilizing the coordinates and the elevation values of the image points and the elements of the external orientation at the image imaging time and combining the elements of the internal orientation, and the calculation formula is as follows:
Figure FDA0002512928320000031
wherein, XA,YA,ZASolving three-dimensional coordinates of an object space corresponding to the image points of the image for the requirement; xS,YS,ZSThe position of the optical center under an object space coordinate system during imaging; a is1,a2,a3,b1,b2,b3,c1,c2,c3Generating a rotation matrix element of an optical center relative to an object coordinate system at the imaging moment by three outer azimuth angle elements; x and y are image coordinates of the image points; f is the picture principal distance. And
substep S75: and enabling the output data cache BRAM to cache the ground coordinate data of the ground point corresponding to the optical image, and finally sending the ground coordinate data to the PS end through the second DMA controller.
CN202010471907.1A 2020-05-28 2020-05-28 Active and passive three-dimensional imaging real-time processing system and method Pending CN111476705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010471907.1A CN111476705A (en) 2020-05-28 2020-05-28 Active and passive three-dimensional imaging real-time processing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010471907.1A CN111476705A (en) 2020-05-28 2020-05-28 Active and passive three-dimensional imaging real-time processing system and method

Publications (1)

Publication Number Publication Date
CN111476705A true CN111476705A (en) 2020-07-31

Family

ID=71765093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010471907.1A Pending CN111476705A (en) 2020-05-28 2020-05-28 Active and passive three-dimensional imaging real-time processing system and method

Country Status (1)

Country Link
CN (1) CN111476705A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112051139A (en) * 2020-09-09 2020-12-08 中山大学 Segment joint shear rigidity measuring method, system, equipment and storage medium
CN114275160A (en) * 2021-12-28 2022-04-05 中国科学院空天信息创新研究院 Aviation platform with multi-dimensional information detection capability and cooperative operation method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112051139A (en) * 2020-09-09 2020-12-08 中山大学 Segment joint shear rigidity measuring method, system, equipment and storage medium
CN114275160A (en) * 2021-12-28 2022-04-05 中国科学院空天信息创新研究院 Aviation platform with multi-dimensional information detection capability and cooperative operation method
CN114275160B (en) * 2021-12-28 2022-08-23 中国科学院空天信息创新研究院 Aviation platform with multi-dimensional information detection capability and cooperative operation method

Similar Documents

Publication Publication Date Title
JP7208356B2 (en) Generating Arbitrary World Views
WO2018112926A1 (en) Locating method, terminal and server
WO2017181562A1 (en) Method and system for processing neural network
CN110392903A (en) The dynamic of matrix manipulation is rejected
CN105241461A (en) Map creating and positioning method of robot and robot system
WO2019079358A1 (en) Density coordinate hashing for volumetric data
CN111476705A (en) Active and passive three-dimensional imaging real-time processing system and method
Mattoccia Stereo vision algorithms for fpgas
WO2022111609A1 (en) Grid encoding method and computer system
CN112799599B (en) Data storage method, computing core, chip and electronic equipment
WO2023045446A1 (en) Computing apparatus, data processing method, and related product
CN205247208U (en) Robotic system
CN107707899A (en) Multi-view image processing method, device and electronic equipment comprising moving target
CN212515897U (en) Active and passive three-dimensional imaging real-time processing system
Bai et al. Pointnet on fpga for real-time lidar point cloud processing
CN114002701A (en) Method, device, electronic equipment and system for rendering point cloud in real time
Liu et al. Accelerating DNN-based 3D point cloud processing for mobile computing
CN112581509B (en) Unmanned aerial vehicle ground target real-time tracking system and tracking method based on SOPC
CN111275608B (en) Remote sensing image orthorectification parallel system based on FPGA
US11829119B2 (en) FPGA-based acceleration using OpenCL on FCL in robot motion planning
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN107240149A (en) Object dimensional model building method based on image procossing
CN105446690A (en) Information fusion and multi-information display method with target positioning function
Zhang et al. Pose detection of aerial image object based on constrained neural network
CN108564644A (en) The passive 3-dimensional image real-time imaging devices of airborne master and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination