CN116952391A - Non-uniformity correction method and system for unmanned aerial vehicle acquired image - Google Patents

Non-uniformity correction method and system for unmanned aerial vehicle acquired image Download PDF

Info

Publication number
CN116952391A
CN116952391A CN202311158339.XA CN202311158339A CN116952391A CN 116952391 A CN116952391 A CN 116952391A CN 202311158339 A CN202311158339 A CN 202311158339A CN 116952391 A CN116952391 A CN 116952391A
Authority
CN
China
Prior art keywords
image
correction
area
temperature
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311158339.XA
Other languages
Chinese (zh)
Inventor
郝树奇
叶成海
任航
高文文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Dexin Intelligent Technology Co ltd
Original Assignee
Shaanxi Dexin Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Dexin Intelligent Technology Co ltd filed Critical Shaanxi Dexin Intelligent Technology Co ltd
Priority to CN202311158339.XA priority Critical patent/CN116952391A/en
Publication of CN116952391A publication Critical patent/CN116952391A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/90Testing, inspecting or checking operation of radiation pyrometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • G06T3/4076Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution using the original low-resolution images to iteratively correct the high-resolution images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

A non-uniformity correction method and device for unmanned aerial vehicle collected images and electronic equipment relate to the field of unmanned aerial vehicle image collection. The method is applied to an optoelectronic pod device, comprising: acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area; based on a preset non-uniformity correction coefficient table, non-uniformity correction is carried out on a plurality of correction areas, and a first image and a second image are generated; the first image is an image corrected by the calibration area, and the second image is an image corrected by the non-calibration area; and fusing the first image and the second image to generate a high-resolution infrared image. And obtaining the correction coefficient of the non-calibration area according to the correction coefficient of the calibration area, thereby meeting the correction of the infrared image under more temperature conditions and improving the imaging effect of the infrared image.

Description

Non-uniformity correction method and system for unmanned aerial vehicle acquired image
Technical Field
The application relates to the technical field of unmanned aerial vehicle image acquisition, in particular to a non-uniformity correction method and system for unmanned aerial vehicle acquired images.
Background
The unmanned aerial vehicle photoelectric pod comprises visible light imaging and infrared imaging, wherein the infrared imaging technology is used as a technology with functions of temperature measurement, night vision and the like, and the basic principle is to discover and identify a target by utilizing an infrared radiation characteristic image formed by the background of the target or the temperature difference or the radiation difference between all parts of the target.
At present, after calibration and correction are carried out on the infrared radiation characteristic image, the imaging effect is often more accurate. While the main correction method is mainly based on non-uniform correction of the reference radiation source. The basic principle of the method is that a plurality of reference radiation sources with different temperatures are introduced to calibrate shooting environments at different temperatures, and then an infrared camera shoots and measures the reference radiation sources to obtain a measurement result. And finally, comparing the measurement result with the real emissivity to obtain the non-uniformity correction coefficient of the infrared camera. These coefficients are then applied to the scene image to correct for non-uniformities.
However, temperature factors in an actual shooting environment are not controllable, and when the environment temperature is not within a temperature range pre-calibrated by the infrared camera, non-uniformity deviation of the infrared image is caused, so that the imaging effect of the infrared image is poor, and the execution of tasks such as inspection and reconnaissance of the unmanned aerial vehicle is affected.
Therefore, a method and a system for correcting the non-uniformity of the image acquired by the unmanned aerial vehicle are needed.
Disclosure of Invention
The application provides a non-uniformity correction method and a non-uniformity correction system for an image acquired by an unmanned aerial vehicle, which are used for obtaining the correction coefficient of a non-calibration area according to the correction coefficient of the calibration area, so that the infrared image correction under more temperature conditions is satisfied, and the imaging effect of the infrared image is improved.
In a first aspect, the present application provides a method for correcting non-uniformity of an image acquired by an unmanned aerial vehicle, applied to an optoelectronic pod device, the method comprising: acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature within a temperature range pre-calibrated by the optoelectronic pod equipment, and the non-calibration area is an image area with the ambient temperature not within the temperature range pre-calibrated by the optoelectronic pod equipment; based on a preset non-uniformity correction coefficient table, non-uniformity correction is carried out on a plurality of correction areas, and a first image and a second image are generated; the first image is the corrected image of the calibration area, the second image is the corrected image of the non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients; and fusing the first image and the second image to generate a high-resolution infrared image.
By adopting the technical scheme, the photoelectric pod equipment divides the image to be processed into a calibration area and a non-calibration area; for a calibration area, the photoelectric pod device obtains a correction coefficient of the calibration area by inquiring a preset non-uniformity correction coefficient table, and then generates a first image after the correction coefficient is applied to the calibration area; for the non-calibration area, the photoelectric pod equipment obtains a correction coefficient closest to the non-calibration area by inquiring a preset non-uniformity correction coefficient table, and corrects the non-calibration area according to the closest correction coefficient to generate a second image; and finally, fusing the first image and the second image into one image. Therefore, when the ambient temperature shot in the image to be processed is not in the preset calibrated temperature range, the predicted correction coefficient can be obtained through the preset non-uniformity correction coefficient table, so that the quality and resolution of the infrared image are improved.
In a second aspect, the present application provides a non-uniformity correction system for an image acquired by an unmanned aerial vehicle, the system being a photovoltaic pod device comprising an acquisition module and a processing module, wherein:
the acquisition module is used for acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature in a temperature range calibrated in advance by the optoelectronic pod equipment, and the non-calibration area is an image area with the ambient temperature not in the temperature range calibrated in advance by the optoelectronic pod equipment;
The processing module is used for carrying out non-uniformity correction on a plurality of correction areas based on a preset non-uniformity correction coefficient table to generate a first image and a second image; the first image is the corrected image of the calibration area, the second image is the corrected image of the non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients; and fusing the first image and the second image to generate a high-resolution infrared image.
By adopting the technical scheme, the acquisition module divides the image to be processed into a calibration area and a non-calibration area; for the calibration area, the processing module obtains the correction coefficient of the calibration area by inquiring a preset non-uniformity correction coefficient table, and then generates a first image after the correction coefficient is applied to the calibration area; for the non-calibration area, the processing module obtains a correction coefficient closest to the non-calibration area by inquiring a preset non-uniformity correction coefficient table, and corrects the non-calibration area according to the closest correction coefficient to generate a second image; and finally, the processing module fuses the first image and the second image into one image. Therefore, when the ambient temperature shot in the image to be processed is not in a pre-calibrated temperature range, the predicted correction coefficient can be obtained through the preset non-uniformity correction coefficient table, so that the quality and resolution of the infrared image are improved.
Optionally, the acquiring module is configured to acquire a first temperature of a first correction area and a first temperature corresponding to the first correction area, where the first correction area is any one of the calibration areas; the processing module is used for matching the first temperature with the preset non-uniform correction coefficient table to obtain a first correction coefficient; and correcting the first correction region by adopting the first correction coefficient to generate the first image.
By adopting the technical scheme, the temperature of the calibration area is matched with the preset non-uniform correction coefficient table, so that an accurate correction coefficient is obtained, the correction of the processing module to the calibration area is more accurate, and the quality and resolution of the infrared image are improved.
Optionally, the acquiring module is configured to acquire a second temperature corresponding to a second correction area and the second correction area, where the second correction area is any one of the plurality of non-calibration areas;
the processing module is used for matching the second temperature with the preset non-uniform correction coefficient table to obtain adjacent correction data corresponding to the second temperature, wherein the adjacent correction data comprises two adjacent temperatures of the second temperature and correction coefficients corresponding to the two adjacent temperatures; according to the adjacent correction data, a second correction coefficient is calculated by adopting a secondary difference algorithm; and correcting the second correction area by adopting the second correction coefficient to generate a second image.
By adopting the technical scheme, the temperature of the non-calibration area is matched with a preset non-uniform correction coefficient table, so that two temperatures closest to the temperature of the non-calibration area and correction coefficients corresponding to the two temperatures are obtained; the processing module predicts the correction coefficient of the non-calibration area through a secondary difference algorithm; and correcting the non-calibration area according to the predicted correction coefficient, so that the accuracy of the processing module in correcting the calibration area is improved, and the quality and resolution of the infrared image are further improved.
Optionally, the obtaining module is configured to perform feature point marking on the image to be processed to obtain a plurality of image feature points;
the processing module is used for marking a plurality of calibration areas in the first image based on a plurality of image feature points to obtain a first marking result; marking a plurality of non-calibration areas in a second image to obtain a second marking result, wherein the first marking result comprises the positions of marking points corresponding to the plurality of calibration areas in the first image, and the second marking result comprises the positions of marking points corresponding to the plurality of non-calibration areas in the second image; and based on the first marking result and the second marking result, splicing the plurality of calibration areas with the plurality of non-punctuation areas to generate the high-resolution infrared image.
By adopting the technical scheme, the position of each correction area is determined by marking the characteristic points on the image to be processed; then the processing module marks the same characteristic points on the first image and the second image, so that the situation that the splicing error does not occur in the splicing of the calibration area and the non-calibration area is ensured, and the situation of image deformity and distortion is reduced; and the information of the calibration area is fully utilized, so that the quality and resolution of the whole image are improved.
Optionally, the acquiring module is configured to segment the image to be processed by using an image segmentation algorithm; acquiring a temperature value corresponding to each pixel point in a third correction area, wherein the third correction area is any one of a plurality of correction areas after being segmented in the image to be processed;
the processing module is used for determining the temperature proportion of the third correction area based on the temperature value corresponding to each pixel point in the third correction area, wherein the temperature proportion is the ratio of the number of the pixel points with the temperature value within a preset calibration temperature range to the number of the pixel points in the third correction area; and determining the calibration area and the non-calibration area based on the temperature proportion.
By adopting the technical scheme, as the photoelectric pod equipment has certain measurement error during measurement, pixel points which are not in a pre-calibrated temperature range still exist in the calibration area; the processing module determines the calibration area and the non-calibration area by counting the ratio of the number of pixels with the temperature within a preset calibration range to the total number of pixels in the calibration area, so that the subsequent processing module processes the calibration area and the non-calibration area in a corresponding processing mode, and the quality and the resolution of the infrared image are improved.
Optionally, when the temperature proportion is greater than or equal to a preset proportion threshold, the processing module determines that the third correction area is a calibration area; and when the temperature ratio is smaller than a preset ratio threshold, the processing module determines that the third correction area is a non-calibration area.
By adopting the technical scheme, the calibration area and the non-calibration area are conveniently determined by setting the preset proportion threshold, and the influence caused by measurement errors of photoelectric pod equipment is reduced.
Optionally, the acquiring module is configured to acquire image information of the image to be processed, where the image information includes an image size and a shooting time;
The processing module is used for determining the segmentation size of the image to be processed based on the image information; and dividing the image to be processed into a plurality of correction areas according to the dividing size.
By adopting the technical scheme, due to different shooting conditions, when the image to be processed is segmented, the requirements of the image quality and the resolution are met; the acquisition module is used for determining the most suitable segmentation proportion according to the image information of the image to be processed, and then the processing module is used for ensuring that the image quality meets the requirements.
In a third aspect, the present application provides an electronic device comprising a processor, a memory for storing instructions, a user interface and a network interface for communicating to other devices, the processor for executing the instructions stored in the memory to cause the electronic device to perform the method of any of the first aspects.
In a fourth aspect, the present application provides a computer readable storage medium storing instructions which, when executed, perform the method of any one of the first aspects.
In summary, one or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
1. dividing an image to be processed into a calibration area and a non-calibration area by the photoelectric pod equipment; for a calibration area, the photoelectric pod device obtains a correction coefficient of the calibration area by inquiring a preset non-uniformity correction coefficient table, and then generates a first image after the correction coefficient is applied to the calibration area; for the non-calibration area, the photoelectric pod equipment obtains a correction coefficient closest to the non-calibration area by inquiring a preset non-uniformity correction coefficient table, and corrects the non-calibration area according to the closest correction coefficient to generate a second image; and finally, fusing the first image and the second image into one image. Therefore, when the ambient temperature shot in the image to be processed is not in the preset calibrated temperature range, the predicted correction coefficient can be obtained through the preset non-uniformity correction coefficient table, so that the quality and resolution of the infrared image are improved.
2. The temperature of the non-calibration area is matched with a preset non-uniform correction coefficient table, so that two temperatures closest to the temperature of the non-calibration area and correction coefficients corresponding to the two temperatures are obtained; the processing module predicts the correction coefficient of the non-calibration area through a secondary difference algorithm; and correcting the non-calibration area according to the predicted correction coefficient, so that the accuracy of the processing module in correcting the calibration area is improved, and the quality and resolution of the infrared image are further improved.
Drawings
Fig. 1 is a schematic flow chart of a non-uniformity correction method for an image acquired by an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a calibration area non-uniformity correction method according to an embodiment of the present application.
FIG. 3 is a schematic diagram of a non-calibration area non-uniformity correction method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of image fusion according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a non-uniformity correction system for an image acquired by an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals illustrate: 1. an acquisition module; 2. a processing module; 600. an electronic device; 601. a processor; 602. a communication bus; 603. a user interface; 604. a network interface; 605. a memory.
Description of the embodiments
In order that those skilled in the art will better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments.
In describing embodiments of the present application, words such as "for example" or "for example" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "such as" or "for example" in embodiments of the application should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "or" for example "is intended to present related concepts in a concrete fashion.
In the description of embodiments of the application, the term "plurality" means two or more. For example, a plurality of systems means two or more systems, and a plurality of screen terminals means two or more screen terminals. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating an indicated technical feature. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
With the development of infrared imaging technology, the infrared imaging device has the functions of temperature measurement, night vision and the like, and is often used in the safety fields of target detection, identification, tracking and the like, or the environmental protection fields of atmosphere pollution, water body temperature, forest fire and the like. The basic principle is to find and identify the target by using an infrared radiation characteristic image formed by the background of the target or the temperature difference or the radiation difference between the parts of the target.
At present, after calibration and correction are carried out on the infrared radiation characteristic image, the imaging effect is often more accurate. While the main correction method is mainly based on non-uniform correction of the reference radiation source. The method combines the multi-temperature point calibration and the nonlinear fitting correction method to realize the non-uniform correction of the infrared image. The basic principle is that a plurality of reference radiation sources with different temperatures are firstly introduced to calibrate shooting environments at different temperatures, wherein the reference radiation sources can be understood as objects or equipment with known radiation characteristics, such as ceramics, aluminum, natural radiation sources and the like, the reference radiation sources are often selected according to specific application scenes to have better image correction effects, for example, when an infrared image of a forest fire is shot, natural radiation sources such as sun or ground and the like are required to be selected as the reference radiation sources; and then the infrared camera shoots and measures the reference radiation source to obtain a measurement result. And finally, comparing the measurement result with the real emissivity to obtain the non-uniformity correction coefficient of the infrared camera. These coefficients are then applied to the scene image to correct for non-uniformities in the infrared image.
However, in an actual photographing environment, there is a change in the ambient temperature at the moment; if the ambient temperature is not in the temperature range of the infrared camera calibrated in advance, the infrared image is corrected at the moment, and the system does not store the non-uniform correction coefficient corresponding to the current ambient temperature, so that the infrared image cannot be corrected, or the corrected image is distorted, the imaging effect of the infrared image is poor, and the execution of tasks such as unmanned aerial vehicle inspection and reconnaissance is affected.
In order to solve the problem, the application provides a non-uniformity correction method of an image acquired by an unmanned aerial vehicle, which is applied to a photoelectric pod device, wherein the photoelectric pod device comprises a visible light imaging device and an infrared imaging device for shooting an infrared image, and the method comprises steps S101 to S103 as shown in fig. 1.
S101, acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature within a temperature range pre-calibrated by the optoelectronic pod device, and the non-calibration area is an image area with the ambient temperature not within the temperature range pre-calibrated by the optoelectronic pod device.
In the above steps, after the infrared camera shoots the infrared image of the target position, the infrared image to be processed is sent to the central processing circuit of the optoelectronic pod device. At this time, the central processing circuit adopts a segmentation algorithm to segment the image to be processed into a plurality of correction areas. The segmentation algorithm may be an edge detection algorithm or a graph theory-based segmentation algorithm. When the photoelectric pod device of the application divides an image to be processed, firstly, the image information of the image to be processed needs to be acquired, and the image information comprises the image size, the shooting time and the shooting precision. Wherein the image size may be understood as a size of the image, for example, the image size may be expressed as a size of 1024×1024; the shooting time can be understood as the time of capturing a picture by the infrared camera; the shooting accuracy can be understood as the number of pixels in an image, and the more pixels the same size of image size, the higher the shooting accuracy. The image to be processed is then divided into a plurality of correction areas according to the image information. Then dividing the image to be processed into a plurality of correction areas according to the image information, thereby ensuring that the quality and resolution of the infrared image shot under different shooting conditions meet the requirements. For example, if the unmanned aerial vehicle loads the photoelectric pod device to shoot the infrared picture on the ground, the infrared camera is in the operation state of high-speed movement at this time, the capturing time of the picture is very short, and in order to save the operation time of the correction process, the area of each correction area is increased, so that the calculation power is saved and more images are processed. If the infrared camera is in a static state and shoots the temperature change condition of the object, and at the moment, the picture capturing time is longer, the area of each correction area is reduced, and more system calculation force is distributed to the image correction processing, so that an image with higher quality and resolution is obtained. Therefore, the division size of the image to be processed depends on the actual situation. It should be explained that, because the measurement error exists in the optoelectronic pod device, an abnormality may exist in the image to be processed in the temperature value corresponding to a certain pixel point, so the number of the pixel points in each correction area is at least 2; in addition, since the operation capability of the optoelectronic pod device is limited, when the correction area is divided too much, the data stored in the optoelectronic pod device is easy to be redundant, so that the imaging effect of the infrared image is poor, the number of pixels in the correction area is not too much, and the upper limit of the number of pixels in each correction area is determined according to the performance of the infrared processing chip.
In one possible implementation, when the image to be processed includes an area whose ambient temperature is not within the temperature range where the infrared camera is calibrated in advance, then the plurality of correction areas need to be divided into a calibration area and a non-calibration area. The calibration area is understood to be an image area of which the ambient temperature is within a temperature range that the electro-optical pod device is pre-calibrated, and the non-calibration area is understood to be an image area of which the ambient temperature is not within the temperature range that the electro-optical pod device is pre-calibrated. At the moment, the photoelectric pod equipment acquires temperature values corresponding to all pixel points in any correction area, and then calculates the proportion of the number of the pixel points with the temperature values in a preset calibration temperature range to the total number of the pixel points in the correction area to obtain the temperature proportion of the correction area; and when the temperature proportion of the correction area is larger than or equal to a preset proportion threshold value, determining the correction area as a calibration area, and when the temperature proportion of the correction area is smaller than the preset proportion threshold value, determining the correction area as a non-calibration area. Wherein the preset ratio threshold is preferably 0.5.
For example, if the correction area includes 800×600 pixels, the temperature value corresponding to each pixel is obtained by measurement of the infrared camera. If there are 30 ten thousand pixel points in the preset temperature range, the temperature ratio of the correction area is 0.625; and determining the correction area as a calibration area according to a preset proportion threshold value of 0.5, otherwise, considering the correction area as a non-calibration area.
S102, carrying out non-uniformity correction on a plurality of correction areas based on a preset non-uniformity correction coefficient table to generate a first image and a second image; the first image is an image corrected by the calibration area, the second image is an image corrected by the non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients.
Specifically, before non-uniformity correction is performed on a plurality of correction areas, it is necessary to construct a non-uniformity correction coefficient table in advance. The concrete construction mode is as follows: first, an area within the imaging field of view of a camera is determined as a build area. This region should contain all the object types and temperature ranges that may occur during infrared camera imaging. After the build area is determined, a set of images is taken within the area covering the entire temperature range. The image may be a stepwise increase or decrease in temperature or may be a random temperature distribution. And inputting the corrected image obtained by shooting into infrared image processing software to perform operations such as background removal, noise filtering and the like so as to improve the image quality and accuracy. And then, carrying out statistics and analysis on the temperature value of each pixel point to obtain an average temperature value at each temperature in the region. The average temperature value at each temperature is compared with the theoretical temperature value at that temperature, and a correction coefficient at that temperature can be obtained. Wherein the correction factor is a fraction between 0 and 1 for correcting possible non-uniformities at the temperature. And arranging the obtained correction coefficients from low to high according to the temperature, and constructing a non-uniformity correction coefficient table. At this time, the non-uniformity correction coefficient table includes correspondence between temperature and non-uniformity correction coefficients.
In one possible implementation manner, when correcting the calibration area, firstly, acquiring the temperature of a first correction area, wherein the first correction area is any one of a plurality of calibration areas, and the temperature of the first correction area can be understood as the average temperature of a plurality of pixel points in the area; then, matching the temperature of the first correction area with a preset non-uniformity correction coefficient table, so as to obtain a correction coefficient corresponding to the first correction area; and correcting the first correction region according to the correction coefficient of the first correction region, namely multiplying the value of each pixel point in the first correction region by the correction coefficient to eliminate errors and deviations caused by non-uniformity, and finally generating a first image.
For example, as shown in fig. 2, a icon a presets a portion of the non-uniformity correction coefficient table. The figure includes 10 temperatures and correction coefficients corresponding to the 10 temperatures. The image B is an image to be processed, a region 1 and a region 3 in the image are calibration regions, the region 1 comprises 4 pixel points, the corresponding temperature is 24 ℃, the region 2 comprises 4 pixel points, and the corresponding temperature is 26 ℃. At the moment, the photoelectric pod equipment inquires correction coefficients corresponding to the No. 1 area and the No. 3 area from a preset non-uniform correction coefficient table; wherein, region No. 1 is 0.65,2 and region No. 0.55. Each pixel in region No. 1 is then multiplied by each pixel in region No. 0.635,2 by 0.525. And finally generating a corrected image, namely a C diagram.
In one possible embodiment, when the non-calibration area is corrected, the temperature of a second correction area is first acquired, and the second correction area is any one area of the plurality of non-calibration areas. Then, matching the temperature of the second correction domain with a preset non-uniformity correction coefficient table, so as to obtain two adjacent temperatures of the second correction domain, and inquiring correction coefficients corresponding to the two temperatures respectively; then, two adjacent temperatures of the second correction area and correction coefficients corresponding to the two temperatures respectively are calculated through a secondary difference algorithm to obtain the correction coefficients corresponding to the second correction area; the secondary difference algorithm specifically comprises the following steps: two temperatures adjacent to the temperature of the second correction area are acquired, namely a first temperature and a second temperature. And calculating the distance between the temperature of the second correction area and the first temperature and the second temperature, wherein the distance is the first distance and the second distance respectively. And distributing the correction coefficient weights corresponding to the first temperature and the second temperature respectively based on the first distance and the second distance, and finally calculating the correction coefficient and the correction coefficient weight corresponding to the first temperature and the second temperature respectively to obtain the correction coefficient of the second correction area. The calculation formula is as follows:
;
Wherein S is the correction coefficient of the second region, a is the weight of the correction coefficient corresponding to the first temperature, b is the weight of the correction coefficient corresponding to the second temperature, S 1 For the first temperature corresponding correction coefficient S 2 And the second temperature corresponds to the correction coefficient. Finally, each pixel point in the second correction areaThe value is multiplied by the correction coefficient to generate a second image.
For example, as shown in fig. 3, a icon a presets a portion of the non-uniformity correction coefficient table. The figure includes 10 temperatures and correction coefficients corresponding to the 10 temperatures. The image B is an image to be processed, a No. 3 area in the image is a non-calibration area, the No. 3 area comprises 4 pixel points, and the corresponding temperature is 25.5 ℃. At this time, the two temperatures adjacent to the temperature corresponding to the No. 3 region are 24 degrees celsius and 26 degrees celsius, respectively. The first distance is 1.5 and the second distance is 0.5. Then the calibration factor weight for the first temperature is 0.75 and the calibration factor weight for the second temperature is 0.25. And obtaining a correction coefficient corresponding to the first temperature of 0.65 according to a preset non-uniformity correction parameter table, and obtaining a correction coefficient corresponding to the second temperature of 0.55. And according to a secondary difference algorithm, calculating to obtain a correction coefficient corresponding to the No. 3 area as 0.625. Each pixel in region No. 3 is then multiplied by 0.625, and finally a corrected image, i.e., a D-map, is generated.
S103, fusing the first image and the second image to generate a high-resolution infrared image.
Specifically, firstly, marking characteristic points of an image to be processed to obtain a plurality of image characteristic points; wherein one image feature point corresponds to one correction area. Marking a plurality of calibration areas in a first image according to a plurality of image feature points in the image to be processed to obtain a first marking result; the positions of the plurality of calibration areas correspond to the positions of the calibration areas in the image to be processed; marking a plurality of non-calibrated areas in the second image to obtain a second marking result; the positions of the plurality of non-calibration areas correspond to the positions of the non-calibration areas in the image to be processed. And finally, splicing and fusing the plurality of calibration areas and the plurality of non-calibration areas according to the first marking result and the second marking result to generate a high-resolution infrared image.
For example, as shown in fig. 4, a is a first image and B is a second image; the area 1 and the area 2 in the A picture are calibration areas; in the graph B, the area No. 3 and the area No. 4 are non-calibration areas. a. The points b, c and d are a plurality of marked points in the image to be processed. The point a and the point b correspond to the point e and the point f in the first image respectively, and the point c and the point d correspond to the point g and the point h in the second image respectively. And then, after correcting the 4 correction areas, extracting the No. 1 area and the No. 2 area in the first image and the No. 3 area and the No. 4 area in the second image from the corresponding points of the image to be processed, the first image and the second image. And finally splicing the area 1, the area 2, the area 3 and the area 4 into a C image, namely a high-resolution infrared image.
The application also provides a non-uniformity correction system for the image acquired by the unmanned aerial vehicle, which is photoelectric pod equipment, as shown in fig. 5, wherein the photoelectric pod equipment comprises an acquisition module 1 and a processing module 2, and the non-uniformity correction system comprises the following components:
the acquisition module 1 is used for acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature in a temperature range pre-calibrated by the optoelectronic pod equipment, and the non-calibration area is an image area with the ambient temperature not in the temperature range pre-calibrated by the optoelectronic pod equipment;
a processing module 2, configured to perform non-uniformity correction on the plurality of correction areas based on a preset non-uniformity correction coefficient table, and generate a first image and a second image; the first image is an image corrected by a calibration area, the second image is an image corrected by a non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients; and fusing the first image and the second image to generate a high-resolution infrared image.
In one possible implementation manner, the obtaining module 1 is configured to obtain a first temperature of a first correction area and a first temperature of the first correction area, where the first correction area is any one of a plurality of calibration areas; the processing module 2 is used for matching the first temperature with a preset non-uniform correction coefficient table to obtain a first correction coefficient; and correcting the first correction region by adopting the first correction coefficient to generate a first image.
In one possible implementation manner, the obtaining module 1 is configured to obtain a second temperature of a second correction area and a second temperature of the second correction area, where the second correction area is any one of a plurality of non-calibration areas;
the processing module 2 is configured to match the second temperature with a preset non-uniform correction coefficient table, so as to obtain adjacent correction data corresponding to the second temperature, where the adjacent correction data includes two temperatures adjacent to the second temperature and correction coefficients corresponding to the two adjacent temperatures; according to the adjacent correction data, a second correction coefficient is obtained by adopting a secondary difference algorithm; and correcting the second correction area by adopting the second correction coefficient to generate a second image.
In one possible implementation manner, the obtaining module 1 is configured to perform feature point marking on an image to be processed to obtain a plurality of image feature points;
the processing module 2 is used for marking a plurality of calibration areas in the first image based on a plurality of image feature points to obtain a first marking result; marking a plurality of non-calibration areas in the second image to obtain a second marking result, wherein the first marking result comprises the positions of marking points corresponding to the plurality of calibration areas in the first image, and the second marking result comprises the positions of marking points corresponding to the plurality of non-calibration areas in the second image; and based on the first marking result and the second marking result, splicing the plurality of calibration areas with the plurality of non-punctuation areas to generate a high-resolution infrared image.
In a possible implementation manner, the acquiring module 1 is configured to segment an image to be processed by using an image segmentation algorithm; acquiring a temperature value corresponding to each pixel point in a third correction area, wherein the third correction area is any one of a plurality of correction areas after being divided in an image to be processed;
the processing module 2 is configured to determine a temperature proportion of the third correction area based on a temperature value corresponding to each pixel point in the third correction area, where the temperature proportion is a ratio of the number of the pixels with the temperature value within a preset calibration temperature range to the number of the pixels in the third correction area; and determining a calibration area and a non-calibration area based on the temperature proportion.
In one possible embodiment, when the temperature ratio is greater than or equal to the preset ratio threshold, the processing module 2 determines the third correction zone as the calibration zone; when the temperature ratio is smaller than the preset ratio threshold, the processing module 2 determines that the third correction area is a non-calibration area.
In one possible implementation manner, the acquiring module 1 is configured to acquire image information of an image to be processed, where the image information includes an image size and a shooting time;
the processing module 2 is used for determining the segmentation size of the image to be processed based on the image information; the image to be processed is divided into a plurality of correction areas according to the division size.
It should be noted that: in the device provided in the above embodiment, when implementing the functions thereof, only the division of the above functional modules is used as an example, in practical application, the above functional allocation may be implemented by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the embodiments of the apparatus and the method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the embodiments of the method are detailed in the method embodiments, which are not repeated herein.
The application further provides electronic equipment. Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device 600 may include: at least one processor 601, at least one network interface 604, a user interface 603, a memory 605, at least one communication bus 602.
Wherein the communication bus 602 is used to enable connected communications between these components.
The user interface 603 may include a Display screen (Display), a Camera (Camera), and the optional user interface 603 may further include a standard wired interface, a wireless interface.
The network interface 604 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 601 may include one or more processing cores. The processor 601 connects various portions of the overall server using various interfaces and lines, performs various functions of the server and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 605, and invoking data stored in the memory 605. Alternatively, the processor 601 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 601 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 601 and may be implemented by a single chip.
The Memory 605 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 605 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 605 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 605 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, etc.; the storage data area may store data or the like involved in the above respective method embodiments. The memory 605 may also optionally be at least one storage device located remotely from the processor 601. Referring to fig. 6, an operating system, a network communication module, a user interface module, and an application program of a non-uniformity correction method for an image acquired by an unmanned aerial vehicle may be included in a memory 605 as a computer storage medium.
In the electronic device 600 shown in fig. 6, the user interface 603 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the processor 601 may be configured to invoke an application in the memory 605 that stores a method for non-uniformity correction of images acquired by a drone, which when executed by the one or more processors 601, causes the electronic device 600 to perform the method as described in one or more of the embodiments above. It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all of the preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as a division of units, merely a division of logic functions, and there may be additional divisions in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some service interface, device or unit indirect coupling or communication connection, electrical or otherwise.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in whole or in part in the form of a software product stored in a memory, comprising several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method of the various embodiments of the present application. And the aforementioned memory includes: various media capable of storing program codes, such as a U disk, a mobile hard disk, a magnetic disk or an optical disk.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure.
This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.

Claims (10)

1. A method of non-uniformity correction of an image acquired by an unmanned aerial vehicle, applied to an optoelectronic pod apparatus, the method comprising:
acquiring a plurality of correction areas in an image to be processed, wherein the plurality of correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature within a temperature range pre-calibrated by the optoelectronic pod equipment, and the non-calibration area is an image area with the ambient temperature not within the temperature range pre-calibrated by the optoelectronic pod equipment;
based on a preset non-uniformity correction coefficient table, non-uniformity correction is carried out on a plurality of correction areas, and a first image and a second image are generated; the first image is the corrected image of the calibration area, the second image is the corrected image of the non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients;
And fusing the first image and the second image to generate a high-resolution infrared image.
2. The method according to claim 1, wherein the non-uniformity correction is performed on a plurality of correction areas based on a preset non-uniformity correction coefficient table, so as to generate a first image, specifically:
acquiring a first temperature of a first correction area and a first temperature corresponding to the first correction area, wherein the first correction area is any one of a plurality of calibration areas;
matching the first temperature with the preset non-uniform correction coefficient table to obtain a first correction coefficient;
and correcting the first correction region by adopting the first correction coefficient to generate the first image.
3. The method according to claim 1, wherein the non-uniformity correction is performed on a plurality of correction areas based on a preset non-uniformity correction coefficient table, so as to generate a second image, specifically:
acquiring a second temperature of a second correction area and a second temperature corresponding to the second correction area, wherein the second correction area is any one of a plurality of non-calibration areas;
matching the second temperature with the preset non-uniform correction coefficient table to obtain adjacent correction data corresponding to the second temperature, wherein the adjacent correction data comprises two adjacent temperatures of the second temperature and correction coefficients corresponding to the two adjacent temperatures;
According to the adjacent correction data, a second correction coefficient is calculated by adopting a secondary difference algorithm;
and correcting the second correction area by adopting the second correction coefficient to generate a second image.
4. The method according to claim 1, wherein the fusing of the first image with the second image generates a high resolution infrared image, in particular:
marking the feature points of the image to be processed to obtain a plurality of image feature points;
marking a plurality of calibration areas in the first image based on a plurality of image feature points to obtain a first marking result; marking a plurality of non-calibration areas in a second image to obtain a second marking result, wherein the first marking result comprises the positions of marking points corresponding to the plurality of calibration areas in the first image, and the second marking result comprises the positions of marking points corresponding to the plurality of non-calibration areas in the second image;
and based on the first marking result and the second marking result, splicing the plurality of calibration areas with the plurality of non-calibration areas to generate the high-resolution infrared image.
5. The method according to claim 1, wherein the acquiring a plurality of correction areas in the image to be processed, in particular:
Dividing the image to be processed by adopting an image dividing algorithm;
acquiring a temperature value corresponding to each pixel point in a third correction area, wherein the third correction area is any one of a plurality of correction areas after being segmented in the image to be processed;
determining a temperature proportion of the third correction area based on temperature values corresponding to all pixel points in the third correction area, wherein the temperature proportion is a ratio of the number of the pixel points with the temperature values within a preset calibration temperature range to the number of the pixel points in the third correction area;
and determining the calibration area and the non-calibration area based on the temperature proportion.
6. The method according to claim 5, characterized in that said determining of said calibrated area and said non-calibrated area based on said temperature ratio is in particular:
when the temperature ratio is greater than or equal to a preset ratio threshold, determining the third correction area as a calibration area;
and when the temperature ratio is smaller than a preset ratio threshold, determining the third correction area as a non-calibration area.
7. The method according to claim 5, wherein the image segmentation algorithm is used to segment the image to be processed, in particular:
Acquiring image information of the image to be processed, wherein the image information comprises image size and shooting time;
determining the segmentation size of the image to be processed based on the image information;
and dividing the image to be processed into a plurality of correction areas according to the dividing size.
8. A non-uniformity correction system for an unmanned aerial vehicle acquired image, characterized in that the system is a photovoltaic pod device comprising an acquisition module (1) and a processing module (2), wherein:
the acquisition module (1) is used for acquiring a plurality of correction areas in an image to be processed, wherein the correction areas comprise a calibration area and a non-calibration area, the calibration area is an image area with the ambient temperature in a temperature range calibrated in advance by the optoelectronic pod equipment, and the non-calibration area is an image area with the ambient temperature not in the temperature range calibrated in advance by the optoelectronic pod equipment;
the processing module (2) is used for carrying out non-uniformity correction on a plurality of correction areas based on a preset non-uniformity correction coefficient table to generate a first image and a second image; the first image is the corrected image of the calibration area, the second image is the corrected image of the non-calibration area, and the preset non-uniform correction coefficient table comprises the corresponding relation between temperature and non-uniform correction coefficients; and fusing the first image and the second image to generate a high-resolution infrared image.
9. An electronic device comprising a processor (601), a memory (605), a user interface (603) and a network interface (604), the memory (605) being for storing instructions, the user interface (603) and the network interface (604) being for communicating to other devices, the processor (601) being for executing the instructions stored in the memory (605) to cause the electronic device (600) to perform the method of any of claims 1 to 7.
10. A computer readable storage medium storing instructions which, when executed, perform the method of any one of claims 1 to 7.
CN202311158339.XA 2023-09-08 2023-09-08 Non-uniformity correction method and system for unmanned aerial vehicle acquired image Pending CN116952391A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311158339.XA CN116952391A (en) 2023-09-08 2023-09-08 Non-uniformity correction method and system for unmanned aerial vehicle acquired image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311158339.XA CN116952391A (en) 2023-09-08 2023-09-08 Non-uniformity correction method and system for unmanned aerial vehicle acquired image

Publications (1)

Publication Number Publication Date
CN116952391A true CN116952391A (en) 2023-10-27

Family

ID=88458580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311158339.XA Pending CN116952391A (en) 2023-09-08 2023-09-08 Non-uniformity correction method and system for unmanned aerial vehicle acquired image

Country Status (1)

Country Link
CN (1) CN116952391A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234796A1 (en) * 2002-06-24 2003-12-25 Koninklijke Philips Electronics N.V. Color non-uniformity correction method and apparatus
US20120038660A1 (en) * 2010-08-12 2012-02-16 Samsung Electronics Co., Ltd. Display apparatus and image correction method of the same
CN103308178A (en) * 2013-06-04 2013-09-18 电子科技大学 Non-uniformity correction method for non-refrigeration infrared focal plane array
CN105841821A (en) * 2016-06-08 2016-08-10 南京理工大学 Calibration-based barrier sheet-free non-uniformity correction device and method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234796A1 (en) * 2002-06-24 2003-12-25 Koninklijke Philips Electronics N.V. Color non-uniformity correction method and apparatus
US20120038660A1 (en) * 2010-08-12 2012-02-16 Samsung Electronics Co., Ltd. Display apparatus and image correction method of the same
CN103308178A (en) * 2013-06-04 2013-09-18 电子科技大学 Non-uniformity correction method for non-refrigeration infrared focal plane array
CN105841821A (en) * 2016-06-08 2016-08-10 南京理工大学 Calibration-based barrier sheet-free non-uniformity correction device and method thereof

Similar Documents

Publication Publication Date Title
US10645364B2 (en) Dynamic calibration of multi-camera systems using multiple multi-view image frames
CN110517209B (en) Data processing method, device, system and computer readable storage medium
WO2019171984A1 (en) Signal processing device, signal processing method, and program
CN111693147A (en) Method and device for temperature compensation, electronic equipment and computer readable storage medium
WO2013052600A1 (en) Using videogrammetry to fabricate parts
CN112529807B (en) Relative radiation correction method and device for satellite image
US20180063454A1 (en) Method and apparatus for correcting fixed pattern noise of an infrared image
CN110858872A (en) Optical axis offset compensation method and device
CN107403410B (en) Splicing method of thermal infrared images
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN104636743A (en) Character image correction method and device
CN113962876B (en) Pixel distortion correction method, correction device and terminal
CN113766203B (en) Image white balance processing method
CN113272855B (en) Response normalization for overlapping multiple image applications
CN110673114B (en) Method and device for calibrating depth of three-dimensional camera, computer device and storage medium
CN112989872B (en) Target detection method and related device
CN116952391A (en) Non-uniformity correction method and system for unmanned aerial vehicle acquired image
CN116708756A (en) Sensor accuracy detection method, detection device, electronic device, and storage medium
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
WO2018139297A1 (en) Camera device
CN113496505B (en) Image registration method and device, multispectral camera, unmanned equipment and storage medium
CN114359425A (en) Method and device for generating ortho image, and method and device for generating ortho exponential graph
KR102240892B1 (en) Method and apparatus for correcting wavefront distortions of video signal
Wegner et al. Image based performance analysis of thermal imagers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination