WO2021179223A1 - 红外图像处理方法、处理设备、无人飞行器和存储介质 - Google Patents

红外图像处理方法、处理设备、无人飞行器和存储介质 Download PDF

Info

Publication number
WO2021179223A1
WO2021179223A1 PCT/CN2020/078873 CN2020078873W WO2021179223A1 WO 2021179223 A1 WO2021179223 A1 WO 2021179223A1 CN 2020078873 W CN2020078873 W CN 2020078873W WO 2021179223 A1 WO2021179223 A1 WO 2021179223A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared image
image
fusion weight
shooting mode
temperature measurement
Prior art date
Application number
PCT/CN2020/078873
Other languages
English (en)
French (fr)
Inventor
张青涛
庹伟
王黎
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080032171.6A priority Critical patent/CN113785559A/zh
Priority to PCT/CN2020/078873 priority patent/WO2021179223A1/zh
Publication of WO2021179223A1 publication Critical patent/WO2021179223A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • This application relates to the field of image processing technology, and specifically to an infrared image processing method, infrared image processing equipment, unmanned aerial vehicles, and computer-readable storage media.
  • Infrared thermal imaging systems are often designed in two modes, high-gain mode and low-gain mode; high-gain mode has a narrow temperature measurement range and high temperature measurement accuracy; low-gain mode has a wide temperature measurement range but low temperature measurement accuracy. Therefore, no matter whether the infrared image is acquired in the high-gain mode or the infrared image is acquired in the low-gain mode, it is difficult to balance the temperature measurement range and the temperature measurement accuracy of the image at the same time.
  • This application aims to solve at least one of the technical problems existing in the prior art or related technologies.
  • the first aspect of this application proposes an infrared image processing method.
  • the second aspect of the application proposes an infrared image processing device.
  • the third aspect of this application proposes an unmanned aerial vehicle.
  • the fourth aspect of the present application proposes a computer-readable storage medium.
  • the first aspect of the present application proposes an infrared image processing method, including: collecting a first infrared image of the object to be measured in a first shooting mode, and collecting a first infrared image of the object to be measured in a second shooting mode Two infrared images, where the image gain of the first shooting mode is greater than the image gain of the second shooting mode; image fusion is performed on the first infrared image and the second infrared image to generate a third infrared image.
  • the second aspect of the present application proposes an infrared image processing device, including: an infrared image acquisition device that collects a first infrared image of an object to be measured in a first shooting mode, and collects an infrared image of the object to be measured in a second shooting mode The second infrared image, where the image gain of the first shooting mode is greater than the image gain of the second shooting mode; a memory, which stores a computer program; a processor, connected to the infrared image acquisition device and the memory, when the processor executes the computer program: Image fusion is performed on the first infrared image and the second infrared image to generate a third infrared image.
  • a third aspect of the present application provides an unmanned aerial vehicle, including: a body; an infrared image processing device, and the infrared image processing device includes: an infrared image acquisition device that acquires a first infrared image of an object to be measured in a first shooting mode, And collecting a second infrared image of the object under test in the second shooting mode, wherein the image gain of the first shooting mode is greater than the image gain of the second shooting mode; a memory, which stores a computer program; a processor, and an infrared image acquisition device When connected with the memory, the processor executes the computer program to realize: image fusion of the first infrared image and the second infrared image to generate a third infrared image.
  • the fourth aspect of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the infrared image processing method as in any one of the technical solutions in the first aspect is implemented. Therefore, the computer The readable storage medium has all the beneficial effects of the infrared image processing method of any one of the technical solutions in the first aspect described above.
  • This application proposes an infrared image processing method, an infrared image processing device, an unmanned aerial vehicle, and a computer-readable storage medium.
  • a first infrared image of the object to be measured is taken
  • a second infrared image of the object to be measured is taken.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode
  • the first shooting mode is the high gain mode
  • the second shooting mode is the low gain mode.
  • the image temperature measurement of the first infrared image collected in the high gain mode The range is narrow, the temperature measurement accuracy is high, the image temperature measurement range of the second infrared image collected in the low gain mode is wide, and the temperature measurement accuracy is low.
  • the first infrared image and the second infrared image are image fused to generate a third infrared image with a high dynamic range, which can simultaneously meet the high temperature measurement range and high temperature measurement accuracy, so that the obtained high dynamic range image can be directly used For temperature measurement, the temperature of the object to be measured can be obtained.
  • Fig. 1 shows a schematic flowchart of an infrared image processing method according to a first embodiment of the present application
  • Fig. 2 shows a schematic flowchart of an infrared image processing method according to a second embodiment of the present application
  • Fig. 3 shows a schematic flowchart of an infrared image processing method according to a third embodiment of the present application
  • Fig. 4 shows a schematic flowchart of an infrared image processing method according to a fourth embodiment of the present application
  • Fig. 5 shows a schematic diagram of an infrared image processing method according to a fifth embodiment of the present application
  • Fig. 6 shows a schematic block diagram of an infrared image processing device according to a sixth embodiment of the present application
  • Fig. 7 shows a schematic block diagram of an unmanned aerial vehicle according to a seventh embodiment of the present application.
  • Infrared imaging technology is a kind of promising high-tech, imaging by reflecting the surface temperature of objects.
  • this technology has been widely used in various application fields, such as power inspection, search and rescue, fire rescue, urban space modeling, etc. .
  • This application proposes an infrared image processing method, an infrared image processing device, an unmanned aerial vehicle, and a computer-readable storage medium, which collects the first infrared image of the object to be measured in the first shooting mode and the second shooting mode Image fusion is performed on the second infrared image of the object to be measured to generate a third infrared image with a high dynamic range to ensure the high temperature measurement range and high temperature measurement accuracy of the infrared image.
  • an unmanned aerial vehicle is used to install an infrared camera to photograph the scene to obtain infrared images, so as to realize the unmanned aerial vehicle aerial photography or temperature measurement, target tracking, monitoring, etc. in a special scene, thereby realizing the application in the above-mentioned application fields .
  • an infrared camera is installed on the pan/tilt device of the UAV.
  • infrared cameras can be dual-lens cameras, including infrared cameras and visible light cameras, or three-lens cameras, including infrared cameras, Visible light zoom cameras and visible light fixed-focus cameras, among which the three-light camera integrates infrared cameras, visible light zoom cameras and visible light fixed-focus cameras, and optimizes the internal module layout to meet the needs of operations in a variety of occasions. It can have a smaller and lighter overall module, which further enhances the practicability of the three-light camera.
  • unmanned aerial vehicles can also be surveillance, detection, aerial photography and other equipment that can be installed with infrared cameras or have infrared photography functions, such as helicopters, surveillance cameras, robots, fire trucks, detectors, and the like.
  • Fig. 1 shows a schematic flowchart of an infrared image processing method according to a first embodiment of the present application.
  • the infrared image processing method of the first embodiment includes:
  • Step 102 Collect a first infrared image of the object to be measured in the first shooting mode, and collect a second infrared image of the object to be measured in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode.
  • Step 104 Perform image fusion on the first infrared image and the second infrared image to generate a third infrared image.
  • a first infrared image of the object under test is taken in the first shooting mode
  • a second infrared image of the object under test is taken in the second shooting mode.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode
  • the first shooting mode is a high gain mode
  • the second shooting mode is a low gain mode.
  • the first infrared image and the second infrared image are image fused to generate a third infrared image with a high dynamic range, which can simultaneously meet the high temperature measurement range and high temperature measurement accuracy, so that the obtained high dynamic range image can be directly used For temperature measurement, the temperature of the object to be measured can be obtained.
  • the infrared image shooting device may be an infrared camera.
  • it may be a dual-lens camera, including an infrared camera and a visible light camera, or a three-lens camera, including an infrared camera, a visible light zoom camera, and a visible light fixed focus camera.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode.
  • the temperature measurement range of the first shooting mode is less than or equal to the temperature measurement range of the second shooting mode, and the temperature measurement of the first shooting mode is The accuracy is greater than or equal to the temperature measurement accuracy of the second shooting mode.
  • the temperature measurement range of the infrared image collected in the high gain mode is narrow.
  • the temperature measurement range supported by the high gain mode is -40°C to 120°C, and the temperature measurement accuracy is high, such as ⁇ 2°C;
  • the infrared image collected in the gain mode has a wide temperature measurement range.
  • the temperature measurement range supported by the low gain mode is -40°C to 550°C, and the temperature measurement accuracy is low, such as ⁇ 5°C.
  • the difference between the upper limit and the lower limit of the temperature measurement range in the first shooting mode is less than or equal to the difference between the upper limit and the lower limit of the temperature measurement range in the second shooting mode.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two temperature measurement intervals that do not overlap.
  • the temperature measurement range of the first shooting mode is -40°C to 50°C
  • the second shooting mode The temperature measurement range is 60°C to 550°C.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two overlapping temperature measurement intervals.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -100°C to 550°C.
  • the temperature measurement range can be a closed interval or an open interval.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -40°C to 550°C, or the temperature measurement range of the first shooting mode is -40°C to 120°C, and the temperature measurement range of the second shooting mode is -50°C to 550°C.
  • the high gain mode and the low gain mode may not necessarily be the high gain mode and the low gain mode. It can also be a mode for measuring a certain temperature section, such as high-precision measurement in the low-temperature section and high-precision measurement in the high-temperature section, and then synthesize an image to obtain Images with high dynamic range and high temperature measurement accuracy.
  • Fig. 2 shows a schematic flow chart of an infrared image processing method according to a second embodiment of the present application.
  • the infrared image processing method of the second embodiment includes:
  • Step 202 Collect a first infrared image of the object to be measured in the first shooting mode, and collect a second infrared image of the object to be measured in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode.
  • Step 204 Perform image fusion for the pixels of the first infrared image according to the first fusion weight and the pixels of the second infrared image according to the second fusion weight to generate a third infrared image; or the designated area of the first infrared image is The three fusion weights and the designated area of the second infrared image are image fused according to the fourth fusion weight to generate the third infrared image.
  • the image can meet the high temperature measurement range and high temperature measurement accuracy at the same time.
  • the weight of each pixel on the same infrared image may be the same or different, and the weight of each pixel in the designated area on the same infrared image may be the same or different.
  • the sum of the fusion weight of the first infrared image and the fusion weight of the second infrared image is 1 or a specified value.
  • formula (1) is used to calculate the pixel value of the third infrared image
  • P3 is the pixel value of the third infrared image
  • P1 is the pixel value of the first infrared image
  • P2 is the pixel value of the second infrared image
  • a is the fusion weight of the first infrared image.
  • the sum of the first fusion weight and the second fusion weight is 1; or the sum of the third fusion weight and the fourth fusion weight is 1.
  • the first fusion weight of each pixel of the first infrared image is the same, and the second fusion weight of each pixel of the second infrared image is the same.
  • the weight of each pixel on the same infrared image is the same.
  • the weight can be a preset value or set according to shooting parameters, and the shooting parameters can be values related to high gain and low gain.
  • the method further includes: setting the first fusion weight and the second fusion weight according to the ratio relationship between the image gain of the first shooting mode and the image gain of the second shooting mode.
  • the fusion weight can be set according to the value of high gain and low gain. 2. Fusion weight.
  • the values of the high gain and the low gain can also be adjusted by the user. After the value of the gain is changed, the corresponding weight will also be changed accordingly.
  • the method further includes: setting the first fusion weight and the second fusion weight according to the relative relationship between the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode.
  • the weight of each pixel on the same infrared image is the same
  • the first fusion weight and the second fusion weight can be determined according to the relative relationship between the temperature measurement range corresponding to high gain and the temperature measurement range corresponding to low gain .
  • Different gain values can correspond to different temperature measurement ranges. For example, high gain has 5 gears, and the temperature measurement ranges corresponding to different gears are not exactly the same, and low gains are similar.
  • the high-gain temperature measurement range of an infrared camera is 0°C to 100°C
  • the low-gain temperature measurement range is 0°C to 500°C.
  • the weight of the high gain can be 1, and the weight of the low gain The value of is 0, that is, only the high gain mode is needed to obtain an infrared image with high accuracy and temperature measurement range that meets the actual scene.
  • the known maximum temperature of the current scene is 300°C
  • the actual temperature measurement range cannot be satisfied only through the high gain mode. If only the low gain mode is used, the accuracy of temperature measurement is poor, so image fusion can be performed.
  • the weight of high gain can be, for example, It is 0.7 and the weight of low gain is 0.3, then high accuracy can be obtained (compared to only using low gain to take pictures of pixels in the temperature range of 0°C to 100°C) and the temperature measurement range is large (compared to only using high gain Take pictures of pixels in the temperature measurement range of 0°C to 100°C).
  • the method further includes: receiving the temperature measurement range required by the user; and setting the first fusion weight and the second fusion weight according to the temperature measurement range required by the user.
  • the weight of each pixel on the same infrared image is the same, and the weight can be set according to the user's requirements for the temperature measurement range.
  • the weight can be related to the temperature measurement range that the user needs to take pictures, and in some cases, the high gain and low gain values used by the infrared camera are fixed, that is, the corresponding temperature measurement range is fixed, as long as the user informs the required With the temperature measurement range, the first fusion weight and the second fusion weight can be calculated according to the needs of the user.
  • the first fusion weight of each pixel of the first infrared image is not completely the same, and the second fusion weight of each pixel of the second infrared image is not completely the same.
  • it further includes: setting the first fusion weight and the second fusion weight according to the image pixel point information, wherein the image pixel point information includes one or a combination of the following: pixel texture, pixel signal-to-noise ratio , Pixel information amount, pixel temperature value.
  • the fusion weight of each pixel of the same infrared image is not completely the same.
  • the weight can be determined according to the pixel texture, pixel signal-to-noise ratio, or pixel information of each pixel, or according to the pixel's
  • the interval corresponding to the temperature value is used to determine its weight.
  • the third fusion weight of pixels in the same designated area of the first infrared image is the same, and the fourth fusion weight of pixels in the same designated area of the second infrared image is the same.
  • it further includes: setting a third fusion weight and a fourth fusion weight according to the designated area information, where the designated area information includes one or a combination of the following: the amount of regional pixel information, the range of regional temperature changes, Regional temperature change gradient value, regional signal-to-noise ratio.
  • the weight of each pixel in a designated area on the same infrared image is the same, and the information amount, temperature change range, gradient value, signal-to-noise ratio, etc. of the pixel in each area are calculated to determine the The weight of the pixel.
  • Fig. 3 shows a schematic flowchart of an infrared image processing method according to a third embodiment of the present application.
  • the infrared image processing method of the third embodiment includes:
  • Step 302 Acquire a first infrared image of the object to be measured in the first shooting mode, and collect a second infrared image of the object to be measured in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode Image gain
  • Step 304 Perform image fusion on the first infrared image and the second infrared image to generate a third infrared image
  • Step 306 Perform designated processing on the third infrared image, where the designated processing includes one or a combination of the following: global stretch enhancement processing, local stretch enhancement processing, detail enhancement processing, and pseudo-color mapping processing.
  • the third infrared image is subjected to operations such as global stretching enhancement processing, local stretching enhancement processing, detail enhancement processing, pseudo-color mapping processing, etc., to obtain a high dynamic range viewing image to achieve a high-precision image Observe.
  • Fig. 4 shows a schematic flowchart of an infrared image processing method according to a fourth embodiment of the present application.
  • the infrared image processing method of the fourth embodiment includes:
  • Step 402 Collect a first infrared image of the object to be measured in the first shooting mode, and collect a second infrared image of the object to be measured in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode.
  • Step 404 Perform image preprocessing on the first infrared image and the second infrared image, where the image preprocessing includes one or a combination of the following: image correction, dead pixel removal, and noise removal;
  • Step 406 Perform image fusion on the first infrared image and the second infrared image to generate a third infrared image.
  • the infrared signals of the two images are preprocessed (mainly the image sensor straight-out data correction and defect removal operations, such as image sensor responsivity correction, offset correction, dead pixel removal, noise removal, etc.) After that, an unstretched clean infrared raw image is obtained, one for each of the two modes. Further, the two infrared raw images are synthesized to obtain a high bit width image.
  • the generated third infrared image may also be subjected to designated processing, where the designated processing includes one or a combination of the following: global stretch enhancement processing, local stretch enhancement processing, detail enhancement processing, pseudo-color mapping deal with.
  • Fig. 5 shows a schematic diagram of an infrared image processing method according to a fifth embodiment of the present application.
  • the infrared image processing method of the fifth embodiment includes:
  • the high gain mode image has a narrow temperature measurement range and high temperature measurement accuracy; low gain image
  • the temperature measurement range is wide, and the temperature measurement accuracy is low.
  • the two infrared raw images are fused to obtain a high bit width image.
  • High dynamic range temperature measurement map to achieve high temperature range and high temperature measurement accuracy, high bit width images can be directly used for temperature measurement.
  • Fig. 6 shows a schematic block diagram of an infrared image processing device according to a sixth embodiment of the present application.
  • the infrared image processing device 600 of the sixth embodiment includes:
  • An infrared image acquisition device 602 which collects a first infrared image of the object to be tested in the first shooting mode and a second infrared image of the object to be measured in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode.
  • Image gain of shooting mode
  • the memory 604 stores a computer program
  • the processor 606 is connected to the infrared image acquisition device 602 and the memory 604. When the processor 606 executes a computer program, it realizes: image fusion of the first infrared image and the second infrared image to generate a third infrared image.
  • the first infrared image of the object under test is taken in the first shooting mode
  • the second infrared image of the object under test is taken in the second shooting mode.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode
  • the first shooting mode is a high gain mode
  • the second shooting mode is a low gain mode.
  • the processor 606 performs image fusion of the first infrared image and the second infrared image to generate a third infrared image with a high dynamic range, which can simultaneously meet the high temperature measurement range and high temperature measurement accuracy, so that the high dynamic range image is obtained It can be directly used for temperature measurement to obtain the temperature of the object to be measured.
  • the infrared image acquisition device 602 is an infrared camera, and the infrared image acquisition device 602 includes an infrared thermal imaging lens and an image sensor.
  • the infrared camera may be a dual-lens camera, that is, an infrared camera and a visible light camera. It is a three-light camera, which includes an infrared camera, a visible light zoom camera, and a visible light fixed focus camera, and the processor 606 is an image processor.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode.
  • the temperature measurement range of the first shooting mode is less than or equal to the temperature measurement range of the second shooting mode, and the temperature measurement of the first shooting mode is The accuracy is greater than or equal to the temperature measurement accuracy of the second shooting mode.
  • the temperature measurement range of the infrared image collected in the high gain mode is narrow, such as -40°C to 120°C, and the temperature measurement accuracy is high, such as ⁇ 2°C; the temperature measurement of the infrared image collected in the low gain mode
  • the range is wide, such as -40°C to 550°C, and the temperature measurement accuracy is low, such as ⁇ 5°C.
  • the difference between the upper limit and the lower limit of the temperature measurement range in the first shooting mode is less than or equal to the difference between the upper limit and the lower limit of the temperature measurement range in the second shooting mode.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two temperature measurement intervals that do not overlap.
  • the temperature measurement range of the first shooting mode is -40°C to 50°C
  • the second shooting mode The temperature measurement range is 60°C to 550°C.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two overlapping temperature measurement intervals.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -100°C to 550°C.
  • the temperature measurement range can be a closed interval or an open interval.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -40°C to 550°C, or the temperature measurement range of the first shooting mode is -40°C to 120°C, and the temperature measurement range of the second shooting mode is -50°C to 550°C.
  • the processor 606 when the processor 606 executes the computer program, it also implements: performing specified processing on the third infrared image, and the specified processing includes one or a combination of the following: global stretching enhancement processing, local stretching enhancement processing, details Enhanced processing, pseudo-color mapping processing.
  • the third infrared image is subjected to operations such as global stretching enhancement processing, local stretching enhancement processing, detail enhancement processing, pseudo color mapping processing, etc., to obtain a high dynamic range viewing image to achieve a high-precision image Observe.
  • the processor 606 performs image fusion of the first infrared image and the second infrared image to generate the third infrared image, which specifically includes: aligning the pixels of the first infrared image according to the first fusion weight and the second infrared image; The pixels of the two infrared images are image fused according to the second fusion weight to generate a third infrared image; or the specified area of the first infrared image is imaged according to the third fusion weight, and the specified area of the second infrared image is imaged according to the fourth fusion weight Fusion to generate a third infrared image.
  • the image can meet the high temperature measurement range and high temperature measurement accuracy at the same time.
  • the weight of each pixel on the same infrared image may be the same or different, and the weight of each pixel in the designated area on the same infrared image may be the same or different.
  • the sum of the fusion weight of the first infrared image and the fusion weight of the second infrared image is 1 or a specified value.
  • formula (1) is used to calculate the pixel value of the third infrared image
  • P3 is the pixel value of the third infrared image
  • P1 is the pixel value of the first infrared image
  • P2 is the pixel value of the second infrared image
  • a is the fusion weight of the first infrared image.
  • the sum of the first fusion weight and the second fusion weight is 1; or the sum of the third fusion weight and the fourth fusion weight is 1.
  • the first fusion weight of each pixel of the first infrared image is the same, and the second fusion weight of each pixel of the second infrared image is the same.
  • the weight of each pixel on the same infrared image is the same.
  • the weight can be a preset value or set according to shooting parameters, and the shooting parameters can be values related to high gain and low gain.
  • the processor 606 when the processor 606 executes the computer program, the processor 606 further implements: according to the ratio relationship between the image gain of the first shooting mode and the image gain of the second shooting mode, the first fusion weight and the second fusion weight are set. In this embodiment, if the weight of each pixel on the same infrared image is the same, the fusion weight can be set according to the value of high gain and low gain. 2. Fusion weight.
  • the values of the high gain and the low gain can also be adjusted by the user. After the value of the gain is changed, the corresponding weight will also be changed accordingly.
  • the processor 606 when the processor 606 executes the computer program, it also realizes: according to the relative relationship between the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode, the first fusion weight and the second fusion weight are set .
  • the weight of each pixel on the same infrared image is the same
  • the first fusion weight and the second fusion weight can be determined according to the relative relationship between the temperature measurement range corresponding to high gain and the temperature measurement range corresponding to low gain .
  • Different gain values can correspond to different temperature measurement ranges. For example, high gain has 5 gears, and the temperature measurement ranges corresponding to different gears are not exactly the same, and low gains are similar.
  • the high-gain temperature measurement range of an infrared camera is 0°C to 100°C
  • the low-gain temperature measurement range is 0°C to 500°C.
  • the weight of the high gain can be 1, and the weight of the low gain The value of is 0, that is, only the high gain mode is needed to obtain an infrared image with high accuracy and temperature measurement range that meets the actual scene.
  • the known maximum temperature of the current scene is 300°C
  • the actual temperature measurement range cannot be satisfied only through the high gain mode. If only the low gain mode is used, the accuracy of temperature measurement is poor, so image fusion can be performed.
  • the weight of high gain can be, for example, It is 0.7 and the weight of low gain is 0.3, then high accuracy can be obtained (compared to only using low gain to take pictures of pixels in the temperature range of 0°C to 100°C) and the temperature measurement range is large (compared to only using high gain Take pictures of pixels in the temperature measurement range of 0°C to 100°C).
  • the processor 606 when the processor 606 executes the computer program, it also implements: receiving the temperature measurement range required by the user; and setting the first fusion weight and the second fusion weight according to the temperature measurement range required by the user.
  • the weight of each pixel on the same infrared image is the same, and the weight can be set according to the user's requirements for the temperature measurement range.
  • the weight can be related to the temperature measurement range that the user needs to take pictures.
  • the high gain and low gain values used by the infrared camera are fixed, that is, the corresponding temperature measurement range is fixed, as long as the user informs the required With the temperature measurement range, the first fusion weight and the second fusion weight can be calculated according to the needs of the user.
  • the first fusion weight of each pixel of the first infrared image is not completely the same, and the second fusion weight of each pixel of the second infrared image is not completely the same.
  • the processor 606 when the processor 606 executes the computer program, it also implements: setting the first fusion weight and the second fusion weight according to the image pixel point information, where the image pixel point information includes one or a combination of the following: pixel point Texture, pixel signal-to-noise ratio, pixel information amount, pixel temperature value.
  • the fusion weight of each pixel in the same infrared image is not completely the same. The weight can be determined according to the pixel texture, pixel signal-to-noise ratio, or pixel information of each pixel, or according to the pixel The interval corresponding to the temperature value is used to determine its weight.
  • the third fusion weight of pixels in the same designated area of the first infrared image is the same, and the fourth fusion weight of pixels in the same designated area of the second infrared image is the same.
  • the processor 606 when the processor 606 executes the computer program, it also implements: setting the third fusion weight and the fourth fusion weight according to the designated area information, where the designated area information includes one or a combination of the following: area pixel point information Quantities, regional temperature change range, regional temperature change gradient value, regional signal-to-noise ratio.
  • the weight of each pixel in a designated area on the same infrared image is the same, and the information amount, temperature change range, gradient value, signal-to-noise ratio, etc. of the pixel in each area are calculated to determine the The weight of the pixel.
  • the processor 606 when the processor 606 executes the computer program, it also implements: image preprocessing of the first infrared image and the second infrared image, the image preprocessing includes one or a combination of the following: image correction, dead pixel removal , Noise removal.
  • the infrared signals of the two images are preprocessed (mainly the image sensor straight-out data correction and defect removal operations, such as image sensor responsivity correction, offset correction, dead pixel removal, noise removal, etc.) After that, an unstretched clean infrared raw image is obtained, one for each of the two modes. Further, the two infrared raw images are synthesized to obtain a high bit width image.
  • Fig. 7 shows a schematic block diagram of an unmanned aerial vehicle according to a seventh embodiment of the present application.
  • the unmanned aerial vehicle 700 of the seventh embodiment includes:
  • the infrared image processing device 704 where the infrared image processing device 704 includes:
  • An infrared image acquisition device 7042 which collects a first infrared image of the object under test in the first shooting mode, and a second infrared image of the object under test in the second shooting mode, wherein the image gain of the first shooting mode is greater than that of the second shooting mode.
  • Image gain of shooting mode
  • a memory 7044 where a computer program is stored in the memory
  • the processor 7046 is connected to the infrared image acquisition device and the memory, and when the processor executes the computer program, it realizes: image fusion of the first infrared image and the second infrared image to generate a third infrared image.
  • the infrared image acquisition device 7042 captures a first infrared image of the object under test in the first shooting mode, and captures a second infrared image of the object under test in the second shooting mode.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode
  • the first shooting mode is the high gain mode
  • the second shooting mode is the low gain mode.
  • the processor 7046 performs image fusion of the first infrared image and the second infrared image to generate a third infrared image with a high dynamic range, which can simultaneously meet the high temperature measurement range and high temperature measurement accuracy, so that the high dynamic range image is obtained It can be directly used for temperature measurement to obtain the temperature of the object to be measured.
  • the infrared image acquisition device 7042 may be a pan/tilt camera, which is installed on the pan/tilt device of an unmanned aerial vehicle.
  • the infrared image acquisition device 7042 includes an infrared thermal imaging lens and an image sensor.
  • an infrared camera it may be a dual-lens camera, which includes infrared
  • the camera and the visible light camera may also be a three-light camera, which includes an infrared camera, a visible light zoom camera, and a visible light fixed focus camera, and the processor 7046 is an image processor.
  • the image gain of the first shooting mode is greater than the image gain of the second shooting mode.
  • the temperature measurement range of the first shooting mode is less than or equal to the temperature measurement range of the second shooting mode, and the temperature measurement of the first shooting mode is The accuracy is greater than or equal to the temperature measurement accuracy of the second shooting mode.
  • the temperature measurement range of the infrared image collected in the high gain mode is narrow, such as -40°C to 120°C, and the temperature measurement accuracy is high, such as ⁇ 2°C; the temperature measurement of the infrared image collected in the low gain mode
  • the range is wide, such as -40°C to 550°C, and the temperature measurement accuracy is low, such as ⁇ 5°C.
  • the difference between the upper limit and the lower limit of the temperature measurement range in the first shooting mode is less than or equal to the difference between the upper limit and the lower limit of the temperature measurement range in the second shooting mode.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two temperature measurement intervals that do not overlap.
  • the temperature measurement range of the first shooting mode is -40°C to 50°C
  • the second shooting mode The temperature measurement range is 60°C to 550°C.
  • the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode may be two overlapping temperature measurement intervals.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -100°C to 550°C.
  • the temperature measurement range can be a closed interval or an open interval.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is a subset of the temperature measurement range of the second shooting mode.
  • the temperature measurement range of the first shooting mode is -40°C to 120°C
  • the temperature measurement range of the second shooting mode The temperature measurement range is -40°C to 550°C, or the temperature measurement range of the first shooting mode is -40°C to 120°C, and the temperature measurement range of the second shooting mode is -50°C to 550°C.
  • the processor 7046 when the processor 7046 executes the computer program, it also realizes: performing specified processing on the third infrared image, and the specified processing includes one or a combination of the following: global stretching enhancement processing, local stretching enhancement processing, details Enhanced processing, pseudo-color mapping processing.
  • the third infrared image is subjected to operations such as global stretching enhancement processing, local stretching enhancement processing, detail enhancement processing, pseudo color mapping processing, etc., to obtain a high dynamic range viewing image to achieve a high-precision image Observe.
  • the processor 7046 performs image fusion of the first infrared image and the second infrared image to generate a third infrared image, which specifically includes: aligning the pixels of the first infrared image according to the first fusion weight and the second infrared image; The pixels of the two infrared images are image fused according to the second fusion weight to generate a third infrared image; or the specified area of the first infrared image is imaged according to the third fusion weight, and the specified area of the second infrared image is imaged according to the fourth fusion weight Fusion to generate a third infrared image.
  • the image can meet the high temperature measurement range and high temperature measurement accuracy at the same time.
  • the weight of each pixel on the same infrared image may be the same or different, and the weight of each pixel in the designated area on the same infrared image may be the same or different.
  • the sum of the fusion weight of the first infrared image and the fusion weight of the second infrared image is 1 or a specified value.
  • formula (1) is used to calculate the pixel value of the third infrared image
  • P3 is the pixel value of the third infrared image
  • P1 is the pixel value of the first infrared image
  • P2 is the pixel value of the second infrared image
  • a is the fusion weight of the first infrared image.
  • the sum of the first fusion weight and the second fusion weight is 1; or the sum of the third fusion weight and the fourth fusion weight is 1.
  • the first fusion weight of each pixel of the first infrared image is the same, and the second fusion weight of each pixel of the second infrared image is the same.
  • the weight of each pixel on the same infrared image is the same.
  • the weight can be a preset value or set according to shooting parameters, and the shooting parameters can be values related to high gain and low gain.
  • the processor 7046 when the processor 7046 executes the computer program, it also implements: according to the ratio relationship between the image gain of the first shooting mode and the image gain of the second shooting mode, the first fusion weight and the second fusion weight are set. In this embodiment, if the weight of each pixel on the same infrared image is the same, the fusion weight can be set according to the value of high gain and low gain. 2. Fusion weight.
  • the values of the high gain and the low gain can also be adjusted by the user. After the value of the gain is changed, the corresponding weight will also be changed accordingly.
  • the processor 7046 executes the computer program, it is also realized: according to the relative relationship between the temperature measurement range of the first shooting mode and the temperature measurement range of the second shooting mode, the first fusion weight and the second fusion weight are set .
  • the weight of each pixel on the same infrared image is the same
  • the first fusion weight and the second fusion weight can be determined according to the relative relationship between the temperature measurement range corresponding to high gain and the temperature measurement range corresponding to low gain .
  • Different gain values can correspond to different temperature measurement ranges. For example, high gain has 5 gears, and the temperature measurement ranges corresponding to different gears are not exactly the same, and low gains are similar.
  • the high-gain temperature measurement range of an infrared camera is 0°C to 100°C
  • the low-gain temperature measurement range is 0°C to 500°C.
  • the weight of the high gain can be 1, and the weight of the low gain The value of is 0, that is, only the high gain mode is needed to obtain an infrared image with high accuracy and temperature measurement range that meets the actual scene.
  • the known maximum temperature of the current scene is 300°C
  • the actual temperature measurement range cannot be satisfied only through the high gain mode. If only the low gain mode is used, the accuracy of temperature measurement is poor, so image fusion can be performed.
  • the weight of high gain can be, for example, It is 0.7 and the weight of low gain is 0.3, then high accuracy can be obtained (compared to only using low gain to take pictures of pixels in the temperature range of 0°C to 100°C) and the temperature measurement range is large (compared to only using high gain Take pictures of pixels in the temperature measurement range of 0°C to 100°C).
  • the processor 7046 when the processor 7046 executes the computer program, it also implements: receiving the temperature measurement range required by the user; and setting the first fusion weight and the second fusion weight according to the temperature measurement range required by the user.
  • the weight of each pixel on the same infrared image is the same, and the weight can be set according to the user's requirements for the temperature measurement range.
  • the weight can be related to the temperature measurement range that the user needs to take pictures, and in some cases, the high gain and low gain values used by the infrared camera are fixed, that is, the corresponding temperature measurement range is fixed, as long as the user informs the required With the temperature measurement range, the first fusion weight and the second fusion weight can be calculated according to the needs of the user.
  • the first fusion weight of each pixel of the first infrared image is not completely the same, and the second fusion weight of each pixel of the second infrared image is not completely the same.
  • the processor 7046 when the processor 7046 executes the computer program, it also implements: setting the first fusion weight and the second fusion weight according to the image pixel point information, where the image pixel point information includes one or a combination of the following: pixel point Texture, pixel signal-to-noise ratio, pixel information amount, pixel temperature value.
  • the fusion weight of each pixel of the same infrared image is not completely the same. The weight can be determined according to the pixel texture, pixel signal-to-noise ratio, or pixel information of each pixel, or according to the pixel's The interval corresponding to the temperature value is used to determine its weight.
  • the third fusion weight of pixels in the same designated area of the first infrared image is the same, and the fourth fusion weight of pixels in the same designated area of the second infrared image is the same.
  • the processor 7046 when the processor 7046 executes the computer program, it also implements: setting the third fusion weight and the fourth fusion weight according to the specified area information, where the specified area information includes one or a combination of the following: area pixel point information Quantities, regional temperature change range, regional temperature change gradient value, regional signal-to-noise ratio.
  • the weight of each pixel in a designated area on the same infrared image is the same, and the information amount, temperature change range, gradient value, signal-to-noise ratio, etc. of the pixel in each area are calculated to determine the The weight of the pixel.
  • the processor 7046 when the processor 7046 executes the computer program, it also implements: image preprocessing of the first infrared image and the second infrared image, the image preprocessing includes one or a combination of the following: image correction, dead pixel removal , Noise removal.
  • the infrared signals of the two images are preprocessed (mainly the image sensor straight-out data correction and defect removal operations, such as image sensor responsivity correction, offset correction, dead pixel removal, noise removal, etc.) After that, an unstretched clean infrared raw image is obtained, one for each of the two modes. Further, the two infrared raw images are synthesized to obtain a high bit width image.
  • the eighth embodiment of the present application is a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the infrared image processing method of any one of the above embodiments is implemented. Therefore, the computer-readable storage medium has all the beneficial effects of the infrared image processing method as in any of the above-mentioned embodiments.
  • the term “plurality” refers to two or more than two, unless specifically defined otherwise.
  • the terms “installed”, “connected”, “connected”, “fixed” and other terms should be understood in a broad sense.
  • “connected” can be a fixed connection, a detachable connection, or an integral connection;
  • “connected” can be It is directly connected or indirectly connected through an intermediary.
  • the specific meanings of the above-mentioned terms in this application can be understood according to specific circumstances.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种红外图像处理方法、处理设备、无人飞行器和存储介质。其中,红外图像处理方法,包括:在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益(102);将第一红外图像和第二红外图像进行图像融合,生成第三红外图像(104)。该方法将第一红外图像和第二红外图像融合,生成高动态范围的第三红外图像,能够同时满足图像高测温范围和高测温精度的需求,使得得到的高动态范围图像能够直接用于测温。

Description

红外图像处理方法、处理设备、无人飞行器和存储介质 技术领域
本申请涉及图像处理技术领域,具体而言,涉及一种红外图像处理方法、红外图像处理设备、无人飞行器和计算机可读存储介质。
背景技术
红外热成像系统往往设计为两种模式,高增益模式和低增益模式;高增益模式测温范围窄,测温精度高;低增益模式测温范围宽,但测温精度低。因此,无论采用高增益模式采集红外图像还是采用低增益模式采集红外图像,均难以同时兼顾图像的测温范围和测温精度。
因此,如何合理地将高增益模式红外图像和低增益模式红外图像进行融合,以得到同时兼顾测温范围和测温精度的图像成为亟待解决的技术问题。
发明内容
本申请旨在至少解决现有技术或相关技术中存在的技术问题之一。
为此,本申请的第一方面提出了一种红外图像处理方法。
本申请的第二方面提出了一种红外图像处理设备。
本申请的第三方面提出了一种无人飞行器。
本申请的第四方面提出了一种计算机可读存储介质。
有鉴于此,本申请的第一方面提出了一种红外图像处理方法,包括:在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请的第二方面提出了一种红外图像处理设备,包括:红外图像采集装置,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大 于第二拍摄模式的图像增益;存储器,存储器存储有计算机程序;处理器,与红外图像采集装置和存储器连接,处理器执行计算机程序时实现:将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请的第三方面提供了一种无人飞行器,包括:机体;红外图像处理设备,红外图像处理设备包括:红外图像采集装置,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;存储器,存储器存储有计算机程序;处理器,与红外图像采集装置和存储器连接,处理器执行计算机程序时实现:将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请的第四方面提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现如第一方面中任一技术方案的红外图像处理方法,因此,该计算机可读存储介质具有如上述第一方面中任一技术方案的红外图像处理方法的全部有益效果。
本申请提出了一种红外图像处理方法、红外图像处理设备、无人飞行器和计算机可读存储介质。在第一拍摄模式下对待测对象拍摄第一红外图像,以及在第二拍摄模式下对待测对象拍摄第二红外图像。其中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益,第一拍摄模式即高增益模式,第二拍摄模式即低增益模式,在高增益模式采集的第一红外图像的图像测温范围窄、测温精度高,在低增益模式采集的第二红外图像的图像测温范围宽、测温精度低。进一步地,将第一红外图像和第二红外图像进行图像融合,生成高动态范围的第三红外图像,能够同时满足高测温范围和高测温精度,使得得到的高动态范围图像能够直接用于测温,从而获取待测对象的温度。
本申请的附加方面和优点将在下面的描述部分中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施例的描 述中将变得明显和容易理解,其中:
图1示出了根据本申请的第一个实施例的红外图像处理方法的示意流程图;
图2示出了根据本申请的第二个实施例的红外图像处理方法的示意流程图;
图3示出了根据本申请的第三个实施例的红外图像处理方法的示意流程图;
图4示出了根据本申请的第四个实施例的红外图像处理方法的示意流程图;
图5示出了根据本申请的第五个实施例的红外图像处理方法的示意图;
图6示出了根据本申请的第六个实施例的红外图像处理设备的示意框图;
图7示出了根据本申请的第七个实施例的无人飞行器的示意框图。
具体实施方式
为了能够更清楚地理解本申请的上述目的、特征和优点,下面结合附图和具体实施方式对本申请进行进一步的详细描述。需要说明的是,在不冲突的情况下,本申请的实施例及实施例中的特征可以相互组合。
在下面的描述中阐述了很多具体细节以便于充分理解本申请,但是,本申请还可以采用其他不同于在此描述的其他方式来实施,因此,本申请的保护范围并不受下面公开的具体实施例的限制。
红外成像技术是一种前景广阔的高新技术,通过反映物体表面温度而成像,目前该技术已被广泛应用于各个应用领域中,例如,电力巡检、搜捕搜救、火灾救援、城市空间建模等。本申请提出一种红外图像处理方法、红外图像处理设备、无人飞行器和计算机可读存储介质,将在第一拍摄模式下采集的待测对象的第一红外图像和在第二拍摄模式下采集的待测对象的第二红外图像进行图像融合,生成高动态范围的第三红外图像,以确保红外图像的高测温范围和高测温精度。
本申请实施例中,利用无人飞行器安装红外相机,对场景进行拍摄以获取红外图像,实现无人飞行器航拍或特殊场景下测温、目标追踪、监控等,进而实现在上述应用领域中的应用。
在一些实施例中,在无人飞行器的云台装置上安装红外相机。随着作业场合的多样性,用户对云台相机也具有多样的使用需求,除了红外相机以外,可以是双光相机,即包括红外相机和可见光相机,还可以是三光相机,即包括红外相机、可见光变焦相机和可见光定焦相机,其中三光相机通过将红外相机、可见光变焦相机和可见光定焦相机整合为一体,通过优化内部模组布置,在满足多种场合中的作业需求的同时,三光相机可具有更为小而轻巧的整体模组,进一步提升了三光相机的实用性。
在一些实施例中,不限于无人飞行器,还可以为可安装红外相机或具有红外拍摄功能的监控、探测、航拍等设备,例如直升机、监控摄像头、机器人、消防车、探测器等。
下面参照图1至图7描述根据本申请一些实施例的红外图像处理方法、红外图像处理设备、无人飞行器和计算机可读存储介质。
图1示出了根据本申请的第一个实施例的红外图像处理方法的示意流程图。
如图1所示,第一个实施例的红外图像处理方法包括:
步骤102,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
步骤104,将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请提供的红外图像处理方法,在第一拍摄模式下对待测对象拍摄第一红外图像,以及在第二拍摄模式下对待测对象拍摄第二红外图像。其中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益,第一拍摄模式即高增益模式,第二拍摄模式即低增益模式。进一步地,将第一红外图像和第二红外图像进行图像融合,生成高动态范围的第三红外图像,能够同时满足高测温范围和高测温精度,使得得到的高动态范围图像能够直接 用于测温,从而获取待测对象的温度。
红外图像的拍摄装置可以为红外相机,除了红外相机以外,可以是双光相机,即包括红外相机和可见光相机,还可以是三光相机,即包括红外相机、可见光变焦相机和可见光定焦相机。
在上述实施例中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益为第一拍摄模式的测温范围小于或等于第二拍摄模式的测温范围,且第一拍摄模式的测温精度大于或等于第二拍摄模式的测温精度。
在该实施例中,在高增益模式采集的红外图像的测温范围窄,例如高增益模式支持的测温范围是-40℃至120℃,且测温精度高,例如±2℃;在低增益模式采集的红外图像的测温范围宽,例如低增益模式支持的测温范围是-40℃至550℃,且测温精度低,例如±5℃。
在一些实施例中,第一拍摄模式的测温范围的上限值与下限值的差值小于等于第二拍摄模式的测温范围的上限值与下限值的差值。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为不重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至50℃,第二拍摄模式的测温范围为60℃至550℃。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-100℃至550℃。
需要说明的是,测温范围可以为闭区间或开区间。
在上述任一实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集。
在该实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-40℃至550℃,或第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-50℃至550℃。
在一些实施例中,不一定必须是高增益模式和低增益模式,也可以是专门测量某个温度段的模式,如低温段高精度测量、高温段高精度测量,再合成一幅图像,得到高动态范围高测温精度的图像。
图2示出了根据本申请的第二个实施例的红外图像处理方法的示意流程 图。
如图2所示,第二个实施例的红外图像处理方法包括:
步骤202,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
步骤204,将第一红外图像的像素点按照第一融合权重、第二红外图像的像素点按照第二融合权重进行图像融合,生成第三红外图像;或将第一红外图像的指定区域按照第三融合权重、第二红外图像的指定区域按照第四融合权重进行图像融合,生成第三红外图像。
在该实施例中,需要找到第一红外图像与第二红外图像中对应的像素点进行图像融合,或者按照第一红外图像与第二红外图像中对应的区域进行图像融合,生成高动态范围红外图像,能够同时满足高测温范围和高测温精度。
需要说明的是,同一红外图像上每个像素点的权重可以相同或不相同,同一红外图像上指定区域内的每个像素点的权重可以相同或不相同。第一红外图像的融合权重与第二红外图像的融合权重的加和为1或为指定数值。
在一些实施例中,根据第一红外图像的像素值和第二红外图像的像素值,利用公式(1),计算第三红外图像的像素值,
P3=P1×a+P2×(1-a)          (1)
其中,P3为第三红外图像的像素值,P1为第一红外图像的像素值,P2为第二红外图像的像素值,a为第一红外图像的融合权重。
在上述实施例中,第一融合权重与第二融合权重的和为1;或第三融合权重与第四融合权重的和为1。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重相同,以及第二红外图像的各个像素点的第二融合权重相同。
在该实施例中,同一红外图像上每个像素点的权重相同,该权重可以是预设的值或者根据拍摄的参数来设置,拍摄的参数可以为与高增益和低增益相关的值。
在上述任一实施例中,还包括:根据第一拍摄模式的图像增益与第二拍摄模式的图像增益的比值关系,设置第一融合权重和第二融合权重。
在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益和低增益的值设置融合权重,例如根据高增益和低增益的比值关系,设置第一融合权重和第二融合权重。
需要说明的是,高增益和低增益的值还可以由用户调整,增益的值改变后,对应的权重也会对应改变。
在上述任一实施例中,还包括:根据第一拍摄模式的测温范围与第二拍摄模式的测温范围的相对关系,设置第一融合权重和第二融合权重。
在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益对应的测温范围和低增益对应的测温范围的相对关系来确定第一融合权重和第二融合权重。不同的增益值可以对应不同的测温范围,例如高增益有5个档,不同档对应的测温范围不完全相同,低增益类似。
例如,红外相机的高增益测温范围为0℃至100℃,低增益测温范围为0℃至500℃,假设当前场景的最大温度为100℃,则高增益的权重可以为1,低增益的值为0,即仅需要高增益模式即可获得高精度且测温范围满足实际场景的红外图像。假设当前场景已知最大温度为300℃,则仅通过高增益模式无法满足实际测温范围,如果仅通过低增益模式,则测温的精度差,因此可以进行图像融合,高增益的权重例如可以为0.7,低增益的权重为0.3,则可获得高精度(相较于只采用低增益对0℃至100℃测温范围的像素点拍摄)且测温范围大(相较于只采用高增益对0℃至100℃测温范围的像素点拍摄)的图像。
在上述任一实施例中,还包括:接收用户所需测温范围;根据用户所需测温范围,设置第一融合权重和第二融合权重。
在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据用户对测温范围的需求来设置权重。权重可以与用户所需拍照的测温范围区间有关,而在一些情况下,红外相机采用的高增益和低增益的值是固定的,即对应的测温范围是固定的,只要用户告知所需测温范围,就可根据用户的需求来计算出第一融合权重和第二融合权重。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重不完 全相同,以及第二红外图像的各个像素点的第二融合权重不完全相同。
在上述任一实施例中,还包括:根据图像像素点信息设置第一融合权重和第二融合权重,其中,图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。
在该实施例中,同一红外图像的各个像素点的融合权重不完全相同,可根据各像素点的像素点纹理、像素点信噪比或像素点信息量来确定其权重,或者根据像素点的温度值对应的区间来确定其权重。
在上述任一实施例中,第一红外图像的同一指定区域内的像素点的第三融合权重相同,以及第二红外图像的同一指定区域内的像素点的第四融合权重相同。
在上述任一实施例中,还包括:根据指定区域信息设置第三融合权重和第四融合权重,其中,指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。
在该实施例中,同一红外图像上指定区域内的每个像素点的权重相同,统计各个区域内的像素点的信息量、温度变化范围、梯度值、信噪比等来确定该区域内的像素点的权重。
图3示出了根据本申请的第三个实施例的红外图像处理方法的示意流程图。
如图3所示,第三个实施例的红外图像处理方法包括:
步骤302,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
步骤304,将第一红外图像和第二红外图像进行图像融合,生成第三红外图像;
步骤306,对第三红外图像进行指定处理,其中,指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。
在该实施例中,对第三红外图像进行全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理等操作,得到高动态范围的观瞄图 像,以实现高精度的图像观瞄。
图4示出了根据本申请的第四个实施例的红外图像处理方法的示意流程图。
如图4所示,第四个实施例的红外图像处理方法包括:
步骤402,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
步骤404,对第一红外图像和第二红外图像进行图像预处理,其中,图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除;
步骤406,将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
在该实施例中,两幅图像的红外信号经过预处理(主要是图像传感器直出数据的矫正和去除瑕疵的操作,例如图像传感器响应率矫正、偏置矫正、坏点去除、噪声去除等)后,得到未拉伸的干净的红外raw图像,两种模式各一张。进一步地,对两幅红外raw图进行合成,得到高位宽图像。
在一些实施例中,还可以对生成的第三红外图像进行指定处理,其中,指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。
图5示出了根据本申请的第五个实施例的红外图像处理方法的示意图。
如图5所示,第五个实施例的红外图像处理方法包括:
利用红外镜头和图像传感器对外界物体进行拍摄,具体为在高增益模式和低增益模式分别进行一次拍摄,得到两张红外图像,高增益模式图像测温范围窄,测温精度高;低增益图像测温范围宽,测温精度低。
两幅红外图像的红外信号经过预处理,主要是传感器直出数据的矫正和去除瑕疵的操作,例如传感器响应率矫正、偏置矫正、坏点去除、噪声去除等,得到未拉伸的干净的红外raw图像,两种模式各一张,位深都为Nbits,例如N=14。
对两幅红外raw图进行融合,得到高位宽图像,合成后图像为N+M  bits,M>0,例如N=14,M=6。其中,融合方法有多种,例如P3=P1×a+P2×(1-a),P1为高增益像素,P2为低增益像素,a为融合系数,a可以是全局固定值,也可以根据像素所处范围自适应调整,例如当高增益像素值饱和时,a=0,可以全部使用低增益像素。
高动态范围测温出图,实现高温度范围和高测温精度,高位宽图像可以直接用于测温。
对高位宽图像进行全局拉伸增强、局部拉伸增强、细节增强、伪彩映射等操作(也可以是其中的一部分操作),高动态范围观瞄出图,实现图像观瞄。
图6示出了根据本申请的第六个实施例的红外图像处理设备的示意框图。
如图6所示,第六个实施例的红外图像处理设备600包括:
红外图像采集装置602,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
存储器604,存储器604存储有计算机程序;
处理器606,与红外图像采集装置602和存储器604连接,处理器606执行计算机程序时实现:将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请提供的红外图像处理设备600,红外图像采集装置602分别在第一拍摄模式下对待测对象拍摄第一红外图像,以及在第二拍摄模式下对待测对象拍摄第二红外图像。其中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益,第一拍摄模式即高增益模式,第二拍摄模式即低增益模式。进一步地,处理器606将第一红外图像和第二红外图像进行图像融合,生成高动态范围的第三红外图像,能够同时满足高测温范围和高测温精度,使得得到的高动态范围图像能够直接用于测温,从而获取待测对象的温度。
在一些实施例中,红外图像采集装置602为红外相机,红外图像采集装置602包括红外热成像镜头和图像传感器,除了红外相机以外,可以是双光相机,即包括红外相机和可见光相机,还可以是三光相机,即包括红 外相机、可见光变焦相机和可见光定焦相机,处理器606为图像处理器。
在上述实施例中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益为第一拍摄模式的测温范围小于或等于第二拍摄模式的测温范围,且第一拍摄模式的测温精度大于或等于第二拍摄模式的测温精度。在该实施例中,在高增益模式采集的红外图像的测温范围窄,例如-40℃至120℃,且测温精度高,例如±2℃;在低增益模式采集的红外图像的测温范围宽,例如-40℃至550℃,且测温精度低,例如±5℃。
在一些实施例中,第一拍摄模式的测温范围的上限值与下限值的差值小于等于第二拍摄模式的测温范围的上限值与下限值的差值。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为不重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至50℃,第二拍摄模式的测温范围为60℃至550℃。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-100℃至550℃。
需要说明的是,测温范围可以为闭区间或开区间。
在上述任一实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集。在该实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-40℃至550℃,或第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-50℃至550℃。
在上述任一实施例中,处理器606执行计算机程序时还实现:对第三红外图像进行指定处理,指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。在该实施例中,对第三红外图像进行全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理等操作,得到高动态范围的观瞄图像,以实现高精度的图像观瞄。
在上述任一实施例中,处理器606执行将第一红外图像和第二红外图像进行图像融合,生成第三红外图像,具体包括:将第一红外图像的像素点按照第一融合权重、第二红外图像的像素点按照第二融合权重进行图像融 合,生成第三红外图像;或将第一红外图像的指定区域按照第三融合权重、第二红外图像的指定区域按照第四融合权重进行图像融合,生成第三红外图像。在该实施例中,需要找到第一红外图像与第二红外图像中对应的像素点进行图像融合,或者按照第一红外图像与第二红外图像中对应的区域进行图像融合,生成高动态范围红外图像,能够同时满足高测温范围和高测温精度。
需要说明的是,同一红外图像上每个像素点的权重可以相同或不相同,同一红外图像上指定区域内的每个像素点的权重可以相同或不相同。第一红外图像的融合权重与第二红外图像的融合权重的加和为1或为指定数值。
在一些实施例中,根据第一红外图像的像素值和第二红外图像的像素值,利用公式(1),计算第三红外图像的像素值,
P3=P1×a+P2×(1-a)         (1)
其中,P3为第三红外图像的像素值,P1为第一红外图像的像素值,P2为第二红外图像的像素值,a为第一红外图像的融合权重。
在上述实施例中,第一融合权重与第二融合权重的和为1;或第三融合权重与第四融合权重的和为1。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重相同,以及第二红外图像的各个像素点的第二融合权重相同。在该实施例中,同一红外图像上每个像素点的权重相同,该权重可以是预设的值或者根据拍摄的参数来设置,拍摄的参数可以为与高增益和低增益相关的值。
在上述任一实施例中,处理器606执行计算机程序时还实现:根据第一拍摄模式的图像增益与第二拍摄模式的图像增益的比值关系,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益和低增益的值设置融合权重,例如根据高增益和低增益的比值关系,设置第一融合权重和第二融合权重。
需要说明的是,高增益和低增益的值还可以由用户调整,增益的值改变后,对应的权重也会对应改变。
在上述任一实施例中,处理器606执行计算机程序时还实现:根据第一 拍摄模式的测温范围与第二拍摄模式的测温范围的相对关系,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益对应的测温范围和低增益对应的测温范围的相对关系来确定第一融合权重和第二融合权重。不同的增益值可以对应不同的测温范围,例如高增益有5个档,不同档对应的测温范围不完全相同,低增益类似。
例如,红外相机的高增益测温范围为0℃至100℃,低增益测温范围为0℃至500℃,假设当前场景的最大温度为100℃,则高增益的权重可以为1,低增益的值为0,即仅需要高增益模式即可获得高精度且测温范围满足实际场景的红外图像。假设当前场景已知最大温度为300℃,则仅通过高增益模式无法满足实际测温范围,如果仅通过低增益模式,则测温的精度差,因此可以进行图像融合,高增益的权重例如可以为0.7,低增益的权重为0.3,则可获得高精度(相较于只采用低增益对0℃至100℃测温范围的像素点拍摄)且测温范围大(相较于只采用高增益对0℃至100℃测温范围的像素点拍摄)的图像。
在上述任一实施例中,处理器606执行计算机程序时还实现:接收用户所需测温范围;根据用户所需测温范围,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据用户对测温范围的需求来设置权重。权重可以与用户所需拍照的测温范围区间有关,而在一些情况下,红外相机采用的高增益和低增益的值是固定的,即对应的测温范围是固定的,只要用户告知所需测温范围,就可根据用户的需求来计算出第一融合权重和第二融合权重。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重不完全相同,以及第二红外图像的各个像素点的第二融合权重不完全相同。
在上述任一实施例中,处理器606执行计算机程序时还实现:根据图像像素点信息设置第一融合权重和第二融合权重,其中,图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。在该实施例中,同一红外图像的各个像素点的融合权重不完全相同,可根据各像素点的像素点纹理、像素点信噪比或像素点信息量来确定其权重,或者根据像素点的温度值对应的区间来确定其权重。
在上述任一实施例中,第一红外图像的同一指定区域内的像素点的第三 融合权重相同,以及第二红外图像的同一指定区域内的像素点的第四融合权重相同。
在上述任一实施例中,处理器606执行计算机程序时还实现:根据指定区域信息设置第三融合权重和第四融合权重,其中,指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。在该实施例中,同一红外图像上指定区域内的每个像素点的权重相同,统计各个区域内的像素点的信息量、温度变化范围、梯度值、信噪比等来确定该区域内的像素点的权重。
在上述任一实施例中,处理器606执行计算机程序时还实现:对第一红外图像和第二红外图像进行图像预处理,图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除。在该实施例中,两幅图像的红外信号经过预处理(主要是图像传感器直出数据的矫正和去除瑕疵的操作,例如图像传感器响应率矫正、偏置矫正、坏点去除、噪声去除等)后,得到未拉伸的干净的红外raw图像,两种模式各一张。进一步地,对两幅红外raw图进行合成,得到高位宽图像。
图7示出了根据本申请的第七个实施例的无人飞行器的示意框图。
如图7所示,第七个实施例的无人飞行器700包括:
机体702;
红外图像处理设备704,其中,红外图像处理设备704包括:
红外图像采集装置7042,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集待测对象的第二红外图像,其中第一拍摄模式的图像增益大于第二拍摄模式的图像增益;
存储器7044,存储器存储有计算机程序;
处理器7046,与红外图像采集装置和存储器连接,处理器执行计算机程序时实现:将第一红外图像和第二红外图像进行图像融合,生成第三红外图像。
本申请提供的无人飞行器700,红外图像采集装置7042分别在第一拍摄模式下对待测对象拍摄第一红外图像,以及在第二拍摄模式下对待测对象拍摄第二红外图像。其中,第一拍摄模式的图像增益大于第二拍摄模式 的图像增益,第一拍摄模式即高增益模式,第二拍摄模式即低增益模式。进一步地,处理器7046将第一红外图像和第二红外图像进行图像融合,生成高动态范围的第三红外图像,能够同时满足高测温范围和高测温精度,使得得到的高动态范围图像能够直接用于测温,从而获取待测对象的温度。
红外图像采集装置7042可以为云台相机,安装于无人飞行器的云台装置上,红外图像采集装置7042包括红外热成像镜头和图像传感器,除了红外相机以外,可以是双光相机,即包括红外相机和可见光相机,还可以是三光相机,即包括红外相机、可见光变焦相机和可见光定焦相机,处理器7046为图像处理器。
在上述实施例中,第一拍摄模式的图像增益大于第二拍摄模式的图像增益为第一拍摄模式的测温范围小于或等于第二拍摄模式的测温范围,且第一拍摄模式的测温精度大于或等于第二拍摄模式的测温精度。在该实施例中,在高增益模式采集的红外图像的测温范围窄,例如-40℃至120℃,且测温精度高,例如±2℃;在低增益模式采集的红外图像的测温范围宽,例如-40℃至550℃,且测温精度低,例如±5℃。
在一些实施例中,第一拍摄模式的测温范围的上限值与下限值的差值小于等于第二拍摄模式的测温范围的上限值与下限值的差值。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为不重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至50℃,第二拍摄模式的测温范围为60℃至550℃。第一拍摄模式的测温范围与第二拍摄模式的测温范围可以为重叠的两个测温区间,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-100℃至550℃。
需要说明的是,测温范围可以为闭区间或开区间。
在上述任一实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集。在该实施例中,第一拍摄模式的测温范围为第二拍摄模式的测温范围的子集,例如,第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-40℃至550℃,或第一拍摄模式的测温范围为-40℃至120℃,第二拍摄模式的测温范围为-50℃至550℃。
在上述任一实施例中,处理器7046执行计算机程序时还实现:对第三 红外图像进行指定处理,指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。在该实施例中,对第三红外图像进行全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理等操作,得到高动态范围的观瞄图像,以实现高精度的图像观瞄。
在上述任一实施例中,处理器7046执行将第一红外图像和第二红外图像进行图像融合,生成第三红外图像,具体包括:将第一红外图像的像素点按照第一融合权重、第二红外图像的像素点按照第二融合权重进行图像融合,生成第三红外图像;或将第一红外图像的指定区域按照第三融合权重、第二红外图像的指定区域按照第四融合权重进行图像融合,生成第三红外图像。在该实施例中,需要找到第一红外图像与第二红外图像中对应的像素点进行图像融合,或者按照第一红外图像与第二红外图像中对应的区域进行图像融合,生成高动态范围红外图像,能够同时满足高测温范围和高测温精度。
需要说明的是,同一红外图像上每个像素点的权重可以相同或不相同,同一红外图像上指定区域内的每个像素点的权重可以相同或不相同。第一红外图像的融合权重与第二红外图像的融合权重的加和为1或为指定数值。
在一些实施例中,根据第一红外图像的像素值和第二红外图像的像素值,利用公式(1),计算第三红外图像的像素值,
P3=P1×a+P2×(1-a)         (1)
其中,P3为第三红外图像的像素值,P1为第一红外图像的像素值,P2为第二红外图像的像素值,a为第一红外图像的融合权重。
在上述实施例中,第一融合权重与第二融合权重的和为1;或第三融合权重与第四融合权重的和为1。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重相同,以及第二红外图像的各个像素点的第二融合权重相同。在该实施例中,同一红外图像上每个像素点的权重相同,该权重可以是预设的值或者根据拍摄的参数来设置,拍摄的参数可以为与高增益和低增益相关的值。
在上述任一实施例中,处理器7046执行计算机程序时还实现:根据第一拍摄模式的图像增益与第二拍摄模式的图像增益的比值关系,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益和低增益的值设置融合权重,例如根据高增益和低增益的比值关系,设置第一融合权重和第二融合权重。
需要说明的是,高增益和低增益的值还可以由用户调整,增益的值改变后,对应的权重也会对应改变。
在上述任一实施例中,处理器7046执行计算机程序时还实现:根据第一拍摄模式的测温范围与第二拍摄模式的测温范围的相对关系,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据高增益对应的测温范围和低增益对应的测温范围的相对关系来确定第一融合权重和第二融合权重。不同的增益值可以对应不同的测温范围,例如高增益有5个档,不同档对应的测温范围不完全相同,低增益类似。
例如,红外相机的高增益测温范围为0℃至100℃,低增益测温范围为0℃至500℃,假设当前场景的最大温度为100℃,则高增益的权重可以为1,低增益的值为0,即仅需要高增益模式即可获得高精度且测温范围满足实际场景的红外图像。假设当前场景已知最大温度为300℃,则仅通过高增益模式无法满足实际测温范围,如果仅通过低增益模式,则测温的精度差,因此可以进行图像融合,高增益的权重例如可以为0.7,低增益的权重为0.3,则可获得高精度(相较于只采用低增益对0℃至100℃测温范围的像素点拍摄)且测温范围大(相较于只采用高增益对0℃至100℃测温范围的像素点拍摄)的图像。
在上述任一实施例中,处理器7046执行计算机程序时还实现:接收用户所需测温范围;根据用户所需测温范围,设置第一融合权重和第二融合权重。在该实施例中,同一红外图像上每个像素点的权重相同,则可以根据用户对测温范围的需求来设置权重。权重可以与用户所需拍照的测温范围区间有关,而在一些情况下,红外相机采用的高增益和低增益的值是固定的,即对应的测温范围是固定的,只要用户告知所需测温范围,就可根据用户的需求来计算出第一融合权重和第二融合权重。
在上述任一实施例中,第一红外图像的各个像素点的第一融合权重不完 全相同,以及第二红外图像的各个像素点的第二融合权重不完全相同。
在上述任一实施例中,处理器7046执行计算机程序时还实现:根据图像像素点信息设置第一融合权重和第二融合权重,其中,图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。在该实施例中,同一红外图像的各个像素点的融合权重不完全相同,可根据各像素点的像素点纹理、像素点信噪比或像素点信息量来确定其权重,或者根据像素点的温度值对应的区间来确定其权重。
在上述任一实施例中,第一红外图像的同一指定区域内的像素点的第三融合权重相同,以及第二红外图像的同一指定区域内的像素点的第四融合权重相同。
在上述任一实施例中,处理器7046执行计算机程序时还实现:根据指定区域信息设置第三融合权重和第四融合权重,其中,指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。在该实施例中,同一红外图像上指定区域内的每个像素点的权重相同,统计各个区域内的像素点的信息量、温度变化范围、梯度值、信噪比等来确定该区域内的像素点的权重。
在上述任一实施例中,处理器7046执行计算机程序时还实现:对第一红外图像和第二红外图像进行图像预处理,图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除。在该实施例中,两幅图像的红外信号经过预处理(主要是图像传感器直出数据的矫正和去除瑕疵的操作,例如图像传感器响应率矫正、偏置矫正、坏点去除、噪声去除等)后,得到未拉伸的干净的红外raw图像,两种模式各一张。进一步地,对两幅红外raw图进行合成,得到高位宽图像。
本申请的第八个实施例一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述任一实施例的红外图像处理方法。因此,该计算机可读存储介质具有如上述任一实施例的红外图像处理方法的全部有益效果。
在本申请中,术语“多个”则指两个或两个以上,除非另有明确的限定。术语“安装”、“相连”、“连接”、“固定”等术语均应做广义理解,例如,“连 接”可以是固定连接,也可以是可拆卸连接,或一体地连接;“相连”可以是直接相连,也可以通过中间媒介间接相连。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本说明书的描述中,术语“一个实施例”、“一些实施例”、“具体实施例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施例或实例。而且,描述的具体特征、结构、材料或特点可以在任何的一个或多个实施例或示例中以合适的方式结合。
以上所述仅为本申请的优选实施例而已,并不用于限制本申请,对于本领域的技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (46)

  1. 一种红外图像处理方法,其特征在于,包括:
    在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集所述待测对象的第二红外图像,其中所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益;
    将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像。
  2. 根据权利要求1所述的红外图像处理方法,其特征在于,
    所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益为所述第一拍摄模式的测温范围小于或等于所述第二拍摄模式的测温范围,且所述第一拍摄模式的测温精度大于或等于所述第二拍摄模式的测温精度。
  3. 根据权利要求2所述的红外图像处理方法,其特征在于,
    所述第一拍摄模式的测温范围为所述第二拍摄模式的测温范围的子集。
  4. 根据权利要求1所述的红外图像处理方法,其特征在于,还包括:
    对所述第三红外图像进行指定处理,
    其中,所述指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。
  5. 根据权利要求1至4中任一项所述的红外图像处理方法,其特征在于,将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像的步骤,具体包括:
    将所述第一红外图像的像素点按照第一融合权重、所述第二红外图像的像素点按照第二融合权重进行图像融合,生成所述第三红外图像;或
    将所述第一红外图像的指定区域按照第三融合权重、所述第二红外图像的指定区域按照第四融合权重进行图像融合,生成所述第三红外图像。
  6. 根据权利要求5所述的红外图像处理方法,其特征在于,
    所述第一融合权重与所述第二融合权重的和为1;或
    所述第三融合权重与所述第四融合权重的和为1。
  7. 根据权利要求5所述的红外图像处理方法,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重相同,以及所述第二红外图像的各个像素点的所述第二融合权重相同。
  8. 根据权利要求7所述的红外图像处理方法,其特征在于,还包括:
    根据所述第一拍摄模式的图像增益与所述第二拍摄模式的图像增益的比值关系,设置所述第一融合权重和所述第二融合权重。
  9. 根据权利要求7所述的红外图像处理方法,其特征在于,还包括:
    根据所述第一拍摄模式的测温范围与所述第二拍摄模式的测温范围的相对关系,设置所述第一融合权重和所述第二融合权重。
  10. 根据权利要求7所述的红外图像处理方法,其特征在于,还包括:
    接收用户所需测温范围;
    根据所述用户所需测温范围,设置所述第一融合权重和所述第二融合权重。
  11. 根据权利要求5所述的红外图像处理方法,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重不完全相同,以及所述第二红外图像的各个像素点的所述第二融合权重不完全相同。
  12. 根据权利要求11所述的红外图像处理方法,其特征在于,还包括:
    根据图像像素点信息设置所述第一融合权重和所述第二融合权重,
    其中,所述图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。
  13. 根据权利要求5所述的红外图像处理方法,其特征在于,
    所述第一红外图像的同一所述指定区域内的像素点的所述第三融合权重相同,以及所述第二红外图像的同一所述指定区域内的像素点的所述第四融合权重相同。
  14. 根据权利要求13所述的红外图像处理方法,其特征在于,还包括:
    根据指定区域信息设置所述第三融合权重和所述第四融合权重,
    其中,所述指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。
  15. 根据权利要求1至4中任一项所述的红外图像处理方法,其特征 在于,在将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像的步骤之前,还包括:
    对所述第一红外图像和所述第二红外图像进行图像预处理,
    其中,所述图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除。
  16. 一种红外图像处理设备,其特征在于,包括:
    红外图像采集装置,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集所述待测对象的第二红外图像,其中所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益;
    存储器,所述存储器存储有计算机程序;
    处理器,与所述红外图像采集装置和所述存储器连接,所述处理器执行所述计算机程序时实现:
    将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像。
  17. 根据权利要求16所述的红外图像处理设备,其特征在于,
    所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益为所述第一拍摄模式的测温范围小于或等于所述第二拍摄模式的测温范围,且所述第一拍摄模式的测温精度大于或等于所述第二拍摄模式的测温精度。
  18. 根据权利要求17所述的红外图像处理设备,其特征在于,
    所述第一拍摄模式的测温范围为所述第二拍摄模式的测温范围的子集。
  19. 根据权利要求16所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    对所述第三红外图像进行指定处理,
    其中,所述指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。
  20. 根据权利要求16至19中任一项所述的红外图像处理设备,其特征在于,所述处理器执行将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像,具体包括:
    将所述第一红外图像的像素点按照第一融合权重、所述第二红外图像的像素点按照第二融合权重进行图像融合,生成所述第三红外图像;或
    将所述第一红外图像的指定区域按照第三融合权重、所述第二红外图像的指定区域按照第四融合权重进行图像融合,生成所述第三红外图像。
  21. 根据权利要求20所述的红外图像处理设备,其特征在于,
    所述第一融合权重与所述第二融合权重的和为1;或
    所述第三融合权重与所述第四融合权重的和为1。
  22. 根据权利要求20所述的红外图像处理设备,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重相同,以及所述第二红外图像的各个像素点的所述第二融合权重相同。
  23. 根据权利要求22所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据所述第一拍摄模式的图像增益与所述第二拍摄模式的图像增益的比值关系,设置所述第一融合权重和所述第二融合权重。
  24. 根据权利要求22所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据所述第一拍摄模式的测温范围与所述第二拍摄模式的测温范围的相对关系,设置所述第一融合权重和所述第二融合权重。
  25. 根据权利要求22所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    接收用户所需测温范围;
    根据所述用户所需测温范围,设置所述第一融合权重和所述第二融合权重。
  26. 根据权利要求20所述的红外图像处理设备,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重不完全相同,以及所述第二红外图像的各个像素点的所述第二融合权重不完全相同。
  27. 根据权利要求26所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据图像像素点信息设置所述第一融合权重和所述第二融合权重,
    其中,所述图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。
  28. 根据权利要求20所述的红外图像处理设备,其特征在于,
    所述第一红外图像的同一所述指定区域内的像素点的所述第三融合权重相同,以及所述第二红外图像的同一所述指定区域内的像素点的所述第四融合权重相同。
  29. 根据权利要求28所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据指定区域信息设置所述第三融合权重和所述第四融合权重,
    其中,所述指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。
  30. 根据权利要求16至19中任一项所述的红外图像处理设备,其特征在于,所述处理器执行所述计算机程序时还实现:
    对所述第一红外图像和所述第二红外图像进行图像预处理,
    其中,所述图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除。
  31. 一种无人飞行器,其特征在于,包括:
    机体;
    红外图像处理设备,所述红外图像处理设备包括:
    红外图像采集装置,在第一拍摄模式下采集待测对象的第一红外图像,以及在第二拍摄模式下采集所述待测对象的第二红外图像,其中所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益;
    存储器,所述存储器存储有计算机程序;
    处理器,与所述红外图像采集装置和所述存储器连接,所述处理器执行所述计算机程序时实现:
    将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像。
  32. 根据权利要求31所述的无人飞行器,其特征在于,
    所述第一拍摄模式的图像增益大于所述第二拍摄模式的图像增益为所 述第一拍摄模式的测温范围小于或等于所述第二拍摄模式的测温范围,且所述第一拍摄模式的测温精度大于或等于所述第二拍摄模式的测温精度。
  33. 根据权利要求32所述的无人飞行器,其特征在于,
    所述第一拍摄模式的测温范围为所述第二拍摄模式的测温范围的子集。
  34. 根据权利要求31所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    对所述第三红外图像进行指定处理,
    其中,所述指定处理包括以下一种或其组合:全局拉伸增强处理、局部拉伸增强处理、细节增强处理、伪彩映射处理。
  35. 根据权利要求31至34中任一项所述的无人飞行器,其特征在于,所述处理器执行将所述第一红外图像和所述第二红外图像进行图像融合,生成第三红外图像,具体包括:
    将所述第一红外图像的像素点按照第一融合权重、所述第二红外图像的像素点按照第二融合权重进行图像融合,生成所述第三红外图像;或
    将所述第一红外图像的指定区域按照第三融合权重、所述第二红外图像的指定区域按照第四融合权重进行图像融合,生成所述第三红外图像。
  36. 根据权利要求35所述的无人飞行器,其特征在于,
    所述第一融合权重与所述第二融合权重的和为1;或
    所述第三融合权重与所述第四融合权重的和为1。
  37. 根据权利要求35所述的无人飞行器,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重相同,以及所述第二红外图像的各个像素点的所述第二融合权重相同。
  38. 根据权利要求37所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据所述第一拍摄模式的图像增益与所述第二拍摄模式的图像增益的比值关系,设置所述第一融合权重和所述第二融合权重。
  39. 根据权利要求37所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据所述第一拍摄模式的测温范围与所述第二拍摄模式的测温范围的相对关系,设置所述第一融合权重和所述第二融合权重。
  40. 根据权利要求37所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    接收用户所需测温范围;
    根据所述用户所需测温范围设置所述第一融合权重和所述第二融合权重。
  41. 根据权利要求35所述的无人飞行器,其特征在于,
    所述第一红外图像的各个像素点的所述第一融合权重不完全相同,以及所述第二红外图像的各个像素点的所述第二融合权重不完全相同。
  42. 根据权利要求41所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据图像像素点信息设置所述第一融合权重和所述第二融合权重,
    其中,所述图像像素点信息包括以下一种或其组合:像素点纹理、像素点信噪比、像素点信息量、像素点温度值。
  43. 根据权利要求35所述的无人飞行器,其特征在于,
    所述第一红外图像的同一所述指定区域内的像素点的所述第三融合权重相同,以及所述第二红外图像的同一所述指定区域内的像素点的所述第四融合权重相同。
  44. 根据权利要求43所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    根据指定区域信息设置所述第三融合权重和所述第四融合权重,
    其中,所述指定区域信息包括以下一种或其组合:区域像素点信息量、区域温度变化范围、区域温度变化梯度值、区域信噪比。
  45. 根据权利要求31至34中任一项所述的无人飞行器,其特征在于,所述处理器执行所述计算机程序时还实现:
    对所述第一红外图像和所述第二红外图像进行图像预处理,
    其中,所述图像预处理包括以下一种或其组合:图像矫正、坏点去除、噪声去除。
  46. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现权利要求1至15中任一项所述的红外图像处理方法。
PCT/CN2020/078873 2020-03-11 2020-03-11 红外图像处理方法、处理设备、无人飞行器和存储介质 WO2021179223A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080032171.6A CN113785559A (zh) 2020-03-11 2020-03-11 红外图像处理方法、处理设备、无人飞行器和存储介质
PCT/CN2020/078873 WO2021179223A1 (zh) 2020-03-11 2020-03-11 红外图像处理方法、处理设备、无人飞行器和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/078873 WO2021179223A1 (zh) 2020-03-11 2020-03-11 红外图像处理方法、处理设备、无人飞行器和存储介质

Publications (1)

Publication Number Publication Date
WO2021179223A1 true WO2021179223A1 (zh) 2021-09-16

Family

ID=77670891

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078873 WO2021179223A1 (zh) 2020-03-11 2020-03-11 红外图像处理方法、处理设备、无人飞行器和存储介质

Country Status (2)

Country Link
CN (1) CN113785559A (zh)
WO (1) WO2021179223A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107369146A (zh) * 2017-06-28 2017-11-21 深圳源广安智能科技有限公司 一种高性能红外图像处理系统
CN108234984A (zh) * 2018-03-15 2018-06-29 百度在线网络技术(北京)有限公司 双目深度相机系统和深度图像生成方法
CN109242815A (zh) * 2018-09-28 2019-01-18 合肥英睿系统技术有限公司 一种红外光图像和可见光图像融合方法及系统
WO2019071613A1 (zh) * 2017-10-13 2019-04-18 华为技术有限公司 一种图像处理方法及装置
WO2019183759A1 (zh) * 2018-03-26 2019-10-03 深圳市大疆创新科技有限公司 图像融合方法、拍摄设备和可移动平台系统
CN110476416A (zh) * 2017-01-26 2019-11-19 菲力尔系统公司 多个成像模式下红外成像的系统和方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796628B (zh) * 2019-10-17 2022-06-07 浙江大华技术股份有限公司 图像融合方法、装置、拍摄装置及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110476416A (zh) * 2017-01-26 2019-11-19 菲力尔系统公司 多个成像模式下红外成像的系统和方法
CN107369146A (zh) * 2017-06-28 2017-11-21 深圳源广安智能科技有限公司 一种高性能红外图像处理系统
WO2019071613A1 (zh) * 2017-10-13 2019-04-18 华为技术有限公司 一种图像处理方法及装置
CN108234984A (zh) * 2018-03-15 2018-06-29 百度在线网络技术(北京)有限公司 双目深度相机系统和深度图像生成方法
WO2019183759A1 (zh) * 2018-03-26 2019-10-03 深圳市大疆创新科技有限公司 图像融合方法、拍摄设备和可移动平台系统
CN109242815A (zh) * 2018-09-28 2019-01-18 合肥英睿系统技术有限公司 一种红外光图像和可见光图像融合方法及系统

Also Published As

Publication number Publication date
CN113785559A (zh) 2021-12-10

Similar Documents

Publication Publication Date Title
KR100802525B1 (ko) 실시간 멀티밴드 카메라
CN105678742B (zh) 一种水下相机标定方法
US8390696B2 (en) Apparatus for detecting direction of image pickup device and moving body comprising same
US7974460B2 (en) Method and system for three-dimensional obstacle mapping for navigation of autonomous vehicles
CN109922251A (zh) 快速抓拍的方法、装置及系统
CN105741379A (zh) 一种变电站全景巡检方法
TW201926240A (zh) 無人飛行機之全景拍照方法與使用其之無人飛行機
CN107122770A (zh) 多目相机系统、智能驾驶系统、汽车、方法和存储介质
Fryskowska et al. Calibration of low cost RGB and NIR UAV cameras
CN104412298B (zh) 用于变换图像的方法和设备
WO2013150830A1 (ja) グレア測定システム
Peng et al. Unmanned Aerial Vehicle for infrastructure inspection with image processing for quantification of measurement and formation of facade map
KR101770745B1 (ko) 이종의 위성영상 융합가능성 평가방법 및 그 장치
Knyaz et al. Joint geometric calibration of color and thermal cameras for synchronized multimodal dataset creating
WO2021179223A1 (zh) 红外图像处理方法、处理设备、无人飞行器和存储介质
Kurkela et al. Camera preparation and performance for 3D luminance mapping of road environments
TW201249188A (en) Image searching, capturing system and control method thereof
KR101996169B1 (ko) 카메라 변위를 고려한 가시광 통신 기반의 차량 위치 추정 방법 및 장치
JP2022057784A (ja) 撮像装置、撮像システムおよび撮像方法
RU2692970C2 (ru) Способ калибровки видеодатчиков многоспектральной системы технического зрения
JP5409451B2 (ja) 3次元変化検出装置
CN105894500B (zh) 一种基于图像处理的可视距离检测方法
Chen et al. A new algorithm for calculating the daytime visibility based on the color digital camera
CN105300377B (zh) 基于三象限偏振片的天空偏振模式探测方法与系统
Chong et al. Night-time surveillance system for forensic 3D mapping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20924776

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20924776

Country of ref document: EP

Kind code of ref document: A1