WO2021212319A1 - 红外图像处理方法、装置、系统及可移动平台 - Google Patents

红外图像处理方法、装置、系统及可移动平台 Download PDF

Info

Publication number
WO2021212319A1
WO2021212319A1 PCT/CN2020/085914 CN2020085914W WO2021212319A1 WO 2021212319 A1 WO2021212319 A1 WO 2021212319A1 CN 2020085914 W CN2020085914 W CN 2020085914W WO 2021212319 A1 WO2021212319 A1 WO 2021212319A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
target area
object corresponding
infrared
temperature
Prior art date
Application number
PCT/CN2020/085914
Other languages
English (en)
French (fr)
Inventor
邹文
张青涛
朱传杰
赵新涛
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/085914 priority Critical patent/WO2021212319A1/zh
Publication of WO2021212319A1 publication Critical patent/WO2021212319A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/06Arrangements for eliminating effects of disturbing radiation; Arrangements for compensating changes in sensitivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • G01J5/12Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors using thermoelectric elements, e.g. thermocouples
    • G01J5/14Electrical features thereof
    • G01J5/16Arrangements with respect to the cold junction; Compensating influence of ambient temperature or other variables
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K1/00Details of thermometers not specially adapted for particular types of thermometer
    • G01K1/20Compensating for effects of temperature changes other than those to be measured, e.g. changes in ambient temperature

Definitions

  • This application relates to the field of image processing technology, and specifically, to an infrared image processing method, device, system, and movable platform.
  • the infrared image collected by the infrared sensor is a thermal characteristic image, which can reflect the temperature information of each object in the image.
  • the temperature of the object can usually be measured with the help of a red sensor.
  • the infrared image can also be further processed based on the temperature information to facilitate subsequent observation.
  • an infrared sensor is used to measure the temperature of an object, because the characteristic parameters of the object itself, such as emissivity, reflectivity, and some environmental parameters when the infrared sensor collects infrared images, will affect the measurement result, it is necessary to accurately determine These parameters can be used to correct the temperature measured by the infrared sensor according to these parameters to improve the accuracy of the measured temperature.
  • this application provides an infrared image processing method, device, system and movable platform.
  • an infrared image processing method including:
  • the value of the temperature compensation parameter is determined based on the data collected by one or more other sensors and/or the pre-stored calibration data of the temperature compensation parameter ;
  • the target temperature of the object corresponding to the target area is determined according to the value of the temperature compensation parameter.
  • an infrared image processing device comprising: a processor, a memory, and a computer program stored on the memory, and when the processor executes the calculation program, Implement the following steps:
  • the value of the temperature compensation parameter is determined based on the data collected by one or more other sensors and/or the pre-stored calibration data of the temperature compensation parameter ;
  • the target temperature of the object corresponding to the target area is determined according to the value of the temperature compensation parameter.
  • an infrared image processing system including a red sensor and an infrared image processing device.
  • the infrared image processing device includes a processor, a memory, and is stored in the memory and can be executed by the processor.
  • the processor executes the computer instructions, the following steps are implemented:
  • the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor Determining the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor, the value of the temperature compensation parameter being determined based on the data collected by one or more sensors and/or the pre-stored calibration data of the temperature compensation parameter;
  • the target temperature of the object corresponding to the target area is determined according to the value of the temperature compensation parameter.
  • a movable platform including the infrared image processing system described in the third aspect.
  • a computer-readable storage medium which stores a computer program, which when executed by a processor implements the infrared image processing method described in the first aspect.
  • the value of the parameter is used to compensate the temperature measurement result.
  • the temperature of the target area can be determined according to one or more of the pre-stored calibration data of the temperature compensation parameter or other data collected by the sensor used to assist in the determination of the temperature compensation parameter. Compensate the value of the parameter, and then correct the temperature measurement result according to the value of the automatically determined temperature compensation parameter to obtain the target temperature of the object corresponding to the target area.
  • the determined value of the temperature compensation parameter can be made more accurate, and thus the measured temperature is also more accurate.
  • Fig. 1 is a flowchart of an infrared image processing method according to an embodiment of the present application.
  • Fig. 2 is a schematic diagram of determining the object type in the target area of the infrared image according to the result of semantic segmentation of the visible light image according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of an application scenario of an embodiment of the present application.
  • Fig. 4 is a schematic diagram of an infrared image processing method according to an embodiment of the present application.
  • Fig. 5 is a schematic diagram of the logical structure of an infrared image processing device according to an embodiment of the present application.
  • 6a-6d are schematic diagrams of the logical structure of an infrared image processing system according to an embodiment of the present application.
  • the infrared image collected by the infrared sensor is a thermal characteristic image, which can reflect the temperature information of each object in the image.
  • the temperature of the object can usually be measured with the help of a red sensor.
  • the infrared image can also be further processed based on the temperature information to facilitate subsequent observation.
  • the infrared sensor can collect the gray value of each pixel in the infrared image, and the temperature corresponding to each pixel can be determined according to the gray value of the pixel and the conversion relationship between temperature and gray.
  • the infrared sensor when used to measure the temperature of an object, due to the characteristics of the object itself, such as emissivity, reflectivity, and some environmental factors when the infrared sensor collects infrared images, such as atmospheric transmittance, the distance from the object to the infrared sensor, etc. Parameters will affect the measurement results. Therefore, after determining the corresponding temperature of each pixel according to the gray value and the conversion relationship between temperature and gray, it is also necessary to determine the characteristic parameters of the object itself, such as emissivity, reflectivity, and environmental parameters. , Such as atmospheric transmittance, the imaging distance from the object to the infrared sensor, etc., to compensate the results measured by the infrared sensor to make the temperature measured by the infrared sensor more accurate.
  • parameters such as object emissivity, reflectivity, atmospheric transmittance, imaging distance, etc., used to compensate the temperature measurement results are manually input by the user, and the degree of automation is low. Or directly use the preset fixed value. Because different objects have different emissivity, reflectivity, imaging distance, etc., using a fixed value will cause the measurement result to be inaccurate. Therefore, it is necessary to provide a solution that can automatically realize the accurate temperature measurement of the infrared sensor.
  • this application provides an infrared image processing method, which can automatically determine the temperature compensation parameters used to compensate and calculate the temperature measurement results of the infrared sensor, and then use these temperature compensation parameters to correct the temperature of each object in the infrared image to Get accurate temperature measurement results.
  • the method is shown in Figure 1 and includes the following steps:
  • S104 Determine the target temperature of the object corresponding to the target area according to the value of the temperature compensation parameter.
  • the infrared image processing method of the present application can be executed by an infrared image processing device, and the infrared image processing device can be integrated with an infrared sensor on a device.
  • the device can be an infrared camera integrating an infrared sensor and an infrared image processing device.
  • the infrared image can be collected by an infrared sensor and then sent to the infrared image processing device for later image processing.
  • the temperature compensation parameter in this application can be any parameter that has an effect on the temperature measurement result during the temperature measurement process using the infrared sensor.
  • it can be some parameters related to the characteristics of the object, such as the emissivity and reflectivity of the object.
  • the environment such as ambient temperature, atmospheric transmittance, environmental humidity, etc., and can also be parameters such as the imaging distance of the object.
  • the other sensor may be any sensor used to assist in determining the temperature compensation parameter, for example, it may be one or more of a temperature sensor, a distance sensor, a humidity sensor, and a visible light sensor. These sensors and infrared sensors can be integrated on one device, or can be independent devices, and this application is not limited.
  • the calibration data of the temperature compensation parameter in this application is the value of the temperature compensation parameter under different conditions determined in advance through calibration, measurement or other methods.
  • it can be calibration data such as the emissivity and reflectivity of different types of objects, the atmospheric transmittance under different temperature and humidity, and different weather types.
  • the target area of the present application can be any area in the infrared image whose temperature is to be determined, it can be a part of the area in the infrared image, or it can be the entire area.
  • one or more of the pre-stored calibration data of temperature compensation parameters or data collected by other sensors can be used to determine the value of the temperature compensation parameter of the target area, and then according to The value of the temperature compensation parameter compensates the temperature of the object corresponding to the target area measured by the infrared sensor to obtain the final target temperature.
  • the value of the temperature compensation parameter of the target area through one or more of the pre-stored calibration data of the temperature compensation parameter or the data collected by other sensors, and the value of the temperature compensation parameter of the target area of the infrared image can be automatically and accurately obtained. Then, the determined value of the temperature compensation parameter is used to compensate the temperature of the object corresponding to the target area measured by the infrared sensor, so that the temperature measurement result of the infrared sensor is more accurate.
  • the principle of infrared sensor temperature measurement is that the infrared energy radiated by an object is positively correlated with the temperature of the object, that is, the higher the temperature of the object, the higher the radiated infrared energy.
  • the temperature of the object can be determined by measuring the infrared energy emitted by the object through an infrared sensor.
  • the infrared energy emitted by the object is not only determined by the temperature, and the infrared energy emitted by the object may not be completely received by the infrared sensor.
  • the infrared energy emitted by an object is not only related to the temperature of the object, but also related to some characteristic parameters of the object itself, such as the emissivity and reflectivity of the object.
  • the emissivity refers to the ratio of the infrared energy emitted by an object at a specific temperature to the value when there is no loss in theory
  • the reflectivity refers to the ratio of the reflected energy to the received energy when an object is illuminated by a heat source.
  • the higher the emissivity of an object at the same temperature the greater the infrared energy radiated. Therefore, the temperature measured by the infrared sensor is also higher.
  • the object when the object radiates infrared energy and is received by the infrared sensor, it will also be affected by the distance between the object and the sensor, the projection rate of the atmosphere, temperature and humidity, etc., so that the infrared energy radiated by the object cannot be completely received by the infrared sensor. Therefore, These parameters will have an impact on temperature measurement, and temperature compensation calculations need to be performed based on these parameters.
  • the temperature compensation parameters may include the atmospheric transmittance of the environment where the infrared sensor is located, the distance from the object corresponding to the target area to the infrared sensor (imaging distance), and characteristics related to the characteristics of the object corresponding to the target area.
  • the characteristic parameter may be the emissivity of the object or the reflectivity of the object. one or more.
  • the pre-stored calibration data may include the correspondence between the type of the object and the value of the characteristic parameter. For example, it may include one or more of the emissivity corresponding to different types of objects or the reflectivity corresponding to different objects.
  • the corresponding relationship between the object category and the characteristic parameter can be used to determine the target area correspondence.
  • the value of the characteristic parameter of the object, and then the temperature of the object is corrected according to the value of the characteristic parameter.
  • the emissivity and/or reflectivity of the object corresponding to the target area can be determined according to the emissivity and/or reflectivity corresponding to different types of objects, and then the temperature of the object can be corrected according to the emissivity and/or reflectivity of the object , Get the final temperature.
  • the type of the object corresponding to the target area may be determined first, and then the corresponding relationship between the object type and the characteristic parameter stored in advance is used to determine the characteristic parameter corresponding to the target area.
  • the value of the characteristic parameter of the object can be realized by a general target detection algorithm.
  • the infrared image when determining the category of the object corresponding to the target area in the infrared image, can be directly determined.
  • the infrared image can be directly detected to determine the category of the object corresponding to the target area, such as people, Trees are still vehicles.
  • infrared images can also be segmented semantically, and the categories of objects corresponding to each pixel can be accurately determined. For example, the infrared image is divided into different areas, and each area represents a type of object.
  • images collected by other image sensors may also be used to assist the identification of the object category corresponding to the target area.
  • other sensors may be visible light sensors, and the visible light sensor and the infrared sensor may have a certain angle of view overlap.
  • the visible light image collected by the visible light sensor can be semantically segmented to determine the category of the object corresponding to each pixel on the visible light image, and then based on the visible light image
  • the category of the object corresponding to each pixel in the, and the pixel point mapping relationship between the visible light image and the infrared image determine the category of the object corresponding to the target area in the infrared image.
  • the left image is the visible light image collected by the visible light sensor
  • the right image is the infrared image collected by the infrared sensor.
  • the visible light image can be semantically segmented to obtain the object category corresponding to each pixel.
  • the image is divided into different areas, such as "sky, roads, trees, houses, vehicles, people" and other areas, and then the coordinates of the pixel points of the target area A1 in the infrared image can be determined, and the pixel points of the infrared image and the visible light image can be mapped
  • the relationship determines the corresponding area A2 of the target area A1 in the visible light image, and according to the result of semantic segmentation of the visible light image, the target area A1 can be determined as a tree.
  • the infrared sensor and the visible light sensor can be located in one device, for example, it can be a dual-light camera, and the two sensors can also be located in separate devices.
  • the positions of the two may be relatively fixed or not, as long as the relative positional relationship can be determined.
  • two sensors can be fixed to the pan-tilt and can respectively follow the pan-tilt to rotate. During the rotation, the relative position relationship between the two sensors can be calculated according to the angle of rotation of the pan-tilt.
  • the pixel point mapping relationship of the visible light image and the infrared image may be determined according to the internal parameters of the infrared sensor, the internal parameters of the visible light sensor, and the positional relationship between the infrared sensor and the visible light sensor.
  • the pixel mapping relationship between the visible light image and the infrared image is represented by matrix H
  • P1 represents the pixel coordinates of the pixel on the visible light image
  • P2 represents the pixel coordinates of the pixel on the infrared image
  • H can be represented by the formula ( 1) Confirm, the relationship between P1 and P2 is as formula (2):
  • K is a matrix representing the positional relationship between the visible light sensor and the infrared sensor, which can be determined according to the external parameter matrix of the two sensors
  • R1 represents the internal parameter matrix of the visible light sensor
  • R2 represents the internal parameter matrix of the infrared sensor.
  • the internal parameter matrix and the external parameter matrix of the visible light sensor and the internal parameter matrix and the external parameter matrix of the infrared sensor can be calibrated by Zhang Zhengyou's calibration method. If the two sensors are integrated into one device, they can be calibrated in advance when the device leaves the factory. Of course, it can also be calibrated temporarily during infrared image processing. This application is not limited.
  • the infrared image can be used directly to recognize the object in the target area, or the visible light image can be used to assist the recognition, which can be determined according to actual application scenarios.
  • the resolution of the image collected by the visible light sensor is usually higher than that of the image collected by the infrared sensor. Therefore, the visible light image can be used to assist identification.
  • the resolution of the image collected by the sensor is usually higher than the resolution of the image collected by the visible light sensor. Therefore, the infrared image can be directly used to identify the object in the target area.
  • the infrared image is subject to target detection or semantic segmentation.
  • the preprocessing operations include one or more of image correction, noise removal, temperature drift compensation, contrast stretching, and detail enhancement.
  • infrared images can be corrected to remove various noises such as bad pixels, random noise or fixed pattern noise on the infrared image, and temperature drift compensation can be performed on the infrared image, or the infrared image can be contrast stretched to improve the infrared image The contrast ratio of the infrared image can also be enhanced to highlight the area of interest.
  • the pre-stored calibration data may include the corresponding relationship between the weather type and the value of the atmospheric transmittance.
  • the weather type may include different types such as cloudy, sunny, rainy, foggy, etc. Of course, it may also be more finely divided according to actual needs, which is not limited in this application.
  • the reference value of atmospheric transmittance under different weather types can be measured or calibrated in advance, and the corresponding relationship between the two can be obtained and stored.
  • the value of the atmospheric transmittance in the environment can be determined according to the pre-stored correspondence between the weather type and the atmospheric transmittance and the weather type of the environment where the infrared sensor is located, and then according to The determined atmospheric transmittance value corrects the temperature measurement result.
  • the weather type of the environment where the infrared sensor is located can be determined with the help of other sensors, for example, it can be determined with the visible light image collected by the visible light sensor.
  • the weather type of the environment where the infrared sensor is located is obtained.
  • weather analysis on visible light images can be implemented using machine learning algorithms. For example, a large number of weather-labeled images can be used to train a preset model to obtain a weather classification model, and then use the model for weather analysis.
  • a temperature sensor, a humidity sensor, or a combination of two sensors can also be used to jointly determine the weather type.
  • the temperature and humidity of the current environment also have a certain impact on the type of weather. For example, the humidity in rainy days is often high and the temperature is low. Therefore, the temperature sensor can be used to measure the environmental temperature, and the humidity sensor can be used to measure the environmental humidity, and then according to the measured environment Temperature and humidity determine the type of weather.
  • the visible light image, ambient temperature, and ambient humidity can also be combined to jointly determine the weather type, so as to obtain more accurate results.
  • the distance between the object and the infrared sensor will also affect the amount of infrared energy received by the infrared sensor to a certain extent. For example, the greater the distance, the more likely the infrared energy radiated by the object will be reflected by the atmosphere, resulting in the infrared received by the infrared sensor. The less energy. Therefore, in order to accurately measure the distance between the object and the infrared sensor, in some embodiments, the distance sensor can also be used to measure the distance from the object corresponding to the target area to the infrared sensor, and then the temperature measurement result is performed according to the measured distance. Correction.
  • the distance sensor may be one or more of a laser distance sensor, a TOF (Time of Flight) sensor, a lidar, an ultrasonic distance sensor, and a terahertz distance sensor.
  • the temperature measurement result can be corrected according to the value of the temperature compensation parameter to obtain the corrected target temperature.
  • the first temperature of the object corresponding to the target area can be determined according to the gray value of the target area and the conversion relationship between gray level and temperature, and then the value of the temperature compensation parameter is input into the corresponding correction function to obtain The correction coefficient corresponding to each temperature compensation parameter is then used to correct the first temperature to obtain the target temperature of the object corresponding to the target area. Since the final target temperature considers the influence of multiple temperature compensation parameters, the result is more accurate.
  • the target temperature of the object corresponding to the target area is T
  • the gray value of the target area in the infrared image is X
  • the distance between the object corresponding to the target area and the infrared sensor is D measured by the distance sensor
  • the target is determined by the calibration data
  • the emissivity of the object corresponding to the area is a
  • the reflectivity is b.
  • the atmospheric transmittance is determined by the visible light sensor, temperature and humidity sensor and calibration data. Therefore, the target temperature T can be determined according to formula (3):
  • F0 is the mapping function from gray value to temperature value
  • F1 is the distance correction function
  • F2 is the emissivity correction function
  • F3 is the reflectivity correction function
  • F4 is the atmospheric transmittance correction function
  • the correction coefficient corresponding to the emissivity may be determined according to the emissivity of the object, the environmental humidity, the distance from the object to the infrared sensor, and/or the response difference parameter of the infrared sensor.
  • the emissivity correction coefficient can be determined according to the influence of the emissivity on the temperature measurement result.
  • the influence coefficient of the emissivity on the temperature measurement result can be expressed by formula (4):
  • mEmiss, mRelHum, mDist and mDebugPara respectively represent the emissivity of the object to be measured, the environmental humidity, the distance from the object to the infrared sensor, and the response difference parameters of the infrared sensor. It can be seen from the above formula (4) that when the emissivity setting is lower than the actual object emissivity, the calculated influence coefficient is too large, the coefficient value is too large, and the measured temperature will be greater than the actual object temperature. On the contrary, when the emissivity setting is higher than the actual object emissivity, the calculated influence coefficient is too small, the coefficient value is too small, and the measured temperature will be lower than the actual object temperature. Therefore, the correction coefficient can be determined according to the influence coefficient of the emissivity on the temperature measurement result, and the temperature measurement result can be corrected.
  • an application scenario of the infrared image processing method provided by this application can be used in a drone.
  • the drone is equipped with an infrared sensor 31, a visible light sensor 32, a distance sensor 33, and a temperature sensor 34. And humidity sensor 35.
  • an infrared sensor 31 a visible light sensor 32
  • a distance sensor 33 a distance sensor 33
  • a temperature sensor 34 a temperature sensor 34
  • a humidity sensor 35 a temperature sensor
  • the infrared image After acquiring the infrared image collected by the infrared sensor 31, the infrared image can be preprocessed by one or more of image correction, denoising, temperature drift compensation, contrast stretching, detail enhancement, etc., and then the preprocessed infrared image can be processed.
  • the image is subjected to target detection or semantic segmentation, and the category of the object corresponding to the area to be measured on the infrared image is obtained.
  • the visible light image can be semantically segmented to determine the object category corresponding to each pixel of the visible light image, and then based on the visible light image and infrared image.
  • the pixel mapping relationship between the images determines the category of the object corresponding to the area to be measured.
  • the pixel point mapping relationship between the visible light image and the infrared image can be determined according to the internal parameters of the infrared sensor 31, the internal parameters of the visible light sensor 32, and the positional relationship between the infrared sensor 31 and the visible light sensor 32.
  • the corresponding relationship between the object category and the emissivity and reflectivity can be obtained from the pre-stored calibration data, and the emissivity and emissivity of the object corresponding to the temperature measurement area can be determined according to the corresponding relationship. Reflectivity.
  • the weather type of the current environment can be determined according to the visible light image collected by the visible light sensor 32, the ambient temperature collected by the temperature sensor 34, and the ambient humidity collected by the humidity sensor 35.
  • the weather type can be divided into sunny, rainy, cloudy, foggy, etc.
  • other classification methods can also be used, and this application is not limited.
  • the corresponding relationship between weather type and atmospheric transmittance is obtained from the pre-stored calibration data, and the atmospheric transmittance of the current environment is determined according to the corresponding relationship.
  • the distance between the object corresponding to the temperature measurement area and the infrared sensor 31 can be determined based on the data collected by the distance sensor 33, so that the emissivity, reflectivity, and the distance from the infrared sensor of the object corresponding to the temperature measurement area can be determined.
  • Atmospheric transmittance and other temperature compensation parameters that affect the temperature measurement results and then the gray value of the object corresponding to the temperature measurement area can be determined according to the infrared image, and the object corresponding to the temperature measurement area can be determined according to the conversion relationship between temperature and gray level Then according to the determined object's emissivity, reflectivity, distance from the infrared sensor, atmospheric transmittance and other parameters to correct the temperature to obtain the final high-precision temperature value.
  • the UAV may include one or several of the above-mentioned sensors according to the requirements of the parameters to be determined, but may not include all of them.
  • the present application also provides an infrared image processing device.
  • the infrared image processing device 50 includes a processor 51, a memory 52, and computer instructions stored in the memory and executable by the processor.
  • the processor 51 executes the computer instructions, the following steps are implemented:
  • the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor Determining the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor, the value of the temperature compensation parameter being determined based on the data collected by one or more sensors and/or the pre-stored calibration data of the temperature compensation parameter;
  • the target temperature of the object corresponding to the target area is determined according to the value of the temperature compensation parameter.
  • the temperature compensation parameter includes one or more of the following: atmospheric transmittance of the environment in which the infrared sensor is located, the distance from the object corresponding to the target area to the infrared sensor, and the distance from the target The characteristic parameters related to the characteristics of the object corresponding to the area.
  • the calibration data includes the corresponding relationship between the object category and the value of the characteristic parameter.
  • the temperature compensation parameter includes a characteristic parameter related to the characteristic of the object corresponding to the target area, and the value of the characteristic parameter of the object corresponding to the target area is determined based on the correspondence relationship.
  • the processor when the processor is configured to determine the value of the characteristic parameter of the object corresponding to the target area based on the corresponding relationship, it is specifically configured to:
  • the category of the object corresponding to the target area is determined, and the value of the characteristic parameter of the object corresponding to the target area is determined according to the correspondence relationship.
  • the processor when used to determine the category of the object corresponding to the target area, it is specifically used to:
  • the other sensors include a visible light sensor, which determines the category of the object corresponding to each pixel of the visible light image collected by the visible light sensor, based on the category of the object corresponding to each pixel in the visible light image and the relationship between the visible light image and the infrared The pixel point mapping relationship of the image determines the category of the object corresponding to the target area.
  • the pixel point mapping relationship is determined based on the internal parameters of the infrared sensor, the internal parameters of the visible light sensor, and the positional relationship between the infrared sensor and the visible light sensor.
  • the processor is configured to perform target detection or semantic segmentation on the infrared image, and before determining the category of the object corresponding to the target area, is further configured to:
  • the infrared image is preprocessed, and the preprocessing includes one or more of the following: image correction, noise removal, temperature drift compensation, contrast stretching, and detail enhancement.
  • the characteristic parameter includes emissivity and/or reflectivity.
  • the calibration data includes the corresponding relationship between the weather type and the value of atmospheric transmittance.
  • the temperature compensation parameter includes the atmospheric transmittance of the environment in which the infrared sensor is located, and the value of the atmospheric transmittance of the environment of the infrared sensor is based on the corresponding relationship and the value of the infrared sensor.
  • the weather type of the environment is determined.
  • the other sensor includes a visible light sensor
  • the weather type is determined based on the visible light image collected by the visible light sensor.
  • the other sensors include a temperature sensor and/or a humidity sensor, and the weather type is determined based on the ambient temperature collected by the temperature sensor and/or the ambient humidity collected by the humidity sensor.
  • the weather type includes one or more of the following: cloudy, sunny, rainy, and foggy.
  • the other sensor includes a distance sensor
  • the temperature compensation parameter includes the distance from the object corresponding to the target area to the infrared sensor
  • the distance from the object corresponding to the target area to the infrared sensor is based on the distance from the infrared sensor. The distance sensor is determined.
  • the distance sensor includes one or more of the following: a laser distance sensor, a TOF, a lidar, an ultrasonic distance sensor, and a terahertz distance sensor.
  • the processor when the processor is configured to determine the temperature of the object corresponding to the target area according to the value of the temperature compensation parameter, it is specifically configured to:
  • the first temperature is corrected by using the correction coefficient to obtain the target temperature of the object corresponding to the target area.
  • the temperature compensation parameter includes the emissivity of the object corresponding to the target area, and the correction coefficient is based on the emissivity of the object, environmental humidity, the distance from the object to the infrared sensor, and / Or the response difference parameter of the infrared sensor is determined.
  • the present application also provides an infrared image processing system.
  • the infrared image processing system includes a red sensor 61 and an infrared image processing device 62.
  • the infrared image processing device includes a processor 621, a memory 622,
  • the computer instructions that can be executed by the processor are stored on the memory, and when the processor executes the computer instructions, the following steps are implemented:
  • the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor Determining the value of the temperature compensation parameter of the target area in the infrared image collected by the infrared sensor, the value of the temperature compensation parameter being determined based on the data collected by one or more sensors and/or the pre-stored calibration data of the temperature compensation parameter;
  • the target temperature of the object corresponding to the target area is determined according to the value of the temperature compensation parameter.
  • the temperature compensation parameter includes one or more of the following: atmospheric transmittance of the environment in which the infrared sensor is located, the distance from the object corresponding to the target area to the infrared sensor, and the distance from the target The characteristic parameters related to the characteristics of the object corresponding to the area.
  • the calibration data includes the corresponding relationship between the weather type and the value of atmospheric transmittance.
  • the temperature compensation parameter includes the atmospheric transmittance of the environment where the infrared sensor is located, and the value of the atmospheric transmittance is determined based on the correspondence relationship and the weather type of the environment where the infrared sensor is located.
  • the infrared image processing system further includes a visible light sensor 63 for collecting visible light images, and the weather type is determined based on the visible light images.
  • the infrared image processing system further includes a temperature sensor and/or humidity sensor 64, the temperature sensor is used to collect environmental temperature, and the humidity sensor is used to collect environmental humidity, The weather type is determined based on the environmental temperature and/or environmental humidity.
  • the calibration data includes the corresponding relationship between the object category and the value of the characteristic parameter.
  • the temperature compensation parameter includes a characteristic parameter of the object corresponding to the target area, and the value of the characteristic parameter of the object corresponding to the target area is determined based on the correspondence relationship.
  • the processor when the processor is configured to determine the value of the characteristic parameter of the object corresponding to the target area based on the corresponding relationship, it is specifically configured to:
  • the category of the object corresponding to the target area is determined, and the value of the characteristic parameter of the object corresponding to the target area is determined according to the correspondence relationship.
  • the infrared image processing system further includes a visible light sensor 63 for collecting visible light images, and the processor is used for determining the category of the object corresponding to the target area.
  • Target detection or semantic segmentation is performed on the infrared image, and the category of the object corresponding to the target area is determined.
  • the infrared image processing system further includes a distance sensor 65, the temperature compensation parameter includes the distance from the object corresponding to the target area to the infrared sensor, and the distance sensor Used to determine the distance.
  • the distance sensor includes one or more of the following: a laser distance sensor, a TOF, a lidar, an ultrasonic distance sensor, and a terahertz distance sensor.
  • the processor when the processor is configured to determine the temperature of the object corresponding to the target area according to the value of the temperature compensation parameter, it is specifically configured to:
  • the first temperature is corrected by using the correction coefficient to obtain the target temperature of the object corresponding to the target area.
  • the temperature compensation parameter includes the emissivity of the object corresponding to the target area, and the correction coefficient is based on the emissivity of the object, environmental humidity, the distance from the object to the infrared sensor, and /Or the response difference parameter of the infrared sensor is determined.
  • the present application also provides a movable platform, which includes any one of the infrared image processing systems in the foregoing embodiments.
  • the movable platform may be an unmanned aerial vehicle, an unmanned vehicle, an unmanned ship, and the like.
  • an embodiment of the present specification also provides a computer storage medium in which a program is stored, and the program is executed by a processor to implement the infrared image processing method in any of the foregoing embodiments.
  • the embodiments of this specification may adopt the form of a computer program product implemented on one or more storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing program codes.
  • Computer usable storage media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to: phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • CD-ROM compact disc
  • DVD digital versatile disc
  • Magnetic cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • the relevant part can refer to the part of the description of the method embodiment.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units.
  • Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement it without creative work.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Radiation Pyrometers (AREA)

Abstract

一种红外图像处理方法、装置、系统及可移动平台,包括:确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,温度补偿参数的数值基于一个或多个其他传感器采集的数据和/或预先存储的温度补偿参数的标定数据确定;根据温度补偿参数的数值确定目标区域对应的物体的目标温度。通过根据实际的测温场景自动确定温度补偿参数,可以使得确定的温度补偿参数更加准确,因而测得的温度也更加准确。

Description

红外图像处理方法、装置、系统及可移动平台 技术领域
本申请涉及图像处理技术领域,具体而言,涉及一种红外图像处理方法、装置、系统及可移动平台。
背景技术
红外传感器采集的红外图像为热特性图像,可以反映图像中各物体的温度信息。通常可以借助红传感器来测量物体的温度,在确定红外图像中各物体温度信息后,也可以基于温度信息对红外图像进行进一步处理,便于后续的观瞄。在采用红外传感器来测量物体温度时,由于物体自身的特性参数,比如发射率、反射率,以及红外传感器在采集红外图像时的一些环境参数对测量结果会产生影响,因此,有必要准确地确定这些参数,以根据这些参数对红外传感器测得的温度进行校正,提高测得的温度的准确性。
发明内容
有鉴于此,本申请提供一种红外图像处理方法、装置、系统及可移动平台。
根据本申请的第一方面,提供一种红外图像处理方法,所述方法包括:
确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个其他传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
根据本申请的第二方面,提供一种红外图像处理装置,所述红外图像处理装置包括:处理器、存储器以及存储在所述存储器上的计算机程序,所述处理器执行所述计算程序时,实现以下步骤:
确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个其他传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
根据本申请的第三方面,提供一种红外图像处理系统,包括红传感器和红外图像处理装置,所述红外图像处理装置包括处理器、存储器、存储在所述存储器上可被所述处理器执行的计算机指令,所述处理器执行所述计算机指令时,实现以下步骤:
确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
根据本申请的第四方面,提供一种可移动平台,所述可移动平台包括上述第三方面所描述的红外图像处理系统。
根据本申请的第五方面,提供一种计算机可读存储介质,其存储有计算机程序,该计算机程序被处理器执行时实现上述第一方面所描述的红外图像处理方法。
应用本申请提供的方案,在确定红外传图像中目标区域的温度时,由于红外传感器的测温结果会受到很多因素的影响,因此,需要根据对测温结果产生影响的一个或者多个温度补偿参数的数值对测温结果进行补偿计算,可以根据预先存储的温度补偿参数的标定数据或者其他的用于辅助测定上述温度补偿参数的传感器采集的数据中的一种或多种确定目标区域的温度补偿参数的数值,然后根据自动确定的温度补偿参数的数值对测温结果进行校正,得到目标区域对应的物体的目标温度。通过根据实际的测温场景自动确定温度补偿参数的数值,可以使得确定的温度补偿参数的数值更加准确,因而测得的温度也更加准确。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个实施例的一种红外图像处理方法流程图。
图2是本申请一个实施例的一种根据可见光图像的语义分割结果确定红外图像目标区域物体类型的示意图。
图3是本申请一个实施例的一种应用场景示意图。
图4是本申请一个实施例的一种红外图像处理方法的示意图。
图5是本申请一个实施例的一种红外图像处理装置的逻辑结构示意图。
图6a-6d是本申请实施例的一种红外图像处理系统的逻辑结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
红外传感器采集的红外图像为热特性图像,可以反映图像中各物体的温度信息。通常可以借助红传感器来测量物体的温度,在确定红外图像中各物体温度信息后,也可以基于温度信息对红外图像进行进一步处理,便于后续的观瞄。红外传感器可以采集得到红外图像中各像素点的灰度值,根据像素点的灰度值以及温度和灰度的转化关系可以确定出各像素点对应的温度。但是,在采用红外传感器来测量物体温度时,由于物体自身的特性,比如发射率、反射率,以及红外传感器在采集红外图像时的一些环境因素,比如大气透射率、物体到红外传感器的距离等参数对测量结果会产生影响,因此,在根据灰度值以及温度与灰度的转化关系确定各像素点对应的温度后,还需要根据物体自身的特性参数,如发射率、反射率,环境参数,如大气透射率、物体到红外传感器的成像距离等对红外传感器测得的结果进行补偿计算,以使红外传感器测得的温度更加准确。
目前,物体发射率、反射率、大气透射率、成像距离等用于对测温结果进行补偿计算的参数都是靠用户手动输入,自动化程度较低。或者直接使用预先设置的固定值,由于对于不同的物体,其发射率、反射率、成像距离等各不相同,使用固定值会导致测量结果不准确。因此,有必要提供一种方案,可以自动实现红外传感器的准确测温。
基于此,本申请提供一种红外图像处理方法,可以自动确定用于对红外传感器测温结果进行补偿计算的温度补偿参数,然后使用这些温度补偿参数对红外图像中各物体的温度进行校正,以得到准确的测温结果。具体的,所述方法如图1所示,包括以下步骤:
S102、确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所 述温度补偿参数的数值基于一个或多个其他传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
S104、根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
本申请的红外图像处理方法可以由红外图像处理装置执行,该红外图像处理装置可以和红外传感器集成在一个设备上,比如,该设备可以是集成了红外传感器和红外图像处理装置的红外相机,当然,也可以位于不同的设备,比如,可以由红外传感器采集到红外图像后再发送给该红外图像处理装置,进行后期的图像处理。
本申请中的温度补偿参数可以是在使用红外传感器测温过程中,对测温结果有影响的任一参数,比如,可以是和物体自身特性相关的一些参数,比如物体的发射率、反射率等,也可以是和环境相关的一些参数,比如环境温度、大气透射率、环境湿度等,还可以是物体的成像距离等参数。
其中,其他传感器可以是用于辅助确定上述温度补偿参数的任一传感器,比如,可以是温度传感器、距离传感器、湿度传感器、可见光传感器中的一个或多个。这些传感器和红外传感器可以集成在一个设备上,也可以是独立的设备,本申请不作限制。
本申请的温度补偿参数的标定数据为预先通过标定、测量或者其他方式确定的不同条件下的温度补偿参数的数值。比如,可以是不同类型的物体的发射率、反射率,不同温湿度、不同天气类型下的大气透射率等标定数据。
本申请的目标区域可以是红外图像中待确定温度的任一区域,可以是红外图像中的部分区域,也可以是全部区域。在确定红外图像中目标区域对应的物体的温度时,可以根据预先存储的温度补偿参数的标定数据或者其他传感器采集的数据中的一种或多种确定目标区域的温度补偿参数的数值,然后根据温度补偿参数的数值对红外传感器测得的目标区域对应的物体的温度进行补偿计算,得到最终的目标温度。
通过预先存储的温度补偿参数的标定数据或者其他传感器采集的数据中的一种或多种来确定目标区域的温度补偿参数的数值,可以自动并且准确地获取红外图像目标区域温度补偿参数的数值,然后采用确定的温度补偿参数的数值对红外传感器测得的目标区域对应的物体的温度进行补偿计算,使得红外传感器的测温结果更加准确。
红外传感器测温的原理是物体辐射的红外能量与物体的温度正相关,也就是说,物体的温度越高,辐射的红外能量越高。理论上,通过红外传感器测定物体发射出的红外能,就可以确定物体的温度。然而,实际上物体发射出的红外能并不是仅由温度这一个因素来决定的,并且物体发射出的红外能也并不一定完全被红外传感器接收。比如,物体发射的红外能除了与物体的温度有关,还与物体自身的一些特性参数相关, 比如,物体的发射率、反射率等。其中,发射率是指一个物体在特定温度下发射出的红外能与理论上完全没有损失时的值的比例,反射率是指一个物体被热源照射时,反射回去能量与接收到的能量的比例。举个例子,相同温度的物体,其发射率越高,辐射出去的红外能量也就越大,因此,红外传感器测得的温度也越高。另外,物体在辐射出红外能被红外传感器接收的过程中,也会受到物体与传感器的距离、大气的投射率、温湿度等影响,造成物体辐射的红外能不能完全被红外传感器接收,因此,这些参数都会测温产生影响,需要根据这些参数对温度进行补偿计算。
所以,在某些实施例中,温度补偿参数可以包括红外传感器所处环境的大气透射率、目标区域对应的物体到红外传感器的距离(成像距离)、与目标区域对应的物体自身特性相关的特性参数中的一个或多个。其中,与物体自身特性相关的特性参数中物体的发射率和反射率对测温结果影响较大,因此,在某些实施例中,特性参数可以是物体的发射率或者物体的反射率中的一个或多个。
由于发射率、反射率等特性参数与物体的材料、形态、透明度等有关系,因而,不同的物体其发射率、反射率等特性参数的数值也不一样。在某些实施例中,为了可以自动地确定待测温的目标图像区域对应的物体的特性参数的数值,预先存储的标定数据中可以包括物体的类别与特性参数的数值的对应关系。比如,可以包括不同类别的物体对应的发射率或者不同的物体对应的反射率中的一种或多种。
由于预先存储了物体类别与特性参数的对应关系,因此,在某些实施例中,在确定目标区域对应的物体的温度时,可以根据存储的物体类别与特性参数的对应关系来确定目标区域对应的物体的特性参数的数值,然后根据特性参数的数值对该物体的温度进行校正。比如,可以根据不同类别的物体对应的发射率和/或反射率确定目标区域对应的物体的发射率和/或反射率,然后根据该物体的发射率和/或反射率对物体的温度进行校正,得到最终的温度。
在某些实施例中,在确定目标区域对应的物体的特性参数时,可以先确定目标区域对应的物体的类别,然后再根据预先存储的物体类别与特性参数的对应关系来确定目标区域对应的物体的特性参数的数值。其中,红外图像中目标区域对应的物体的类别可以通过通用的目标检测算法实现。
在某些实施例中,确定红外图像中目标区域对应的物体的类别时,可以直接通过红外图像确定,比如可以直接对红外图像进行目标检测,确定目标区域对应的物体的类别,比如是人物、树木还是车辆。当然,在某些应用场景中,也可以对红外图像进行语义分割,可以精确地确定各个像素点对应的物体的类别。比如,将红外图像分割 成不同区域,每个区域代表一类物体。
当然,红外图像由于其成像特性,分辨率往往较低,噪声较大。通过直接对红外图像进行目标检测或语义分割确定目标区域对应的物体的类别,结果可能不太准确。因此,在某些实施例中,还可以借助其他图像传感器采集的图像辅助目标区域对应的物体的类别的识别。比如,其他传感器可以是可见光传感器,可见光传感器与红外传感器可以有一定的视角重叠。由于白天或光线较好的时候,可见光图像的分辨率通常较高,因此,可以先对可见光传感器采集的可见光图像进行语义分割,确定可见光图像上各像素点对应的物体的类别,然后根据可见光图像中各像素点对应的物体的类别,以及可见光图像与红外图像的像素点映射关系确定红外图像中目标区域对应的物体的类别。举个例子,如图2所示,左图为可见光传感器采集的可见光图像,右图为红外传感器采集的红外图像,可以先对可见光图像进行语义分割,得到各个像素点对应的物体类别,把可见光图像分割成不同的区域,如“天空、道路、树木、房屋、车辆、人物”等区域,然后可以确定红外图像中的目标区域A1的像素点的坐标,通过红外图像和可见光图像的像素点映射关系确定目标区域A1在可见光图像中的对应区域A2,根据可见光图像语义分割的结果,即可以确定目标区域A1为树木。
其中,红外传感器和可见光传感器可以位于一个设备,比如,可以是一个双光相机,两个传感器也可以位于独立的设备。两者的位置可以相对固定,也可以不固定,只要其相对位置关系可以确定即可。比如,两个传感器可以固定于云台,可以分别跟随云台转动,在转动过程中,两者的相对位置关系可以根据云台转动角度计算得到。
在某些实施例,可见光图像和红外图像的像素点映射关系可以根据红外传感器的内参数、可见光传感器的内参数、红外传感器和可见光传感器的位置关系确定。举个例子,可见光图像和红外图像之间的像素点映射关系用矩阵H表示,P1表示可见光图像上的像素点的像素坐标,P2表示红外图像上像素点的像素坐标,则H可以通过公式(1)确定,P1和P2之间的关系如公式(2):
H=KR1R2   公式(1)
P2=HP1    公式(2)
其中,K为表示可见光传感器和红外传感器的位置关系的矩阵,可以根据两个传感器的外参矩阵确定,R1表示可见光传感器的内参矩阵,R2表示红外传感器的内参矩阵。其中,可见光传感器的内参矩阵、外参矩阵以及红外传感器的内参矩阵、外参矩阵可通过张正友标定法标定得到。如果两个传感器集成在一个设备,则可以在设备出厂时预先标定好,当然,也可以在进行红外图像处理时,临时标定,本申请不作限 制。
当然,在对红外图像目标区域对应的物体进行识别时,既可以直接使用红外图像识别该目标区域的物体,也可以借助可见光图像来辅助识别,具体可以根据实际应用场景确定。举个例子,在白天或者光线较好的场景,可见光传感器采集的图像的分辨率通常都高于红外传感器采集的图像,因此可以借助可见光图像来辅助识别,当夜间或者光线较差的场景,红外传感器采集的图像的分辨率通常高于可见光传感器采集的图像的分辨率,因此,可以直接使用红外图像来识别该目标区域的物体。
当然,由于红外图像通常存在较多的噪声,分辨率也较低,为了提高根据红外图像进行物体类别识别的结果的准确度,在某些实施例中,在对红外图像进行目标检测或语义分割之前,可以先对红外图像进行一些预处理,预处理操作包括图像矫正、噪声去除、温度漂移补偿、对比度拉伸和细节增强中一项或者多项。比如,可以对红外图像进行矫正,去除红外图像上的坏点、随机噪声或者固定模式的噪声等各种噪声,对红外图像进行温度漂移补偿,或者可以对红外图像进行对比度拉伸,提升红外图像的对比度,也可以对红外图像进行细节增强,突出感兴趣的区域。
由于物体辐射的红外能量不一定全部都能被红外传感器接收,比如,可能被大气层反射,因而无法到达红外传感器。所以,大气透射率也是影响测温结果的一个关键因素。为了实现自动确定红外传感器所处环境的大气透射率的数值,在某些实施例中,预先存储的标定数据可以包括天气类型与大气透射率的数值的对应关系。其中,在某些实施例中,天气类型可以包括阴天、晴天、雨天、雾天等不同的类型,当然,也可以根据实际需求进行更加精细的划分,本申请不作限制。可以预先测量或者标定不同天气类型下的大气透射率的参考值,得到两者的对应关系并存储。
在某些实施例中,在确定目标区域的温度时,可以根据预先存储的天气类型与大气透射率的对应关系以及红外传感器所处环境的天气类型确定环境中的大气透射率的数值,然后根据确定的大气透射率的数值对测温结果进行校正。
在某些实施例中,红外传感器所处环境的天气类型可以借助其他的传感器来确定,比如,可以借助可见光传感器采集的可见光图像来确定。通过对可见光图像进行天气分析,得到红外传感器所处环境的天气类型。具体的,对可见光图像进行天气分析可以采用机器学习算法实现,比如,可以使用大量标注好天气类型的图像对预设模型进行训练,得到一个天气分类模型,然后使用该模型进行天气分析。
当然,在某些实施例中,也可以借助温度传感器、湿度传感器或者结合两个传感器来共同确定天气类型。通常,当前环境的温度和湿度对天气类型也有一定影响,比 如,雨天湿度往往较大,温度较低,因此,可以通过温度传感器测量环境温度,通过湿度传感器测量环境湿度,然后根据测得的环境温湿度来确定天气类型。当然,某些实施例中,也可以结合可见光图像、环境温度和环境湿度来共同确定天气类型,以便得到更加准确的结果。
由于物体辐射的红外能量可能被大气层反射,因而无法到达红外传感器。因此,物体离红外传感器的距离在一定程度上也会影响红外传感器接收的红外能量的多少,比如,距离越大,物体辐射的红外能量被大气反射掉的可能越多,导致红外传感器接收的红外能量越少。因此,为了准确地测量出物体与红外传感器的距离,在某些实施例中,还可以借助距离传感器来测量目标区域对应的物体到红外传感器的距离,然后根据测得的距离对测温结果进行校正。
其中,在某些实施例中,距离传感器可以是激光测距传感器、TOF(Time of Flight:飞行时间)传感器、激光雷达、超声测距传感器和太赫兹测距传感器中的一种或多种。
在确定红外图像目标区域的温度补偿参数的数值以后,可以根据这些温度补偿参数的数值对测温结果进行校正,得到校正后的目标温度。在某些实施例中,可以先根据目标区域的灰度值以及灰度与温度的转化关系确定目标区域对应的物体的第一温度,然后将温度补偿参数的数值输入到对应的修正函数,得到各温度补偿参数对应的修正系数,然后采用这些修正系数对第一温度进行校正,得到目标区域对应的物体的目标温度,由于最后确定的目标温度考虑多个温度补偿参数的影响,结果更加准确。举个例子,假设目标区域对应的物体的目标温度为T,红外图像中目标区域的灰度值为X,通过距离传感器测得目标区域对应的物体与红外传感器距离为D,通过标定数据确定目标区域对应的物体发射率为a,反射率为b,通过可见光传感器、温湿度传感器和标定数据确定大气透射率为c,因此,可以根据公式(3)来确定目标温度T:
T=F0(X)*F1(D)*F2(a)*F3(b)*F4(c)   公式(3)
其中F0是灰度值到与温度值的映射函数,F1是距离修正函数,F2是发射率修正函数,F3是反射率修正函数,F4是大气透射率修正函数。经过上述计算后,可以得到高精度的测温结果。
在某些实施例中,发射率对应的修正系数可以根据物体的发射率、环境湿度、物体到红外传感器的距离和/或红外传感器的响应差异参数确定。发射率修正系数可以根据发射率对测温结果的影响来确定,举个例子,发射率对测温结果的影响系数可用公式(4)表示:
Figure PCTCN2020085914-appb-000001
其中,mEmiss、mRelHum、mDist和mDebugPara分别表示待测物体的发射率、环境湿度、物体到红外传感器的距离和红外传感器的响应差异参数。由上述公式(4)可知,当发射率设置比实际物体发射率偏小时,则计算得到的影响系数偏大,coefficient值偏大,测量的温度将大于实际物体温度。相反,当发射率设置比实际物体发射率偏大时,则计算得到的影响系数偏小,coefficient值偏小,测量的温度将小于实际物体温度。因此,可以根据发射率对测温结果影响系数来确定修正系数,对测温结果进行修正。
为了进一步解释本申请提供的红外图像处理方法,以下结合一个具体的实施例加以解释。
如图3所示,为本申请提供的红外图像处理方法的一个应用场景,该方法可以用于无人机,无人机上安装有红外传感器31、可见光传感器32、距离传感器33、温度传传感器34以及湿度传感器35。其中,根据该无人机上的红外传感器采集的红外图像对外界物体进行温度测量的具体过程如图4所示。
在获取红外传感器31采集的红外图像后,可以先对红外图像进行图像矫正、去噪、温漂补偿、对比度拉伸、细节增强等一项或者多项预处理,然后可以对预处理后的红外图像进行目标检测或者语义分割,得到红外该图像上待测温区域对应的物体的类别。当然,在某些场景,如果可见光传感器32采集的可见光图像的分辨率比红外图像高,则可以先对可见光图像进行语义分割,确定可见光图像各像素点对应的物体类别,然后根据可见光图像和红外图像之间的像素点映射关系确定待测温区域对应的物体的类别。其中,可见光图像和红外图像之间的像素点映射关系可以根据红外传感器31的内参数、可见光传感器32的内参数、红外传感器31与可见光传感器32的位置关系确定。在确定待测温区域对应的物体的类别后,可以从预先存储的标定数据中获取物体类别与发射率、反射率的对应关系,根据该对应关系确定待测温区域对应的物体的发射率和反射率。
然后可以根据可见光传感器32采集的可见光图像、温度传感器34采集的环境温度、湿度传感器35采集的环境湿度确定当前环境的天气类型,比如,天气类型可以分为晴天、雨天、阴天、雾天等,当然也可以采用其他的分类方法,本申请不作限制。然后从预先存储的标定数据中获取天气类型与大气透射率的对应关系,根据该对应关系确定当前环境的大气透射率。
进一步的,可以根据距离传感器33采集的数据确定待测温区域对应的物体与红外传感器31的距离,这样就可以确定出待测温区域对应的物体的发射率、反射率、与红 外传感器的距离、大气透射率等对测温结果造成影响的温度补偿参数,然后可以根据红外图像确定待测温区域对应的物体的灰度值,根据温度与灰度的转化关系确定待测温区域对应的物体的温度,然后再根据确定的物体的发射率、反射率、与红外传感器的距离、大气透射率等参数对该温度进行校正,得到最终的高精度的温度值。
当然,在某些实施例中,也可以只测定物体的发射率、反射率、与红外传感器的距离、大气透射率等参数中的一个或者几个,其余的可以采用默认值。这时,无人机可以根据待确定的参数的需求包括上述传感器中的一个或者几个,可以不需要包括全部。
相应的,本申请还提供一种红外图像处理装置,如图5所示,该红外图像处理装置50包括处理器51、存储器52、存储在所述存储器上可被所述处理器执行的计算机指令,所述处理器51执行所述计算机指令时,实现以下步骤:
确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
在某些实施例中,所述温度补偿参数包括以下一个或多个:所述红外传感器所处环境的大气透射率、所述目标区域对应的物体到所述红外传感器的距离、与所述目标区域对应的物体自身特性相关的特性参数。
在某些实施例中,所述标定数据包括物体类别与特性参数的数值的对应关系。
在某些实施例中,所述温度补偿参数包括与所述目标区域对应的物体自身特性相关的特性参数,所述目标区域对应的物体的特性参数的数值基于所述对应关系确定。
在某些实施例中,所述处理器用于基于所述对应关系确定所述目标区域对应的物体的特性参数的数值时,具体用于:
确定所述目标区域对应的物体的类别,根据所述对应关系确定所述目标区域对应的物体的特性参数的数值。
在某些实施例中,所述处理器用于确定所述目标区域对应的物体的类别时,具体用于:
对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别;或
所述其他传感器包括可见光传感器,确定所述可见光传感器采集的可见光图像各像素点对应的物体的类别,基于所述可见光图像中的各像素点对应的物体的类别以及 所述可见光图像与所述红外图像的像素点映射关系确定所述目标区域对应的物体的类别。
在某些实施例中,所述像素点映射关系基于所述红外传感器的内参数、所述可见光传感器的内参数、所述红外传感器和所述可见光传感器的位置关系确定。
在某些实施例中,所述处理器用于对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别之前,还用于:
对所述红外图像进行预处理,所述预处理包括以下一种或多种:图像矫正、噪声去除、温度漂移补偿、对比度拉伸和细节增强。
在某些实施例中,所述特性参数包括发射率和/或反射率。
在某些实施例中,所述标定数据包括天气类型与大气透射率的数值的对应关系。
在某些实施例中,所述温度补偿参数包括与所述红外传感器所处环境的大气透射率,所述红外传感器所述环境的大气透射率的数值基于所述对应关系与所述红外传感器所处述环境的天气类型确定。
在某些实施例中,所述其他传感器包括可见光传感器,所述天气类型基于所述可见光传感器采集的可见光图像确定。
在某些实施例中,所述其他传感器包括温度传感器和/或湿度传感器,所述天气类型基于所述温度传感器采集的环境温度和/或所述湿度传感器采集的环境湿度确定。
在某些实施例中,所述天气类型包括以下一种或多种:阴天、晴天、雨天、雾天。
在某些实施例中,所述其他传感器包括距离传感器,所述温度补偿参数包括目标区域对应的物体到所述红外传感器的距离,所述目标区域对应的物体到所述红外传感器的距离基于所述距离传感器确定。
在某些实施例中,所述距离传感器包括以下一种或多种:激光测距传感器、TOF、激光雷达、超声测距传感器和太赫兹测距传感器。
在某些实施例中,所述处理器用于根据所述温度补偿参数的数值确定所述目标区域对应的物体的温度时,具体用于:
根据所述目标区域的灰度值确定所述目标区域对应的物体的第一温度;
将所述温度补偿参数的数值输入到对应的修正函数,得到修正系数;
采用所述修正系数对所述第一温度进行校正,得到所述目标区域对应的物体的目标温度。
在某些实施例中,所述温度补偿参数包括所述目标区域对应的物体的发射率,所述修正系数基于所述物体的发射率、环境湿度、所述物体到所述红外传感器的距离和/ 或所述红外传感器的响应差异参数确定。
其中,红外图像处理装置在对红外图像进行处理的具体的实施细节可以参考上述红外图像处理方法中的描述,在此不再赘述。
进一步地,本申请还提供一种红外图像处理系统,如图6a所示,该红外图像处理系统包括红传感器61和红外图像处理装置62,所述红外图像处理装置包括处理器621、存储器622、存储在所述存储器上可被所述处理器执行的计算机指令,所述处理器执行所述计算机指令时,实现以下步骤:
确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
在某些实施例中,所述温度补偿参数包括以下一个或多个:所述红外传感器所处环境的大气透射率、所述目标区域对应的物体到所述红外传感器的距离、与所述目标区域对应的物体自身特性相关的特性参数。
在某些实施例中,所述标定数据包括天气类型与大气透射率的数值的对应关系。
在某些实施例中,所述温度补偿参数包括所述红外传感器所处环境的大气透射率,所述大气透射率的数值基于所述对应关系和所述红外传感器所处环境的天气类型确定。
在某些实施例中,如图6b所示,所述红外图像处理系统还包括可见光传感器63,用于采集可见光图像,所述天气类型基于所述可见光图像确定。
在某些实施例中,如图6c所示,所述红外图像处理系统还包括温度传感器和/或湿度传感器64,所述温度传感器用于采集环境温度,所述湿度传感器用于采集环境湿度,所述天气类型基于所述环境温度和/或环境湿度确定。
在某些实施例中,所述标定数据包括物体类别与特性参数的数值的对应关系。
在某些实施例中,所述温度补偿参数包括目标区域对应的物体的特性参数,所述目标区域对应的物体的特性参数的数值基于所述对应关系确定。
在某些实施例中,所述处理器用于基于所述对应关系确定所述目标区域对应的物体的特性参数的数值时,具体用于:
确定所述目标区域对应的物体的类别,根据所述对应关系确定所述目标区域对应的物体的特性参数的数值。
在某些实施例中,如图6b所示,所述红外图像处理系统还包括可见光传感器63, 用于采集可见光图像,所述处理器用于确定所述目标区域对应的物体的类别时,具体用于:
确定所述可见光图像各像素点对应的物体的类别,基于所述可见光图像中的各像素点对应的物体的类别以及所述可见光图像与所述红外图像的像素点映射关系确定所述目标区域对应的物体的类别;或
对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别。
在某些实施例中,如图6d所示,所述红外图像处理系统还包括距离传感器65,所述温度补偿参数包括所述目标区域对应的物体到所述红外传感器的距离,所述距离传感器用于确定所述距离。
在某些实施例中,所述距离传感器包括以下一种或多种:激光测距传感器、TOF、激光雷达、超声测距传感器和太赫兹测距传感器。
在某些实施例中,所述处理器用于根据所述温度补偿参数的数值确定所述目标区域对应的物体的温度时,具体用于:
根据所述目标区域的灰度值确定所述目标区域对应的物体的第一温度;
将所述温度补偿参数的数值输入到对应的修正函数,得到修正系数;
采用所述修正系数对所述第一温度进行校正,得到所述目标区域对应的物体的目标温度。
在某些实施例中,所述温度补偿参数包括所述目标区域对应的物体的发射率,所述修正系数基于所述物体的发射率、环境湿度、所述物体到所述红外传感器的距离和/或所述红外传感器的响应差异参数确定。
进一步地,本申请还提供一种可移动平台,所述可移动平台包括上述实施例中的任一项红外图像处理系统。其中,所述可移动平台可以是无人机、无人车、无人船等。
相应地,本说明书实施例还提供一种计算机存储介质,所述存储介质中存储有程序,所述程序被处理器执行时实现上述任一实施例中的红外图像处理方法。
本说明书实施例可采用在一个或多个其中包含有程序代码的存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。计算机可用存储介质包括永久性和非永久性、可移动和非可移动媒体,可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括但不限于:相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、 数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本发明实施例所提供的方法和装置进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (52)

  1. 一种红外图像处理方法,其特征在于,所述方法包括:
    确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个其他传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
    根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
  2. 根据权利要求1所述的方法,其特征在于,所述温度补偿参数包括以下一个或多个:所述红外传感器所处环境的大气透射率、所述目标区域对应的物体到所述红外传感器的距离、与所述目标区域对应的物体自身特性相关的特性参数。
  3. 根据权利要求1或2所述的方法,其特征在于,所述标定数据包括物体类别与特性参数的数值的对应关系。
  4. 根据权利要求3所述的方法,其特征在于,所述温度补偿参数包括与所述目标区域对应的物体自身特性相关的特性参数,所述目标区域对应的物体的特性参数的数值基于所述对应关系确定。
  5. 根据权利要求4所述的方法,其特征在于,基于所述对应关系确定所述目标区域对应的物体的特性参数的数值,具体包括:
    确定所述目标区域对应的物体的类别,根据所述对应关系确定所述目标区域对应的物体的特性参数的数值。
  6. 根据权利要求5所述的方法,其特征在于,确定所述目标区域对应的物体的类别,包括:
    对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别;或
    所述其他传感器包括可见光传感器,确定所述可见光传感器采集的可见光图像各像素点对应的物体的类别,基于所述可见光图像中的各像素点对应的物体的类别以及所述可见光图像与所述红外图像的像素点映射关系确定所述目标区域对应的物体的类别。
  7. 根据权利要求6所述的方法,其特征在于,所述像素点映射关系基于所述红外传感器的内参数、所述可见光传感器的内参数、所述红外传感器和所述可见光传感器的位置关系确定。
  8. 根据权利要求6所述的方法,其特征在于,对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别之前,还包括:
    对所述红外图像进行预处理,所述预处理包括以下一种或多种:图像矫正、噪声去除、温度漂移补偿、对比度拉伸和细节增强。
  9. 根据权利要求2至8任一项所述的方法,所述特性参数包括发射率和/或反射率。
  10. 根据权利要求1至9任一项所述的方法,其特征在于,所述标定数据包括天气类型与大气透射率的数值的对应关系。
  11. 根据权利要求10所述的方法,其特征在于,所述温度补偿参数包括与所述红外传感器所处环境的大气透射率,所述红外传感器所述环境的大气透射率的数值基于所述对应关系与所述红外传感器所处环境的天气类型确定。
  12. 根据权利要求11所述的方法,其特征在于,所述其他传感器包括可见光传感器,所述天气类型基于所述可见光传感器采集的可见光图像确定。
  13. 根据权利要求11或12所述的方法,其特征在于,所述其他传感器还包括温度传感器和/或湿度传感器,所述天气类型基于所述温度传感器采集的环境温度和/或所述湿度传感器采集的环境湿度确定。
  14. 根据权利要求10-13任一项所述的方法,其特征在于,所述天气类型包括以下一种或多种:阴天、晴天、雨天、雾天。
  15. 根据权利要求1所述的方法,其特征在于,所述其他传感器包括距离传感器,所述温度补偿参数包括目标区域对应的物体到所述红外传感器的距离,所述目标区域对应的物体到所述红外传感器的距离基于所述距离传感器确定。
  16. 根据权利要求15所述的方法,其特征在于,所述距离传感器包括以下一种或多种:激光测距传感器、TOF、激光雷达、超声测距传感器和太赫兹测距传感器。
  17. 根据权利要求1所述的方法,其特征在于,根据所述温度补偿参数的数值确定所述目标区域对应的物体的温度,包括:
    根据所述目标区域的灰度值确定所述目标区域对应的物体的第一温度;
    将所述温度补偿参数的数值输入到对应的修正函数,得到修正系数;
    采用所述修正系数对所述第一温度进行校正,得到所述目标区域对应的物体的目标温度。
  18. 根据权利要求17所述的方法,其特征在于,所述温度补偿参数包括所述目标区域对应的物体的发射率,所述修正系数基于所述物体的发射率、环境湿度、所述物体到所述红外传感器的距离和/或所述红外传感器的响应差异参数确定。
  19. 一种红外图像处理装置,其特征在于,包括处理器、存储器、存储在所述存 储器上可被所述处理器执行的计算机指令,所述处理器执行所述计算机指令时,实现以下步骤:
    确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
    根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
  20. 根据权利要求19所述的装置,其特征在于,所述温度补偿参数包括以下一个或多个:所述红外传感器所处环境的大气透射率、所述目标区域对应的物体到所述红外传感器的距离、与所述目标区域对应的物体自身特性相关的特性参数。
  21. 根据权利要求19或20所述的装置,其特征在于,所述标定数据包括物体类别与特性参数的数值的对应关系。
  22. 根据权利要求21所述的装置,其特征在于,所述温度补偿参数包括与所述目标区域对应的物体自身特性相关的特性参数,所述目标区域对应的物体的特性参数的数值基于所述对应关系确定。
  23. 根据权利要求22所述的装置,其特征在于,所述处理器用于基于所述对应关系确定所述目标区域对应的物体的特性参数的数值时,具体用于:
    确定所述目标区域对应的物体的类别,根据所述对应关系确定所述目标区域对应的物体的特性参数的数值。
  24. 根据权利要求23所述的装置,其特征在于,所述处理器用于确定所述目标区域对应的物体的类别时,具体用于:
    对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别;或
    所述其他传感器包括可见光传感器,确定所述可见光传感器采集的可见光图像各像素点对应的物体的类别,基于所述可见光图像中的各像素点对应的物体的类别以及所述可见光图像与所述红外图像的像素点映射关系确定所述目标区域对应的物体的类别。
  25. 根据权利要求24所述的装置,其特征在于,所述像素点映射关系基于所述红外传感器的内参数、所述可见光传感器的内参数、所述红外传感器和所述可见光传感器的位置关系确定。
  26. 根据权利要求24所述的装置,其特征在于,所述处理器用于对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别之前,还用于:
    对所述红外图像进行预处理,所述预处理包括以下一种或多种:图像矫正、噪声去除、温度漂移补偿、对比度拉伸和细节增强。
  27. 根据权利要求20至26任一项所述的装置,所述特性参数包括发射率和/或反射率。
  28. 根据权利要求19至27任一项所述的装置,其特征在于,所述标定数据包括天气类型与大气透射率的数值的对应关系。
  29. 根据权利要求28所述的装置,其特征在于,所述温度补偿参数包括与所述红外传感器所处环境的大气透射率,所述红外传感器所述环境的大气透射率的数值基于所述对应关系与所述红外传感器所处述环境的天气类型确定。
  30. 根据权利要求29所述的装置,其特征在于,所述其他传感器包括可见光传感器,所述天气类型基于所述可见光传感器采集的可见光图像确定。
  31. 根据权利要求29或30所述的装置,其特征在于,所述其他传感器包括温度传感器和/或湿度传感器,所述天气类型基于所述温度传感器采集的环境温度和/或所述湿度传感器采集的环境湿度确定。
  32. 根据权利要求28-31任一项所述的装置,其特征在于,所述天气类型包括以下一种或多种:阴天、晴天、雨天、雾天。
  33. 根据权利要求19所述的装置,其特征在于,所述其他传感器包括距离传感器,所述温度补偿参数包括目标区域对应的物体到所述红外传感器的距离,所述目标区域对应的物体到所述红外传感器的距离基于所述距离传感器确定。
  34. 根据权利要求33所述的装置,其特征在于,所述距离传感器包括以下一种或多种:激光测距传感器、TOF、激光雷达、超声测距传感器和太赫兹测距传感器。
  35. 根据权利要求19所述的装置,其特征在于,所述处理器用于根据所述温度补偿参数的数值确定所述目标区域对应的物体的温度时,具体用于:
    根据所述目标区域的灰度值确定所述目标区域对应的物体的第一温度;
    将所述温度补偿参数的数值输入到对应的修正函数,得到修正系数;
    采用所述修正系数对所述第一温度进行校正,得到所述目标区域对应的物体的目标温度。
  36. 根据权利要求35所述的装置,其特征在于,所述温度补偿参数包括所述目标区域对应的物体的发射率,所述修正系数基于所述物体的发射率、环境湿度、所述物体到所述红外传感器的距离和/或所述红外传感器的响应差异参数确定。
  37. 一种红外图像处理系统,其特征在于,包括红传感器和红外图像处理装置, 所述红外图像处理装置包括处理器、存储器、存储在所述存储器上可被所述处理器执行的计算机指令,所述处理器执行所述计算机指令时,实现以下步骤:
    确定红外传感器采集的红外图像中的目标区域的温度补偿参数的数值,所述温度补偿参数的数值基于一个或多个传感器采集的数据和/或预先存储的所述温度补偿参数的标定数据确定;
    根据所述温度补偿参数的数值确定所述目标区域对应的物体的目标温度。
  38. 根据权利要求37所述的红外图像处理系统,其特征在于,所述温度补偿参数包括以下一个或多个:所述红外传感器所处环境的大气透射率、所述目标区域对应的物体到所述红外传感器的距离、与所述目标区域对应的物体自身特性相关的特性参数。
  39. 根据权利要求37所述的红外图像处理系统,其特征在于,所述标定数据包括天气类型与大气透射率的数值的对应关系。
  40. 根据权利要求37所述的红外图像处理系统,其特征在于,所述温度补偿参数包括所述红外传感器所处环境的大气透射率,所述大气透射率的数值基于所述对应关系和所述红外传感器所处环境的天气类型确定。
  41. 根据权利要求39或40所述的红外图像处理系统,其特征在于,还包括可见光传感器,用于采集可见光图像,所述天气类型基于所述可见光图像确定。
  42. 根据权利要求39-41任一项所述的红外图像处理系统,其特征在于,还包括温度传感器和/或湿度传感器,所述温度传感器用于采集环境温度,所述湿度传感器用于采集环境湿度,所述天气类型基于所述环境温度和/或环境湿度确定。
  43. 根据权利要求37所述的红外图像处理系统,其特征在于,所述标定数据包括物体类别与特性参数的数值的对应关系。
  44. 根据权利要求43所述的红外图像处理系统,其特征在于,所述温度补偿参数包括目标区域对应的物体的特性参数,所述目标区域对应的物体的特性参数的数值基于所述对应关系确定。
  45. 根据权利要求44所述的红外图像处理系统,其特征在于,所述处理器用于基于所述对应关系确定所述目标区域对应的物体的特性参数的数值时,具体用于:
    确定所述目标区域对应的物体的类别,根据所述对应关系确定所述目标区域对应的物体的特性参数的数值。
  46. 根据权利要求45所述的红外图像处理系统,其特征在于,还包括可见光传感器,用于采集可见光图像,所述处理器用于确定所述目标区域对应的物体的类别时,具体用于:
    确定所述可见光图像各像素点对应的物体的类别,基于所述可见光图像中的各像素点对应的物体的类别以及所述可见光图像与所述红外图像的像素点映射关系确定所述目标区域对应的物体的类别;或
    对所述红外图像进行目标检测或语义分割,确定所述目标区域对应的物体的类别。
  47. 根据权利要求37-46任一项所述的红外图像处理系统,其特征在于,还包括距离传感器,所述温度补偿参数包括所述目标区域对应的物体到所述红外传感器的距离,所述距离传感器用于确定所述距离。
  48. 根据权利要求47所述的红外图像处理系统,其特征在于,所述距离传感器包括以下一种或多种:激光测距传感器、TOF、激光雷达、超声测距传感器和太赫兹测距传感器。
  49. 根据权利要求37-48任一项所述的红外图像处理系统,其特征在于,所述处理器用于根据所述温度补偿参数的数值确定所述目标区域对应的物体的温度时,具体用于:
    根据所述目标区域的灰度值确定所述目标区域对应的物体的第一温度;
    将所述温度补偿参数的数值输入到对应的修正函数,得到修正系数;
    采用所述修正系数对所述第一温度进行校正,得到所述目标区域对应的物体的目标温度。
  50. 根据权利要求49所述的红外图像处理系统,其特征在于,所述温度补偿参数包括所述目标区域对应的物体的发射率,所述修正系数基于所述物体的发射率、环境湿度、所述物体到所述红外传感器的距离和/或所述红外传感器的响应差异参数确定。
  51. 一种可移动平台,其特征在于,所述可移动平台包括如权利要求37-50任一项所述的红外图像处理系统。
  52. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至18任一项红外图像处理方法。
PCT/CN2020/085914 2020-04-21 2020-04-21 红外图像处理方法、装置、系统及可移动平台 WO2021212319A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085914 WO2021212319A1 (zh) 2020-04-21 2020-04-21 红外图像处理方法、装置、系统及可移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085914 WO2021212319A1 (zh) 2020-04-21 2020-04-21 红外图像处理方法、装置、系统及可移动平台

Publications (1)

Publication Number Publication Date
WO2021212319A1 true WO2021212319A1 (zh) 2021-10-28

Family

ID=78271056

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085914 WO2021212319A1 (zh) 2020-04-21 2020-04-21 红外图像处理方法、装置、系统及可移动平台

Country Status (1)

Country Link
WO (1) WO2021212319A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114037633A (zh) * 2021-11-18 2022-02-11 南京智谱科技有限公司 一种红外图像处理的方法及装置
CN117387775A (zh) * 2023-12-12 2024-01-12 深圳市云帆自动化技术有限公司 电气设备红外测温与无线测温监测系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104155008A (zh) * 2014-07-28 2014-11-19 上海电力学院 一种红外温度监测系统测量误差修正方法
CN108731819A (zh) * 2017-04-18 2018-11-02 亚迪电子股份有限公司 热像检测装置
CN108775963A (zh) * 2018-07-27 2018-11-09 合肥英睿系统技术有限公司 受反射影响的红外测温修正方法、装置、设备及存储介质
US20180372547A1 (en) * 2017-06-22 2018-12-27 Asahi Kasei Kabushiki Kaisha Radiation temperature measuring device
CN109655162A (zh) * 2018-11-30 2019-04-19 诺仪器(中国)有限公司 红外热像仪测温校正系统及方法
CN109990907A (zh) * 2017-12-29 2019-07-09 宁波方太厨具有限公司 一种目标体的红外参数测定装置及测定方法
CN110307904A (zh) * 2019-08-07 2019-10-08 宁波舜宇红外技术有限公司 一种红外热成像镜头的测温装置及测温方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104155008A (zh) * 2014-07-28 2014-11-19 上海电力学院 一种红外温度监测系统测量误差修正方法
CN108731819A (zh) * 2017-04-18 2018-11-02 亚迪电子股份有限公司 热像检测装置
US20180372547A1 (en) * 2017-06-22 2018-12-27 Asahi Kasei Kabushiki Kaisha Radiation temperature measuring device
CN109990907A (zh) * 2017-12-29 2019-07-09 宁波方太厨具有限公司 一种目标体的红外参数测定装置及测定方法
CN108775963A (zh) * 2018-07-27 2018-11-09 合肥英睿系统技术有限公司 受反射影响的红外测温修正方法、装置、设备及存储介质
CN109655162A (zh) * 2018-11-30 2019-04-19 诺仪器(中国)有限公司 红外热像仪测温校正系统及方法
CN110307904A (zh) * 2019-08-07 2019-10-08 宁波舜宇红外技术有限公司 一种红外热成像镜头的测温装置及测温方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114037633A (zh) * 2021-11-18 2022-02-11 南京智谱科技有限公司 一种红外图像处理的方法及装置
CN114037633B (zh) * 2021-11-18 2022-07-15 南京智谱科技有限公司 一种红外图像处理的方法及装置
CN117387775A (zh) * 2023-12-12 2024-01-12 深圳市云帆自动化技术有限公司 电气设备红外测温与无线测温监测系统
CN117387775B (zh) * 2023-12-12 2024-02-20 深圳市云帆自动化技术有限公司 电气设备红外测温与无线测温监测系统

Similar Documents

Publication Publication Date Title
CN110988912B (zh) 自动驾驶车辆的道路目标与距离检测方法、系统、装置
CN109961468B (zh) 基于双目视觉的体积测量方法、装置及存储介质
de Moraes Frasson et al. Three-dimensional digital model of a maize plant
TW202004560A (zh) 目標檢測系統、自動駕駛車輛以及其目標檢測方法
Tatoglu et al. Point cloud segmentation with LIDAR reflection intensity behavior
WO2021212319A1 (zh) 红外图像处理方法、装置、系统及可移动平台
CN112379352B (zh) 一种激光雷达标定方法、装置、设备及存储介质
CN112562093B (zh) 目标检测方法、电子介质和计算机存储介质
AU2011226732A1 (en) Sensor data processing
CN112257692A (zh) 一种行人目标的检测方法、电子设备及存储介质
CN113205604A (zh) 一种基于摄像机和激光雷达的可行区域检测方法
CN114155501A (zh) 一种无人车在烟雾遮挡环境下的目标检测方法
CN114089329A (zh) 一种基于长短焦相机与毫米波雷达融合的目标检测方法
CN111239684A (zh) 一种基于YoloV3深度学习的双目快速距离测量方法
CN114399882A (zh) 一种用于消防机器人的火源探测识别并预警方法
WO2020027167A1 (en) System, method, and non-transitory, computer-readable medium containing instructions for image processing
Chen et al. A novel calibration method between a camera and a 3D LiDAR with infrared images
CN113327296A (zh) 基于深度加权的激光雷达与相机在线联合标定方法
CN115376109A (zh) 障碍物检测方法、障碍物检测装置以及存储介质
CN113256493B (zh) 一种热红外遥感图像重建方法和装置
CN117215316A (zh) 基于协同控制与深度学习的驾驶环境感知的方法和系统
CN115937325B (zh) 一种结合毫米波雷达信息的车端摄像头标定方法
CN115656991A (zh) 车辆外参标定方法、装置、设备以及存储介质
WO2021258282A1 (zh) 目标检测设备及方法、成像装置和可移动平台
CN114529884A (zh) 基于双目相机的障碍物探测处理方法、装置、设备及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932055

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932055

Country of ref document: EP

Kind code of ref document: A1