CN111982305A - Temperature measuring method, device and computer storage medium - Google Patents

Temperature measuring method, device and computer storage medium Download PDF

Info

Publication number
CN111982305A
CN111982305A CN202010880084.8A CN202010880084A CN111982305A CN 111982305 A CN111982305 A CN 111982305A CN 202010880084 A CN202010880084 A CN 202010880084A CN 111982305 A CN111982305 A CN 111982305A
Authority
CN
China
Prior art keywords
sub
region
area
temperature measurement
temperature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010880084.8A
Other languages
Chinese (zh)
Inventor
李佳
王小东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010880084.8A priority Critical patent/CN111982305A/en
Publication of CN111982305A publication Critical patent/CN111982305A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Radiation Pyrometers (AREA)

Abstract

The application discloses a temperature measuring method, a temperature measuring device and a computer storage medium, and belongs to the technical field of monitoring. The method comprises the following steps: according to the thermal radiation data of the monitoring area, determining a thermal imaging image of the monitoring area, acquiring a local temperature measurement parameter corresponding to the first sub-area, and according to the local temperature measurement parameter and a thermal radiation value of a shooting point corresponding to a pixel point in the first sub-area, determining the temperature information of the first sub-area. The physical environment characteristics of the first sub-region often differ from those of the whole monitoring region greatly, and the local temperature measurement parameters are related to those of the first sub-region, so that in the embodiment of the application, for any first sub-region, the temperature information of the first sub-region is determined based on the local temperature measurement parameters of the first sub-region, and the problem of low accuracy of the temperature information of the first sub-region caused by the fact that all the first sub-regions in the prior art adopt the same set of fixed temperature measurement parameters is solved.

Description

Temperature measuring method, device and computer storage medium
Technical Field
The present disclosure relates to monitoring technologies, and in particular, to a temperature measuring method and apparatus, and a computer storage medium.
Background
The terminal may convert the invisible infrared energy radiated by the object into a visible thermal imaging image based on thermal imaging techniques. Different colors in the thermal imaging image are used for representing different thermal radiation values, and the infrared energy radiated by the object is related to the temperature of the object, so that the temperature of the object in the monitoring area can be visually monitored through the thermal imaging image. If the temperature information determined by the thermal imaging image is not accurate enough, an erroneous monitoring result may result, and therefore, when the temperature of the monitored area is monitored, the accuracy of the temperature information determined based on the thermal imaging image needs to be improved.
Disclosure of Invention
The embodiment of the application provides a temperature measurement method, a temperature measurement device and a computer storage medium, which can improve the accuracy of temperature measurement. The technical scheme is as follows:
in one aspect, a temperature measurement method is provided, the method including:
determining a thermal imaging image of a monitoring area according to thermal radiation data of the monitoring area, wherein pixel values of pixel points in the thermal imaging image are used for indicating thermal radiation values of shooting points corresponding to the pixel points;
acquiring a local temperature measurement parameter corresponding to a first subregion in the thermal imaging image, wherein the local temperature measurement parameter is related to physical environment characteristics of the first subregion, and the first subregion is any region in the thermal imaging image;
and determining the temperature information of the first sub-area according to the local temperature measurement parameters and the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area.
Optionally, the determining the temperature information of the first sub-region according to the local temperature measurement parameter and the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-region includes:
converting the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameter to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area;
and determining temperature information of the first sub-area according to the temperature of a shooting point corresponding to the pixel point in the first sub-area, wherein the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
Optionally, after determining the temperature information of the first sub-region according to the local temperature measurement parameter and the pixel value of the pixel point in the first sub-region, the method further includes:
displaying temperature information of the first sub-area in the first sub-area, wherein the temperature information comprises one or more of a lowest temperature, a highest temperature and an average temperature.
Optionally, the obtaining of the local temperature measurement parameter corresponding to the first sub-region includes:
acquiring sub-region temperature measurement configuration information, wherein the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions, and the plurality of second sub-regions are a plurality of sub-regions in the monitoring region;
determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas;
and determining the local temperature measurement parameter of the second sub-region as the local temperature measurement parameter corresponding to the first sub-region.
Optionally, the method further includes:
detecting a region setting instruction, wherein the region setting instruction carries the sub-region temperature measurement configuration information;
and storing the temperature measurement configuration information of the sub-region.
Optionally, the acquiring the local temperature measurement parameter corresponding to the first sub-region includes:
and under the condition that a region updating instruction for a third sub-region in the thermal imaging image is detected, determining the updated third sub-region as the first sub-region, and determining a local temperature measurement parameter corresponding to the third sub-region as a local temperature measurement parameter corresponding to the first sub-region.
Optionally, the physical environment characteristics include an ambient temperature, an ambient humidity, a reflection temperature, a reflectivity, and a distance between the thermal imaging lens and a shooting target in the shooting area corresponding to the first sub-area, and one or more of a depth of field of the shooting area.
In another aspect, there is provided a temperature measuring device, the device comprising:
the system comprises a first determining module, a second determining module and a control module, wherein the first determining module is used for determining a thermal imaging image of a monitoring area according to thermal radiation data of the monitoring area, and pixel values of pixel points in the thermal imaging image are used for indicating thermal radiation values of shooting points corresponding to the pixel points;
an obtaining module, configured to obtain a local temperature measurement parameter corresponding to a first sub-region in the thermal imaging image, where the local temperature measurement parameter is related to a physical environment characteristic of the first sub-region, and the first sub-region is any region in the thermal imaging image;
and the second determining module is used for determining the temperature information of the first sub-area according to the local temperature measurement parameters and the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area.
Optionally, the second determining module includes:
the conversion sub-module is used for converting the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameter to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area;
the first determining submodule is used for determining temperature information of the first sub-area according to the temperature of a shooting point corresponding to a pixel point in the first sub-area, and the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
Optionally, the apparatus further comprises:
the display module is used for displaying the temperature information of the first sub-area in the first sub-area, wherein the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
Optionally, the obtaining module includes:
the acquisition submodule is used for acquiring sub-region temperature measurement configuration information, wherein the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions, and the plurality of second sub-regions are a plurality of sub-regions in the monitoring region;
the second determining submodule is used for determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas;
and the third determining submodule is used for determining the local temperature measurement parameter of the second sub-area as the local temperature measurement parameter corresponding to the first sub-area.
Optionally, the apparatus further comprises:
the detection module is used for detecting a region setting instruction, and the region setting instruction carries the sub-region temperature measurement configuration information;
and the storage module is used for storing the temperature measurement configuration information of the sub-region.
Optionally, the obtaining module is configured to:
and under the condition that a region updating instruction for a third sub-region in the thermal imaging image is detected, determining the updated third sub-region as the first sub-region, and determining a local temperature measurement parameter corresponding to the third sub-region as a local temperature measurement parameter corresponding to the first sub-region.
Optionally, the physical environment characteristics include one or more of an ambient temperature, an ambient humidity, a reflection temperature, a reflectivity, a distance between the thermal imaging lens and a shooting target in the shooting area, and a depth of the shooting area corresponding to the first sub-area.
In another aspect, a temperature measurement device is provided, the temperature measurement device comprising a processor, a memory;
wherein the memory is used for storing computer programs;
the processor is used for executing the program stored in the memory so as to realize the temperature measuring method.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the temperature measurement method as provided above.
In another aspect, a computer program product is provided comprising instructions which, when run on a computer, cause the computer to perform the steps of the temperature measurement method provided in the foregoing.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
because the physical environment characteristic of the first sub-region and the physical environment characteristic of the whole monitoring region are often greatly different, and the local temperature measurement parameter is related to the physical environment characteristic of the first sub-region, in the embodiment of the present application, for any first sub-region, the temperature information of the first sub-region is determined based on the local temperature measurement parameter of the first sub-region, so that the accuracy of the determined temperature information can be improved by the method provided by the embodiment of the present application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a temperature measurement system according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of a temperature measurement method provided by an embodiment of the present application;
FIG. 3 is a flow chart of a method of generating thermal images provided by an embodiment of the present application;
fig. 4 is a schematic diagram of a terminal display interface provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of another terminal display interface provided in an embodiment of the present application;
FIG. 6 is a flowchart of a method for determining a thermometry parameter according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of another terminal display interface provided in an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a temperature measuring device according to an embodiment of the present disclosure;
fig. 9 is a block diagram of a terminal according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiments of the present application in detail, an application scenario related to the embodiments of the present application will be described.
All objects in nature, whether arctic glaciers, flames, human bodies or even extremely cold universe deep space, have infrared radiation as long as the temperature of the objects is higher than absolute zero. This is due to the thermal movement of molecules within the object. The radiation energy is proportional to the fourth power of the temperature of the solar panel, and the radiated wavelength is inversely proportional to the temperature of the solar panel. The thermal imaging technology is that according to the detected radiation energy of the object, the radiation energy is converted into a thermal imaging image of the whole monitored area through system processing, and the thermal imaging image is displayed by a gray scale image or a pseudo color image. Namely, the temperature distribution of the whole monitored area is obtained, so that the temperature states of all objects in the whole monitored area are judged.
In order to monitor the temperature of the whole monitored area more intuitively, temperature information of the whole monitored area, such as the maximum temperature, the minimum temperature, the average temperature, and the like, may be generally superimposed on the already generated thermal imaging image. When the user pays special attention to a local area of the whole monitoring area, the temperature of the local area can be monitored according to the temperature data of the local area. However, in the process of calculating the temperature data of each local area, the system often adopts uniform temperature measurement parameters, such as humidity, reflectivity, and the like. This results in a large error in the acquired temperature data of the local area. The temperature measuring method provided by the embodiment of the application is applied to the scene so as to improve the accuracy of determining the temperature information.
Next, a system architecture related to the temperature measurement method provided in the embodiment of the present application is described.
Fig. 1 is an architecture diagram of a temperature measurement system according to an embodiment of the present disclosure. As shown in fig. 1, the system 100 includes a thermal imaging device 101, a terminal 102. The thermal imaging apparatus 101 and the terminal 102 are connected for communication by wireless or wired means.
The thermal imaging device 101 may be deployed with a thermal imaging lens and a temperature sensing chip, the thermal imaging device 101 may collect each object thermal radiation data in the target area based on the deployed thermal imaging lens and the temperature sensing chip, and then the thermal imaging device 101 transmits the collected each object thermal radiation data to the terminal 102 through a network. The target area is any area needing temperature monitoring. The terminal 102 is equipped with thermal imaging thermometry software that converts the thermal radiation data of each object in the target area received by the terminal into a thermal imaging image. The thermal imaging temperature measurement software can convert thermal radiation data of each object in a target area received by the terminal into a thermal imaging image on line, and at the moment, the thermal imaging temperature measurement software can also be called thermal imaging on-line temperature measurement software. Optionally, the thermal imaging temperature measurement software may also convert, offline, thermal radiation data of each object in the target area received by the terminal into a thermal imaging image, which is not limited in the embodiment of the present application.
In addition, the thermal imaging temperature measurement software can also determine temperature information in the thermal imaging image based on the temperature measurement method provided by the embodiment of the application, and render the thermal imaging image and the temperature information in a preview window of the terminal through a rendering engine so as to realize temperature monitoring of the target area.
Furthermore, the thermal imaging device 101 may be an infrared detector, or may be another type of device that may be used to obtain thermal imaging images. The terminal 102 may be a tablet computer, a desktop computer, a mobile phone, or the like. The network for transmitting the thermal radiation data may be an ethernet network, or may be another type of network, which is not limited in this embodiment of the application.
Fig. 1 illustrates only 2 thermal imaging devices 101 and 1 terminal 102 by way of example, and does not limit the temperature measurement system provided by the embodiments of the present application.
The temperature measurement method provided in the embodiments of the present application will be explained in detail below.
Fig. 2 is a flowchart of a temperature measurement method according to an embodiment of the present application, where the method is applied to the terminal shown in fig. 1. Referring to fig. 2, the method includes the following steps.
Step 201: and determining a thermal imaging image of the monitoring area according to the thermal radiation data of the monitoring area, wherein the pixel value of the pixel point in the thermal imaging image is used for indicating the thermal radiation value of the shooting point corresponding to the pixel point.
Wherein the thermal radiation data is indicative of thermal radiation energy emitted by the object and is proportional to a surface temperature of the object. Thermal radiation data was acquired by the thermal imaging device of fig. 1. Since thermal radiation data is used for subsequent temperature measurements, thermal radiation data may also be referred to as temperature bare data.
In a possible implementation manner, the possible implementation procedures of step 201 are: the terminal sends a temperature detection instruction to the thermal imaging device, and the thermal imaging device responds to the temperature detection instruction, acquires thermal radiation data of each object in the monitoring area and transmits the thermal radiation data to the terminal. And the terminal converts the thermal radiation data of each object in the target area into a thermal imaging image according to the thermal imaging image temperature algorithm model. Wherein the thermographic image temperature algorithm model is used to convert thermal radiation data into a thermographic image.
In another possible implementation manner, in order to ensure the integrity of the thermal imaging image of the monitored area, the possible implementation procedures of step 201 are: the terminal sends a temperature detection instruction to the thermal imaging device, the thermal imaging device responds to the temperature detection instruction, thermal radiation data of each object in the monitoring area are obtained, the thermal radiation data are verified, and after verification is successful, the thermal radiation data are transmitted to the terminal. And the terminal converts the thermal radiation data of each object in the target area into a thermal imaging image according to the thermal imaging image temperature algorithm model. Wherein the thermographic algorithm image temperature model is used to convert the thermal radiation data into a thermographic image.
The heat radiation data are verified to ensure that the current heat radiation data are the complete heat radiation data of each object in a corresponding frame of picture, and the phenomenon that the heat radiation data are incomplete due to network instability is avoided.
The verification of the heat radiation data described above is performed by way of example with a thermal imaging apparatus. Alternatively, the thermal radiation data may be verified by the terminal. Under the scene, the thermal imaging equipment responds to a temperature detection instruction sent by the terminal, obtains thermal radiation data of each object in the monitoring area, and sends the thermal radiation data to the terminal. The terminal checks the thermal radiation data to determine whether the thermal imaging device has finished transmitting the complete thermal radiation data for one frame of picture. When the fact that the complete thermal radiation data of one frame of picture is received currently is determined, the fact that the transmission of the complete thermal radiation data of the current thermal imaging equipment for one frame of picture is completed is indicated, at the moment, the terminal can determine that verification is successful, and the terminal converts the thermal radiation data of all objects in the target area into thermal imaging images according to a thermal imaging image temperature algorithm model. If the verification is unsuccessful, the operation of converting the thermal radiation data into the thermal imaging image is not executed, and the terminal continues to wait for receiving the thermal radiation data sent by the thermal imaging equipment until the verification is successful.
In addition, in order to ensure the security of the terminal information, before the temperature detection instruction is generated, the user name and the password input by the user or the correctness of the user information of the current terminal IP address can be verified, and whether the temperature detection instruction for the user to obtain the thermal imaging image is generated or not can be judged based on the correctness of the information.
The thermal imaging image temperature algorithm model has the specific functions as follows: the configuration parameters and thermal radiation data input by a user are used as input of the thermal imaging image temperature algorithm model, the thermal imaging image temperature algorithm model can output a thermal imaging image and global temperature information through a series of processes, and the global temperature information can comprise one or more of the highest temperature, the lowest temperature or the average temperature of the whole monitoring area. The configuration parameters input by the user at least comprise: global thermometry parameters, pseudo-color parameters, alarm parameters, and the like. The global thermometric parameters may be determined by a user in advance based on the average physical environmental characteristics of the monitored area, or based on the physical environmental characteristics of some interested areas in the monitored area. The physical environment characteristics will be explained in detail later, and will not be explained first. The pseudo-color parameters include a series of parameters for generating a color image. The alarm parameters include an upper temperature limit for alarming, and the like.
As shown in fig. 3, the specific processing procedure of the thermal imaging image temperature algorithm model can be realized by the following steps L1 to L6.
Step L1: the heat radiation data is converted into grayscale data.
In one possible implementation, the thermal radiation values included in the thermal radiation data are first converted into grayscale data for generating a thermal imaging image according to the thermal radiation values of the respective objects and a mapping table of the thermal radiation values and the grayscale data.
Step L2: a gray scale map is generated based on the gray scale data, and the contrast of the gray scale map is adjusted.
In one possible implementation, the gray scale image may be adjusted to a suitable contrast by performing an automatic gain balancing process. Contrast is used to indicate the degree of contrast between portions of an image, and generally, the higher the contrast, the better the visual effect.
Step L3: and converting the gray scale data corresponding to the adjusted gray scale map into color data.
In one possible implementation, the grayscale data is converted to color data for generating a thermographic image based on the grayscale data and a mapping of the grayscale data to the color data. The gray data can be generated into different types of color data according to different types of pseudo color modes set by a user. For example, the pseudo color mode may be a white-hot mode, a rainbow mode, a red-iron mode, or the like.
Step L4: the color data is converted into a thermographic image.
In one possible implementation, the color data is converted into a thermographic image based on a mapping between the color data and pixel values of individual colors in the color spectrum.
Step L5: and determining the temperature information of the whole monitoring area according to the global temperature measurement parameters, and displaying the temperature information of the whole monitoring area in the thermal imaging image.
The temperature information may include one or more of a highest temperature, a lowest temperature, and an average temperature. Wherein, the temperature information of the whole monitoring area is also called full screen temperature information.
Step L6: the thermographic images are subjected to two-photon image fusion and a final thermographic image is generated.
The double-light image fusion is used for enabling the boundary outline of the thermal imaging image to be clearer, so that the visual effect of the thermal imaging image is improved.
It should be noted that the thermal image obtained through the steps L1-L6 also includes temperature information of the whole monitored area. Alternatively, in the above steps L1-L6, the thermal imaging image temperature algorithm model may not determine the global temperature information for the whole monitored area first, but determine the temperature information of a certain local area through the following steps 202-203. In addition, when the temperature information of the local region is determined through the following steps 202-203, the thermal imaging image temperature algorithm model may also determine the temperature information of other regions according to the global thermometry parameters for other regions except the local region, and will not be described in detail here.
In addition, in the temperature measurement system shown in fig. 1, the thermal imaging device may only acquire thermal radiation data in one picture, and then the thermal radiation data is processed by the terminal to realize temperature monitoring. Alternatively, the thermal imaging device may continuously acquire thermal radiation data in a plurality of frames, and then transmit the thermal radiation data in the plurality of frames to the terminal in a data stream manner, so that the terminal performs temperature measurement based on the thermal radiation data in the plurality of frames respectively. At this time, the heat radiation data in each sub-screen is targeted. The terminal may generate a thermal image and global temperature information through step 201, and render the thermal image and the global temperature information in a preview window. In this scenario, for any one frame of thermal imaging image displayed in the preview window, the temperature information of a certain local area in the current frame of thermal imaging image can be determined through the following steps 202 and 203. That is, in the case where the thermal imaging apparatus continuously acquires thermal radiation data in a plurality of frames, the determined thermal imaging image of the monitoring area may refer to a thermal imaging image corresponding to the plurality of frames, where a sub-area of the thermal imaging image corresponding to the plurality of frames may be determined by the sub-area determined in any one frame of the thermal imaging image without a positional change occurring during acquisition of the thermal radiation data by the thermal imaging apparatus.
Step 202: and acquiring a local temperature measurement parameter corresponding to the first subregion in the thermal imaging image, wherein the local temperature measurement parameter is related to the physical environment characteristic of the first subregion.
The physical environment characteristics comprise one or more of the ambient temperature, the ambient humidity, the reflection temperature, the reflectivity of the shooting area corresponding to the first sub-area, the distance between the thermal imaging lens and a shooting target in the shooting area, and the depth of view of the shooting area. However, the physical environment characteristics of the local area often differ greatly from the physical environment characteristics of the whole monitoring area, and in the process of calculating the temperature data of the local area, if the local temperature measurement parameters of the local temperature measurement area are adopted, the acquired temperature data of the local area can be more accurate.
The distance between the thermal imaging lens and a shooting target in the shooting area is configured in advance. For example, when temperature monitoring is required for a stationary object in the shooting area, the distance between the thermal imaging lens and the stationary object may be specified in advance as the distance between the thermal imaging lens and the shooting object. Alternatively, if the photographic subject is a moving subject in the photographic region, the distance between the aforementioned thermal imaging lens and the photographic subject in the photographic region at this time is set after estimation by the user in advance.
The depth of field of the shooting area is a spatial range in which the front and rear of the shooting area can be within the shooting range of the camera. In the process of acquiring thermal radiation data by the thermal imaging lens, the accuracy of the acquired thermal radiation data is affected by different depths of field. For example, when the temperature is monitored in an indoor area, the thermal radiation data collected by the thermal imaging lens is more accurate due to the small depth of field of the shooting area. When the outdoor area is used for temperature monitoring, the thermal radiation data acquired by the thermal imaging lens has larger deviation due to the larger depth of field of the shooting area. Therefore, the depth of field of the shooting area can be taken into account when determining the local thermometry parameters.
It should be noted that, in the process of converting the thermal radiation value into the temperature, a temperature measurement parameter needs to be considered, and the temperature measurement parameter refers to a factor that can affect the magnitude of the thermal radiation value corresponding to the temperature. For example, the conversion relationship between the thermal emission value and the temperature can be identified by the following functional relationship:
Y=f(a,X);
wherein, Y is used for indicating temperature, X is used for indicating heat radiation value, a is used for indicating temperature measurement parameter. Obviously, for the same temperature, when the temperature measurement parameters are different, the corresponding thermal radiation values of the same temperature will be different. Or, the temperatures corresponding to the same thermal radiation values are different under different temperature measurement parameters.
The thermometric parameters are usually related to the physical environment where the thermal radiation data acquisition site is located, and therefore will be different for different physical environments. For example, for two shot points in the monitored area: shot point 1 and shot point 2. The reflectivity and the reflection temperature of the shooting point 1 are greater than those of the shooting point 2, the temperature of the shooting point 1 is greater than that of the shooting point 2, but the actually collected heat radiation value of the shooting point 1 and the actually collected heat radiation value of the shooting point 2 are likely to be the same. In this scenario, if the same temperature measurement parameters are used to convert the thermal radiation value of the shot point 1 and the thermal radiation value of the shot point 2, the temperature of the shot point 1 and the temperature of the shot point 2 obtained by conversion are the same. Obviously, in this scenario, there is an error between the measured temperature and the actual temperatures of the shot point 1 and the shot point 2.
Therefore, when determining the temperature information of the first sub-region, the temperature measurement parameter of the first sub-region itself needs to be considered. Generally, the physical environment characteristic of the first sub-area may be greatly different from the average physical environment characteristic of the whole monitored area, or the physical environment characteristics of a plurality of different sub-areas are also greatly different, so that the influence of the difference of the physical environment characteristics of each sub-area on the radiation value of the sub-area needs to be considered, and the accuracy of the determined temperature of each sub-area is further improved.
The local temperature measurement parameter of the first sub-region may also be referred to as a temperature measurement parameter in an expert temperature measurement mode for the first sub-region. Therefore, the expert mode for independently measuring the temperature of any local area is provided, so that independent expert temperature measurement parameters can be provided for different local areas, and the temperature measurement accuracy is improved. In addition, in the embodiment of the present application, the local region in the thermal imaging image, where the temperature needs to be measured by the expert mode alone, may also be referred to as a temperature measurement rule.
It is noted that the first sub-area may be one or more of a point, a line and a plane. As shown in fig. 4, the plurality of first sub-areas in the entire monitoring area are: a first subregion 1, a first subregion 2, a first subregion 3 and a first subregion 4. The first sub-regions 1 and 3 are planar, the first sub-regions 4 are dot-shaped, and the first sub-regions 2 are linear. Therefore, in the embodiment of the application, the user can change the size and the position of the first sub-area at any time to track the area which is most concerned by the user, so that the efficiency of the temperature measuring behavior is improved. Besides, the monitoring area can be a plane area, a point or a line segment, so that the monitoring area is not limited to the plane area, and the selectivity of a user is increased.
In order to facilitate rapid determination of the local temperature measurement parameters of the first sub-regions, the monitoring region may be divided into a plurality of second sub-regions in advance. For example, as shown in fig. 5, the monitoring area is divided into four second sub-areas, that is, four rectangular areas in fig. 5, and the local temperature measurement parameters of the four rectangular areas are set in advance. When a certain first sub-area falls into the range of the second sub-area, the local temperature measurement parameter of the second sub-area configured in advance is used as the local temperature measurement parameter of the first sub-area.
In a possible implementation manner, the possible implementation manner in which the terminal presets the local temperature measurement parameters of the second sub-region is as follows: the user triggers a region setting instruction based on preset operation, a terminal detection region setting instruction carries sub-region temperature measurement configuration information. The sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions. And the terminal stores the temperature measurement configuration information of the area. Wherein the boundary information of the second sub-area may also be referred to as a temperature rule of the second sub-area.
The plurality of second sub-areas are a plurality of sub-areas in the monitoring area.
For example, when a user needs to set a local temperature measurement parameter, the user triggers a region setting instruction to the terminal through a preset operation, and when the terminal detects the region setting instruction, the temperature measurement configuration information of the sub-region is stored, where the temperature measurement configuration information includes boundary information of a plurality of second sub-regions and the local temperature measurement parameter of the plurality of second sub-regions. The preset operation may be a click operation, a slide operation, or the like.
Thus, in one possible implementation, then step 202 may be implemented as: under the condition that a temperature detection instruction based on a first sub-region in a thermal imaging image is detected, a terminal acquires pre-stored sub-region temperature measurement configuration information, wherein the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions; determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas; and determining the local temperature measurement parameter of the second sub-region as the local temperature measurement parameter corresponding to the first sub-region.
Wherein the temperature detection instruction may be triggered by a user during the process of constructing the first sub-region in the thermographic image of the monitored region. The process of constructing the first sub-area by the user can be realized by means of a third-party drawing tool.
For example, as shown in fig. 4, the monitoring area may be divided into four sub-areas in advance, where the four sub-areas are also 4 second sub-areas, which are respectively a second sub-area 1, a second sub-area 2, a second sub-area 3, and a second sub-area 4. The temperature measurement configuration information of the 4 second sub-regions is obtained in advance, and includes the boundary information (upper left region) and the local temperature measurement parameter a of the second sub-region 1, the boundary information (upper right region) and the local temperature measurement parameter b of the second sub-region 2, the boundary information (lower left region) and the local temperature measurement parameter c of the second sub-region 3, and the boundary information and the local temperature measurement parameter d of the second sub-region 4 (lower right region). When the terminal detects a temperature detection instruction, assuming that the temperature detection instruction is directed at an oval area in the second sub-area 1 in fig. 4, taking the oval area as the first sub-area, and taking the local temperature measurement parameter of the second sub-area 1 as the local temperature measurement parameter of the first sub-area; assuming that the temperature detection instruction is directed to a linear region in the second sub-region 2 in fig. 4, taking the linear region as the first sub-region, and taking the local temperature measurement parameter of the second sub-region 2 as the local temperature measurement parameter of the first sub-region; assuming that the temperature detection instruction is directed to a triangular area in the second sub-area 3 in fig. 4, the triangular area is taken as the first sub-area, and the local temperature measurement parameter of the second sub-area 3 is taken as the local temperature measurement parameter of the first sub-area.
In another possible implementation manner, the possible implementation manner of step 202 is: the first sub-region and the local temperature measurement parameters of the first sub-region are preset, and at the moment, after the thermal imaging image is obtained, the local temperature measurement parameters corresponding to the first sub-region in the thermal imaging image are directly obtained. In this implementation manner, after the thermal imaging image is acquired, the temperature information of the preset first sub-area is directly displayed through step 202 and step 203 without triggering a temperature detection instruction by a user.
In addition, in the two implementation manners, the local temperature measurement parameters corresponding to the first sub-region are preset. If the boundary information and the local temperature measurement parameters of the plurality of second sub-regions are not configured in advance, the local temperature measurement parameters corresponding to the first sub-region can be directly obtained according to the setting of a user after the first sub-region is determined. At this time, as shown in fig. 6, the step 202 may be implemented by the step B1 to the step B3.
Step B1: a first sub-region is constructed.
In one possible implementation, the user determines the boundary characteristics of the first sub-region in the thermal imaging image in step 201 by means of a drawing tool installed at the terminal, so that the first sub-region can be obtained. The boundary feature may be a point, line or plane.
For example, in step B1, in the case where a region selection instruction for the thermographic image is detected, the first sub-region is determined according to the region selection instruction. The region selection instruction is triggered by the user.
Step B2: and determining the local temperature measurement parameters of the first sub-area.
The local temperature measurement parameters of the first sub-region may be obtained by the terminal by means of other auxiliary devices, or may be input by the user at the terminal.
In a possible implementation manner, in step B2, when the terminal detects a temperature measurement parameter configuration instruction for the first sub-region, the terminal acquires a local temperature measurement parameter corresponding to the first sub-region according to the temperature measurement parameter configuration instruction. The temperature measurement parameter configuration instruction is also triggered by a user.
Step B3: and saving the local temperature measurement parameters of the first subregion.
Based on the steps B1-B3, after determining the first sub-region, the user temporarily acquires the local temperature measurement parameter of the first sub-region, so that the acquired temperature measurement parameter more matches the physical environment characteristic of the current first sub-region, and the subsequently determined temperature of the first sub-region is more accurate.
In addition, as shown in the foregoing, the terminal may display each picture acquired by the thermal imaging device frame by frame. In this scenario, after the steps B1-B3 are performed to determine the first sub-region and the local temperature measurement parameter of the first sub-region in a certain frame of thermal imaging image, when another frame of thermal imaging image is displayed subsequently, the temperature measurement of the first sub-region at the same position in the other frame of thermal imaging image may be performed continuously using the local temperature measurement parameters of the first sub-region and the first sub-region. At this time, the user may also update a sub-region that has already been constructed while displaying other frames of the thermographic image. In this scenario, the terminal needs to update the boundary information of the stored first sub-region to obtain the updated first sub-region in the current frame thermal imaging image, and uses the previously stored local temperature measurement parameter of the first sub-region as the updated local temperature measurement parameter of the first sub-region.
Specifically, for convenience of subsequent description, the constructed first sub-region is referred to as a third sub-region, and at this time, when a region update instruction for the third sub-region in the thermal imaging image in step 201 is detected, the updated third sub-region is determined as the first sub-region to be subsequently subjected to temperature monitoring, and the local temperature measurement parameter corresponding to the third sub-region is determined as the local temperature measurement parameter of the reconstructed first sub-region. The region update instruction for the third sub-region may be triggered by a zoom operation of the user for the third sub-region. The third sub-area may be a local area determined by the terminal based on the thermal imaging image displayed before the thermal imaging image in step 201. In the case of determining the thermal imaging images corresponding to the thermal radiation data of the consecutive multiple frames, the third sub-region may be a local region determined by the terminal based on any one frame of the thermal imaging images corresponding to the thermal radiation data of the consecutive multiple frames, and the local region may be a sub-region in the thermal imaging images corresponding to the thermal radiation data of the consecutive multiple frames.
In addition, after determining the first sub-region and the local thermometry parameters of the first sub-region through steps B1-B3, the user may also update the saved local thermometry parameters of the first sub-region. In this scenario, after the terminal determines the thermal imaging image, and under the condition that a temperature measurement parameter configuration instruction for the first sub-region is detected, the updated local temperature measurement parameter of the first sub-region is determined according to the temperature measurement parameter configuration instruction. The temperature measurement parameter configuration instruction for the first sub-region can be triggered by a user through a temperature measurement parameter input window on the terminal.
In addition, if the user updates the local parameter of the first sub-area and/or the first sub-area in the above manner after the step 203 is executed, in this scenario, the terminal re-determines the temperature information of the first sub-area according to the updated local parameter of the first sub-area and/or the first sub-area based on the step 203, and re-renders the updated temperature information of the first sub-area and/or the first sub-area in the thermal imaging image of the preview window to implement re-rendering of the thermal imaging image.
Step 203: and determining the temperature information of the first sub-area according to the local temperature measurement parameters and the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area.
In order to facilitate a user to more intuitively acquire temperature data of the monitored area, after the thermal imaging image is generated in step 201, temperature information of the first sub-area is determined based on the local temperature measurement parameter of the first sub-area determined in step 202, and then the temperature characteristics of the current local area can be briefly summarized through the temperature information. Wherein the temperature information may include one or more of a lowest temperature, a highest temperature, and an average temperature. Optionally, the temperature information may further include a temperature of a shooting point corresponding to at least one pixel point in the first sub-area, for example, a temperature of a shooting point corresponding to a pixel point at a key position. The at least one pixel point may be designated by a user or may be preconfigured by a terminal, which is not limited in this application embodiment.
In one possible implementation manner, the possible implementation procedures of step 203 are: converting the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameters to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area; and determining the temperature information of the first sub-area according to the temperature of the shooting point corresponding to the pixel point in the first sub-area.
For example, the thermal radiation values of the local temperature measurement parameters and the shooting points corresponding to the pixel points in the first sub-region can be converted into the temperatures of the shooting points corresponding to the pixel points in the first sub-region through the thermal imaging image temperature algorithm model, so as to obtain the temperatures corresponding to all the pixel points in the first sub-region; and then, according to the temperatures corresponding to all the pixel points in the first subregion, counting the temperature information of the first subregion. The temperature information of the first sub-region comprises one or more of a lowest temperature, a highest temperature, and an average temperature.
In addition, in order to facilitate the user to monitor the temperature of the first sub-area, after the temperature information of the first sub-area is determined, the temperature information of the first sub-area may be displayed in the first sub-area. That is, the primitives such as the first sub-area and the temperature information of the first sub-area are re-superimposed on the thermal imaging image already displayed in the preview window, so as to implement re-drawing of the thermal imaging image.
As shown in fig. 7, assuming that the oval area located at the upper left portion of the entire monitoring area is a first sub-area, the maximum temperature of 30 ℃ and the minimum temperature of 15 ℃ are displayed in the first sub-area; assuming that a linear region located at the upper right portion of the entire monitoring region is a first sub-region, displaying a maximum temperature of 10 ℃ and a minimum temperature of 5 ℃ in the first sub-region; assuming that a triangular area located at the lower left portion of the entire monitoring area is a first sub-area, displaying a maximum temperature of 50 ℃ and a minimum temperature of 10 ℃ in the first sub-area; assuming that the dot-shaped region located at the lower right portion of the entire monitoring region is the first sub-region, the average temperature is 5 ℃, and is displayed in the first sub-region.
In addition, after the terminal acquires the temperature information of the first sub-area, a temperature threshold value can be set, and when the temperature of the first sub-area reaches the temperature threshold value, an alarm sound can be played through the loudspeaker. For example, the high temperature threshold of the first sub-area 1 is preset to be 20 ℃, and the maximum temperature of the first sub-area 1 is 30 ℃ higher than the high temperature threshold of the first sub-area 1, the terminal needs to play an alarm sound.
Through the embodiment shown in fig. 2, the embodiment of the present application provides a complete set of thermal imaging online preview thermometry analysis processes. The process includes how the devices in fig. 1 are connected, sending a temperature bare data stream collected by the thermal imaging device, determining color data and temperature information corresponding to the thermal imaging image by a thermal imaging image temperature algorithm model in the terminal, rendering the thermal imaging image and the temperature information by the terminal based on the color data and the temperature information corresponding to the thermal imaging image, re-rendering the temperature information in the thermal imaging image by the terminal based on a local area designated by a user, and the like.
Therefore, in the embodiment of the present application, the temperature information of any first sub-region can be determined according to the local temperature measurement parameter and the pixel value of the pixel point in the first sub-region. The physical environment characteristics of the first sub-region often differ from those of the whole monitoring region greatly, and the local temperature measurement parameters are related to those of the first sub-region, so that in the embodiment of the application, for any first sub-region, the temperature information of the first sub-region is determined based on the local temperature measurement parameters of the first sub-region, and the problem of low accuracy of the temperature information of the first sub-region caused by the fact that the whole monitoring region adopts the same set of fixed temperature measurement parameters in the related technology is solved. That is, the method provided by the embodiment of the present application may further improve the accuracy of the determined temperature information.
All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present application, and the present application embodiment is not described in detail again.
Fig. 8 is a schematic structural diagram of a temperature measuring device provided in an embodiment of the present application, where the temperature measuring device may be implemented by software, hardware, or a combination of the two. The temperature measuring device 800 may include:
a first determining module 801, configured to determine a thermal imaging image of a monitored area according to thermal radiation data of the monitored area, where a pixel value of a pixel point in the thermal imaging image is used to indicate a thermal radiation value of a shooting point corresponding to the pixel point;
an obtaining module 802, configured to obtain a local temperature measurement parameter corresponding to a first sub-region in the thermal imaging image, where the local temperature measurement parameter is related to a physical environment characteristic of the first sub-region, and the first sub-region is any region in the thermal imaging image;
the second determining module 803 is configured to determine the temperature information of the first sub-region according to the local temperature measurement parameter and the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-region.
Optionally, the second determining module includes:
the conversion sub-module is used for converting the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameter to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area;
the first determining submodule is used for determining temperature information of the first sub-area according to the temperature of a shooting point corresponding to the pixel point in the first sub-area, and the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
Optionally, the apparatus further comprises:
the display module is used for displaying the temperature information of the first sub-area in the first sub-area, wherein the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
Optionally, the obtaining module includes:
the acquisition submodule is used for acquiring sub-region temperature measurement configuration information, and the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions;
the second determining submodule is used for determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas;
and the third determining submodule is used for determining the local temperature measurement parameter of the second sub-area as the local temperature measurement parameter corresponding to the first sub-area.
Optionally, the apparatus further comprises:
the detection module is used for detecting a region setting instruction, and the region setting instruction carries the boundary information of the plurality of second subregions and the local temperature measurement parameters of the plurality of second subregions;
and the storage module is used for storing the boundary information of the plurality of second sub-areas and the local temperature measurement parameters of the plurality of second sub-areas.
Optionally, the obtaining module is configured to:
and under the condition that a region updating instruction for a third sub-region in the thermal imaging image is detected, determining the updated third sub-region as a first sub-region, and determining a local temperature measurement parameter corresponding to the third sub-region as a local temperature measurement parameter corresponding to the first sub-region.
Optionally, the physical environment characteristics include one or more of an ambient temperature, an ambient humidity, a reflection temperature, a reflectivity, a distance between the thermal imaging lens and a shooting target in the shooting area, and a depth of the shooting area corresponding to the first sub-area.
In the embodiment of the application, the temperature information of any first sub-area can be determined according to the local temperature measurement parameters and the pixel values of the pixel points in the first sub-area. The physical environment characteristics of the first sub-region often differ from those of the whole monitoring region greatly, and the local temperature measurement parameters are related to those of the first sub-region, so that in the embodiment of the application, for any first sub-region, the temperature information of the first sub-region is determined based on the local temperature measurement parameters of the first sub-region, and the problem of low accuracy of the temperature information of the first sub-region caused by the fact that all the first sub-regions in the related art adopt the same set of fixed temperature measurement parameters is solved.
It should be noted that: in the temperature measuring device provided in the above embodiment, only the division of the functional modules is exemplified when measuring the temperature, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the functions described above. In addition, the temperature measurement device and the temperature measurement method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 9 shows a block diagram of a terminal 900 according to an embodiment of the present application, and any one of the temperature measuring devices in the foregoing embodiments may be implemented by the terminal shown in fig. 9. The terminal 900 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
In general, terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 901 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 901 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement a temperature measurement method provided by method embodiments herein.
In some embodiments, terminal 900 can also optionally include: a peripheral interface 903 and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 903 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 904, a touch display screen 905, a camera 906, an audio circuit 907, a positioning component 909, and a power supply 909.
The peripheral interface 903 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 901, the memory 902 and the peripheral interface 903 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 904 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 905 is a touch display screen, the display screen 905 also has the ability to capture touch signals on or over the surface of the display screen 905. The touch signal may be input to the processor 901 as a control signal for processing. At this point, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one, providing the front panel of the terminal 900; in other embodiments, the number of the display panels 905 may be at least two, and each of the display panels is disposed on a different surface of the terminal 900 or is in a foldable design; in still other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display screen 905 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display panel 905 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 906 is used to capture images or video. Optionally, camera assembly 906 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for realizing voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of the terminal 900. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuit 907 may also include a headphone jack.
The positioning component 908 is used to locate the current geographic Location of the terminal 900 for navigation or LBS (Location Based Service). The Positioning component 908 may be a Positioning component based on the GPS (Global Positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 909 is used to provide power to the various components in terminal 900. The power source 909 may be alternating current, direct current, disposable or rechargeable. When power source 909 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can also include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyro sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 901 can control the touch display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 911. The acceleration sensor 911 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may cooperate with the acceleration sensor 911 to acquire a 3D motion of the user on the terminal 900. The processor 901 can implement the following functions according to the data collected by the gyro sensor 912: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 913 may be disposed on the side bezel of terminal 900 and/or underneath touch display 905. When the pressure sensor 913 is disposed on the side frame of the terminal 900, the user's holding signal of the terminal 900 may be detected, and the processor 901 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 913. When the pressure sensor 913 is disposed at a lower layer of the touch display 905, the processor 901 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 905. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 914 is used for collecting a fingerprint of the user, and the processor 901 identifies the user according to the fingerprint collected by the fingerprint sensor 914, or the fingerprint sensor 914 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 914 may be disposed on the front, back, or side of the terminal 900. When a physical key or vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or vendor Logo.
The optical sensor 915 is used to collect ambient light intensity. In one embodiment, the processor 901 may control the display brightness of the touch display 905 based on the ambient light intensity collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 905 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 905 is turned down. In another embodiment, the processor 901 can also dynamically adjust the shooting parameters of the camera assembly 906 according to the ambient light intensity collected by the optical sensor 915.
Proximity sensor 916, also known as a distance sensor, is typically disposed on the front panel of terminal 900. The proximity sensor 916 is used to collect the distance between the user and the front face of the terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the touch display 905 to switch from the bright screen state to the dark screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually becomes larger, the processor 901 controls the touch display 905 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of terminal 900, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
Embodiments of the present application also provide a non-transitory computer-readable storage medium, where instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the temperature measurement method provided in the above embodiments.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a terminal, cause the terminal to perform the temperature measurement method provided in the foregoing embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of temperature measurement, the method comprising:
determining a thermal imaging image of a monitoring area according to thermal radiation data of the monitoring area, wherein pixel values of pixel points in the thermal imaging image are used for indicating thermal radiation values of shooting points corresponding to the pixel points;
acquiring a local temperature measurement parameter corresponding to a first subregion in the thermal imaging image, wherein the local temperature measurement parameter is related to physical environment characteristics of the first subregion, and the first subregion is any region in the thermal imaging image;
and determining the temperature information of the first sub-area according to the local temperature measurement parameters and the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area.
2. The method as claimed in claim 1, wherein said determining the temperature information of the first sub-region according to the local thermometry parameters and the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-region comprises:
converting the thermal radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameter to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area;
and determining temperature information of the first sub-area according to the temperature of a shooting point corresponding to the pixel point in the first sub-area, wherein the temperature information comprises one or more of the lowest temperature, the highest temperature and the average temperature.
3. The method of claim 1, wherein said obtaining local thermometry parameters corresponding to the first sub-region comprises:
acquiring sub-region temperature measurement configuration information, wherein the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions, and the plurality of second sub-regions are a plurality of sub-regions in the monitoring region;
determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas;
and determining the local temperature measurement parameter of the second sub-region as the local temperature measurement parameter corresponding to the first sub-region.
4. The method of claim 1, wherein said obtaining local thermometry parameters corresponding to the first sub-region comprises:
and under the condition that a region updating instruction for a third sub-region in the thermal imaging image is detected, determining the updated third sub-region as the first sub-region, and determining a local temperature measurement parameter corresponding to the third sub-region as a local temperature measurement parameter corresponding to the first sub-region.
5. The method according to any one of claims 1 to 4, wherein the physical environment characteristics include one or more of an ambient temperature, an ambient humidity, a reflection temperature, a reflectivity, and a distance between a thermal imaging lens and a photographic target in the photographic area, a depth of field of the photographic area, of the photographic area corresponding to the first sub-area.
6. A temperature measurement device, the device comprising:
the system comprises a first determining module, a second determining module and a control module, wherein the first determining module is used for determining a thermal imaging image of a monitoring area according to thermal radiation data of the monitoring area, and pixel values of pixel points in the thermal imaging image are used for indicating thermal radiation values of shooting points corresponding to the pixel points;
an obtaining module, configured to obtain a local temperature measurement parameter corresponding to a first sub-region in the thermal imaging image, where the local temperature measurement parameter is related to a physical environment characteristic of the first sub-region, and the first sub-region is any region in the thermal imaging image;
and the second determining module is used for determining the temperature information of the first sub-area according to the local temperature measurement parameters and the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area.
7. The apparatus of claim 6, wherein the acquisition module comprises:
the acquisition submodule is used for acquiring sub-region temperature measurement configuration information, wherein the sub-region temperature measurement configuration information comprises boundary information of a plurality of second sub-regions and local temperature measurement parameters of the plurality of second sub-regions, and the plurality of second sub-regions are a plurality of sub-regions in the monitoring region;
the second determining submodule is used for determining a second sub-area to which the first sub-area belongs according to the boundary information of the plurality of second sub-areas;
and the third determining submodule is used for determining the local temperature measurement parameter of the second sub-area as the local temperature measurement parameter corresponding to the first sub-area.
8. The apparatus of claim 6,
the second determining module includes:
the conversion sub-module is used for converting the heat radiation value of the shooting point corresponding to the pixel point in the first sub-area according to the local temperature measurement parameter to obtain the temperature of the shooting point corresponding to the pixel point in the first sub-area;
the first determining submodule is used for determining temperature information of the first sub-area according to the temperature of a shooting point corresponding to a pixel point in the first sub-area;
the acquisition module is configured to:
under the condition that a region updating instruction for a third sub-region in the thermal imaging image is detected, determining the updated third sub-region as the first sub-region, and determining a local temperature measurement parameter corresponding to the third sub-region as a local temperature measurement parameter corresponding to the first sub-region;
the physical environment characteristics comprise one or more of the ambient temperature, the ambient humidity, the reflection temperature, the reflectivity and the distance between a thermal imaging lens and a shooting target in the shooting area and the depth of the shooting area corresponding to the first sub-area.
9. A temperature measurement device, the device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of any of the above claims 1 to 5.
10. A computer-readable storage medium having stored thereon instructions which, when executed by a processor, carry out the steps of the method of any of the preceding claims 1 to 5.
CN202010880084.8A 2020-08-27 2020-08-27 Temperature measuring method, device and computer storage medium Pending CN111982305A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010880084.8A CN111982305A (en) 2020-08-27 2020-08-27 Temperature measuring method, device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010880084.8A CN111982305A (en) 2020-08-27 2020-08-27 Temperature measuring method, device and computer storage medium

Publications (1)

Publication Number Publication Date
CN111982305A true CN111982305A (en) 2020-11-24

Family

ID=73440027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010880084.8A Pending CN111982305A (en) 2020-08-27 2020-08-27 Temperature measuring method, device and computer storage medium

Country Status (1)

Country Link
CN (1) CN111982305A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112665734A (en) * 2020-12-04 2021-04-16 杭州新瀚光电科技有限公司 Temperature measurement method and device based on reference calibration
CN113487841A (en) * 2021-07-14 2021-10-08 苏州艾斯莫科信息科技有限公司 Gas safety monitoring system
CN114544002A (en) * 2022-02-17 2022-05-27 深圳市同为数码科技股份有限公司 Temperature measurement jump processing method and device, computer equipment and medium
CN114616445A (en) * 2020-12-30 2022-06-10 深圳市大疆创新科技有限公司 Temperature measurement method and device based on thermal radiation detector and thermal radiation detector
CN115050325A (en) * 2022-08-12 2022-09-13 江苏迈特菲光电技术有限公司 Temperature control data processing method suitable for OLED display

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050082483A1 (en) * 2003-10-10 2005-04-21 Takuji Oida Absorbance monitor
CN101281063A (en) * 2008-05-16 2008-10-08 天津市电视技术研究所 High temperature furnace inner video image temperature measuring system
US20110292965A1 (en) * 2010-06-01 2011-12-01 Mihailov Stephen J Method and system for measuring a parameter in a high temperature environment using an optical sensor
JP2013200137A (en) * 2012-03-23 2013-10-03 Omron Corp Infrared temperature measurement device, infrared temperature measurement method, and infrared temperature measurement device control program
CN103901291A (en) * 2012-12-28 2014-07-02 华北电力科学研究院有限责任公司 Method for diagnosing internal insulation defects of transformation equipment
CN104483026A (en) * 2014-12-16 2015-04-01 上海热像机电科技股份有限公司 On-line analysis alarm type thermal infrared imager
CN110969665A (en) * 2018-09-30 2020-04-07 杭州海康威视数字技术股份有限公司 External parameter calibration method, device and system and robot
CN111157123A (en) * 2020-02-17 2020-05-15 北京迈格威科技有限公司 Temperature measuring method, device, server and temperature measuring system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050082483A1 (en) * 2003-10-10 2005-04-21 Takuji Oida Absorbance monitor
CN101281063A (en) * 2008-05-16 2008-10-08 天津市电视技术研究所 High temperature furnace inner video image temperature measuring system
US20110292965A1 (en) * 2010-06-01 2011-12-01 Mihailov Stephen J Method and system for measuring a parameter in a high temperature environment using an optical sensor
JP2013200137A (en) * 2012-03-23 2013-10-03 Omron Corp Infrared temperature measurement device, infrared temperature measurement method, and infrared temperature measurement device control program
CN103901291A (en) * 2012-12-28 2014-07-02 华北电力科学研究院有限责任公司 Method for diagnosing internal insulation defects of transformation equipment
CN104483026A (en) * 2014-12-16 2015-04-01 上海热像机电科技股份有限公司 On-line analysis alarm type thermal infrared imager
CN110969665A (en) * 2018-09-30 2020-04-07 杭州海康威视数字技术股份有限公司 External parameter calibration method, device and system and robot
CN111157123A (en) * 2020-02-17 2020-05-15 北京迈格威科技有限公司 Temperature measuring method, device, server and temperature measuring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴洪潭 等: "一种基于正则参数的表面温度测量方法", 《自动化仪表》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112665734A (en) * 2020-12-04 2021-04-16 杭州新瀚光电科技有限公司 Temperature measurement method and device based on reference calibration
CN112665734B (en) * 2020-12-04 2023-08-04 杭州新瀚光电科技有限公司 Temperature measurement method and device based on reference calibration
CN114616445A (en) * 2020-12-30 2022-06-10 深圳市大疆创新科技有限公司 Temperature measurement method and device based on thermal radiation detector and thermal radiation detector
WO2022141188A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Thermal radiation detector-based temperature measurement method and device, and thermal radiation detector
CN113487841A (en) * 2021-07-14 2021-10-08 苏州艾斯莫科信息科技有限公司 Gas safety monitoring system
CN114544002A (en) * 2022-02-17 2022-05-27 深圳市同为数码科技股份有限公司 Temperature measurement jump processing method and device, computer equipment and medium
CN115050325A (en) * 2022-08-12 2022-09-13 江苏迈特菲光电技术有限公司 Temperature control data processing method suitable for OLED display
CN115050325B (en) * 2022-08-12 2022-10-21 江苏迈特菲光电技术有限公司 Temperature control data processing method suitable for OLED display

Similar Documents

Publication Publication Date Title
CN109461406B (en) Display method, display device, electronic apparatus, and medium
CN111982305A (en) Temperature measuring method, device and computer storage medium
CN109886208B (en) Object detection method and device, computer equipment and storage medium
CN109166150B (en) Pose acquisition method and device storage medium
CN110798790A (en) Microphone abnormality detection method, device storage medium
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN110570460A (en) Target tracking method and device, computer equipment and computer readable storage medium
CN110874905A (en) Monitoring method and device
CN111385525B (en) Video monitoring method, device, terminal and system
CN113763228A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111325701B (en) Image processing method, device and storage medium
CN110853128A (en) Virtual object display method and device, computer equipment and storage medium
CN112396076A (en) License plate image generation method and device and computer storage medium
CN111754386A (en) Image area shielding method, device, equipment and storage medium
CN111857793B (en) Training method, device, equipment and storage medium of network model
CN112991439A (en) Method, apparatus, electronic device, and medium for positioning target object
CN111753606A (en) Intelligent model upgrading method and device
CN111127541A (en) Vehicle size determination method and device and storage medium
CN111179628B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN112241987B (en) System, method, device and storage medium for determining defense area
CN111860064B (en) Video-based target detection method, device, equipment and storage medium
CN112882094B (en) First-arrival wave acquisition method and device, computer equipment and storage medium
CN112804481B (en) Method and device for determining position of monitoring point and computer storage medium
CN112243083B (en) Snapshot method and device and computer storage medium
CN114241055A (en) Improved fisheye lens internal reference calibration method, system, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201124

RJ01 Rejection of invention patent application after publication