CN112414558A - Temperature detection method and device based on visible light image and thermal imaging image - Google Patents

Temperature detection method and device based on visible light image and thermal imaging image Download PDF

Info

Publication number
CN112414558A
CN112414558A CN202110093688.2A CN202110093688A CN112414558A CN 112414558 A CN112414558 A CN 112414558A CN 202110093688 A CN202110093688 A CN 202110093688A CN 112414558 A CN112414558 A CN 112414558A
Authority
CN
China
Prior art keywords
visible light
user
thermal imaging
image
light image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110093688.2A
Other languages
Chinese (zh)
Other versions
CN112414558B (en
Inventor
黄怡超
张晓华
陈勇
韦景豹
谢卫良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Smdt Technology Co ltd
Original Assignee
Shenzhen Smdt Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Smdt Technology Co ltd filed Critical Shenzhen Smdt Technology Co ltd
Priority to CN202110093688.2A priority Critical patent/CN112414558B/en
Publication of CN112414558A publication Critical patent/CN112414558A/en
Application granted granted Critical
Publication of CN112414558B publication Critical patent/CN112414558B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation

Abstract

The application relates to the technical field of artificial intelligence, and provides a temperature detection method and a temperature detection device based on a visible light image and a thermal imaging image, wherein the temperature detection method and the temperature detection device comprise the steps of collecting the visible light image of a user based on a visible light camera and collecting the thermal imaging image of the user based on a thermal imaging camera; detecting the coordinates of the forehead of the user in the visible light image; detecting the distance from a user to a visible light camera based on the visible light image; calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance between the user and the visible light camera; and acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as a temperature detection result of the user. This application calculates the coordinate of user's forehead in the thermal imaging image with the coordinate conversion of user's forehead in visible light image, and simultaneously, this application is to carrying out the temperature measurement to the forehead position for the temperature measurement result is more accurate.

Description

Temperature detection method and device based on visible light image and thermal imaging image
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a temperature detection method and device based on a visible light image and a thermal imaging image.
Background
At present, on temperature measurement equipment such as temperature measurement face identification all-in-one, camera preview picture and people's face frame position suggestion are indispensable user experience, especially on temperature measurement face identification all-in-one, because of temperature measurement module is fixed position and visual scope, when the face frame is not at the intermediate position of previewing the picture, can lead to the inaccurate problem of temperature measurement. Meanwhile, the visible light camera and the thermal imaging camera on the temperature measurement equipment cannot be overlapped, a certain horizontal distance must be provided, namely, images collected by the visible light camera and the thermal imaging camera cannot be completely overlapped, and inaccurate temperature measurement must be caused if the coordinate position of the visible light image face is directly adopted for temperature measurement.
Disclosure of Invention
The application mainly aims to provide a temperature detection method and device based on a visible light image and a thermal imaging image, and aims to overcome the defect that the temperature of the existing temperature measurement equipment is inaccurate.
In order to achieve the above object, the present application provides a temperature detection method based on a visible light image and a thermal imaging image, which is applied to a temperature measurement device, wherein the temperature measurement device includes a visible light camera and a thermal imaging camera, and includes the following steps:
collecting a visible light image of a user based on the visible light camera and collecting a thermal imaging image of the user based on the thermal imaging camera;
detecting the coordinates of the forehead of the user in the visible light image;
detecting the distance from the user to the visible light camera based on the visible light image;
calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera;
and acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as a temperature detection result of the user.
Further, before the step of detecting the coordinates of the forehead of the user in the visible light image, the method further includes:
detecting a face frame of a user in the visible light image;
detecting the coordinates of the upper left corner and the upper right corner of the face frame, and acquiring the width of the visible light image;
acquiring a first margin between the upper left corner of the face frame and the leftmost side of the visible light image according to the upper left corner coordinate of the face frame;
calculating a second edge distance from the upper right corner of the face frame to the rightmost side of the visible light image according to the width of the visible light image and the upper right corner coordinate of the face frame;
calculating a margin difference value of the first margin and the second margin, and filtering the margin difference value based on a filter to obtain a filtering margin difference value;
judging whether the filtering edge distance difference value is within a first preset range or not, and judging whether a vertical coordinate in an upper left corner coordinate or an upper right corner coordinate of the face frame is within a second preset range or not;
if the filtering edge distance difference value is within a first preset range and the ordinate is within a second preset range, judging that the face of the user is centered;
and if the filtering edge distance difference value is not in a first preset range and/or the ordinate is not in a second preset range, judging that the face of the user is not centered, and sending out prompt information for prompting the user to be centered.
Further, the step of obtaining the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera comprises:
acquiring the size of a thermal imaging pixel and the size of a visible light pixel;
acquiring a thermal imaging focal length and a visible light focal length;
calculating an geometric scaling factor based on the thermal imaging focal length, the visible light focal length, the thermal imaging pixel size and the visible light pixel size;
acquiring a coordinate of a central area in a display picture of the temperature measuring equipment;
acquiring the horizontal distance between the visible light camera and the thermal imaging camera;
detecting a horizontal rotation angle, a pitch angle and an inclination angle of the face of the user in the visible light image;
calculating the offset of the visible light image relative to the thermal imaging image based on the horizontal rotation angle, the pitch angle and the tilt angle of the user face, the geometric scaling factor, the centered area coordinate, the horizontal distance, the thermal imaging pixel size, the thermal imaging focal length, the coordinate of the user forehead in the visible light image and the distance from the user to the visible light camera;
calculating the coordinates of the user's forehead in the thermal imaging image based on the offset and the coordinates of the user's forehead in the visible light image.
Further, the calculation formula for calculating the geometric scaling factor is as follows:
Figure 225518DEST_PATH_IMAGE001
wherein k is an equal scaling coefficient,
Figure 132294DEST_PATH_IMAGE002
in order to achieve the thermal imaging focal length,
Figure 349649DEST_PATH_IMAGE003
the focal length of the visible light is set,
Figure 415694DEST_PATH_IMAGE004
for the size of the thermal imaging pixel,
Figure 263564DEST_PATH_IMAGE005
is the visible pixel size.
Further, the calculation formula for calculating the offset of the visible light image relative to the thermal imaging image is as follows:
Figure 782270DEST_PATH_IMAGE006
where δ is the offset and the centered area coordinate is
Figure 803316DEST_PATH_IMAGE007
Figure 333655DEST_PATH_IMAGE008
Is the thermal imaging focal length, D is the horizontal distance, k is the geometric scaling factor, D is the distance of the user from the visible camera,
Figure 214410DEST_PATH_IMAGE009
for the size of the thermal imaging pixel, α, θ and Ω are the horizontal rotation angle, the pitch angle and the tilt angle of the user face, respectively, and γ and A, B, C are estimation coefficients, respectively.
Further, the step of detecting the coordinates of the forehead of the user in the visible-light image includes:
inputting the visible light image into a preset forehead detection network model, and detecting to obtain the coordinates of the forehead of the user in the visible light image;
wherein the training step of the preset forehead detection network model comprises the following steps:
acquiring a face frame in a training sample;
taking the upper left corner and the upper right corner of the face frame in the training sample as reference points, and acquiring an image of a preset area above the face frame as a forehead key point feature image;
inputting the forehead key point feature image into a convolutional neural network for regression operation, correcting deviation by adopting least square regression, and training on the basis of a gradient descent algorithm and a back propagation algorithm to obtain a global minimum value or a local minimum value of a loss function so as to train to obtain the forehead detection network model; wherein the loss function of the convolutional neural network is a cross entropy loss function.
Further, the step of detecting the distance from the user to the visible light camera based on the visible light image includes:
detecting the width of the face of the user in the visible light image;
detecting a horizontal corner of a user face in the visible light image;
calculating the distance between the user and the visible light camera based on the width of the user face in the visible light image and the horizontal rotation angle of the user face in the visible light image;
wherein the calculation formula for calculating the distance from the user to the visible light camera is as follows:
D=λ*(1-βCosα)* W;
d is the distance between the user and the visible light camera, W is the width of the user face in the visible light image, alpha is the horizontal rotation angle of the user face in the visible light image, and beta and lambda are distance conversion coefficients.
Further, before the step of calculating the distance from the user to the visible light camera based on the width of the user's face in the visible light image and the horizontal rotation angle of the user's face in the visible light image, the method includes:
acquiring a plurality of first sample data; each piece of first sample data comprises a sample distance from a sample face to a visible light camera, a horizontal rotation angle of the sample face in a corresponding visible light image, and a width of the sample face in the corresponding visible light image;
and inputting each first sample data into a preset depth network model for iterative training to obtain the distance conversion coefficient.
The application also provides a temperature-detecting device based on visible light image and thermal imaging image, is applied to temperature measurement equipment on, temperature measurement equipment includes visible light camera and thermal imaging camera, temperature-detecting device based on visible light image and thermal imaging image includes:
the acquisition unit is used for acquiring a visible light image of a user based on the visible light camera and acquiring a thermal imaging image of the user based on the thermal imaging camera;
a first coordinate detection unit, configured to detect a coordinate of the forehead of the user in the visible light image;
a distance detection unit for detecting a distance from the user to the visible light camera based on the visible light image;
a second coordinate detection unit, configured to calculate coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and a distance from the user to the visible light camera;
and the temperature detection unit is used for acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as the temperature detection result of the user.
The present application further provides a computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of any one of the above methods when executing the computer program.
The temperature detection method, the temperature detection device and the computer equipment based on the visible light image and the thermal imaging image comprise the steps of collecting the visible light image of a user based on the visible light camera and collecting the thermal imaging image of the user based on the thermal imaging camera; detecting the coordinates of the forehead of the user in the visible light image; detecting the distance from the user to the visible light camera based on the visible light image; calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera; and acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as a temperature detection result of the user. According to the method and the device, the coordinate of the forehead of the user in the visible light image is converted and calculated to obtain the coordinate of the forehead of the user in the thermal imaging image, and then temperature measurement is carried out, so that the temperature measurement result is more accurate; simultaneously, this application is to carrying out the temperature measurement to the forehead position, also can promote the degree of accuracy of temperature measurement.
Drawings
FIG. 1 is a schematic diagram illustrating the steps of a temperature detection method based on visible light images and thermal imaging images according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of a temperature detection device based on visible light images and thermal imaging images according to an embodiment of the present disclosure;
fig. 3 is a block diagram illustrating a structure of a computer device according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, an embodiment of the present application provides a temperature detection method based on a visible light image and a thermal imaging image, which is applied to a temperature measurement device, where the temperature measurement device includes a visible light camera and a thermal imaging camera, and includes the following steps:
step S1, collecting a visible light image of a user based on the visible light camera, and collecting a thermal imaging image of the user based on the thermal imaging camera;
step S2, detecting the coordinates of the forehead of the user in the visible light image;
step S3, detecting a distance from the user to the visible light camera based on the visible light image;
step S4, calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance between the user and the visible light camera;
step S5, acquiring a thermal imaging temperature of the coordinate position of the forehead in the thermal imaging image as a temperature detection result for the user.
In this embodiment, the temperature measurement equipment is a temperature measurement face recognition all-in-one machine, which can realize face recognition and measure temperature, and the temperature measurement equipment is provided with a visible light camera and a thermal imaging camera, and the visible light camera and the thermal imaging camera are generally arranged on the same vertical plane and located on the same horizontal plane.
As described in the step S1, the visible light camera is configured to collect a visible light image of a user, and the thermal imaging camera is configured to collect a thermal imaging image of the user, where the visible light image is used for face recognition, and a face frame may be displayed in the visible light image during the face recognition. In this embodiment, in order to improve the accuracy of temperature measurement, the forehead temperature of the user is used as a result of measuring the body temperature of the user, and therefore, the forehead position of the user needs to be obtained from the thermal imaging image.
As described in step S2, the forehead position of the user cannot be normally obtained directly from the thermal imaging image, but since the distances between the visible light camera and the thermal imaging camera and the user are the same and are located on the same horizontal plane, the visible light image and the thermal imaging image can be subjected to offset calculation, and the coordinates of the forehead of the user in the thermal imaging image can be obtained based on the coordinates of the forehead of the user in the visible light image. Therefore, it is necessary to detect the coordinates of the forehead of the user from the visible light image.
As described in the above steps S3-S4, if the distances between the visible light camera and the thermal imaging camera and the user are the same and the visible light camera and the thermal imaging camera are located on the same horizontal plane, the visible light image and the thermal imaging image can be aligned in the vertical direction, but the visible light image and the thermal imaging image have a certain offset distance in the horizontal direction due to the horizontal distance between the visible light camera and the thermal imaging camera in the horizontal direction. Accordingly, the coordinates of the forehead of the user in the thermal imaging image can be calculated from the coordinates of the forehead of the user in the visible light image.
As described in step S5, when the coordinate position of the forehead in the thermal imaging image is obtained, the thermal imaging temperature at the corresponding coordinate position, that is, the forehead temperature of the user, that is, the human body temperature detected by the user, can be directly obtained. In the embodiment, the coordinates of the forehead of the user in the visible light image are converted and calculated to obtain the coordinates of the forehead of the user in the thermal imaging image, and then temperature measurement is performed, so that the temperature measurement result is more accurate; simultaneously, this application is to carrying out the temperature measurement to the forehead position, also can promote the degree of accuracy of temperature measurement.
In an embodiment, before the step S2 of detecting the coordinates of the forehead of the user in the visible light image, the method further includes:
step S101, detecting a face frame of a user in the visible light image;
step S102, detecting the coordinates of the upper left corner and the upper right corner of the face frame, and acquiring the width of the visible light image;
step S103, acquiring a first margin between the upper left corner of the face frame and the leftmost side of the visible light image according to the upper left corner coordinate of the face frame;
step S104, calculating a second margin between the upper right corner of the face frame and the rightmost side of the visible light image according to the width of the visible light image and the upper right corner coordinate of the face frame;
step S105, calculating a margin difference value of the first margin and the second margin, and performing filtering processing on the margin difference value based on a filter to obtain a filtering margin difference value;
step S106, judging whether the filtering margin difference value is within a first preset range, and judging whether a vertical coordinate in an upper left corner coordinate or an upper right corner coordinate of the face frame is within a second preset range;
step S107, if the filtering margin difference value is within a first preset range and the ordinate is within a second preset range, determining that the face of the user is centered; and if the filtering edge distance difference value is not in a first preset range and/or the ordinate is not in a second preset range, judging that the face of the user is not centered, and sending out prompt information for prompting the user to be centered.
In this embodiment, when the face of the user is not centered, inaccurate temperature measurement is easily caused, and therefore, before temperature measurement, it is necessary to detect whether the face is centered. In this embodiment, a face frame of a user in a visible light image is identified based on a face identification algorithm, and based on the face frame, coordinates of four corners of the face frame in the visible light image can be acquired. In this embodiment, whether the face is centered is determined, and only the face needs to be applied to the upper left-corner coordinate and the upper right-corner coordinate, and meanwhile, the width of the visible light image needs to be acquired.
Specifically, the coordinates of the upper left corner of the face frame are (X0, Y0), the coordinates of the upper right corner of the face frame are (X1, Y0), and according to the abscissa X0 of the coordinates of the upper left corner of the face frame, the first distance W1 from the upper left corner of the face frame to the leftmost side of the visible light image can be obtained and is X0. According to the width W 'of the visible light image and the abscissa X1 of the coordinate of the upper right corner of the face frame, the second margin W2 from the upper right corner of the face frame to the rightmost side of the visible light image can be calculated as W' -X1.
Further, an absolute value W3= | W1-W2 |, which is a difference between the first edge and the second edge, is calculated, and since the face recognition algorithm is continuously processing image data, the absolute value W3 is continuously calculated, and an IIR Filter is used to perform filtering processing in order to remove noise and to smooth an absolute value output, so that the filtered edge difference is obtained. And then, determining that the filtering edge distance difference is within a first preset range (e.g. 150 pixels), and determining whether the ordinate in the top left-hand coordinate or the top right-hand coordinate of the face frame is within a second preset range, i.e. determining whether Y0 is within a second preset range (e.g. 250 pixels and 550 pixels), if the above conditions are met, determining that the current face is centered, and executing the next action.
In an embodiment, the step S4 of obtaining the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera includes:
acquiring the size of a thermal imaging pixel and the size of a visible light pixel;
acquiring a thermal imaging focal length and a visible light focal length;
calculating an geometric scaling factor based on the thermal imaging focal length, the visible light focal length, the thermal imaging pixel size and the visible light pixel size;
acquiring a coordinate of a central area in a display picture of the temperature measuring equipment;
acquiring the horizontal distance between the visible light camera and the thermal imaging camera;
detecting a horizontal rotation angle, a pitch angle and an inclination angle of the face of the user in the visible light image;
calculating the offset of the visible light image relative to the thermal imaging image based on the horizontal rotation angle, the pitch angle and the tilt angle of the user face, the geometric scaling factor, the centered area coordinate, the horizontal distance, the thermal imaging pixel size, the thermal imaging focal length, the coordinate of the user forehead in the visible light image and the distance from the user to the visible light camera;
calculating the coordinates of the user's forehead in the thermal imaging image based on the offset and the coordinates of the user's forehead in the visible light image.
In this embodiment, since the thermal imaging pixel size and the visible light pixel size are different, and the thermal imaging focal length and the visible light focal length are also different, the proportions of the thermal imaging image and the visible light image in the same user are also different, and therefore, the geometric scaling factor can be calculated based on the thermal imaging focal length, the visible light focal length, the thermal imaging pixel size and the visible light pixel size. Further, the offset of the visible light image relative to the thermal imaging image is calculated, so that the coordinates of the forehead of the user in the thermal imaging image can be calculated according to the offset. For example, the offset is δ, and the forehead of the user has coordinates in the visible light image
Figure 361358DEST_PATH_IMAGE010
The forehead of the user has coordinates in the thermal imaging image
Figure 186095DEST_PATH_IMAGE011
In this embodiment, the above formula for calculating the scaling factor is:
Figure 961152DEST_PATH_IMAGE001
wherein k is an equal scaling coefficient,
Figure 619667DEST_PATH_IMAGE002
in order to achieve the thermal imaging focal length,
Figure 378544DEST_PATH_IMAGE003
the focal length of the visible light is set,
Figure 413496DEST_PATH_IMAGE004
for the size of the thermal imaging pixel,
Figure 308640DEST_PATH_IMAGE005
is the visible pixel size.
In an embodiment, the calculation formula for calculating the offset of the visible light image relative to the thermal imaging image is as follows:
Figure 138056DEST_PATH_IMAGE006
where δ is the offset and the centered area coordinate is
Figure 321913DEST_PATH_IMAGE007
Figure 285189DEST_PATH_IMAGE008
Is the thermal imaging focal length, D is the horizontal distance, k is the geometric scaling factor, D is the distance of the user from the visible camera,
Figure 644627DEST_PATH_IMAGE009
for the size of the thermal imaging pixel, α, θ and Ω are the horizontal rotation angle, the pitch angle and the tilt angle of the user face, respectively, and γ and A, B, C are estimation coefficients, respectively.
In this embodiment, the parameters γ and A, B, C are estimation coefficients calculated by a local scatter smoothing estimation algorithm according to the face size and samples at different distances in advance.
In another embodiment, the step of detecting the coordinates of the forehead of the user in the visible-light image comprises:
inputting the visible light image into a preset forehead detection network model, and detecting to obtain the coordinates of the forehead of the user in the visible light image;
specifically, the training step of the preset forehead detection network model includes:
acquiring a face frame in a training sample;
taking the upper left corner and the upper right corner of the face frame in the training sample as reference points, and acquiring an image of a preset area above the face frame as a forehead key point feature image;
inputting the forehead key point feature image into a convolutional neural network for regression operation, correcting deviation by adopting least square regression, and training on the basis of a gradient descent algorithm and a back propagation algorithm to obtain a global minimum value or a local minimum value of a loss function so as to train to obtain the forehead detection network model; wherein the loss function of the convolutional neural network is a cross entropy loss function.
In another embodiment, the step of detecting the distance from the user to the visible light camera based on the visible light image comprises:
detecting the width of the face of the user in the visible light image;
detecting a horizontal corner of a user face in the visible light image;
calculating the distance between the user and the visible light camera based on the width of the user face in the visible light image and the horizontal rotation angle of the user face in the visible light image;
wherein the calculation formula for calculating the distance from the user to the visible light camera is as follows:
D=λ*(1-βCosα)* W;
d is the distance between the user and the visible light camera, W is the width of the user face in the visible light image, alpha is the horizontal rotation angle of the user face in the visible light image, and beta and lambda are distance conversion coefficients.
In this embodiment, since the face does not necessarily completely face the camera, the face may have a deflection angle (including the above horizontal rotation angle) in the image, and the size of the face deflection angle and the face frame may affect the distance calculation between the face and the visible light camera, the above estimation formula is proposed in this embodiment to calculate the distance from the user to the visible light camera.
In this embodiment, before the step of calculating the distance from the user to the visible light camera based on the width of the user's face in the visible light image and the horizontal rotation angle of the user's face in the visible light image, the method includes:
acquiring a plurality of first sample data; each piece of first sample data comprises a sample distance from a sample face to a visible light camera, a horizontal rotation angle of the sample face in a corresponding visible light image, and a width of the sample face in the corresponding visible light image;
and inputting each first sample data into a preset depth network model for iterative training to obtain the distance conversion coefficient.
In another embodiment, after the step of obtaining the thermal imaging temperature of the coordinate position of the forehead in the thermal imaging image, the method further includes:
detecting whether the user is centered;
if not, acquiring a centering offset value of the user and a centering position;
inputting the centering deviation value into a preset temperature compensation model, and calculating to obtain a temperature compensation value;
and calculating to obtain a final temperature according to the temperature compensation value and the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image, wherein the final temperature is used as a temperature detection result of the user.
In this embodiment, if the face of the user is not centered in the visible light image, the accuracy of temperature detection may be affected; in order to reduce the temperature detection error, a temperature compensation model is preset in this embodiment, and is used to calculate a corresponding temperature compensation value for the user and the centered position under different centered offset values, and finally the final temperature may be calculated according to the sum of the temperature compensation value and the thermal imaging temperature of the coordinate position where the forehead is located in the thermal imaging image.
In this embodiment, the temperature compensation model is trained based on a convolutional neural network, and the training data includes different centering offset values of the user and the centering position, and corresponding temperature compensation values. In other embodiments, the convolutional neural network can be comprehensively trained by combining the face deflection angle of the user, the centering offset value of the face and the corresponding temperature compensation value, so that a temperature compensation model with better judgment capability can be obtained.
Referring to fig. 2, an embodiment of the present application further provides a temperature detection device based on a visible light image and a thermal imaging image, which is applied to a temperature measurement device, where the temperature measurement device includes a visible light camera and a thermal imaging camera, and the temperature detection device based on the visible light image and the thermal imaging image includes:
the acquisition unit 10 is used for acquiring a visible light image of a user based on the visible light camera and acquiring a thermal imaging image of the user based on the thermal imaging camera;
a first coordinate detecting unit 20, configured to detect a coordinate of the forehead of the user in the visible light image;
a distance detection unit 30 configured to detect a distance from the user to the visible-light camera based on the visible-light image;
a second coordinate detecting unit 40, configured to calculate coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and a distance from the user to the visible light camera;
and the temperature detection unit 50 is configured to obtain a thermal imaging temperature of a coordinate position where the forehead of the user is located in the thermal imaging image, as a temperature detection result for the user.
In this embodiment, please refer to the above embodiments of the temperature detection method for the specific implementation of each unit in the temperature detection apparatus, which is not described herein again.
Referring to fig. 3, a computer device, which may be a server and whose internal structure may be as shown in fig. 3, is also provided in the embodiment of the present application. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer designed processor is used to provide computational and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing temperature detection data and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a temperature detection method.
Those skilled in the art will appreciate that the architecture shown in fig. 3 is only a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects may be applied.
An embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements a temperature detection method. It is to be understood that the computer-readable storage medium in the present embodiment may be a volatile-readable storage medium or a non-volatile-readable storage medium.
In summary, the method and the device for detecting temperature based on visible light images and thermal imaging images provided in the embodiments of the present application include acquiring visible light images of a user based on the visible light camera, and acquiring thermal imaging images of the user based on the thermal imaging camera; detecting the coordinates of the forehead of the user in the visible light image; detecting the distance from the user to the visible light camera based on the visible light image; calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera; and acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as a temperature detection result of the user. According to the method and the device, the coordinate of the forehead of the user in the visible light image is converted and calculated to obtain the coordinate of the forehead of the user in the thermal imaging image, and then temperature measurement is carried out, so that the temperature measurement result is more accurate; simultaneously, this application is to carrying out the temperature measurement to the forehead position, also can promote the degree of accuracy of temperature measurement.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only for the preferred embodiment of the present application and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.

Claims (10)

1. A temperature detection method based on a visible light image and a thermal imaging image is applied to temperature measurement equipment, wherein the temperature measurement equipment comprises a visible light camera and a thermal imaging camera, and the temperature detection method is characterized by comprising the following steps:
collecting a visible light image of a user based on the visible light camera and collecting a thermal imaging image of the user based on the thermal imaging camera;
detecting the coordinates of the forehead of the user in the visible light image;
detecting the distance from the user to the visible light camera based on the visible light image;
calculating the coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and the distance of the user from the visible light camera;
and acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as a temperature detection result of the user.
2. The method according to claim 1, wherein the step of detecting the coordinates of the forehead of the user in the visible light image further comprises:
detecting a face frame of a user in the visible light image;
detecting the coordinates of the upper left corner and the upper right corner of the face frame, and acquiring the width of the visible light image;
acquiring a first margin between the upper left corner of the face frame and the leftmost side of the visible light image according to the upper left corner coordinate of the face frame;
calculating a second edge distance from the upper right corner of the face frame to the rightmost side of the visible light image according to the width of the visible light image and the upper right corner coordinate of the face frame;
calculating a margin difference value of the first margin and the second margin, and filtering the margin difference value based on a filter to obtain a filtering margin difference value;
judging whether the filtering edge distance difference value is within a first preset range or not, and judging whether a vertical coordinate in an upper left corner coordinate or an upper right corner coordinate of the face frame is within a second preset range or not;
if the filtering edge distance difference value is within a first preset range and the ordinate is within a second preset range, judging that the face of the user is centered;
and if the filtering edge distance difference value is not in a first preset range and/or the ordinate is not in a second preset range, judging that the face of the user is not centered, and sending out prompt information for prompting the user to be centered.
3. The method according to claim 1, wherein the step of obtaining the coordinates of the forehead of the user in the thermal image based on the coordinates of the forehead of the user in the visible light image and the distance from the user to the visible light camera comprises:
acquiring the size of a thermal imaging pixel and the size of a visible light pixel;
acquiring a thermal imaging focal length and a visible light focal length;
calculating an geometric scaling factor based on the thermal imaging focal length, the visible light focal length, the thermal imaging pixel size and the visible light pixel size;
acquiring a coordinate of a central area in a display picture of the temperature measuring equipment;
acquiring the horizontal distance between the visible light camera and the thermal imaging camera;
detecting a horizontal rotation angle, a pitch angle and an inclination angle of the face of the user in the visible light image;
calculating the offset of the visible light image relative to the thermal imaging image based on the horizontal rotation angle, the pitch angle and the tilt angle of the user face, the geometric scaling factor, the centered area coordinate, the horizontal distance, the thermal imaging pixel size, the thermal imaging focal length, the coordinate of the user forehead in the visible light image and the distance from the user to the visible light camera;
calculating the coordinates of the user's forehead in the thermal imaging image based on the offset and the coordinates of the user's forehead in the visible light image.
4. The method according to claim 3, wherein the formula for calculating the scaling coefficient is:
Figure 546449DEST_PATH_IMAGE001
wherein k is an equal scaling coefficient,
Figure 288140DEST_PATH_IMAGE002
in order to achieve the thermal imaging focal length,
Figure 158007DEST_PATH_IMAGE003
the focal length of the visible light is set,
Figure 971242DEST_PATH_IMAGE004
for the size of the thermal imaging pixel,
Figure 938061DEST_PATH_IMAGE005
is the visible pixel size.
5. The method according to claim 3, wherein the calculation formula for calculating the shift amount of the visible light image relative to the thermal imaging image is:
Figure 378882DEST_PATH_IMAGE006
where δ is the offset and the centered area coordinate is
Figure 798362DEST_PATH_IMAGE007
Figure 415288DEST_PATH_IMAGE008
Is the thermal imaging focal length, D is the horizontal distance, k is the geometric scaling factor, D is the distance of the user from the visible camera,
Figure 236614DEST_PATH_IMAGE009
for the size of the thermal imaging pixel, α, θ and Ω are the horizontal rotation angle, the pitch angle and the tilt angle of the user face, respectively, and γ and A, B, C are estimation coefficients, respectively.
6. The method according to claim 1, wherein the step of detecting the coordinates of the forehead of the user in the visible light image comprises:
inputting the visible light image into a preset forehead detection network model, and detecting to obtain the coordinates of the forehead of the user in the visible light image;
wherein the training step of the preset forehead detection network model comprises the following steps:
acquiring a face frame in a training sample;
taking the upper left corner and the upper right corner of the face frame in the training sample as reference points, and acquiring an image of a preset area above the face frame as a forehead key point feature image;
inputting the forehead key point feature image into a convolutional neural network for regression operation, correcting deviation by adopting least square regression, and training on the basis of a gradient descent algorithm and a back propagation algorithm to obtain a global minimum value or a local minimum value of a loss function so as to train to obtain the forehead detection network model; wherein the loss function of the convolutional neural network is a cross entropy loss function.
7. The method according to claim 1, wherein the step of detecting the distance from the user to the visible-light camera based on the visible-light image comprises:
detecting the width of the face of the user in the visible light image;
detecting a horizontal corner of a user face in the visible light image;
calculating the distance between the user and the visible light camera based on the width of the user face in the visible light image and the horizontal rotation angle of the user face in the visible light image;
wherein the calculation formula for calculating the distance from the user to the visible light camera is as follows:
D=λ*(1-βCosα)* W;
d is the distance between the user and the visible light camera, W is the width of the user face in the visible light image, alpha is the horizontal rotation angle of the user face in the visible light image, and beta and lambda are distance conversion coefficients.
8. The method according to claim 7, wherein the step of calculating the distance from the user to the visible light camera based on the width of the user's face in the visible light image and the horizontal rotation angle of the user's face in the visible light image is preceded by the step of:
acquiring a plurality of first sample data; each piece of first sample data comprises a sample distance from a sample face to a visible light camera, a horizontal rotation angle of the sample face in a corresponding visible light image, and a width of the sample face in the corresponding visible light image;
and inputting each first sample data into a preset depth network model for iterative training to obtain the distance conversion coefficient.
9. The utility model provides a temperature-detecting device based on visible light image and thermal imaging image, is applied to temperature measurement equipment on, temperature measurement equipment includes visible light camera and thermal imaging camera, its characterized in that, temperature-detecting device based on visible light image and thermal imaging image includes:
the acquisition unit is used for acquiring a visible light image of a user based on the visible light camera and acquiring a thermal imaging image of the user based on the thermal imaging camera;
a first coordinate detection unit, configured to detect a coordinate of the forehead of the user in the visible light image;
a distance detection unit for detecting a distance from the user to the visible light camera based on the visible light image;
a second coordinate detection unit, configured to calculate coordinates of the forehead of the user in the thermal imaging image based on the coordinates of the forehead of the user in the visible light image and a distance from the user to the visible light camera;
and the temperature detection unit is used for acquiring the thermal imaging temperature of the coordinate position of the forehead of the user in the thermal imaging image as the temperature detection result of the user.
10. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method according to any of claims 1 to 8.
CN202110093688.2A 2021-01-25 2021-01-25 Temperature detection method and device based on visible light image and thermal imaging image Active CN112414558B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110093688.2A CN112414558B (en) 2021-01-25 2021-01-25 Temperature detection method and device based on visible light image and thermal imaging image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110093688.2A CN112414558B (en) 2021-01-25 2021-01-25 Temperature detection method and device based on visible light image and thermal imaging image

Publications (2)

Publication Number Publication Date
CN112414558A true CN112414558A (en) 2021-02-26
CN112414558B CN112414558B (en) 2021-04-23

Family

ID=74783215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110093688.2A Active CN112414558B (en) 2021-01-25 2021-01-25 Temperature detection method and device based on visible light image and thermal imaging image

Country Status (1)

Country Link
CN (1) CN112414558B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113008380A (en) * 2021-03-10 2021-06-22 五邑大学 Intelligent AI body temperature early warning method, system and storage medium
CN113095190A (en) * 2021-04-01 2021-07-09 武汉理工大学 Non-contact temperature measurement and identity recognition system
CN113269036A (en) * 2021-04-13 2021-08-17 杭州魔点科技有限公司 Method, system, device and storage medium for determining face coordinates on thermal imaging
CN113297941A (en) * 2021-05-18 2021-08-24 宁波书写芯忆科技有限公司 Remote AI intelligent temperature measurement face recognition system
CN113701894A (en) * 2021-08-30 2021-11-26 深圳科卫机器人科技有限公司 Face temperature measurement method and device, computer equipment and storage medium
CN114152349A (en) * 2021-11-30 2022-03-08 深圳Tcl新技术有限公司 Temperature measuring method, temperature measuring device, storage medium and electronic equipment
CN114960854A (en) * 2022-08-01 2022-08-30 深圳市海清视讯科技有限公司 Hand washing guiding method, device and system based on thermal imaging

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701827A (en) * 2016-01-15 2016-06-22 中林信达(北京)科技信息有限责任公司 Method and device for jointly calibrating parameters of visible light camera and infrared camera
CN107976257A (en) * 2016-10-21 2018-05-01 杭州海康威视数字技术股份有限公司 A kind of method for displaying image of infrared thermography, device and infrared thermography
CN109670422A (en) * 2018-12-05 2019-04-23 北京旷视科技有限公司 Face datection information display method, device, equipment and storage medium
CN110738142A (en) * 2019-09-26 2020-01-31 广州广电卓识智能科技有限公司 method, system and storage medium for self-adaptively improving face image acquisition
CN111083381A (en) * 2019-12-31 2020-04-28 深圳市道通智能航空技术有限公司 Image fusion method and device, double-optical camera and unmanned aerial vehicle
CN111579077A (en) * 2020-04-08 2020-08-25 北京遥感设备研究所 Dual-band image position calibration and information interaction display system and method
CN112033545A (en) * 2020-08-17 2020-12-04 深圳市视美泰技术股份有限公司 Human body temperature infrared measurement method and device and computer equipment
CN112241700A (en) * 2020-10-15 2021-01-19 希望银蕨智能科技有限公司 Multi-target forehead temperature measurement method for forehead accurate positioning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701827A (en) * 2016-01-15 2016-06-22 中林信达(北京)科技信息有限责任公司 Method and device for jointly calibrating parameters of visible light camera and infrared camera
CN107976257A (en) * 2016-10-21 2018-05-01 杭州海康威视数字技术股份有限公司 A kind of method for displaying image of infrared thermography, device and infrared thermography
CN109670422A (en) * 2018-12-05 2019-04-23 北京旷视科技有限公司 Face datection information display method, device, equipment and storage medium
CN110738142A (en) * 2019-09-26 2020-01-31 广州广电卓识智能科技有限公司 method, system and storage medium for self-adaptively improving face image acquisition
CN111083381A (en) * 2019-12-31 2020-04-28 深圳市道通智能航空技术有限公司 Image fusion method and device, double-optical camera and unmanned aerial vehicle
CN111579077A (en) * 2020-04-08 2020-08-25 北京遥感设备研究所 Dual-band image position calibration and information interaction display system and method
CN112033545A (en) * 2020-08-17 2020-12-04 深圳市视美泰技术股份有限公司 Human body temperature infrared measurement method and device and computer equipment
CN112241700A (en) * 2020-10-15 2021-01-19 希望银蕨智能科技有限公司 Multi-target forehead temperature measurement method for forehead accurate positioning

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113008380A (en) * 2021-03-10 2021-06-22 五邑大学 Intelligent AI body temperature early warning method, system and storage medium
CN113095190A (en) * 2021-04-01 2021-07-09 武汉理工大学 Non-contact temperature measurement and identity recognition system
CN113269036A (en) * 2021-04-13 2021-08-17 杭州魔点科技有限公司 Method, system, device and storage medium for determining face coordinates on thermal imaging
CN113297941A (en) * 2021-05-18 2021-08-24 宁波书写芯忆科技有限公司 Remote AI intelligent temperature measurement face recognition system
CN113701894A (en) * 2021-08-30 2021-11-26 深圳科卫机器人科技有限公司 Face temperature measurement method and device, computer equipment and storage medium
CN114152349A (en) * 2021-11-30 2022-03-08 深圳Tcl新技术有限公司 Temperature measuring method, temperature measuring device, storage medium and electronic equipment
CN114152349B (en) * 2021-11-30 2023-11-14 深圳Tcl新技术有限公司 Temperature measurement method and device, storage medium and electronic equipment
CN114960854A (en) * 2022-08-01 2022-08-30 深圳市海清视讯科技有限公司 Hand washing guiding method, device and system based on thermal imaging
CN114960854B (en) * 2022-08-01 2023-01-06 深圳市海清视讯科技有限公司 Hand washing guiding method, device and system based on thermal imaging

Also Published As

Publication number Publication date
CN112414558B (en) 2021-04-23

Similar Documents

Publication Publication Date Title
CN112414558B (en) Temperature detection method and device based on visible light image and thermal imaging image
CN111964789B (en) Temperature measuring method, temperature measuring device, computer equipment and storage medium
WO2019184885A1 (en) Method, apparatus and electronic device for calibrating extrinsic parameters of camera
CN110211185B (en) Method for identifying characteristic points of calibration pattern in group of candidate points
WO2022121189A1 (en) Method and apparatus for measuring temperature, and computer device
CN103180690B (en) Pattern assay method, pattern determinator and utilize its program
JP6444283B2 (en) Posture determination device
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN111144398A (en) Target detection method, target detection device, computer equipment and storage medium
US20220015644A1 (en) Fever Detector by Distant Multipixel Thermal Imaging
CN110542482B (en) Blind pixel detection method and device and electronic equipment
CN113749646A (en) Monocular vision-based human body height measuring method and device and electronic equipment
CN105551042A (en) Determination method and apparatus for mark point positions of scanning bed
CN112215878B (en) X-ray image registration method based on SURF feature points
JP2008116206A (en) Apparatus, method, and program for pattern size measurement
CN111160442B (en) Image classification method, computer device, and storage medium
CN109902695B (en) Line feature correction and purification method for image pair linear feature matching
CN116797648A (en) Width measuring method, device, equipment, system and medium for material deformation process
CN108416811B (en) Camera self-calibration method and device
CN113808108A (en) Visual inspection method and system for defects of printed film
JP2004220371A (en) Image processing method, image processor, image processing program, and recording medium recorded with image processing program
CN113269036A (en) Method, system, device and storage medium for determining face coordinates on thermal imaging
CN112770041A (en) Image processing method for switching multiple zoom lenses and camera
CN112798812B (en) Target speed measuring method based on monocular vision
CN110658618A (en) Method and device for fitting and focusing sample image, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant