CN112287798A - Temperature measuring method and device, electronic equipment and storage medium - Google Patents

Temperature measuring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112287798A
CN112287798A CN202011148359.5A CN202011148359A CN112287798A CN 112287798 A CN112287798 A CN 112287798A CN 202011148359 A CN202011148359 A CN 202011148359A CN 112287798 A CN112287798 A CN 112287798A
Authority
CN
China
Prior art keywords
face
region
area
temperature
face frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011148359.5A
Other languages
Chinese (zh)
Inventor
高哲峰
彭恩厚
李若岱
刘杰
左冬冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202011148359.5A priority Critical patent/CN112287798A/en
Publication of CN112287798A publication Critical patent/CN112287798A/en
Priority to PCT/CN2021/098352 priority patent/WO2022083130A1/en
Priority to TW110131735A priority patent/TWI779801B/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • G01J5/485Temperature profile
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Abstract

The application discloses a temperature measuring method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed; carrying out face detection processing on the image to be processed to obtain a first face area; determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region; and obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face area.

Description

Temperature measuring method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of temperature measurement technologies, and in particular, to a temperature measurement method and apparatus, an electronic device, and a storage medium.
Background
At present, the demand for non-contact temperature measurement is rapidly increasing in the field of temperature measurement. The non-contact temperature measurement can effectively avoid the cross infection of people and has positive effect on epidemic prevention.
In the existing non-contact temperature measurement method, the temperature measurement equipment performs face detection on the acquired temperature thermodynamic diagram to obtain a face area in the temperature thermodynamic diagram. And obtaining the temperature of the temperature measurement object according to the temperature of the pixel points contained in the face region in the temperature thermodynamic diagram. However, the temperature of the object to be measured obtained by this method is relatively low in accuracy.
Disclosure of Invention
The application provides a temperature measuring method and device, electronic equipment and a storage medium.
The application provides a temperature measurement method, which comprises the following steps:
acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed;
carrying out face detection processing on the image to be processed to obtain a first face area;
determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region;
and obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face area.
With reference to any embodiment of the present application, determining, according to the homography matrix, a pixel point region corresponding to the first face region from the temperature thermodynamic diagram to obtain a second face region includes:
determining four pixel points corresponding to the four corner points from the temperature thermodynamic diagram according to the homography matrix and the four corner points of the face frame containing the first face area;
and sequentially connecting the four pixel points to obtain a quadrilateral area to obtain the second face area.
With reference to any embodiment of the present application, the performing face detection processing on the image to be processed to obtain a first face region includes:
carrying out face detection on the image to be processed to obtain at least one first face frame;
taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region; and the resolution of a pixel point region contained in the second face frame exceeds the face frame with the resolution threshold.
In combination with any embodiment of the present application, the number of the first face frames exceeds 1, and the at least one first face frame includes a third face frame and a fourth face frame;
the taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region includes:
and under the condition that the overlapping rate between the third face frame and the fourth face frame does not exceed an overlapping rate threshold value, taking a pixel point region contained in any one of the third face frame and the fourth face frame as the first face region.
In combination with any one of the embodiments of the present application, the taking, as the first face region, a pixel point region included in any one of the third face frame and the fourth face frame includes:
taking a pixel point region contained in the face frame with the largest size measure in the third face frame and the fourth face frame as the first face region; the size measurement of the third face frame is the maximum value of the side length of the third face frame; and the size measurement of the fourth face frame is the maximum value of the side length of the fourth face frame.
In combination with any embodiment of the present application, obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel point included in the second face area includes:
carrying out forehead detection on the first face area to obtain a forehead detection result of the first face area;
obtaining the temperature of the temperature measurement object according to the temperature of pixel points included in a third face area under the condition that the forehead detection result of the first face area is that the forehead area in the first face area is not shielded; the third face area is an area corresponding to the forehead area in the second face area.
With reference to any one of the embodiments of the present application, the obtaining the second face area according to the quadrilateral area obtained by sequentially connecting the four pixels includes:
acquiring an intersection point of diagonal lines of the quadrilateral area;
taking the distance between the first point and the intersection point as a first distance; the first point is the pixel point which is closest to the intersection point in the four pixel points;
constructing a first area by taking the intersection point as a circle center and the first distance as a radius;
determining the intersection of the first area and the quadrilateral area to obtain a second area;
and selecting the area containing the intersection point from the second area as the second face area.
With reference to any one of the embodiments of the present application, the obtaining the second face area according to the quadrilateral area obtained by sequentially connecting the four pixels includes:
acquiring an intersection point of diagonal lines of the quadrilateral area;
determining a maximum inscribed area of the quadrilateral area; the maximum inscribed region is a rectangular region or a circular region containing the intersection point;
and taking the maximum inscribed area as the second face area.
With reference to any embodiment of the present application, the determining a maximum inscribed area of the quadrilateral area includes:
selecting a second largest abscissa from the abscissas of the four pixel points to obtain a first abscissa, selecting a third largest abscissa from the abscissas of the four pixel points to obtain a second abscissa, selecting a second largest ordinate from the ordinates of the four pixel points to obtain a first ordinate, and selecting a third largest ordinate from the ordinates of the four pixel points to obtain a second ordinate;
determining a second point according to the first abscissa and the first ordinate, determining a third point according to the first abscissa and the second ordinate, determining a fourth point according to the second abscissa and the first ordinate, and determining a fifth point according to the second abscissa and the second ordinate;
and taking the area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point as the maximum inscribed area.
In some embodiments, the present application further provides a thermometry device comprising:
the acquisition unit is used for acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed;
the detection unit is used for carrying out face detection processing on the image to be processed to obtain a first face area;
the first processing unit is used for determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region;
and the second processing unit is used for obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face frame.
In combination with any embodiment of the present application, the first processing unit is configured to:
determining four pixel points corresponding to the four corner points from the temperature thermodynamic diagram according to the homography matrix and the four corner points of the face frame containing the first face area;
and sequentially connecting the four pixel points to obtain a quadrilateral area to obtain the second face area.
In combination with any embodiment of the present application, the detection unit is configured to:
carrying out face detection on the image to be processed to obtain at least one first face frame;
taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region; and the resolution of a pixel point region contained in the second face frame exceeds the face frame with the resolution threshold.
In combination with any embodiment of the present application, the number of the first face frames exceeds 1, and the at least one first face frame includes a third face frame and a fourth face frame;
the detection unit is used for:
and under the condition that the overlapping rate between the third face frame and the fourth face frame does not exceed an overlapping rate threshold value, taking a pixel point region contained in any one of the third face frame and the fourth face frame as the first face region.
In combination with any embodiment of the present application, the detection unit is configured to:
taking a pixel point region contained in the face frame with the largest size measure in the third face frame and the fourth face frame as the first face region; the size measurement of the third face frame is the maximum value of the side length of the third face frame; and the size measurement of the fourth face frame is the maximum value of the side length of the fourth face frame.
In combination with any embodiment of the present application, the second processing unit is configured to:
carrying out forehead detection on the first face area to obtain a forehead detection result of the first face area;
obtaining the temperature of the temperature measurement object according to the temperature of pixel points included in a third face area under the condition that the forehead detection result of the first face area is that the forehead area in the first face area is not shielded; the third face area is an area corresponding to the forehead area in the second face area.
In combination with any embodiment of the present application, the first processing unit is configured to:
acquiring an intersection point of diagonal lines of the quadrilateral area;
taking the distance between the first point and the intersection point as a first distance; the first point is the pixel point which is closest to the intersection point in the four pixel points;
constructing a first area by taking the intersection point as a circle center and the first distance as a radius;
determining the intersection of the first area and the quadrilateral area to obtain a second area;
and selecting the area containing the intersection point from the second area as the second face area.
In combination with any embodiment of the present application, the first processing unit is configured to:
acquiring an intersection point of diagonal lines of the quadrilateral area;
determining a maximum inscribed area of the quadrilateral area; the maximum inscribed region is a rectangular region or a circular region containing the intersection point;
and taking the maximum inscribed area as the second face area.
In combination with any embodiment of the present application, the first processing unit is configured to:
selecting a second largest abscissa from the abscissas of the four pixel points to obtain a first abscissa, selecting a third largest abscissa from the abscissas of the four pixel points to obtain a second abscissa, selecting a second largest ordinate from the ordinates of the four pixel points to obtain a first ordinate, and selecting a third largest ordinate from the ordinates of the four pixel points to obtain a second ordinate;
determining a second point according to the first abscissa and the first ordinate, determining a third point according to the first abscissa and the second ordinate, determining a fourth point according to the second abscissa and the first ordinate, and determining a fifth point according to the second abscissa and the second ordinate;
and taking the area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point as the maximum inscribed area.
In some embodiments, the present application further provides a processor for performing the method of the first aspect and any one of its possible implementation manners.
In some embodiments, the present application further provides an electronic device comprising: a processor, transmitting means, input means, output means, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In some embodiments, the present application further provides a computer-readable storage medium having stored therein a computer program comprising program instructions which, if executed by a processor, cause the processor to perform the method of the first aspect and any one of its possible implementations.
In some embodiments, the present application also provides a computer program product comprising a computer program or instructions which, if run on a computer, causes the computer to perform the method of the first aspect and any of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of a temperature measurement method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a temperature measuring device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a hardware structure of a temperature measuring device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
At present, the demand for non-contact temperature measurement is rapidly increasing in the field of temperature measurement. The non-contact temperature measurement algorithm can effectively avoid cross infection of people and plays a positive role in epidemic prevention.
The commonly used temperature measurement method is to directly use a temperature thermodynamic diagram generated by a temperature measurement module of non-contact temperature measurement equipment to carry out face detection, so as to obtain a face area in the temperature thermodynamic diagram. And then obtaining the temperature of the temperature measurement object according to the temperature of pixel points contained in the face region in the temperature thermodynamic diagram. The temperature thermodynamic diagrams of the temperature measurement modules produced by each manufacturer have obvious difference, and the cost of making a universal thermodynamic diagram human face detection model is very high. Moreover, the face detection model of the non-contact thermometry device is not trained based on a thermographic map. The temperature thermodynamic diagram carries less information, the face detection effect is poor, and the obtained face area is inaccurate. Therefore, the obtained temperature of the temperature measurement object is also inaccurate according to the temperature of the pixel point region contained in the inaccurate face region in the temperature thermodynamic diagram. The temperature measurement method not only needs to consume larger labor cost, but also has low detection efficiency and is easy to cause inaccurate temperature measurement result.
Based on this, the embodiment of the application provides a temperature measurement method realized through a temperature measurement device. The execution main body of the embodiment of the application is a temperature measuring device. Optionally, the temperature measuring device may be one of the following: cell-phone, computer, panel computer, entrance guard's equipment. The embodiments of the present application will be described below with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic flow chart of a temperature measuring method according to an embodiment of the present disclosure.
101. Acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed.
In the embodiment of the present application, the image to be processed may be any image. For example, the image to be processed may include a human face, and may also include a human face and an object. The content contained in the image to be processed is not limited.
The temperature thermodynamic diagram is a pseudo-color diagram carrying temperature information corresponding to an image to be processed. And each pixel point of the temperature thermodynamic diagram carries corresponding temperature information. Darker colors of pixel points in the temperature thermodynamic diagram represent higher temperatures. That is to say, according to the temperature of the pixel point in the temperature thermodynamic diagram, the temperature of the temperature measurement object in the image to be processed can be obtained.
In one possible implementation of the acquisition of the images to be processed, the thermometry device is loaded with an RGB camera. The temperature measuring device acquires images to be processed through the RGB camera. Optionally, the temperature measuring device is assembled with the access control device. The temperature measuring device is a non-contact temperature measuring product such as an Artificial Intelligence (AI) infrared imager and a security door (the product is mainly placed in the scenes with dense pedestrian flow such as stations, airports, subways, shops, supermarkets, schools, company halls and cell gates).
In another possible implementation of obtaining the image to be processed, the thermometry device is loaded with a monitoring camera. And decoding the video stream sent by the monitoring camera, and taking the RGB image obtained by decoding as the image to be processed.
In one possible implementation of obtaining a temperature thermodynamic diagram, the thermometry device is loaded with an infrared imaging device. The temperature measuring device acquires and obtains a temperature thermodynamic diagram through an infrared imaging camera of the imaging equipment.
In the embodiment of the present application, the Homography (Homography) is equivalent to a projective transformation, and changes the position and the shape of an object. The homography matrix (Homograph, H) is a pre-calibrated homography mapping transformation parameter. The acquisition of the homography matrix between the temperature thermodynamic diagram and the image to be processed is realized by adopting a traditional method. The traditional method for acquiring the homography transformation matrix needs to pass through the following steps: extracting characteristic points of each graph; extracting a descriptor corresponding to each feature point; finding matched feature point pairs in the two graphs (there may be error matching) by matching the feature point descriptors; rejecting false matches using RANSAC algorithm; and solving the equation set and calculating a homography transformation matrix. The thermography is an image taken by an infrared imaging device, and the image to be processed is an image taken by an RGB imaging device.
In one possible implementation of acquiring a homography matrix between a thermographic image and an image to be processed, an RGB camera and an infrared imaging camera are mounted on a thermometry device. A square card is placed in the position about 1 m in front of the temperature measuring device, and two images are shot by the RGB camera and the infrared imaging camera. Then, an RGB image of the square card is obtained by the RGB camera. And obtaining an infrared imaging image of the square card by using an infrared imaging camera. By extracting the RGB image of the square card and the local characteristic points of the infrared imaging image of the square card, the algorithm which can be adopted comprises the following steps: scale-invariant feature transform (SIFT), FAST up robust feature (SURF), corner detection (FAST), and the like, which are not limited herein. And then, extracting a descriptor corresponding to each local characteristic point, and finding out the matched characteristic point pairs in the RGB image of the square card and the infrared imaging image of the square card by matching the characteristic point descriptors. Because there may be mismatching, the mismatching is eliminated by using random sample consensus (RANSAC) algorithm, the corresponding equation set is solved, and the homography transformation matrix (homograph) is calculated. It should be understood that there may be differences in the positions and angles at which the RGB camera and the infrared imaging camera of each thermometry device are placed, and there may be differences in the homography matrices obtained by the RGB camera and the infrared imaging camera of different thermometry devices. Therefore, each temperature measuring device needs to be calibrated through a square card, and after images shot by the RGB camera and images shot by the infrared imaging camera are obtained, the homography matrix is obtained through calculation by the method.
102. Carrying out face detection processing on the image to be processed to obtain a first face area;
in the embodiment of the present application, the face detection processing is used to determine whether the image to be processed contains a face region. The temperature measuring device can determine whether the image to be processed contains a human face region by carrying out human face detection processing on the image to be processed.
And the temperature measuring device determines that the image to be processed contains the representation of the face region, and the image to be processed contains a temperature measuring object. And obtaining a first face area under the condition that the image to be processed contains the face area. It should be understood that the first face region is a face region corresponding to the thermometric object in the to-be-processed image.
In a possible implementation manner of face detection processing, a temperature measuring device performs face feature extraction processing on an image to be processed to obtain face feature data of the image to be processed. And further determining whether the image to be processed contains a face region according to the face feature data.
In another possible implementation manner of the face detection processing, the temperature measuring device uses a face detection neural network to process the image to be processed, so as to determine whether the image to be processed contains a face region. The face detection neural network is obtained by training with a first training image with labeling information as training data, wherein the labeling information comprises whether the first training image contains a face or not.
In another possible implementation manner of the face detection processing, the pre-trained neural network is used to perform feature extraction processing on the image to be processed to obtain feature data, and the pre-trained neural network identifies whether the image to be processed contains a face according to features in the feature data. The method comprises the steps of performing feature extraction processing on an image to be processed, determining the position of a face frame of the image to be processed under the condition that the image to be processed contains a face in feature extraction data, namely detecting the face, wherein the face detection processing on the image to be processed can be realized through a convolutional neural network. The convolutional neural network is trained by taking a plurality of images with the labeling information as training data, so that the trained convolutional neural network can finish the face detection processing of the images. The labeling information of the image in the training data is the face and the position of the face. In the process of training the convolutional neural network by using the training data, the convolutional neural network extracts feature data of an image from the image, determines whether a face exists in the image according to the feature data, and obtains the position of the face according to the feature data of the image under the condition that the face exists in the image. And monitoring the result obtained by the convolutional neural network in the training process by taking the marking information as the monitoring information, updating the parameters of the convolutional neural network, and finishing the training of the convolutional neural network. In this way, the trained convolutional neural network can be used to process the image to be processed, so as to obtain the position of the face in the image to be processed.
In another possible implementation manner of the face detection process, the face detection process may be implemented by a face detection algorithm, where the face detection algorithm may be one of the following: a face detection algorithm based on histogram rough segmentation and singular value features, face detection based on binary wavelet transformation, a neural network method (pdbnn) based on probability decision, a hidden markov model method (hidden markov model), and the like, and the face detection algorithm for realizing the face detection processing is not particularly limited in the present application.
103. Determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region;
and determining pixel points corresponding to the pixel points in the image to be processed from the temperature thermodynamic diagram through the homography matrix.
For example, assume that the coordinate of the pixel point a in the image to be processed is (2, 4), the pixel point corresponding to the pixel point a in the thermography is the pixel point D, and the homography matrix
Figure BDA0002740414110000071
Converting the coordinates of the A pixel points into homogeneous coordinates (2, 4, 1), wherein the corresponding matrix form B is [2, 4, 1 ]]. Then, the homogeneous coordinate of the A pixel point is multiplied by the homography matrix H to obtain a matrix
Figure BDA0002740414110000072
Dividing the first numerical value 13 of the matrix C by the third numerical value 55 to obtain the abscissa of the pixel point D as 13/55; the second value 34 of the matrix C is divided by the third value 55 to obtain the ordinate 34/55 of the pixel D. Therefore, the coordinate of the pixel point D corresponding to the pixel point a in the thermodynamic diagram is (13/55, 34/55).
The image processing device can determine pixel points corresponding to the pixel points on the contour of the first face region from the temperature thermodynamic diagram according to the homography matrix, and further determine the pixel point region corresponding to the first face region from the temperature thermodynamic diagram.
104. And obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face area.
In the embodiment of the application, the temperature measuring device can determine the temperature corresponding to any pixel point in the temperature thermodynamic diagram. And when the temperature thermodynamic diagram contains a second face area, the temperature measuring device can obtain the temperature of a temperature measuring object corresponding to the first face area according to the temperature of pixel points contained in the second face area.
In a possible implementation manner, the temperature measuring device uses the average temperature value corresponding to at least one pixel point in the second face region as the temperature of the temperature measuring object. For example, the pixel points of the second face region include: the temperature of the pixel point a is 36.9 degrees, and the temperature of the pixel point b is 36.3 degrees. The temperature measuring device can take the average value (36.6 ℃) of the temperature corresponding to the pixel point a and the temperature corresponding to the pixel point b as the temperature of a temperature measuring object; the temperature measuring device can also take the temperature (36.9 ℃) corresponding to the pixel point a as the temperature of the temperature measuring object; the temperature measuring device can also take the temperature (36.3 degrees) corresponding to the pixel point b as the temperature of the temperature measuring object.
In another possible implementation manner, the temperature measuring device uses the highest temperature value corresponding to the pixel point in the second face region as the temperature of the temperature measuring object. For example, the pixel points of the second face region include: the temperature of the pixel point a is 36.9 degrees, the temperature of the pixel point b is 36.3 degrees, and the temperature of the pixel point c is 37 degrees. The temperature measuring device can use the maximum value (37 degrees) of the temperature corresponding to the pixel point a, the temperature corresponding to the pixel point b and the temperature corresponding to the pixel point c as the temperature of the temperature measuring object.
In another possible implementation manner, the temperature measuring device first calculates an average temperature value corresponding to a pixel point in the second face region, selects at least one pixel point from the pixel points whose difference with the average temperature value does not exceed a threshold, and uses the average temperature value corresponding to the at least one pixel point as the temperature of the temperature measurement object. For example, the pixel points of the second face region include: pixel a, pixel b, pixel c and pixel d. Wherein, the temperature corresponding to the pixel point a is 36.9 degrees, the temperature corresponding to the pixel point b is 36.3 degrees, the temperature corresponding to the pixel point c is 37.9 degrees, and the temperature corresponding to the pixel point d is 35.2 degrees. The temperature measuring device calculates the average temperature value corresponding to the pixel points in the second face region to be 36.625 degrees. Under the condition that the threshold is 1 degree, because the difference (1.275 degrees) between the temperature corresponding to the pixel point c and the average temperature value and the difference (1.375 degrees) between the temperature corresponding to the pixel point d and the average temperature value both exceed 1 degree, the temperature measuring device selects at least one pixel point from the pixel point a and the pixel point b. Further, the temperature measuring device may use an average value (36.6 degrees) of the temperature corresponding to the pixel point a and the temperature corresponding to the pixel point b as the temperature of the temperature measuring object; the temperature measuring device can also take the temperature (36.9 ℃) corresponding to the pixel point a as the temperature of the temperature measuring object; the temperature measuring device can also take the temperature (36.3 degrees) corresponding to the pixel point b as the temperature of the temperature measuring object.
In the embodiment of the application, the temperature measuring device can accurately determine the face area of the temperature measuring object from the image to be processed to obtain the first face area by carrying out face detection processing on the image to be processed. And determining the face region of the temperature measurement object from the temperature thermodynamic diagram according to the homography matrix between the image to be processed and the temperature thermodynamic diagram and the position of the first face region in the image to be processed, so that the accuracy of the face region of the temperature measurement object determined from the temperature thermodynamic diagram can be improved, and the second face region is obtained. Therefore, the temperature measuring device obtains the temperature of the temperature measuring object according to the temperature of the second face area, and the accuracy of the temperature measuring object can be improved.
As an alternative implementation, in order to determine a pixel point region corresponding to the first face region in the temperature thermodynamic diagram, a position of the first face region needs to be determined first. And determining the position of the first face area through the coordinates of four corner points of the face frame containing the first face area. In the face detection process, the first face region is obtained, and at the same time, a face frame including the first face region can also be obtained, so that the temperature measuring device performs the following steps in the process of performing step 102:
21. carrying out face detection on the image to be processed to obtain at least one first face frame;
in the embodiment of the application, under the condition that a plurality of faces exist in the image to be processed, the face detection processing is performed on the image to be processed, and a plurality of face frames containing face areas are output according to the size of the face areas in the image to be processed. That is to say, the face detection processing is performed on the image to be processed to obtain at least one first face frame. It should be understood that the size of the face frame is determined according to the size of the face region included. The shape of each face frame in the at least one first face frame is rectangular, and the position of each face frame is determined by coordinates of four corner points of the face frame.
22. Taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region; the second face frame comprises a face frame with the resolution of the pixel point region exceeding the resolution threshold;
in the embodiment of the application, a face frame containing pixel point regions with resolution exceeding a resolution threshold is selected from at least one first face frame, and a second face frame is obtained. In the present application, the face pixels of the face detection process are at least 60 × 60 or more. That is, the resolution threshold is at least 60x 60. For example, when the face pixel of the first face region is 56 × 56, the accuracy of the first face region cannot be determined after the face detection processing, and the accuracy of the second face region corresponding to the first face region cannot be determined in the thermography. Therefore, the resolution threshold is set to be at least equal to or greater than 60 × 60, but the value of the specific resolution threshold is not limited in the present application.
A possible implementation manner is that under the condition that the number of at least one first face frame is 1 and the resolution of a pixel point region contained in the face frame exceeds a resolution threshold, a second face frame is obtained and is used as the face frame of the temperature measurement object in the image to be processed.
As an alternative implementation, the image to be processed is subjected to face detection processing, and there may be a plurality of face frames obtained. Because the present application relates to single person thermometry, it is necessary to select one of the face frames. When the number of the first face frames exceeds 1, the at least one first face frame includes a third face frame and a fourth face frame, and the temperature measuring device includes the following steps in a process of executing a process of using a pixel point region included in a second face frame in the at least one first face frame as the first face region:
23. when the overlap ratio between the third face frame and the fourth face frame does not exceed an overlap ratio threshold, regarding a pixel point region included in any one of the third face frame and the fourth face frame as the first face region;
in the embodiment of the present application, when the number of the first face frames exceeds 1, a face frame including a first face area is selected from at least one first face frame. In the image to be processed, at least one face frame different from at least one first face frame exists in the at least one first face frame, so that under the condition that pixel point regions contained in the two face frames have overlapping parts, the pixel point region contained in the at least one first face frame cannot be used as the first face region. That is to say, in the case that at least one face frame of at least one first face frame in the image to be processed has at least one face frame different from the at least one first face frame, so that the overlap ratio between the two face frames exceeds the overlap ratio threshold, the pixel point region included in any one face frame of the at least one first face frame cannot be used as the first face region.
Under the condition that the overlapping rate between any two face frames in at least one first face frame does not exceed the overlapping rate threshold, namely, a distance exists between every two face frames in at least one first face frame, namely, the area of the overlapping part of pixel point regions contained in any two face frames is 0, namely, the overlapping rate between any two face frames is 0%. Then, a pixel point region included in any one of the at least one first face frame can be used as the first face region. Therefore, the overlap ratio threshold is set to 0.
In one possible implementation manner, the number of the at least one first face frame is 2, and the two face frames are set as a third face frame and a fourth face frame. And calculating the overlapping rate between the third face frame and the fourth face frame, and taking a pixel point area contained in any one of the third face frame and the fourth face frame as the first face area under the condition that the overlapping rate between the third face frame and the fourth face frame is determined not to exceed an overlapping rate threshold value. For example, the at least one first face frame has two face frames: face frame b and face frame c. And calculating the overlapping rate between the face frame b and the face frame c. Under the condition that the overlapping rate between the face frame b and the face frame c does not exceed the overlapping rate threshold, namely, the pixel point region contained in the face frame b and the pixel point region contained in the face frame c are not overlapped. Therefore, the pixel point region contained in the face frame b can be used as the first face region, and the pixel point region contained in the face frame c can also be used as the first face region. Under the condition that the overlap ratio between the face frame b and the face frame c exceeds the overlap ratio threshold, namely, the pixel point region contained in the face frame b and the pixel point region contained in the face frame c have overlapped parts. Therefore, the pixel point region included in the face frame b cannot be used as the first face region, and the pixel point region included in the face frame c cannot be used as the first face region.
In another possible implementation manner, the number of the at least one first face frame is 3, and the three face frames are set as a third face frame, a fourth face frame, and a fifth face frame. And respectively calculating the overlapping rate between the third face frame and the fourth face frame, the overlapping rate between the third face frame and the fifth face frame and the overlapping rate between the fourth face frame and the fifth face frame. In one possible case, the overlap ratio between the third face frame and the fourth face frame exceeds an overlap ratio threshold, the overlap ratio between the third face frame and the fifth face frame does not exceed an overlap ratio threshold, and the overlap ratio between the fourth face frame and the fifth face frame does not exceed an overlap ratio threshold. That is to say, there is the overlapping part in the pixel region that third face frame contained and the pixel region that fourth face frame contained, and the pixel region that fifth face frame contained and the pixel region that third face frame contained do not have the overlapping part, and the pixel region that fifth face frame contained and the pixel region that fourth face frame contained do not have the overlapping part. Therefore, the pixel point region contained in the third face frame cannot be used as the first face region, the pixel point region contained in the fourth face frame cannot be used as the first face region, and only the pixel point region contained in the fifth face frame can be used as the first face region. In yet another possible case, the overlap ratio between the third face frame and the fourth face frame exceeds the overlap ratio threshold, the overlap ratio between the third face frame and the fifth face frame exceeds the overlap ratio threshold, and the overlap ratio between the fourth face frame and the fifth face frame does not exceed the overlap ratio threshold. That is to say, there is the overlapping part in the pixel region that third face frame contained and the pixel region that fourth face frame contained, and there is the overlapping part in the pixel region that fifth face frame contained and the pixel region that third face frame contained, and there is not the overlapping part in the pixel region that fifth face frame contained and the pixel region that fourth face frame contained. Therefore, the pixel point region contained in the third face frame cannot be used as the first face region, the pixel point region contained in the fourth face frame cannot be used as the first face region, and the pixel point region contained in the fifth face frame cannot be used as the first face region. In yet another possible case, the overlapping rate between the third face frame and the fourth face frame does not exceed the overlapping rate threshold, the overlapping rate between the third face frame and the fifth face frame does not exceed the overlapping rate threshold, and the overlapping rate between the fourth face frame and the fifth face frame does not exceed the overlapping rate threshold. That is to say, the pixel point region that the third face frame contains and the pixel point region that the fourth face frame contains do not have the part that overlaps, the pixel point region that the fifth face frame contains and the pixel point region that the third face frame contains do not have the part that overlaps, the pixel point region that the fifth face frame contains and the pixel point region that the fourth face frame contains do not have the part that overlaps. Therefore, the pixel point region contained in the third face frame can be used as the first face region, the pixel point region contained in the fourth face frame can be used as the first face region, and the pixel point region contained in the fifth face frame can also be used as the first face region.
In another possible implementation manner, when the number of face frames in which the resolution of the pixel point regions included in the at least one first face frame exceeds the resolution threshold exceeds 1, the face frame including the first face region is selected from the at least one first face frame in which the resolution exceeds the resolution threshold. The number of the face frames in which the resolution of the pixel point region included in at least one first face frame in the image to be processed exceeds the resolution threshold is greater than 1, and at least one face frame different from the face frame exists in any one of the face frames in which the resolution of the pixel point region included in at least one first face frame exceeds the resolution threshold, so that the pixel point region included in any one of the face frames in which the resolution of the pixel point region included in at least one first face frame exceeds the resolution threshold cannot be used as the first face region under the condition that the pixel point regions included in the two face frames have overlapping portions. That is to say, in a case where at least one face frame of the face frames in which the resolution of the pixel point region included in the at least one first face frame exceeds the resolution threshold value exists, and the overlap ratio between the two face frames exceeds the overlap ratio threshold value, the pixel point region included in any one face frame of the face frames in which the resolution of the pixel point region included in the at least one first face frame exceeds the resolution threshold value cannot be used as the first face region.
In another possible implementation manner, two face frames, in which the resolution of the pixel point region included in at least one first face frame exceeds the resolution threshold, are set as a third face frame and a fourth face frame. And calculating the overlapping rate between the third face frame and the fourth face frame, and taking a pixel point area contained in any one of the third face frame and the fourth face frame as the first face area under the condition that the overlapping rate between the third face frame and the fourth face frame is determined not to exceed an overlapping rate threshold value. For example, there are two face frames in which the resolution of the pixel region included in at least one first face frame exceeds the resolution threshold: face frame b and face frame c. And calculating the overlapping rate between the face frame b and the face frame c. Under the condition that the overlapping rate between the face frame b and the face frame c does not exceed the overlapping rate threshold, namely, the pixel point region contained in the face frame b and the pixel point region contained in the face frame c are not overlapped. Therefore, the pixel point region contained in the face frame b can be used as the first face region, and the pixel point region contained in the face frame c can also be used as the first face region. Under the condition that the overlap ratio between the face frame b and the face frame c exceeds the overlap ratio threshold, namely, the pixel point region contained in the face frame b and the pixel point region contained in the face frame c have overlapped parts. Therefore, the pixel point region included in the face frame b cannot be used as the first face region, and the pixel point region included in the face frame c cannot be used as the first face region.
In another possible implementation manner, three face frames in which the resolution of pixel point regions included in at least one first face frame exceeds the resolution threshold are set as a third face frame, a fourth face frame, and a fifth face frame. And respectively calculating the overlapping rate between the third face frame and the fourth face frame, the overlapping rate between the third face frame and the fifth face frame and the overlapping rate between the fourth face frame and the fifth face frame. In one possible case, the overlap ratio between the third face frame and the fourth face frame exceeds an overlap ratio threshold, the overlap ratio between the third face frame and the fifth face frame does not exceed an overlap ratio threshold, and the overlap ratio between the fourth face frame and the fifth face frame does not exceed an overlap ratio threshold. That is to say, there is the overlapping part in the pixel region that third face frame contained and the pixel region that fourth face frame contained, and the pixel region that fifth face frame contained and the pixel region that third face frame contained do not have the overlapping part, and the pixel region that fifth face frame contained and the pixel region that fourth face frame contained do not have the overlapping part. Therefore, the pixel point region contained in the third face frame cannot be used as the first face region, the pixel point region contained in the fourth face frame cannot be used as the first face region, and only the pixel point region contained in the fifth face frame can be used as the first face region. In yet another possible case, the overlap ratio between the third face frame and the fourth face frame exceeds the overlap ratio threshold, the overlap ratio between the third face frame and the fifth face frame exceeds the overlap ratio threshold, and the overlap ratio between the fourth face frame and the fifth face frame does not exceed the overlap ratio threshold. That is to say, there is the overlapping part in the pixel region that third face frame contained and the pixel region that fourth face frame contained, and there is the overlapping part in the pixel region that fifth face frame contained and the pixel region that third face frame contained, and there is not the overlapping part in the pixel region that fifth face frame contained and the pixel region that fourth face frame contained. Therefore, the pixel point region contained in the third face frame cannot be used as the first face region, the pixel point region contained in the fourth face frame cannot be used as the first face region, and the pixel point region contained in the fifth face frame cannot be used as the first face region. In yet another possible case, the overlapping rate between the third face frame and the fourth face frame does not exceed the overlapping rate threshold, the overlapping rate between the third face frame and the fifth face frame does not exceed the overlapping rate threshold, and the overlapping rate between the fourth face frame and the fifth face frame does not exceed the overlapping rate threshold. That is to say, the pixel point region that the third face frame contains and the pixel point region that the fourth face frame contains do not have the part that overlaps, the pixel point region that the fifth face frame contains and the pixel point region that the third face frame contains do not have the part that overlaps, the pixel point region that the fifth face frame contains and the pixel point region that the fourth face frame contains do not have the part that overlaps. Therefore, the pixel point region contained in the third face frame can be used as the first face region, the pixel point region contained in the fourth face frame can be used as the first face region, and the pixel point region contained in the fifth face frame can also be used as the first face region.
In a possible implementation manner of determining whether the overlap ratio between two face frames exceeds the overlap ratio threshold, the edges of the two face frames are both parallel to the x axis or the y axis, that is, the face frames are aligned with the horizontal axis or the vertical axis of the pixel coordinate system of the image to be processed. Setting two face frames in the image to be processed: a face frame a and a face frame B. Judging the condition that the overlapping rate between the face frame A and the face frame B does not exceed the overlapping rate threshold value, and understanding that the edges of the two face frames cannot be overlapped. Then, there are four cases such that the overlap ratio between the B face frame and the a face frame does not exceed the overlap ratio threshold: in the first case, the B face frame is above the A face frame; in the second case, the B face frame is below the A face frame; in the third case, the B face frame is on the left of the A face frame; in the fourth case, the B face frame is to the right of the a face frame. Let the coordinate of the upper left corner of the face frame A be p1, the coordinate of the lower right corner be p2, the coordinate of the upper left corner of the face frame B be p3, and the coordinate of the lower right corner be p 4. When the B face frame is above the A face frame, the ordinate of p1 is smaller than the ordinate of p 4; when the B face frame is below the A face frame, the ordinate of the p3 is smaller than the ordinate of the p 2; when the B face frame is at the left of the A face frame, the abscissa of p1 is larger than the abscissa of p 4; when the B face frame is to the right of the A face frame, the abscissa of p2 is smaller than the abscissa of p 3. In the above four cases, the overlap ratio between the a face frame and the B face frame does not exceed the overlap ratio threshold. Then, both the pixel point region contained in the face frame a and the pixel point region contained in the face frame B in the image to be processed can be used as the first face region.
It should be understood that there are only two relationships between any two different face frames in the plurality of face frames, one is that the overlap ratio between the two different face frames exceeds the overlap ratio threshold, and the other is that the overlap ratio between the two different face frames does not exceed the overlap ratio threshold. The number of the face frames is not limited in the present application.
As an optional implementation manner, under the condition that the overlap ratio between the third face frame and the fourth face frame does not exceed the overlap ratio threshold, both the pixel point region included in the third face frame and the pixel point region included in the fourth face frame can be used as the first face region, but when the scene of queuing to detect body temperature, such as a security check, is involved, generally, the closer to the temperature measuring device, the larger the corresponding face region will be. Therefore, taking the pixel point region contained in the larger face frame of the third face frame and the fourth face frame as the first face region, the temperature measuring device executes the following steps in the process of executing step 23:
24. taking a pixel point region contained in the face frame with the largest size measure in the third face frame and the fourth face frame as the first face region; the size measurement of the third face frame is the maximum value of the side length of the third face frame; the size measure of the fourth face frame is the maximum value of the side length of the fourth face frame;
the embodiment of the application relates to a queuing temperature measurement scene, and the more front people in the queue show the larger corresponding face area on the display interface of the temperature measurement device. Therefore, under the condition that a plurality of face frames exist in the image to be processed, the pixel point region contained in the largest face frame in the plurality of face frames is selected as the first face region. The size measurement of each face frame is the maximum value of the side length of the face frame. Therefore, the maximum value of the side length of each face frame in the face frames in the image to be processed is obtained first, and then the face frame with the largest size measure is selected.
In a possible implementation manner, two face frames exist in at least one first face frame, so that the overlap ratio between the two face frames does not exceed the overlap ratio threshold, and the two face frames are set as a third face frame and a fourth face frame. And under the condition that the maximum side length of the third face frame is larger than the maximum side length of the fourth face frame, taking a pixel point region contained in the third face frame as a first face region. For example, the third face frame has a length of 40 pixels and a width of 45 pixels. The size measure of the third face box is 45 pixels. The fourth face frame is 36 pixels long and 32 pixels wide. Then the size measure of the fourth face box is 36 pixels. Since the size measure of the third face box is 45 pixels and the size measure of the fourth face box is 36 pixels. The size measurement of the third face frame is larger than that of the fourth face frame, so that a pixel point area contained in the third face frame is used as the first face area.
In another possible implementation manner, the resolution of a pixel point region included in two face frames in at least one first face frame exceeds a resolution threshold, the overlap ratio between the two face frames does not exceed an overlap ratio threshold, and the two face frames are set as a third face frame and a fourth face frame. And under the condition that the maximum side length of the third face frame is larger than the maximum side length of the fourth face frame, taking a pixel point region contained in the third face frame as a first face region. For example, the third face frame has a length of 40 pixels and a width of 45 pixels. The size measure of the third face box is 45 pixels. The fourth face frame is 36 pixels long and 32 pixels wide. Then the size measure of the fourth face box is 36 pixels. Since the size measure of the third face box is 45 pixels and the size measure of the fourth face box is 36 pixels. The size measurement of the third face frame is larger than that of the fourth face frame, so that a pixel point area contained in the third face frame is used as the first face area.
It should be understood that, the present application only provides an implementation manner of selecting the largest face frame from two face frames, but the present application is also applicable to selecting the largest face frame from face frames whose number is greater than 2, and the number of face frames is not limited in the present application. As an alternative embodiment, the thermometric apparatus performs the following steps in the process of performing step 103:
31. and determining four pixel points corresponding to the four corner points from the temperature thermodynamic diagram according to the homography matrix and the four corner points of the face frame comprising the first face area.
In the embodiment of the application, the coordinates of the four corners of the face frame of the temperature measurement object in the image to be processed include coordinates of the first corner, and the coordinates of the corresponding four pixel points in the temperature thermodynamic diagram are obtained according to the homography matrix and the coordinates of the four corners. By the method, the homography matrix obtained by calibrating the square card of each temperature measuring device is a 3x3 matrix. Assuming that coordinates of four corner points of the face frame are respectively a first corner point coordinate (a1, b1), a second corner point coordinate (a2, b2), a third corner point coordinate (a3, b3) and a fourth corner point coordinate (a4, b4), and the coordinates of four pixel points obtained through the coordinates of the four corner points and the homography matrix are respectively a first coordinate (x1, y1), a second coordinate (x2, y2), a third coordinate (x3, y3) and a fourth coordinate (x4, y 4).
It is to be understood that the procedure for obtaining the first coordinates (x1, y1) is similar to that obtained by the first corner coordinates (a1, b1) and the homography matrix H. Then, a second coordinate (x2, y2) can be obtained by the second corner coordinate (a2, b2) and the homography matrix H; from the third corner coordinates (a3, b3) and the homography matrix H, a third coordinate (x3, y3) can be obtained; from the fourth corner coordinates (a4, b4) and the homography matrix H, fourth coordinates (x4, y4) may be obtained.
32. And sequentially connecting the four pixel points to obtain a quadrilateral area to obtain the second face area.
In the embodiment of the present application, the four pixel points are points in a thermodynamic diagram, and the four pixel points are sequentially connected to obtain a quadrilateral area. And each pixel point in the temperature thermodynamic diagram carries the temperature information of the corresponding pixel point. Optionally, the temperature thermodynamic diagram is acquired by an infrared thermal imaging device on the temperature measuring device. The temperature measuring device obtains a quadrilateral area on the temperature thermodynamic diagram, and the quadrilateral area is used as a second face area, namely a pixel point area corresponding to the first face area.
As an alternative embodiment, the thermometric apparatus performs the following steps in the process of performing step 32:
acquiring an intersection point of diagonal lines of the quadrilateral area; setting a distance between a first point and the intersection point as a first distance; the first point is the pixel point which is closest to the intersection point among the four pixel points; constructing a first area by taking the intersection point as a circle center and the first distance as a radius; determining the intersection of the first area and the quadrilateral area to obtain a second area; selecting a region including the intersection from the second region as the second face region;
in the embodiment of the application, the coordinates of four pixel points in the temperature thermodynamic diagram are obtained through the homography matrix and the coordinates of four corner points of the face frame of the RGB image. Four pixel points in the temperature thermodynamic diagram are sequentially connected to obtain a quadrilateral area, if the quadrilateral area is directly used as a second face area, pixel points in the second face area may have pixel points corresponding to other areas except the pixel point corresponding to the first face area, and the accuracy of a temperature measurement result of a measurement object is reduced to some extent. It is to be understood that the present application relates to close range single person thermometry, and therefore the middle portion of the quadrilateral area may be part of the first face area. Therefore, an area accurately corresponding to the first face area is determined from the pixel point area included in the quadrilateral area, wherein the pixel point corresponding to the intersection point of the diagonal lines of the quadrilateral area may be a pixel point corresponding to the first face area.
The distances between the four pixel points and the intersection point are determined, the point closest to the intersection point is taken as a first point, and the distance between the first point and the intersection point is taken as a first distance. The first area is an area with the intersection point as the center of a circle and the first distance as the radius. However, the first region may contain more pixel regions than the quadrilateral. Therefore, a second area is selected from the first area, and the second area is an overlapping area of the first area and a pixel point area contained in the quadrilateral area. The second region constructed is a region of the middle portion of the quadrangular region.
Under the condition that the temperature measuring module obtains the temperature of the temperature measuring object of the image to be processed according to the temperature of any part of the pixel point region corresponding to the first face region in the temperature thermodynamic diagram, it needs to be determined that the pixel point region is the region corresponding to the first face region with high probability. It can be determined that the region in the middle of the quadrilateral region is a pixel point region corresponding to the first face region, that is, the constructed second region has a high probability of being a pixel point region corresponding to the first face region. And selecting the area containing the intersection point from the second area as the second face area, so that the accuracy of the temperature measurement result can be improved.
In one possible implementation, the pixel point corresponding to the intersection point is used as the second face region.
In another possible implementation manner, a pixel point region included in the maximum inscribed circle or a pixel point region included in the maximum inscribed rectangle of the second region is used as the second face region.
In another possible implementation manner, a pixel point region included in a maximum inscribed circle in the second region with the intersection point as a center is used as the second face region.
As an alternative embodiment, the thermometric apparatus performs the following steps in the process of performing step 32:
acquiring an intersection point of diagonal lines of the quadrilateral area; determining the maximum inscribed area of the quadrilateral area; the maximum inscribed region is a rectangular region or a circular region containing the intersection point; taking the maximum inscribed region as the second face region;
in the embodiment of the application, the coordinates of four pixel points in the temperature thermodynamic diagram can be obtained through the homography matrix and the coordinates of four corner points of the face frame of the RGB image. And sequentially connecting four pixel points in the temperature thermodynamic diagram to obtain a quadrilateral area. It may be determined that the pixel point corresponding to the intersection of the diagonals of the quadrilateral region may be one of the pixel points corresponding to the first face region. Because the application relates to single-person close-range temperature measurement, according to the position relation between RGB imaging equipment and infrared imaging equipment in a temperature measurement device and the mapping relation of a homography matrix, a maximum inscribed rectangle area or a maximum inscribed circle area containing intersection points is intercepted from a quadrilateral area, and a pixel point area corresponding to a first face area in a temperature force diagram can be contained. And obtaining coordinates of the four pixel points through the homography matrix and the coordinates of the four corner points, and solving a maximum inscribed rectangle area or a maximum inscribed circle area. Then, this maximum inscribed rectangle area or maximum inscribed circle area is the second face area in the temperature thermodynamic diagram.
In one possible implementation of determining that the shape of the largest inscribed region is circular, let the four pixel points of the quadrilateral region be A, B, C, D, and let the intersection point be E. Then the four sides of the quadrilateral area are AB, BD, CD, AC. The distances from the intersection point E to AB, BD, CD and AC are calculated respectively. Let AB be the edge closest to the intersection point, and let AB be the second distance from the intersection point E. And taking the intersection point as the center of a circle and the second distance as the radius to determine a circular pixel point area as the maximum inscribed area.
In another possible implementation manner to determine that the shape of the maximum inscribed region is circular, let the four pixel points of the quadrilateral region be A, B, C, D, and let the intersection point be E. Then the four sides of the quadrilateral area are AB, BD, CD, AC. Finding a point a on AB such that a is perpendicular to AB; finding a point B on the BD, so that B E is perpendicular to the BD; finding a point C on the CD, so that C E is perpendicular to the CD; find a point D on the AC, so that D E is perpendicular to the AC. The distance between point D and point B is taken as the third distance, and the distance between point a and point C is taken as the fourth distance. And comparing the third distance with the fourth distance, assuming that the third distance is smaller than the fourth distance, taking the midpoint of a connecting line D × B corresponding to the third distance as F, and taking half of the third distance as a fifth distance. And taking the midpoint F as the center of a circle and the fifth distance as the radius to determine a pixel point area contained in the circle as a maximum inscribed area.
In another possible implementation manner to determine that the shape of the maximum inscribed region is circular, let the four pixel points of the quadrilateral region be A, B, C, D, and let the intersection point be E. Then the four sides of the quadrilateral area are AB, BD, CD, AC. Finding a point a on AB such that a is perpendicular to AB; finding a point B on the BD, so that B E is perpendicular to the BD; finding a point C on the CD, so that C E is perpendicular to the CD; find a point D on the AC, so that D E is perpendicular to the AC. The distance between point D and point B is taken as the third distance, and the distance between point a and point C is taken as the fourth distance. And (4) setting the intersection point of A C and B D as G, comparing the third distance with the fourth distance, and taking half of the third distance as the fifth distance on the assumption that the third distance is smaller than the fourth distance. And taking the midpoint G as the center of a circle and the fifth distance as the radius to determine a pixel point area contained in the circle as a maximum inscribed area.
In one possible implementation of determining that the shape of the largest inscribed region is rectangular, let the four pixel points of the quadrilateral region be A, B, C, D. Then the four sides of the quadrilateral area are AB, BD, CD, AC. Let AB be the shortest of the four sides of the quadrilateral area. Then, find a point H on CD, making HA perpendicular to AB; find a point I on the CD such that IB is perpendicular to AB. Then find a point J on BI so that HJ is perpendicular to BI. Since A, B, I, H the quadrilateral region surrounded by four pixels has three right angles, the quadrilateral ABIH is a rectangle. The pixel point area contained by the rectangle ABIH is the maximum inscribed area.
In another possible implementation manner of determining that the shape of the maximum inscribed region is rectangular, let four pixel points of the quadrilateral region be A, B, C, D, and let an intersection point be E. The lengths of EA, EB, EC and ED were calculated, assuming that the length of EA was the shortest. Finding a point K on the BE such that the length of EK equals the length of EA; finding a point L on DE such that the length of EL equals the length of EA; find a point M on CE so that the length of EM equals the length of EA. That is, EK, EA, EL, EM, E is the midpoint of the line KM, and E is also the midpoint of the line AL. Because the diagonals of the quadrilateral region formed by the four A, K, L, M pixel points are equal and are bisected by the point E, the quadrilateral AKLM is a rectangle. The pixel point area contained in the rectangular AKLM is the maximum inscribed area.
In another possible implementation manner to determine that the shape of the maximum inscribed region is rectangular, let the four pixel points of the quadrilateral region be A, B, C, D. Then the four sides of the quadrilateral area are AB, BD, CD, AC. Taking the midpoint of the AB as N; taking the middle point of the BD as O; taking the middle point of the CD as P; the midpoint of AC is taken as R. Because the quadrangle enclosed by the midpoints of the arbitrary quadrangles is a parallelogram. I.e. the quadrilateral NOPR is a parallelogram. Wherein, PN and OR are two diagonals corresponding to the parallelogram NOPR, and the intersection point of the two diagonals is S. The shorter diagonal of the PN and OR segments is taken. For example, in the case where the length of the PN is smaller than the length of OR, the PN is taken as a diagonal line of the rectangular region, where the length of NS is equal to the length of SP. Finding a point T on the SO such that the length of the TS equals the length of the NS; find a point U on the SR so that the length of SU equals the length of NS. That is, NS is equal to SP, TS is equal to SU, S is the midpoint of the line segment PN, and S is also the midpoint of the line segment UT. Because the diagonals of the quadrilateral area formed by the four N, P, T, U pixel points are equal and are bisected by the point S, the quadrilateral NPTU is a rectangle. The pixel point area contained in the rectangle NPTU is the maximum inscribed area.
As an alternative, in order to obtain an accurate temperature of the object, the second face area needs to be accurately found in the temperature thermodynamic diagram. Therefore, the temperature measuring device executes the following steps in the process of determining the maximum inscribed area of the quadrilateral area:
selecting a second largest abscissa from the abscissas of the four pixel points to obtain a first abscissa; selecting a third largest abscissa from the abscissas of the four pixel points to obtain a second abscissa; selecting a second largest ordinate from the ordinates of the four pixel points to obtain a first ordinate; selecting a third largest ordinate from the ordinates of the four pixel points to obtain a second ordinate; determining a second point according to the first abscissa and the first ordinate; determining a third point according to the first abscissa and the second ordinate; determining a fourth point according to the second abscissa and the first ordinate; determining a fifth point according to the second abscissa and the second ordinate; a region in which the second point, the third point, the fourth point, and the fifth point are connected in this order is defined as the maximum inscribed region;
in the embodiment of the present application, it is assumed that the four pixel points are A, B, C, D respectively. Wherein, the coordinate of A is (X1, Y1); the coordinates of B are (X2, Y2); the coordinates of C are (X3, Y3); the coordinates of D are (X4, Y4). X1, X2, X3, X4 were ranked, and Y1, Y2, Y3, Y4 were ranked. Assuming X1> X2> X3> X4, the first abscissa is X2 and the second abscissa is X3. Assuming Y1> Y2> Y3> Y4, the first ordinate is Y2 and the second ordinate is Y3. The coordinates of the second point are (X2, Y2), the coordinates of the third point are (X2, Y3), the coordinates of the fourth point are (X3, Y2), and the coordinates of the fifth point are (X3, Y3). According to the coordinate relationship of the second point, the third point, the fourth point and the fifth point, the quadrilateral area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point can be determined to be a rectangular area. And a rectangular area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point is used as the maximum inscribed area. As an alternative embodiment, in order to further improve the accuracy of temperature measurement, the temperature measurement device uses the temperature of the forehead region of the first face region as the temperature of the temperature measurement object, and the temperature measurement device performs the following steps in the process of performing step 104:
41. carrying out forehead detection on the first face area to obtain a forehead detection result of the first face area;
in this embodiment, carry out forehead to above-mentioned first face region and detect, obtain the testing result and include: the forehead of the person in the first face area is in an occluded state or the forehead of the person in the first face area is in an unoccluded state.
In a possible implementation manner, the temperature measuring device performs first feature extraction processing on the first face region to obtain first feature data, where the first feature data carries information about whether the forehead of a person in the first face region is in an occlusion state. The temperature measuring device obtains a detection result according to first characteristic data obtained by forehead detection.
Optionally, the first feature extraction process may be implemented by a forehead detection network. The forehead detection network can be obtained by training the deep convolutional neural network by taking at least one first training image with marking information as training data, wherein the marking information comprises whether the forehead of a person in the first training image is in an occlusion state or not.
The mode that this application detected the adoption is not restricted to the forehead.
42. Obtaining the temperature of the temperature measurement object according to the temperature of pixel points included in a third face area under the condition that the forehead detection result of the first face area is that the forehead area in the first face area is not shielded; the third face area is an area corresponding to the forehead area in the second face area;
in this embodiment of the application, when the forehead detection result of the first face area is that the forehead area in the first face area is not blocked, the third face area is found from the temperature thermodynamic diagram first. The third face area is a pixel point area in the second face area corresponding to the forehead area in the first face area. Generally, the forehead area is located in the upper 30% -40% of the whole face area. That is, the third face region is the upper 30% to 40% of the second face region. And obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the third face area in the temperature thermodynamic diagram. And outputting prompt information needing to expose the forehead when the forehead detection result of the first face area is that the forehead area in the first face area is blocked.
In a possible implementation manner, the temperature measuring device is placed indoors, and the average temperature value of the pixel point region of the third face region is used as the temperature of the temperature measuring object. For example, the pixel points of the third face region include: pixel a and pixel b. Wherein, the temperature corresponding to the pixel point a is 36.9 degrees, and the temperature corresponding to the pixel point b is 36.3 degrees. The temperature measuring device can use the average value (36.6 degrees) of the temperature corresponding to the pixel point a and the temperature corresponding to the pixel point b as the temperature of the temperature measuring object.
In another possible implementation manner, the temperature measuring device uses the highest temperature value corresponding to the pixel point in the third face region as the temperature of the temperature measuring object. For example, the pixel points of the third face region include: the temperature of the pixel point a is 36.9 degrees, the temperature of the pixel point b is 36.3 degrees, and the temperature of the pixel point c is 37 degrees. The temperature measuring device can use the maximum value (37 degrees) of the temperature corresponding to the pixel point a, the temperature corresponding to the pixel point b and the temperature corresponding to the pixel point c as the temperature of the temperature measuring object.
In yet another possible implementation, the thermometric device is placed outdoors, and there may be direct solar radiation. And because the third face region may contain a pixel region corresponding to the direct solar radiation region. If the temperature of the pixel point corresponding to the direct solar radiation area and the temperature of the actual face area are averaged to be used as the temperature of the temperature measurement object in the image to be processed, a larger error exists. The temperature of the direct solar radiation is about 50 ℃, and the temperature which can be tolerated by human bodies is generally below 45 ℃. Therefore, the temperature of each pixel point in the pixel point region of the third face region is read, the pixel points with the temperature higher than 45 ℃ are excluded, and the average value of the temperatures of the pixel points with the temperature lower than 45 ℃ is used as the temperature of the temperature measurement object in the image to be processed.
It should be understood that, in this embodiment of the application, an average value of temperatures of a part of pixel points included in the third face region may also be used as a temperature of the temperature measurement object, or a maximum value of temperatures of a part of pixel points included in the third face region may also be used as a temperature of the temperature measurement object, which is not limited in this application.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a temperature measuring device according to an embodiment of the present application, where the temperature measuring device 1 includes: an acquisition unit 11, a detection unit 12, a first processing unit 13, a second processing unit 14, wherein:
the acquiring unit 11 is used for acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed;
the detection unit 12 is configured to perform face detection processing on the image to be processed to obtain a first face region;
the first processing unit 13 is configured to determine a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix, and obtain a second face region;
and the second processing unit 14 is configured to obtain the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel point included in the second face frame.
In combination with any embodiment of the present application, the first processing unit 13 is configured to:
determining four pixel points corresponding to the four corner points from the temperature thermodynamic diagram according to the homography matrix and the four corner points of the face frame containing the first face area;
and sequentially connecting the four pixel points to obtain a quadrilateral area to obtain the second face area.
In combination with any embodiment of the present application, the detecting unit 12 is configured to:
carrying out face detection on the image to be processed to obtain at least one first face frame;
taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region; and the resolution of a pixel point region contained in the second face frame exceeds the face frame with the resolution threshold.
In combination with any embodiment of the present application, the number of the first face frames exceeds 1, and the at least one first face frame includes a third face frame and a fourth face frame;
the detection unit 12 is configured to:
and under the condition that the overlapping rate between the third face frame and the fourth face frame does not exceed an overlapping rate threshold value, taking a pixel point region contained in any one of the third face frame and the fourth face frame as the first face region.
In combination with any embodiment of the present application, the detection unit is configured to:
taking a pixel point region contained in the face frame with the largest size measure in the third face frame and the fourth face frame as the first face region; the size measurement of the third face frame is the maximum value of the side length of the third face frame; and the size measurement of the fourth face frame is the maximum value of the side length of the fourth face frame.
In combination with any embodiment of the present application, the second processing unit 14 is configured to:
carrying out forehead detection on the first face area to obtain a forehead detection result of the first face area;
obtaining the temperature of the temperature measurement object according to the temperature of pixel points included in a third face area under the condition that the forehead detection result of the first face area is that the forehead area in the first face area is not shielded; the third face area is an area corresponding to the forehead area in the second face area.
In combination with any embodiment of the present application, the first processing unit 13 is configured to:
acquiring an intersection point of diagonal lines of the quadrilateral area;
taking the distance between the first point and the intersection point as a first distance; the first point is the pixel point which is closest to the intersection point in the four pixel points;
constructing a first area by taking the intersection point as a circle center and the first distance as a radius;
determining the intersection of the first area and the quadrilateral area to obtain a second area;
and selecting the area containing the intersection point from the second area as the second face area.
In combination with any embodiment of the present application, the first processing unit 13 is configured to:
acquiring an intersection point of diagonal lines of the quadrilateral area;
determining a maximum inscribed area of the quadrilateral area; the maximum inscribed region is a rectangular region or a circular region containing the intersection point;
and taking the maximum inscribed area as the second face area.
In combination with any embodiment of the present application, the first processing unit 13 is configured to:
selecting a second largest abscissa from the abscissas of the four pixel points to obtain a first abscissa, selecting a third largest abscissa from the abscissas of the four pixel points to obtain a second abscissa, selecting a second largest ordinate from the ordinates of the four pixel points to obtain a first ordinate, and selecting a third largest ordinate from the ordinates of the four pixel points to obtain a second ordinate;
determining a second point according to the first abscissa and the first ordinate, determining a third point according to the first abscissa and the second ordinate, determining a fourth point according to the second abscissa and the first ordinate, and determining a fifth point according to the second abscissa and the second ordinate;
and taking the area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point as the maximum inscribed area.
In the embodiment of the application, the temperature measuring device can accurately determine the face area of the temperature measuring object from the image to be processed to obtain the first face area by carrying out face detection processing on the image to be processed. And determining the face region of the temperature measurement object from the temperature thermodynamic diagram according to the homography matrix between the image to be processed and the temperature thermodynamic diagram and the position of the first face region in the image to be processed, so that the accuracy of the face region of the temperature measurement object determined from the temperature thermodynamic diagram can be improved, and the second face region is obtained. Therefore, the temperature measuring device obtains the temperature of the temperature measuring object according to the temperature of the second face area, and the accuracy of the temperature measuring object can be improved.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present application may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Fig. 3 is a schematic diagram of a hardware structure of a temperature measuring device according to an embodiment of the present application. The thermometry device 2 includes a processor 21, a memory 22, an input device 23, and an output device 24. The processor 21, the memory 22, the input device 23 and the output device 24 are coupled by a connector, which includes various interfaces, transmission lines or buses, etc., and the embodiment of the present application is not limited thereto. It should be appreciated that in various embodiments of the present application, coupled refers to being interconnected in a particular manner, including being directly connected or indirectly connected through other devices, such as through various interfaces, transmission lines, buses, and the like.
The processor 21 may be one or more Graphics Processing Units (GPUs), and in the case that the processor 21 is one GPU, the GPU may be a single-core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group composed of a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. Alternatively, the processor may be other types of processors, and the like, and the embodiments of the present application are not limited.
Memory 22 may be used to store computer program instructions, as well as various types of computer program code for executing the program code of aspects of the present application. Alternatively, the memory includes, but is not limited to, Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or compact disc read-only memory (CD-ROM), which is used for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It is understood that, in the embodiment of the present application, the memory 22 may be used to store not only the relevant instructions, but also relevant data, for example, the memory 22 may be used to store the image to be processed and the temperature thermodynamic diagram acquired by the input device 23, or the memory 22 may also be used to store the temperature of the temperature measurement object obtained by the processor 21, and the like, and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that fig. 3 only shows a simplified design of the thermometric device. In practical applications, the temperature measuring devices may also respectively include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all temperature measuring devices that can implement the embodiments of the present application are within the scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It is also clear to those skilled in the art that the descriptions of the various embodiments of the present application have different emphasis, and for convenience and brevity of description, the same or similar parts may not be repeated in different embodiments, so that the parts that are not described or not described in detail in a certain embodiment may refer to the descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (13)

1. A method of measuring temperature, the method comprising:
acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed;
carrying out face detection processing on the image to be processed to obtain a first face area;
determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region;
and obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face area.
2. The method of claim 1, wherein the determining a pixel region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region comprises:
determining four pixel points corresponding to the four corner points from the temperature thermodynamic diagram according to the homography matrix and the four corner points of the face frame containing the first face area;
and sequentially connecting the four pixel points to obtain a quadrilateral area to obtain the second face area.
3. The method according to claim 1 or 2, wherein the performing the face detection processing on the image to be processed to obtain a first face region comprises:
carrying out face detection on the image to be processed to obtain at least one first face frame;
taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region; and the resolution of a pixel point region contained in the second face frame exceeds the face frame with the resolution threshold.
4. The method of claim 3, wherein the number of the first face frames exceeds 1, and the at least one first face frame comprises a third face frame and a fourth face frame;
the taking a pixel point region contained in a second face frame in the at least one first face frame as the first face region includes:
and under the condition that the overlapping rate between the third face frame and the fourth face frame does not exceed an overlapping rate threshold value, taking a pixel point region contained in any one of the third face frame and the fourth face frame as the first face region.
5. The method according to claim 4, wherein the using, as the first face region, a pixel region included in any one of the third face frame and the fourth face frame includes:
taking a pixel point region contained in the face frame with the largest size measure in the third face frame and the fourth face frame as the first face region; the size measurement of the third face frame is the maximum value of the side length of the third face frame; and the size measurement of the fourth face frame is the maximum value of the side length of the fourth face frame.
6. The method according to any one of claims 1 to 5, wherein obtaining the temperature of the temperature measurement object corresponding to the first face region according to the temperature of the pixel point included in the second face region comprises:
carrying out forehead detection on the first face area to obtain a forehead detection result of the first face area;
obtaining the temperature of the temperature measurement object according to the temperature of pixel points included in a third face area under the condition that the forehead detection result of the first face area is that the forehead area in the first face area is not shielded; the third face area is an area corresponding to the forehead area in the second face area.
7. The method according to any one of claims 2 to 6, wherein the obtaining the second face region according to the quadrilateral region obtained by sequentially connecting the four pixel points comprises:
acquiring an intersection point of diagonal lines of the quadrilateral area;
taking the distance between the first point and the intersection point as a first distance; the first point is the pixel point which is closest to the intersection point in the four pixel points;
constructing a first area by taking the intersection point as a circle center and the first distance as a radius;
determining the intersection of the first area and the quadrilateral area to obtain a second area;
and selecting the area containing the intersection point from the second area as the second face area.
8. The method according to any one of claims 2 to 6, wherein the obtaining the second face region according to the quadrilateral region obtained by sequentially connecting the four pixel points comprises:
acquiring an intersection point of diagonal lines of the quadrilateral area;
determining a maximum inscribed area of the quadrilateral area; the maximum inscribed region is a rectangular region or a circular region containing the intersection point;
and taking the maximum inscribed area as the second face area.
9. The method of claim 8, wherein determining the largest inscribed region of the quadrilateral region comprises:
selecting a second largest abscissa from the abscissas of the four pixel points to obtain a first abscissa, selecting a third largest abscissa from the abscissas of the four pixel points to obtain a second abscissa, selecting a second largest ordinate from the ordinates of the four pixel points to obtain a first ordinate, and selecting a third largest ordinate from the ordinates of the four pixel points to obtain a second ordinate;
determining a second point according to the first abscissa and the first ordinate, determining a third point according to the first abscissa and the second ordinate, determining a fourth point according to the second abscissa and the first ordinate, and determining a fifth point according to the second abscissa and the second ordinate;
and taking the area obtained by sequentially connecting the second point, the third point, the fourth point and the fifth point as the maximum inscribed area.
10. A temperature measuring device, said device comprising:
the acquisition unit is used for acquiring an image to be processed, a temperature thermodynamic diagram and a homography matrix between the temperature thermodynamic diagram and the image to be processed;
the detection unit is used for carrying out face detection processing on the image to be processed to obtain a first face area;
the first processing unit is used for determining a pixel point region corresponding to the first face region from the temperature thermodynamic diagram according to the homography matrix to obtain a second face region;
and the second processing unit is used for obtaining the temperature of the temperature measurement object corresponding to the first face area according to the temperature of the pixel points contained in the second face frame.
11. A processor configured to perform the method of any one of claims 1 to 9.
12. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, if executed by the processor, the electronic device performs the method of any of claims 1 to 9.
13. A computer-readable storage medium, in which a computer program is stored, which computer program comprises program instructions which, if executed by a processor, cause the processor to carry out the method of any one of claims 1 to 9.
CN202011148359.5A 2020-10-23 2020-10-23 Temperature measuring method and device, electronic equipment and storage medium Pending CN112287798A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011148359.5A CN112287798A (en) 2020-10-23 2020-10-23 Temperature measuring method and device, electronic equipment and storage medium
PCT/CN2021/098352 WO2022083130A1 (en) 2020-10-23 2021-06-04 Temperature measurement method and apparatus, electronic device, and storage medium
TW110131735A TWI779801B (en) 2020-10-23 2021-08-26 Temperature measurement method, electronic equipment and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011148359.5A CN112287798A (en) 2020-10-23 2020-10-23 Temperature measuring method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112287798A true CN112287798A (en) 2021-01-29

Family

ID=74423805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011148359.5A Pending CN112287798A (en) 2020-10-23 2020-10-23 Temperature measuring method and device, electronic equipment and storage medium

Country Status (3)

Country Link
CN (1) CN112287798A (en)
TW (1) TWI779801B (en)
WO (1) WO2022083130A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113063500A (en) * 2021-03-30 2021-07-02 新疆爱华盈通信息技术有限公司 Face temperature measurement method, face temperature measurement instrument and storage medium
WO2022083130A1 (en) * 2020-10-23 2022-04-28 深圳市商汤科技有限公司 Temperature measurement method and apparatus, electronic device, and storage medium
WO2023093407A1 (en) * 2021-11-25 2023-06-01 上海商汤智能科技有限公司 Calibration method and apparatus, and electronic device and computer-readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967677A (en) * 2017-12-15 2018-04-27 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
US20180307928A1 (en) * 2016-04-21 2018-10-25 Tencent Technology (Shenzhen) Company Limited Living face verification method and device
CN109745014A (en) * 2018-12-29 2019-05-14 江苏云天励飞技术有限公司 Thermometry and Related product
CN110248107A (en) * 2019-06-13 2019-09-17 Oppo广东移动通信有限公司 Image processing method and device
CN110276308A (en) * 2019-06-25 2019-09-24 上海商汤智能科技有限公司 Image processing method and device
CN110879972A (en) * 2019-10-24 2020-03-13 深圳云天励飞技术有限公司 Face detection method and device
CN111339951A (en) * 2020-02-26 2020-06-26 北京迈格威科技有限公司 Body temperature measuring method, device and system
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111426388A (en) * 2020-03-31 2020-07-17 高新兴科技集团股份有限公司 Personnel body temperature measuring method, system, computer storage medium and electronic equipment
CN111623885A (en) * 2020-06-01 2020-09-04 上海闻泰电子科技有限公司 Infrared temperature measuring device and infrared temperature measuring method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101789078B (en) * 2010-03-01 2012-07-18 江西财经大学 Robust infrared face recognition technology
US9501681B1 (en) * 2015-07-14 2016-11-22 A9.Com, Inc. Decoding visual codes
CN109960974A (en) * 2017-12-22 2019-07-02 北京市商汤科技开发有限公司 Face critical point detection method, apparatus, electronic equipment and storage medium
CN109446981B (en) * 2018-10-25 2023-03-24 腾讯科技(深圳)有限公司 Face living body detection and identity authentication method and device
CN109856979B (en) * 2018-12-21 2022-03-25 深圳云天励飞技术有限公司 Environment adjusting method, system, terminal and medium
CN111507200A (en) * 2020-03-26 2020-08-07 北京迈格威科技有限公司 Body temperature detection method, body temperature detection device and dual-optical camera
CN111626125B (en) * 2020-04-26 2023-04-28 浙江大华技术股份有限公司 Face temperature detection method, system, device and computer equipment
CN112287798A (en) * 2020-10-23 2021-01-29 深圳市商汤科技有限公司 Temperature measuring method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180307928A1 (en) * 2016-04-21 2018-10-25 Tencent Technology (Shenzhen) Company Limited Living face verification method and device
CN107967677A (en) * 2017-12-15 2018-04-27 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN109745014A (en) * 2018-12-29 2019-05-14 江苏云天励飞技术有限公司 Thermometry and Related product
CN110248107A (en) * 2019-06-13 2019-09-17 Oppo广东移动通信有限公司 Image processing method and device
CN110276308A (en) * 2019-06-25 2019-09-24 上海商汤智能科技有限公司 Image processing method and device
CN110879972A (en) * 2019-10-24 2020-03-13 深圳云天励飞技术有限公司 Face detection method and device
CN111339951A (en) * 2020-02-26 2020-06-26 北京迈格威科技有限公司 Body temperature measuring method, device and system
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111426388A (en) * 2020-03-31 2020-07-17 高新兴科技集团股份有限公司 Personnel body temperature measuring method, system, computer storage medium and electronic equipment
CN111623885A (en) * 2020-06-01 2020-09-04 上海闻泰电子科技有限公司 Infrared temperature measuring device and infrared temperature measuring method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022083130A1 (en) * 2020-10-23 2022-04-28 深圳市商汤科技有限公司 Temperature measurement method and apparatus, electronic device, and storage medium
CN113063500A (en) * 2021-03-30 2021-07-02 新疆爱华盈通信息技术有限公司 Face temperature measurement method, face temperature measurement instrument and storage medium
WO2023093407A1 (en) * 2021-11-25 2023-06-01 上海商汤智能科技有限公司 Calibration method and apparatus, and electronic device and computer-readable storage medium

Also Published As

Publication number Publication date
WO2022083130A1 (en) 2022-04-28
TW202217647A (en) 2022-05-01
TWI779801B (en) 2022-10-01

Similar Documents

Publication Publication Date Title
CN112287798A (en) Temperature measuring method and device, electronic equipment and storage medium
CN110544258B (en) Image segmentation method and device, electronic equipment and storage medium
US7554575B2 (en) Fast imaging system calibration
CN112348863B (en) Image alignment method, image alignment device and terminal equipment
CN111738225B (en) Crowd gathering detection method, device, equipment and storage medium
TWI787113B (en) Methods, apparatuses, processors, electronic equipment and storage media for image processing
CN111815668A (en) Target tracking method, electronic device and storage medium
JP7255173B2 (en) Human detection device and human detection method
JP7188067B2 (en) Human detection device and human detection method
CN112200002B (en) Body temperature measuring method, device, terminal equipment and storage medium
CN115031635A (en) Measuring method and device, electronic device and storage medium
CN112102391A (en) Measuring method and device, electronic device and storage medium
US11003877B2 (en) Methods and systems for recognizing and reading a coded identification tag from video imagery
CN112200842B (en) Image registration method, device, terminal equipment and storage medium
CN116363583A (en) Human body identification method, device, equipment and medium for top view angle
CN114136462A (en) Calibration method and device, electronic equipment and computer readable storage medium
CN111739086A (en) Method and device for measuring area, electronic equipment and storage medium
CN112488076A (en) Face image acquisition method, system and equipment
CN112150527A (en) Measuring method and device, electronic device and storage medium
CN111724442B (en) Image processing method and device, electronic device and storage medium
US9830528B2 (en) Rotation invariant object feature recognition
CN115393579B (en) Infrared small target detection method based on weighted block contrast
JP2019159787A (en) Person detection method and person detection program
CN116386016B (en) Foreign matter treatment method and device, electronic equipment and storage medium
CN202916878U (en) Infrared thermal image processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40040128

Country of ref document: HK