CN112001886A - Temperature detection method, device, terminal and readable storage medium - Google Patents
Temperature detection method, device, terminal and readable storage medium Download PDFInfo
- Publication number
- CN112001886A CN112001886A CN202010692855.0A CN202010692855A CN112001886A CN 112001886 A CN112001886 A CN 112001886A CN 202010692855 A CN202010692855 A CN 202010692855A CN 112001886 A CN112001886 A CN 112001886A
- Authority
- CN
- China
- Prior art keywords
- face
- image
- temperature
- video stream
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 57
- 238000000034 method Methods 0.000 claims abstract description 47
- 238000009529 body temperature measurement Methods 0.000 claims abstract description 30
- 238000013507 mapping Methods 0.000 claims abstract description 26
- 238000001931 thermography Methods 0.000 claims description 23
- 238000004590 computer program Methods 0.000 claims description 18
- 210000004709 eyebrow Anatomy 0.000 claims description 3
- 230000008569 process Effects 0.000 description 18
- 238000004422 calculation algorithm Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 210000001061 forehead Anatomy 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000036760 body temperature Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 241000190070 Sarracenia purpurea Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Image Analysis (AREA)
Abstract
The application is applicable to the technical field of temperature detection, and provides a temperature detection method, a temperature detection device, a terminal and a readable storage medium, wherein the method comprises the following steps: recognizing a face image and extracting a first region of a target face part in the face image; acquiring a second region of the target face part from a thermal image which has a position mapping relation with the face image according to the first region; and acquiring the temperature value of the target face part according to the second area. This scheme can avoid the false retrieval condition to take place, promotes the degree of accuracy and the stability of temperature measurement result.
Description
Technical Field
The present application belongs to the field of temperature detection technologies, and in particular, to a temperature detection method, apparatus, terminal, and readable storage medium.
Background
With the frequent outbreaks of epidemic situations, the non-contact body temperature detection technology is more and more paid more attention by people. The existing non-contact body temperature detection mode is generally to use a forehead thermometer to aim at the position of the eyebrow center of a detected person to carry out infrared temperature measurement.
In some special cases, the safety distance between the detection person and the detected person needs to be further ensured, and the movable temperature measurement robot is specifically adopted in the existing application to realize body temperature detection.
The temperature measurement robot has the advantages of flexible movement, no limitation of places, capability of controlling the height change of the temperature measurement holder through the stretching rod of the temperature measurement robot so as to better realize temperature acquisition, and some problems.
The temperature measurement robot is inaccurate in determining the optimal temperature detection position of a detected person, under the common condition, most of temperature measurement robots achieve temperature detection by obtaining the value of the point with the highest temperature, however, if the person carries heating objects such as a hot water cup and the like, temperature false alarm can be caused, temperature measurement is unstable due to the change of the temperature measurement position of the temperature measurement cloud deck in the position moving process, and the accuracy of the temperature measurement is large in jumping.
Disclosure of Invention
The embodiment of the application provides a temperature detection method, a temperature detection device, a terminal and a readable storage medium, and aims to solve the problems that in the prior art, the temperature measurement result is misreported due to the fact that the optimal temperature detection position of a detected person determined by a temperature measurement robot is inaccurate, the temperature measurement is unstable, and the accuracy fluctuation is large.
A first aspect of an embodiment of the present application provides a temperature detection method, including:
recognizing a face image and extracting a first region of a target face part in the face image;
acquiring a second region of the target face part from a thermal image which has a position mapping relation with the face image according to the first region;
and acquiring the temperature value of the target face part according to the second area.
A second aspect of an embodiment of the present application provides a temperature detection apparatus, including:
the first coordinate acquisition module is used for identifying a face image and extracting a first area of a target face part in the face image;
the second coordinate acquisition module is used for acquiring a second area of the target face part from a thermal image which has a position mapping relation with the face image according to the first area;
and the temperature acquisition module is used for acquiring the temperature value of the target face part according to the second area.
A third aspect of embodiments of the present application provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method according to the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, performs the steps of the method according to the first aspect.
A fifth aspect of the present application provides a computer program product, which, when run on a terminal, causes the terminal to perform the steps of the method of the first aspect described above.
As can be seen from the above, in the embodiment of the present application, a first region of a target face portion in a face image is identified and extracted, a second region of the target face portion is obtained from a thermal image having a position mapping relationship with the face image according to the first region, and a temperature value of the target face portion is obtained according to the second region. In the process, the position of the same target face part in the thermal image is acquired by mapping the region position of the target face part in the face image to the thermal image with the position mapping relation, so that the temperature value corresponding to the coordinate is acquired based on the position of the target face part in the thermal image, the target face part is accurately determined, the temperature of the target face part after the accurate determination is accurately acquired, the false detection condition is avoided, and the accuracy and the stability of the temperature measurement result are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a first flowchart of a temperature detection method provided in an embodiment of the present application;
fig. 2 is a second flowchart of a temperature detection method according to an embodiment of the present disclosure;
FIG. 3 is a display interface diagram of face detection provided by an embodiment of the present application;
FIG. 4 is a diagram of a display interface for displaying a visible light video stream according to an embodiment of the present application;
fig. 5 is a structural diagram of a temperature detection device according to an embodiment of the present application;
fig. 6 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, fig. 1 is a first flowchart of a temperature detection method provided in an embodiment of the present application. As shown in fig. 1, a temperature detection method includes the following steps:
The extracting of the first region may be realized by coordinate extracting, and specifically may be to acquire region coordinates of the target face part in the face image, and realize extracting the first region of the target face part in the face image.
The obtaining of the coordinates may specifically be establishing a coordinate system based on the recognized face image, and determining a coordinate range of the target face portion in the face image in the coordinate system based on the coordinate system.
Specifically, the face image is specifically identified from an image or a video acquired by a camera.
The target face part is a part of the face in the recognized face image.
The target face part is a measuring position capable of accurately measuring the temperature of an object to be measured. The target face part can be a certain specific area or a certain specific point in the face, specifically, the target face part can be a forehead part of the face, or more specifically, the target face part is an eyebrow center point part in the forehead part of the face; the first region is a region where the forehead of the human face is located, and more specifically, the first region is a region where the eyebrow center point in the forehead of the human face is located, that is, the first region may be a sheet region or a dot region, and the corresponding coordinate may be a dot coordinate or a region coordinate range.
And 102, acquiring a second area of the target human face part from the thermal image with the position mapping relation with the human face image according to the first area.
Thermal images, which are images that record the heat or temperature of the object itself or radiated outward. Typically, the thermal image is embodied as an infrared thermal image. Here, the thermal image is an image having a position mapping relationship with the face image.
The human face image display method comprises the steps that a human face area of an object to be subjected to temperature measurement in a human face image also comprises a target human face part, the target human face part has a relative position in the human face area in the human face image, the human face area of the object to be subjected to temperature measurement in the thermal image also comprises the target human face part, the target human face part has a relative position in the human face area of the object to be subjected to temperature measurement in the thermal image also comprises a relative position in the human face area in the thermal image, the relative positions are the same in the thermal image and the human face image, the difference is only in the acquisition angles or the acquisition modes of the human face image and the thermal image, and the display positions of the target human face part in the human face image and the thermal image are shifted, so that the position mapping relation between the human face image and the thermal image is formed.
Specifically, in the actual implementation process, the same object to be measured may be subjected to image acquisition at the same time to obtain the face image and the thermal image, so that a position mapping relationship of the face based on the object to be measured is formed between the face image and the thermal image.
The second region refers to a region where the target face part is located in the thermal image, and the obtaining of the second region may be mapping coordinates of a first region in the face image to the thermal image, and obtaining coordinates of a second region corresponding to the first region.
And acquiring the second area, namely acquiring the area position of the same target human face part in the thermal image by specifically utilizing the position mapping relation of the human face between the human face image and the thermal image. Wherein, different human face parts in the thermal image correspond to temperature values. The method comprises the steps of mapping the region of a target face part in a face image to a thermal image with a position mapping relation, acquiring the region of the same target face part in the thermal image, acquiring a temperature value corresponding to the region position based on the region position of the target face part in the thermal image, and further determining the temperature of the target face part.
Similarly, the second area may be a sheet-like area or a dot-like area opposite to the first area, and the coordinate corresponding to the second area may be a point coordinate or an area coordinate range.
In the process, the target face part is accurately determined, and the temperature of the accurately determined target face part is accurately obtained.
And 103, acquiring a temperature value of the target face part according to the second area.
Different location points in the thermal image have corresponding temperature values. Here, the temperature at the position of the second area in the thermal image is the temperature of the target human face portion.
Optionally, the temperature value of the target face part may be an average value of temperatures of a plurality of location points in the second area, or a median value of temperatures of different location points in the second area, and may be set according to specific needs, which is not limited in this embodiment.
In the embodiment of the application, a face image is identified, a first area of a target face part in the face image is extracted, a second area of the target face part is obtained from a thermal image which has a position mapping relation with the face image according to the first area, and a temperature value of the target face part is obtained according to the second area. In the process, the position of the same target face part in the thermal image is acquired by mapping the region position of the target face part in the face image to the thermal image with the position mapping relation, so that the temperature value corresponding to the coordinate is acquired based on the position of the target face part in the thermal image, the target face part is accurately determined, the temperature of the target face part after the accurate determination is accurately acquired, the false detection condition is avoided, and the accuracy and the stability of the temperature measurement result are improved.
The embodiment of the application also provides different implementation modes of the temperature detection method.
Referring to fig. 2, fig. 2 is a second flowchart of a temperature detection method provided in the embodiment of the present application. As shown in fig. 2, a temperature detection method includes the following steps:
In particular, the visible light video stream is a video stream relative to a thermal imaging video stream. The visible light video stream is realized through a temperature measuring cloud platform, and can be collected through a camera arranged on the cloud platform.
When the human face is detected, the visible light video stream collected by the temperature measuring pan-tilt can be directly utilized to read the human face and the human face detection can be carried out based on the visible light video stream.
The visible light video stream is specifically an RGB (red, green, and blue) visible light video stream. And realizing the face detection of the visible light video stream through a face detection module.
Specifically, when the face detection module in this embodiment implements face detection, two parameters are specifically output, one is position information of a face in an image, and the other is key point information in the face. And identifying whether the video image is a human face or not by using the key point information, and determining the position information of the human face in the video image when the video image is determined to be the human face. In outputting, in conjunction with fig. 3, the position coordinates of the region block diagram of the face region (e.g., the upper left coordinate point (x1, y1), the lower right coordinate point (x2, y2), and the 5 key point coordinates of the face: the position coordinate points of the left eye, right eye, nose, left mouth angle, right mouth angle) may be output.
Specifically, the face detection module is specifically a face detection module applied to a mobile terminal (for example, a mobile robot), in the face detection module of the mobile terminal, a retinafec face detection algorithm is applied, an original retinafec algorithm includes a structure of 4 stages, and ResNet50 is applied as a backbone (neural network model) of the stages. In this embodiment, the face detection algorithm is improved in the mobile terminal face detection module, a backhaul is changed into a shufflentetv 2 × 0.5, and stages are changed into 3. The improvement can reduce the calculation amount of the whole network, improve the network detection speed and accelerate the calculation speed of face recognition.
Specifically, when the movement track of the face in the visible light video stream is obtained, the face of the key frame image in the visible light video stream may be obtained, and the coordinates of the face obtained from the key frame are extracted to obtain the movement track of the face in the visible light video stream. Wherein the key frame image may be each frame image in the visible light video stream or a partial frame image selected therefrom.
The aim of acquiring the moving track of the human face is to enable the temperature detected by the human face identified from the visible light video to be as smooth as possible, so that the temperature does not jump too much, and the accuracy of the final temperature measurement value is improved.
In a specific implementation, an image tracking module may be used, and the tracking algorithm adopted is an SORT tracking algorithm:
in a specific implementation process, the SORT tracking algorithm may specifically include:
and (3) prediction operation: and predicting the position of the target in the current frame by using a Kalman filter, and executing a Kalman filtering formula to predict a new bbox coordinate.
The moving object can be assumed to be a linear moving object through the prediction model, a Kalman filter is adopted for prediction, and the moving trajectory of the face is assumed as follows:
where u and v represent the center coordinates of the face region, respectively, and s and r represent the proportion of the bounding box of the face region in the image frame (area ratio) and the aspect ratio of the bounding box, respectively, which is considered to be constant, need to be kept constant. With superscript symbolsThe motion accelerations corresponding to u, v and s respectively.
The process is based on the lightweight face detection and the key point prediction algorithm, the temperature measurement range can be reduced to the face area, the measured temperature is very accurate, the face tracking technology is combined, the moving track of the face is detected, and the temperature of the same pedestrian can be stably displayed in a picture.
Wherein, the target face part is the eyebrow center part in the face.
Here, each moving position in the moving trajectory specifically includes: and displaying the positions of the human faces in the images in different frame images. Specifically, the method can be realized by calculating the display coordinates of the eyebrow center point in the human face.
The thermal imaging video stream and the visible light video stream are obtained by shooting the same object to be measured.
The temperature measuring cloud platform is provided with two paths of video streams, wherein one path is an RGB visible light video stream, and the other path is a thermal imaging video stream. After finding the object to be tested, the temperature measuring cloud platform starts to test the temperature of the object to be tested, and simultaneously acquires the visible light video stream and the thermal imaging video stream corresponding to the object to be tested.
The acquisition of the thermal imaging video stream can be realized by an infrared camera; specifically, an infrared camera is arranged on the temperature measuring holder, so that the thermal imaging video stream is acquired. Meanwhile, a position mapping relation between the face image and the thermal image is established.
The thermal imaging video stream comprises a plurality of thermal imaging video frame images. Because the thermal imaging video stream and the visible light video stream are obtained by shooting the same object to be detected, thermal imaging video frame images corresponding to the face images at different moving positions, namely the thermal images, can be obtained based on the moving track after the moving track of the face is determined.
And step 206, acquiring a second area corresponding to the first area in the thermal image in the face image of each mobile position according to the first area.
After determining the thermal images corresponding to the face images of each moving position, it is necessary to determine the regions of the target face portion in the thermal images from the thermal images, so as to determine the corresponding temperature values based on the regions of the target face portion in the thermal images.
The process can be specifically realized through an image matching module, wherein the image matching module is used for mapping the position of a target face part in a face image in a visible light video stream into a thermal imaging image and then taking a temperature value corresponding to the target face part from the thermal imaging video stream.
Specifically, the image matching may be calculated by using a homography transformation matrix.
The moving track of the face comprises a plurality of face moving positions, and the face image of each moving position has a second area of the target face part in the thermal image, so that the number of the second areas is multiple, the temperature value corresponding to each second area is integrated into a temperature data set, the temperature value in the temperature data set corresponds to each moving position in the moving track of the face in the visible light video stream, and is the temperature value of the target face part in each moving position in the moving track of the face in the visible light video stream.
In the process, the position of the human face can be positioned in the visible light video stream channel, the region where the eyebrow point of the human face is located is extracted, and then the region position, corresponding to the point, in the temperature chart is found through an image matching algorithm, so that the accurate temperature of the eyebrow point of the human face is extracted, and the condition of false detection cannot occur.
In the process, a motion track of the human face entering the robot temperature measurement range is obtained through a human face tracking technology, a temperature data set of the human face is obtained through the motion track, and a stable temperature value is obtained through a smoothing method.
And step 208, determining a temperature value of the target human face part based on the temperature data set.
Specifically, the temperature value of the target face part is a final temperature value, and may be an average value or a median of temperature values in the temperature data set.
Further, as an optional implementation manner, before recognizing a face image and extracting a first region of a target face part in the face image, the method further includes:
detecting the distance between the temperature sensor and an object to be measured;
and under the condition that the distance is within the range of the credible distance, controlling a visible light image acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the visible light video stream, and controlling a heat acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the thermal imaging video stream.
The detection of the distance between the temperature measuring device and the object to be measured can be realized by an infrared distance measuring device or an ultrasonic distance measuring device.
The credible distance range can be set according to the effective acquisition distance of the visible light image acquisition device and the heat acquisition device, so that the reliability of the visible light video stream and the heat imaging video stream is improved, and the accuracy of subsequent temperature detection is improved.
The visible light image acquisition device can be a common camera, such as a front-mounted or rear-mounted photographing camera in a mobile phone or a computer; the heat collection device may be an infrared camera.
Optionally, as shown in fig. 4, as a specific implementation manner, after acquiring the temperature value of the target face region according to the second area, the method further includes:
displaying a visible light video stream; wherein, the visible light video stream displays the marking information of the face image and the temperature value of the target face part which is displayed in association with the marking information.
The mark information is, for example, a face mark frame on the face image, or a prompt word; the association display between the temperature value of the target face portion and the annotation information may be, for example, that a plurality of face images are identified, each face image corresponds to a temperature value of the target face portion (e.g., temperature: 29 in fig. 4), at this time, the annotation information of each face image needs to be associated with the temperature value of the target face portion in the same face image for display, and the association display may be display at a closer distance, or directly display the temperature value in the annotation information, and so on.
Furthermore, the temperature value can be judged, and when the temperature value exceeds a set temperature threshold value, the marking information or the displayed temperature value is displayed in a set prompt color, such as red or orange. And when the temperature value does not exceed the set temperature threshold value, the marked information or the displayed temperature value is displayed in other colors different from the set prompt color, for example, green, so that the early warning and prompt display of the temperature value are realized. Furthermore, when a plurality of face images are identified, the temperature values corresponding to different face images can be respectively compared with the set temperature threshold value, and are correspondingly displayed according to the different color display modes of the labeling information or the temperature values, and the face images with the temperature values exceeding the set temperature threshold value can be recorded, so that the effectiveness of temperature detection and the convenience of information management are improved.
In the embodiment of the application, a face image is identified, a first area of a target face part in the face image is extracted, a second area of the target face part is obtained from a thermal image which has a position mapping relation with the face image according to the first area, and a temperature value of the target face part is obtained according to the second area. In the process, the position of the same target face part in the thermal image is acquired by mapping the region position of the target face part in the face image to the thermal image with the position mapping relation, so that the temperature value corresponding to the coordinate is acquired based on the position of the target face part in the thermal image, the target face part is accurately determined, the temperature of the target face part after the accurate determination is accurately acquired, the false detection condition is avoided, and the accuracy and the stability of the temperature measurement result are improved.
Referring to fig. 5, fig. 5 is a structural diagram of a temperature detection device according to an embodiment of the present application, and for convenience of description, only a part related to the embodiment of the present application is shown.
The temperature detection device 500 includes:
a first coordinate obtaining module 501, configured to identify a face image and extract a first region of a target face portion in the face image;
a second coordinate obtaining module 502, configured to obtain, according to the first region, a second region of the target face portion from a thermal image having a position mapping relationship with the face image;
a temperature obtaining module 503, configured to obtain a temperature value of the target face part according to the second area.
The first coordinate obtaining module 501 is specifically configured to:
carrying out face detection on the collected visible light video stream;
under the condition that a human face is detected, acquiring a moving track of the human face in the visible light video stream;
and acquiring a first area of a target face part in the face image for the face image at each moving position in the moving track.
The second coordinate obtaining module 502 is specifically configured to:
reading a thermal imaging video stream, wherein the thermal imaging video stream and the visible light video stream are obtained by shooting the same object to be measured in temperature;
determining a thermal image corresponding to the face image of each moving position in the moving track from the thermal imaging video stream;
and acquiring a second area corresponding to the first area in the thermal image in the face image of each mobile position according to the first area.
The temperature obtaining module 503 is specifically configured to:
determining a temperature value corresponding to each second area according to the second areas acquired from the thermal image to obtain a temperature data set;
and determining a temperature value of the target human face part based on the temperature data set.
Wherein, this temperature-detecting device still includes:
the distance detection module is used for detecting the distance between the object to be measured and the object to be measured;
and the video stream acquisition module is used for controlling the visible light image acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the visible light video stream and controlling the heat acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the thermal imaging video stream under the condition that the distance is judged to be within the range of the credible distance.
Wherein, this temperature-detecting device still includes:
the display module is used for displaying the visible light video stream;
and displaying the marking information of the face image and the temperature value of the target face part in association with the marking information in the visible light video stream.
The target face part is an eyebrow center part in the face.
The temperature detection device provided by the embodiment of the application can realize each process of the embodiment of the temperature detection method, can achieve the same technical effect, and is not repeated here for avoiding repetition.
Fig. 6 is a block diagram of a terminal according to an embodiment of the present application. As shown in the figure, the terminal 6 of this embodiment includes: at least one processor 60 (only one shown in fig. 6), a memory 61, and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, the steps of any of the various method embodiments described above being implemented when the computer program 62 is executed by the processor 60.
The terminal 6 may include, but is not limited to, a processor 60, a memory 61. It will be appreciated by those skilled in the art that fig. 6 is only an example of a terminal 6 and does not constitute a limitation of the terminal 6, and that it may comprise more or less components than those shown, or some components may be combined, or different components, for example the terminal may further comprise input output devices, network access devices, buses, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the terminal 6, such as a hard disk or a memory of the terminal 6. The memory 61 may also be an external storage device of the terminal 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the terminal 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal 6. The memory 61 is used for storing the computer program and other programs and data required by the terminal. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described apparatus/terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The present application realizes all or part of the processes in the method of the above embodiments, and may also be implemented by a computer program product, when the computer program product runs on a terminal, the steps in the above method embodiments may be implemented when the terminal executes the computer program product.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. A method of detecting temperature, comprising:
recognizing a face image and extracting a first region of a target face part in the face image;
acquiring a second region of the target face part from a thermal image which has a position mapping relation with the face image according to the first region;
and acquiring the temperature value of the target face part according to the second area.
2. The method of claim 1, wherein the recognizing the face image and extracting the first region of the target face part in the face image comprises:
carrying out face detection on the collected visible light video stream;
under the condition that a human face is detected, acquiring a moving track of the human face in the visible light video stream;
and acquiring a first area of a target face part in the face image for the face image at each moving position in the moving track.
3. The method according to claim 2, wherein the obtaining a second region of the target face portion from the thermal image having a position mapping relationship with the face image according to the first region comprises:
reading a thermal imaging video stream, wherein the thermal imaging video stream and the visible light video stream are obtained by shooting the same object to be measured in temperature;
determining a thermal image corresponding to the face image of each moving position in the moving track from the thermal imaging video stream;
and acquiring a second area corresponding to the first area in the thermal image in the face image of each mobile position according to the first area.
4. The method according to claim 3, wherein the obtaining the temperature value of the target face portion according to the second region comprises:
determining a temperature value corresponding to each second area according to the second areas acquired from the thermal image to obtain a temperature data set;
and determining a temperature value of the target human face part based on the temperature data set.
5. The method of claim 3, wherein the recognizing the face image and extracting the target face part before the first region in the face image further comprises:
detecting the distance between the temperature sensor and an object to be measured;
and under the condition that the distance is within the range of the credible distance, controlling a visible light image acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the visible light video stream, and controlling a heat acquisition device to perform video acquisition on the object to be subjected to temperature measurement to obtain the thermal imaging video stream.
6. The method according to claim 2, wherein after acquiring the temperature value of the target face portion according to the second region, the method further comprises:
displaying the visible light video stream;
and displaying the marking information of the face image and the temperature value of the target face part in association with the marking information in the visible light video stream.
7. The method according to any one of claims 1 to 6, wherein the target face part is an eyebrow part in a human face.
8. A temperature detection device, comprising:
the first coordinate acquisition module is used for identifying a face image and extracting a first area of a target face part in the face image;
the second coordinate acquisition module is used for acquiring a second area of the target face part from a thermal image which has a position mapping relation with the face image according to the first area;
and the temperature acquisition module is used for acquiring the temperature value of the target face part according to the second area.
9. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010692855.0A CN112001886A (en) | 2020-07-17 | 2020-07-17 | Temperature detection method, device, terminal and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010692855.0A CN112001886A (en) | 2020-07-17 | 2020-07-17 | Temperature detection method, device, terminal and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112001886A true CN112001886A (en) | 2020-11-27 |
Family
ID=73467561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010692855.0A Pending CN112001886A (en) | 2020-07-17 | 2020-07-17 | Temperature detection method, device, terminal and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112001886A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112906600A (en) * | 2021-03-04 | 2021-06-04 | 联想(北京)有限公司 | Object information monitoring method and device and electronic equipment |
CN113063500A (en) * | 2021-03-30 | 2021-07-02 | 新疆爱华盈通信息技术有限公司 | Face temperature measurement method, face temperature measurement instrument and storage medium |
CN113503972A (en) * | 2021-09-08 | 2021-10-15 | 四川大学 | Local dynamic target temperature measurement system based on low-pixel infrared camera |
CN113642404A (en) * | 2021-07-13 | 2021-11-12 | 季华实验室 | Target identification detection association method, device, medium, and computer program product |
CN114543997A (en) * | 2022-02-11 | 2022-05-27 | 深圳市商汤科技有限公司 | Temperature measurement method and related product |
WO2022121243A1 (en) * | 2020-12-07 | 2022-06-16 | 北京市商汤科技开发有限公司 | Calibration method and apparatus, and electronic device, storage medium, and program product |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101972140A (en) * | 2010-09-07 | 2011-02-16 | 航天海鹰安全技术工程有限公司 | Thermal imaging temperature monitoring device, system and method |
CN101999888A (en) * | 2010-12-01 | 2011-04-06 | 北京航空航天大学 | Epidemic preventing and controlling system for detecting and searching people with abnormal temperatures |
CN105852819A (en) * | 2016-03-23 | 2016-08-17 | 深圳云天励飞技术有限公司 | Body temperature monitoring method and device |
CN108010008A (en) * | 2017-12-01 | 2018-05-08 | 北京迈格威科技有限公司 | Method for tracing, device and the electronic equipment of target |
CN108764071A (en) * | 2018-05-11 | 2018-11-06 | 四川大学 | It is a kind of based on infrared and visible images real human face detection method and device |
CN109846463A (en) * | 2019-03-04 | 2019-06-07 | 武汉迅检科技有限公司 | Infrared face temp measuring method, system, equipment and storage medium |
CN110060272A (en) * | 2018-01-18 | 2019-07-26 | 杭州海康威视数字技术股份有限公司 | Determination method, apparatus, electronic equipment and the storage medium of human face region |
CN111209812A (en) * | 2019-12-27 | 2020-05-29 | 深圳市优必选科技股份有限公司 | Target face picture extraction method and device and terminal equipment |
CN111337142A (en) * | 2020-04-07 | 2020-06-26 | 北京迈格威科技有限公司 | Body temperature correction method and device and electronic equipment |
CN111366252A (en) * | 2020-05-20 | 2020-07-03 | 浙江双视红外科技股份有限公司 | Method for stabilizing screening result of infrared human body surface temperature rapid screening instrument |
CN111366244A (en) * | 2020-03-02 | 2020-07-03 | 北京迈格威科技有限公司 | Temperature measuring method and device, electronic equipment and computer readable storage medium |
-
2020
- 2020-07-17 CN CN202010692855.0A patent/CN112001886A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101972140A (en) * | 2010-09-07 | 2011-02-16 | 航天海鹰安全技术工程有限公司 | Thermal imaging temperature monitoring device, system and method |
CN101999888A (en) * | 2010-12-01 | 2011-04-06 | 北京航空航天大学 | Epidemic preventing and controlling system for detecting and searching people with abnormal temperatures |
CN105852819A (en) * | 2016-03-23 | 2016-08-17 | 深圳云天励飞技术有限公司 | Body temperature monitoring method and device |
CN108010008A (en) * | 2017-12-01 | 2018-05-08 | 北京迈格威科技有限公司 | Method for tracing, device and the electronic equipment of target |
CN110060272A (en) * | 2018-01-18 | 2019-07-26 | 杭州海康威视数字技术股份有限公司 | Determination method, apparatus, electronic equipment and the storage medium of human face region |
CN108764071A (en) * | 2018-05-11 | 2018-11-06 | 四川大学 | It is a kind of based on infrared and visible images real human face detection method and device |
CN109846463A (en) * | 2019-03-04 | 2019-06-07 | 武汉迅检科技有限公司 | Infrared face temp measuring method, system, equipment and storage medium |
CN111209812A (en) * | 2019-12-27 | 2020-05-29 | 深圳市优必选科技股份有限公司 | Target face picture extraction method and device and terminal equipment |
CN111366244A (en) * | 2020-03-02 | 2020-07-03 | 北京迈格威科技有限公司 | Temperature measuring method and device, electronic equipment and computer readable storage medium |
CN111337142A (en) * | 2020-04-07 | 2020-06-26 | 北京迈格威科技有限公司 | Body temperature correction method and device and electronic equipment |
CN111366252A (en) * | 2020-05-20 | 2020-07-03 | 浙江双视红外科技股份有限公司 | Method for stabilizing screening result of infrared human body surface temperature rapid screening instrument |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022121243A1 (en) * | 2020-12-07 | 2022-06-16 | 北京市商汤科技开发有限公司 | Calibration method and apparatus, and electronic device, storage medium, and program product |
CN112906600A (en) * | 2021-03-04 | 2021-06-04 | 联想(北京)有限公司 | Object information monitoring method and device and electronic equipment |
CN113063500A (en) * | 2021-03-30 | 2021-07-02 | 新疆爱华盈通信息技术有限公司 | Face temperature measurement method, face temperature measurement instrument and storage medium |
CN113642404A (en) * | 2021-07-13 | 2021-11-12 | 季华实验室 | Target identification detection association method, device, medium, and computer program product |
CN113503972A (en) * | 2021-09-08 | 2021-10-15 | 四川大学 | Local dynamic target temperature measurement system based on low-pixel infrared camera |
CN113503972B (en) * | 2021-09-08 | 2022-01-18 | 四川大学 | Local dynamic target temperature measurement system based on low-pixel infrared camera |
CN114543997A (en) * | 2022-02-11 | 2022-05-27 | 深圳市商汤科技有限公司 | Temperature measurement method and related product |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112001886A (en) | Temperature detection method, device, terminal and readable storage medium | |
KR102173123B1 (en) | Method and apparatus for recognizing object of image in electronic device | |
CN110909611B (en) | Method and device for detecting attention area, readable storage medium and terminal equipment | |
TW393629B (en) | Hand gesture recognition system and method | |
US20120027252A1 (en) | Hand gesture detection | |
US20120027263A1 (en) | Hand gesture detection | |
CN108596955B (en) | Image detection method, image detection device and mobile terminal | |
JP2015106281A (en) | Operation determination method, operation determination device, and operation determination program | |
CN111461092B (en) | Method, device and equipment for brushing face, measuring temperature and checking body | |
CN112633084A (en) | Face frame determination method and device, terminal equipment and storage medium | |
CN110264523A (en) | A kind of method and apparatus of the location information of target image in determining test image | |
WO2019095117A1 (en) | Facial image detection method and terminal device | |
CN111297337A (en) | Detection object judgment method, system, machine readable medium and equipment | |
CN112200002B (en) | Body temperature measuring method, device, terminal equipment and storage medium | |
KR20200127928A (en) | Method and apparatus for recognizing object of image in electronic device | |
CN109444905B (en) | Dynamic object detection method and device based on laser and terminal equipment | |
JPWO2017179543A1 (en) | Information processing apparatus, information processing method, and program recording medium | |
CN108763491B (en) | Picture processing method and device and terminal equipment | |
CN110288621A (en) | Lip line complementing method, device, electronic equipment and storage medium based on B-spline | |
JP2012027617A (en) | Pattern identification device, pattern identification method and program | |
CN113269730B (en) | Image processing method, image processing device, computer equipment and storage medium | |
Yousefi et al. | 3D hand gesture analysis through a real-time gesture search engine | |
CN109005357B (en) | Photographing method, photographing device and terminal equipment | |
CN111797656B (en) | Face key point detection method and device, storage medium and electronic equipment | |
WO2019134606A1 (en) | Terminal control method, device, storage medium, and electronic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |