CN112525355A - Image processing method, device and equipment - Google Patents

Image processing method, device and equipment Download PDF

Info

Publication number
CN112525355A
CN112525355A CN202011494299.2A CN202011494299A CN112525355A CN 112525355 A CN112525355 A CN 112525355A CN 202011494299 A CN202011494299 A CN 202011494299A CN 112525355 A CN112525355 A CN 112525355A
Authority
CN
China
Prior art keywords
infrared image
image
human face
target
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011494299.2A
Other languages
Chinese (zh)
Inventor
王吉汉
梁云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202011494299.2A priority Critical patent/CN112525355A/en
Publication of CN112525355A publication Critical patent/CN112525355A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Processing (AREA)
  • Radiation Pyrometers (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The embodiment of the invention provides an image processing method, device and equipment. An image processing method comprising: acquiring an infrared image; performing face living body detection on the infrared image based on a preset face living body detection model to obtain a first detection result, wherein the first detection result is used for indicating whether the infrared image is an image of a face living body; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body; and outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image. Through the scheme, the purpose of accurately and effectively measuring the temperature of the personnel based on image analysis can be realized.

Description

Image processing method, device and equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and an image processing device.
Background
Currently, for public safety, the body temperature of a person needs to be monitored or detected in a large scene, for example: temperature measurement is needed before a customer enters a supermarket, when a staff enters an office area, temperature measurement is needed, and the like.
In the related art, temperature measurement is performed by a hand-held contact type temperature measuring instrument. However, the handheld contact type temperature measuring instrument needs to measure the temperature manually, so that the efficiency is seriously influenced, and the risk of group infection is increased.
Nowadays, image acquisition equipment is more and more widely applied, so how to carry out accurate effective temperature measurement to personnel based on image analysis is a problem that needs to be solved urgently.
Disclosure of Invention
The embodiment of the invention aims to provide an image processing method, an image processing device and image processing equipment, so as to achieve the aim of accurately and effectively measuring the temperature of a person based on image analysis. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, including:
acquiring an infrared image;
performing face living body detection on the infrared image based on a preset face living body detection model to obtain a first detection result, wherein the first detection result is used for indicating whether the infrared image is an image of a face living body; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body;
and outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image.
Optionally, before outputting the temperature information of the target area based on the infrared image, the method further includes:
acquiring a target visible light image; the target visible light image and the infrared image are images which are related to the same acquisition area and are matched in acquisition time;
carrying out face detection on the target visible light image to obtain a second detection result; the second detection result is used for representing whether a human face exists in the target visible light image or not;
when the human face living body exists in the infrared image based on the first detection result, outputting temperature information of a target area based on the infrared image, wherein the method comprises the following steps:
and outputting temperature information of a target area based on the infrared image when the first detection result shows that the infrared image has the human face living body and the second detection result shows that the target visible light image has the human face.
Optionally, the outputting temperature information of the target area based on the infrared image includes:
determining the image position of a target area in the infrared image as a target position based on the face area in the infrared image;
and determining the temperature information of the target area based on the target position, and outputting the temperature information of the target area.
Optionally, before determining the temperature information of the target area based on the target position, the method further includes:
correcting the target position by using the position information of key points of the human face in the visible light image corresponding to the infrared image to obtain the corrected target position;
the visible light image and the infrared image corresponding to the infrared image are images which are related to the same acquisition area and are matched in acquisition time.
Optionally, the determining temperature information of the target area based on the target position includes:
determining a matrix area indicated by the target position in the temperature measurement thermodynamic matrix; the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image;
and selecting the highest temperature value from the matrix area as the temperature information of the target area.
Optionally, before the target position is corrected by using the position information of the key point of the face in the visible light image corresponding to the infrared image, the method further includes:
identifying the appointed face angle of the face in the visible light image corresponding to the infrared image;
and if the specified face angle meets the preset face correction condition, executing the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image.
Optionally, before the target position is corrected by using the position information of the key point of the face in the visible light image corresponding to the infrared image, the method further includes:
determining an intersection area of a face area of the visible light image corresponding to the infrared image and a face area of the infrared image;
calculating the proportion of the intersection area in the face area of the infrared image;
and if the ratio is larger than a preset ratio threshold value, executing the step of correcting the target position by using the position information of the key points of the human face in the visible light image corresponding to the infrared image.
In a second aspect, an embodiment of the present invention provides an image processing apparatus, including:
the first acquisition module is used for acquiring an infrared image;
the first detection module is used for carrying out human face living body detection on the infrared image based on a preset human face living body detection model to obtain a first detection result, and the first detection result is used for indicating whether the infrared image is an image of a human face living body; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body;
and the output module is used for outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring a target visible light image before the output module outputs the temperature information of the target area based on the infrared image; the target visible light image and the infrared image are images which are related to the same acquisition area and are matched in acquisition time;
the second detection module is used for carrying out face detection on the target visible light image to obtain a second detection result; the second detection result is used for representing whether a human face exists in the target visible light image or not;
the output module is specifically configured to:
and outputting temperature information of a target area based on the infrared image when the first detection result shows that the infrared image has the human face living body and the second detection result shows that the target visible light image has the human face.
Optionally, the output module outputs temperature information of a target area based on the infrared image, including:
determining the image position of a target area in the infrared image as a target position based on the face area in the infrared image;
and determining the temperature information of the target area based on the target position, and outputting the temperature information of the target area.
Optionally, the output module is further configured to, before determining the temperature information of the target area based on the target position, correct the target position by using position information of a key point of a face in a visible light image corresponding to the infrared image, so as to obtain the corrected target position;
the visible light image and the infrared image corresponding to the infrared image are images which are related to the same acquisition area and are matched in acquisition time.
Optionally, the output module determines the temperature information of the target area based on the target position, including:
determining a matrix area indicated by the target position in the temperature measurement thermodynamic matrix; the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image;
and selecting the highest temperature value from the matrix area as the temperature information of the target area.
Optionally, the output module is further configured to identify a specified face angle of a face in the visible light image corresponding to the infrared image before correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image; and if the specified face angle meets the preset face correction condition, executing the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image.
Optionally, the output module is further configured to determine an intersection region of a face region of the visible light image corresponding to the infrared image and a face region of the infrared image before correcting the target position by using position information of a key point of the face in the visible light image corresponding to the infrared image; calculating the proportion of the intersection area in the face area of the infrared image; and if the ratio is larger than a preset ratio threshold value, executing the step of correcting the target position by using the position information of the key points of the human face in the visible light image corresponding to the infrared image.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor, configured to implement the method steps provided by the first aspect when executing the program stored in the memory.
In a fourth aspect, the present invention provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps provided in the first aspect.
In the image processing method provided by the embodiment of the invention, after an infrared image is obtained, human face living body detection is carried out on the infrared image based on a preset human face living body detection model, and a first detection result for indicating whether the infrared image is an image of a human face living body is obtained; furthermore, when it is determined that a living human face exists in the infrared image based on the first detection result, temperature information of a target area is output based on the infrared image, and the target area is an area capable of representing the temperature of the human body in the human face area of the infrared image. In the scheme, the human face living body detection is carried out on the infrared image based on a preset human face living body detection model, so that whether the temperature-to-be-detected person exists or not can be accurately identified; and when the human face living body exists in the infrared image based on the detection result, the temperature information of the target area is output based on the infrared image, and the temperature measurement of the person to be subjected to temperature measurement can be realized. Therefore, by the scheme, the aim of accurately and effectively measuring the temperature of the personnel based on image analysis can be fulfilled.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by using the drawings without creative efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is another flowchart of an image processing method according to an embodiment of the present invention;
fig. 3(a), 3(b), and 3(c) respectively exemplarily show grayscale diagram intentions of a sample infrared image;
FIG. 3(d) is an exemplary schematic diagram of human thermometry using the image processing method of an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of another structure of an image processing apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
To facilitate understanding of the solution, terms referred to in the solution provided by the embodiment of the present invention will be described first.
And (3) living body detection: the method for confirming the real physiological characteristics of the object in the identity verification scene can effectively resist common attack means such as photos, screen reproduction and the like, thereby helping a user to discriminate fraudulent behaviors and ensuring the benefits of the user.
Thermal imaging is that all objects in the nature can have infrared radiation as long as the temperature is higher than absolute zero-273 ℃, the radiation energy is proportional to the fourth power of the temperature, and the wavelength of the radiation is inversely proportional to the temperature. The infrared imaging technology is that according to the detected radiation energy of an object, the radiation energy is converted into an infrared image of the object through system processing, and the infrared image is also called as a thermal imaging image; in the infrared image, different colors characterize different temperatures. The temperature distribution of the object can be obtained through the infrared image so as to judge the state of the object.
Infrared temperature measurement: biological thermal radiation can be converted into infrared images by a thermal sensor.
Deep Learning (DL, Deep Learning): is a new research direction in the field of Machine Learning (ML), which is introduced into Machine Learning to make it closer to the original target, Artificial Intelligence (AI).
Caffe: the full name of the concept Architecture for Fast Feature Embedding is a deep learning framework with expressiveness, speed and thinking modularity.
In order to achieve the purpose of accurately and effectively measuring the temperature of a person based on image analysis, the embodiment of the invention provides an image processing method and device, electronic equipment and a storage medium.
The following first describes an image processing method provided by an embodiment of the present invention.
The execution subject of the image processing method provided by the embodiment of the invention can be an image processing device. The image processing apparatus may be applied to an electronic device, which may be, for example, an image capture device capable of capturing an infrared image, or a device communicating with an image capture device capable of capturing an infrared image, and the embodiment of the present invention does not limit the specific form of the device.
Moreover, the image processing method can be applied to any scene with human temperature measurement, such as: temperature measurement of personnel in office areas, temperature measurement of personnel at store entrances, and the like.
The image processing method provided by the embodiment of the invention can comprise the following steps:
acquiring an infrared image;
performing face living body detection on the infrared image based on a preset face living body detection model to obtain a first detection result, wherein the first detection result is used for indicating whether the infrared image is an image of a face living body;
the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of the human face living body, the infrared image is determined to be the image of the human face living body.
And outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image.
According to the scheme provided by the embodiment of the invention, after the infrared image is obtained, the human face living body detection is carried out on the infrared image based on a preset human face living body detection model, and a first detection result for indicating whether the infrared image is an image of a human face living body is obtained; furthermore, when it is determined that a living human face exists in the infrared image based on the first detection result, temperature information of a target area is output based on the infrared image, and the target area is an area capable of representing the temperature of the human body in the human face area of the infrared image. In the scheme, the human face living body detection is carried out on the infrared image based on a preset human face living body detection model, so that whether the temperature-to-be-detected person exists or not can be accurately identified; and when the human face living body exists in the infrared image based on the detection result, the temperature information of the target area is output based on the infrared image, and the temperature measurement of the person to be subjected to temperature measurement can be realized. Therefore, by the scheme, the aim of accurately and effectively measuring the temperature of the personnel based on image analysis can be fulfilled.
The following describes an image processing method provided by an embodiment of the present invention with reference to the accompanying drawings.
As shown in fig. 1, an image processing method provided in an embodiment of the present invention may include the following steps:
s101, acquiring an infrared image;
the image acquisition equipment capable of acquiring the infrared image can acquire the infrared image and send the infrared image to the image processing device, and correspondingly, the image processing device can acquire the infrared image. And the image acquisition equipment can continuously acquire the infrared images and send all or part of the continuously acquired infrared images to the image processing device. Since the processing flow of each infrared image by the image processing apparatus is the same, the image processing method provided by the embodiment of the present invention is described in conjunction with the processing procedure of one infrared image.
S102, carrying out human face living body detection on the infrared image based on a preset human face living body detection model to obtain a first detection result;
wherein, the first detection result is used for indicating whether the infrared image is an image of a human face living body; and the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of the human face living body, the infrared image is determined to be the image of the human face living body. It is to be understood that the first detection result may also be used to indicate position information of a face region when a living human face exists.
In order to accurately and effectively measure the temperature of personnel, after the infrared image is obtained, the temperature measurement information is not directly output based on the infrared image, but the human face living body detection model is preset to identify whether the human face living body exists in the infrared image so as to accurately determine whether the personnel to be measured exist, and when the personnel to be measured exist, the temperature measurement information is output based on the infrared image. Specifically, after an infrared image is obtained, the image processing device inputs the obtained infrared image into a preset human face living body detection model, so that the human face living body detection model performs human face living body detection on the infrared image to obtain a first detection result of the infrared image; thus, when the first detection result shows that the infrared image is an image containing a human face living body, the temperature information is determined based on the infrared image.
Considering that the temperature distribution in the infrared image containing the living human face is completely different from the temperature distribution in the infrared image not containing the living human face, that is, if one infrared image is an image containing the living human face, the temperature distribution in the infrared image conforms to the temperature distribution of the living human face, so that the temperature distribution of the living human face can be learned by a deep learning manner by using a large number of sample infrared images, and a living human face detection model for recognizing whether the living human face exists in the infrared image is trained. In this way, after any infrared image is input into the living human face detection model, the living human face detection model can detect the temperature distribution in the input infrared image, and when the temperature distribution in the infrared image input into the model conforms to the temperature distribution of the living human face, the infrared image input into the model is determined to be the image containing the living human face.
The specific network type and network structure of the face living body detection model are not limited in the embodiments of the present invention. For example, the living human face detection model may be a cave (Convolutional structure for Fast Feature Embedding) model or a YOLO (youonly Look once) model, and the like, where YOLO is an object recognition and positioning algorithm based on a deep neural network, and the greatest characteristic of the model is that the operation speed is Fast, and the model can be used in a real-time system.
Illustratively, the training process of the face living body detection model may include:
acquiring a plurality of sample infrared images and a face living body calibration result corresponding to each sample infrared image;
inputting the sample infrared image to a neural network model to be trained aiming at each sample infrared image to obtain a prediction result; the prediction result is used for representing whether the sample infrared image is an image of a human face living body;
judging whether the neural network model is converged or not based on the difference between the prediction result and the face living body calibration result, and if so, ending the training to obtain a face living body detection model; and if not, adjusting the network parameters of the neural network model, returning to the step of inputting the sample infrared image to the neural network model to be trained aiming at each sample infrared image to obtain a prediction result, and continuing training.
After the sample infrared image is input to the neural network model to be trained, the neural network model can perform multilayer convolution operation on the sample infrared image to obtain a prediction result. And, the plurality of sample infrared images may include a positive sample and a negative sample, wherein the positive sample is an image containing a living human face, and the negative sample is an image not containing a living human face. For example, the face in the positive sample for training the living human face detection model may have the characteristics of wearing glasses, wearing a hat, wearing a mask, having long hair or short hair; the negative examples may be images including non-living faces including a mobile phone screen, printed paper, a 3D sculpture, and the like. To facilitate understanding of the positive and negative samples, fig. 3(a) gives a grayscale illustration of the positive sample, fig. 3(b) gives a grayscale illustration of the negative sample obtained by photographing a paper, and fig. 3(c) gives a grayscale illustration of the negative sample obtained by photographing a cell phone.
And S103, outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of the human body in the human face area of the infrared image.
When the human face living body exists in the infrared image based on the first detection result, the existence of a person to be measured is indicated, and then temperature measurement can be continuously carried out based on the infrared image. Illustratively, the target area may be a forehead area, or an eyebrow area, or the like. In addition, in an implementation manner, the determining, based on the first detection result, that the living human face exists in the infrared image may include: the first detection result shows that the living human face exists in the infrared image.
There are various ways to output the temperature information of the target area based on the infrared image, and for clarity of the scheme and clear layout, the following describes, in combination with other embodiments, how to output the temperature information of the target area based on the infrared image.
According to the scheme provided by the embodiment of the invention, after the infrared image is obtained, the human face living body detection is carried out on the infrared image based on a preset human face living body detection model, and a first detection result for indicating whether the infrared image is an image of a human face living body is obtained; furthermore, when it is determined that a living human face exists in the infrared image based on the first detection result, temperature information of a target area is output based on the infrared image, and the target area is an area capable of representing the temperature of the human body in the human face area of the infrared image. In the scheme, the human face living body detection is carried out on the infrared image based on a preset human face living body detection model, so that whether the temperature-to-be-detected person exists or not can be accurately identified; and when the human face living body exists in the infrared image based on the detection result, the temperature information of the target area is output based on the infrared image, and the temperature measurement of the person to be subjected to temperature measurement can be realized. Therefore, by the scheme, the aim of accurately and effectively measuring the temperature of the personnel based on image analysis can be fulfilled.
Optionally, in order to further ensure that temperature measurement is performed on the human face of a person, so as to improve effectiveness and accuracy of temperature measurement, in another embodiment of the present invention, temperature measurement is performed on the basis of an infrared image in combination with a visible light image.
For the embodiment of performing thermometry in combination with a visible light image, the image processing method may be, for example, an image processing method based on a dual-system imaging device; of course, the image processing method is not limited to the dual-system-based imaging device on the premise that the visible light image matched with the infrared image can be acquired.
The dual-system imaging equipment comprises a visible light imaging subsystem for collecting visible light images and a thermal imaging subsystem for collecting infrared images; of course, the dual system imaging device may also include a data processing subsystem for processing the imaged images of the visible light imaging subsystem and the thermal imaging subsystem. It will be appreciated that in a particular application, the visible light imaging subsystem and the thermal imaging subsystem may also be referred to as a visible light imaging master system and a thermal imaging slave system, respectively.
The visible light imaging subsystem can comprise a visible light lens, an image sensor and an image processor which are matched with the visible light lens, and the thermal imaging subsystem can comprise a thermal imaging lens, an image sensor and an image processor which are matched with the thermal imaging lens; the visible light lens is a lens which transmits visible light, namely a lens which transmits visible light; and the thermal imaging lens is a lens for transmitting infrared rays. Moreover, for the dual-system imaging device, before the dual-system imaging device works, the visible light lens in the visible light imaging subsystem and the thermal imaging lens in the thermal imaging subsystem can be subjected to region calibration, and after the region calibration, when the visible light lens and the thermal imaging lens shoot the same object, the position information of the object in the lens region is consistent; moreover, the visible light imaging subsystem and the thermal imaging subsystem can be time-calibrated, and after the time calibration, the visible light image and the infrared image at the same acquisition time have the correspondence, namely the visible light image and the infrared image at the same acquisition time are imaging images of the same object and at the same moment. The method for the area calibration and the time calibration of the dual system may include, but is not limited to, a manual method, and a specific calibration or calibration process, and the embodiment of the present invention is not limited thereto.
For the application of the image processing method to the dual-system imaging device, it is reasonable that the image processing apparatus is a device operating in the visible light imaging subsystem, a device operating in the thermal imaging subsystem, a device operating in the data processing subsystem, or a device operating in other electronic devices communicating with the dual-system imaging device.
In addition, the image processing method provided by the embodiment can be applied to human face identification scenes such as human face entrance guard, human face attendance and the like. At the moment, the dual-system imaging equipment can be imaging equipment in a face access control system and a face attendance system, the imaging equipment can realize the face identification functions such as face access control or face attendance and the like, and the temperature measurement can be realized during face identification. Of course, the image processing method provided by the embodiment of the present invention is not limited to be applied to face recognition scenes such as face access control and face attendance, for example: the method can also be applied to special personnel temperature measurement scenes.
Specifically, on the basis of the embodiment shown in fig. 1, as shown in fig. 2, before S103, the image processing method provided by this embodiment may further include the following steps:
s104, acquiring a target visible light image; the target visible light image and the infrared image are related to the same acquisition area and are matched with each other in acquisition time;
s105, carrying out face detection on the target visible light image to obtain a second detection result; the second detection result is used for indicating whether a human face exists in the target visible light image or not;
accordingly, S103 in the above embodiment may include:
and S103A, outputting temperature information of the target area based on the infrared image when the first detection result shows that the living human face exists in the infrared image and the second detection result shows that the human face exists in the target visible light image.
It should be emphasized that the execution sequence of the steps shown in fig. 2 is only for example and should not be construed as limiting the embodiments of the present invention. For example: s101 may be executed simultaneously with S104, and S102 is executed simultaneously with S105, and so on, which is reasonable.
The method comprises the steps that when the infrared image is obtained, or after the infrared image is obtained, a target visible light image can be obtained, and face detection is carried out on the target visible light image to obtain a second detection result; the second detection result is used to indicate whether a human face exists in the target visible light image, and of course, the second detection result may also be used to indicate position information of a human face region when a human face exists in the target visible light image.
In this embodiment, any manner capable of performing face detection on a visible light image may be applied to this embodiment, which is not limited in this embodiment of the present invention. For example: performing face detection on the target visible light image through a preset face detection model to obtain a second detection result, wherein the preset face detection model can be a model obtained by training in advance based on a visible light image containing a face and a visible light image not containing the face as training samples; or, performing face detection on the target visible light image through a predetermined face detection algorithm to obtain a second detection result, and the like.
In addition, the target visible light image and the infrared image are images with matched acquisition time, which may specifically refer to: the target visible light image and the infrared image are images with the same acquisition time, and the acquisition time can be represented by a timestamp; or the target visible light image is an image with the difference value of the acquisition time of the infrared image within a preset time range.
In addition, in this embodiment, when it is determined that a living human face exists in the infrared image based on the first detection result and the second detection result indicates that a human face exists in the target visible light image, the step of outputting temperature information of the target area based on the infrared image is executed, and compared with the step of executing temperature measurement when it is determined that a living human face exists in the infrared image based on the first detection result, it is not only possible to accurately identify whether a person to be measured exists, but also possible to further ensure that temperature measurement is directed at the temperature measurement of the human face, thereby improving effectiveness and accuracy of temperature measurement. Specific implementation manners of outputting the temperature information of the target area based on the infrared image may be as shown in the following embodiments.
In the embodiment, the human face living body detection is carried out on the infrared image based on the preset human face living body detection model, so that whether the temperature-to-be-detected person exists or not can be accurately identified; and when the fact that the human face living body exists in the infrared image and the human face exists in the visible light image is judged based on the detection result, temperature information of the target area is output based on the infrared image, temperature measurement of a person to be subjected to temperature measurement can be achieved, and temperature measurement of the human face of the person is guaranteed. Therefore, by the scheme, the aim of accurately and effectively measuring the temperature of the personnel based on image analysis can be fulfilled.
In addition, in the embodiment, the temperature information of the target area in the infrared image is used as the temperature measurement result, and the temperature measurement result is not obtained by performing area mapping on the infrared image and the visible light image, so that the accuracy of the temperature measurement result can be ensured even under the condition that the area calibration has no strict requirement.
Optionally, in another embodiment of the present invention, when it is determined that a living human face exists in the infrared image based on the first detection result, and the second detection result indicates that a human face exists in the target visible light image, the image processing method may further include the following steps:
detecting whether the face in the target visible light image hits the face image in a preset face library;
and if the detection result is hit, executing the processing action matched with the hit operation.
In the scene based on face identification such as entrance guard or attendance, when carrying out the temperature measurement, can realize other functions based on the people's face, for example: access control functions, attendance functions, and the like. Therefore, when the first detection result shows that the living human face exists in the infrared image, and the second detection result shows that the human face exists in the target visible light image, whether the human face in the target visible light image hits the human face image in a preset human face library or not can be detected, and if the detection result is hit, a processing action matched with the hit operation is executed. For example: and for the face entrance guard scene, the processing action matched with the hit operation is used for opening the door, and for the face attendance scene, the processing action matched with the hit operation is used for recording the attendance information of the corresponding personnel of the hit face image.
Moreover, there are various specific implementation manners for detecting whether the face in the target visible light image hits the face in the predetermined face library. For example, a specific implementation manner of detecting whether the face in the target visible light image hits the face image in the predetermined face library may include:
performing matting on the human face in the target visible light image to obtain a sub-image containing the human face;
calculating the face similarity of the subimages and face images in a preset face library;
if the face image with the face similarity larger than the preset threshold exists, the fact that the face in the target visible light image hits the face image in the preset face library is indicated. And, the face image with the largest face similarity is taken as the hit face image.
Moreover, in order to facilitate the temperature information of subsequent inquirers, if the detection result is hit, the identity information corresponding to the hit face in the preset face library can be determined; and correspondingly storing the temperature measurement result of the infrared image corresponding to the identity information and the target visible light image. That is, by the solution of this embodiment, the temperature measurement result of a specific person can be determined, and for convenience of understanding, refer to the working principle diagram shown in fig. 3(d), specifically: after the second detection result and the first detection result are obtained by analyzing the target visible light image and the infrared image, the temperature information of the person in the target visible light image can be determined based on the first detection result and the second detection result.
Through the scheme that this embodiment provided, not only can carry out accurate effective temperature measurement to the personnel based on image analysis, moreover, can realize with hit operation assorted processing action, promote the function comprehensiveness.
Optionally, in an embodiment of the present invention, outputting the temperature information of the target area based on the infrared image may include the following steps a and B:
step A, determining the image position of a target area in the infrared image as a target position based on the face area in the infrared image;
it can be understood that, when the first detection result can be used to represent the position information of the face region when the living human face exists, after the first detection result is obtained, the position information of the face region in the infrared image can be obtained. When the first detection result is only used for indicating whether the infrared image is an image of a living human face, the position information of the human face area in the infrared image can be determined by any method capable of realizing human face area detection.
After the face region in the infrared image is determined, that is, after the position information of the face region in the infrared image is determined, the image position of the target region in the infrared image may be determined as the target position based on the position information of the face region in the infrared image.
Alternatively, in one implementation, the image position of the target region in the infrared image may be determined as the target position based on the position information of the face region in the infrared image according to a calculation formula regarding the image position of the target region. For example, if the target area is a forehead area, the calculation formula of the image position of the target area may be as follows:
assume that the position information of the face area in the infrared image is: coordinates of lower left corner (X)0,Y0) Width W0Height H0(ii) a If the target area is a forehead area, the calculation formula of the image position of the target area is as follows:
X1=X0+0.35*W0;Y1=Y0+0.13*H0
W1=(1-(0.35+0.35))*W0;H1=0.21*H0
wherein (X)1,Y1) Is the lower left corner coordinate of the forehead area, W1Is the width of the forehead region, H1Is the height of the forehead area.
It should be emphasized that, in this embodiment, a specific implementation manner of how to determine the image position of the target region in the infrared image based on the face region in the infrared image is not limited, and any specific implementation manner that can determine the image position of the target region in the infrared image based on the face region in the infrared image may be applied to this embodiment.
And B, determining the temperature information of the target area based on the target position, and outputting the temperature information of the target area.
For example, in one implementation, after the target location is determined, the temperature information of the target area may be determined based on the color value at the target location in the infrared image and the mapping relationship between the color value and the temperature value.
For example, in another implementation, determining the temperature information of the target area based on the target position may include:
determining a matrix area indicated by the target position in the temperature measurement thermodynamic matrix; the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image;
and selecting the highest temperature value from the matrix area as the temperature information of the target area.
It can be understood that, for any infrared image, it may correspond to a temperature measurement thermodynamic matrix, specifically, a matrix element of the temperature measurement thermodynamic matrix is a temperature value, the infrared image and the temperature measurement thermodynamic matrix are different representation forms of the temperature sensed by the thermal sensor, the infrared image is represented by a color, and the temperature measurement thermodynamic matrix is represented by a temperature value. That is, the thermometric thermodynamic matrix is a temperature matrix representing temperature values of image content in the infrared image.
Because the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image, after the target position is determined, the matrix area indicated by the target position can be determined in the temperature measurement thermodynamic matrix; after the matrix area is determined, the highest temperature value may be selected from the matrix area to obtain the temperature information of the target area. The size of the matrix region indicated by the target position is related to the size of the image indicated by the target position, and the size of the matrix region indicated by the target position may be 3 × 3, 5 × 5, and so on.
The above-mentioned specific implementation manner regarding outputting the temperature information of the target area based on the infrared image is only an example, and should not be construed as limiting the embodiments of the present invention.
Optionally, in an embodiment of the present invention, in order to further improve the temperature measurement accuracy, the target position may be corrected based on the visible light image. Based on the processing idea, before determining the temperature information of the target area based on the target position, the method may further include:
step C, correcting the target position by using the position information of key points of the human face in the visible light image corresponding to the infrared image to obtain the corrected target position;
the visible light image corresponding to the infrared image and the infrared image are images which are related to the same acquisition area and are matched in acquisition time. The visible light image corresponding to the infrared image may be the target visible light image described above, or may not be the target visible light image described above.
Accordingly, determining temperature information of the target area based on the target location may include the steps of:
based on the corrected target position, temperature information of the target area is determined.
The correcting the target position by using the position information of the key point of the human face in the visible light image corresponding to the infrared image to obtain the corrected target position may include:
determining the image position of the face area in the visible light image corresponding to the infrared image by using the position information of the key point of the face in the visible light image corresponding to the infrared image;
and correcting the target position based on the determined image position to obtain a corrected target position.
It can be understood that after the key points of the human face in the visible light image corresponding to the infrared image are obtained, the image position of the human face area in the visible light image corresponding to the infrared image can be calculated and determined according to any one of the prior art.
There are various ways to correct the target position based on the determined image position. For example: when any image position is represented in the form of coordinates and width and height of the corner point, averaging or weighted averaging can be performed on the coordinates of the corner point in the determined image position and the coordinates of the corner point in the target position to obtain the coordinates of the corrected corner point; and determining the coordinates of the corrected corner points and the width and the height in the target position as the corrected target position. Another example is: when any image position is represented in the form of coordinates and width of corner points, the coordinates of the corner points in the determined image position and the width of the target position can be determined as the corrected target position information.
Compared with the embodiment, the method and the device have the advantages that the purpose of accurately and effectively measuring the temperature of the person based on image analysis is achieved, and meanwhile the target position is corrected by combining the visible light image, so that the temperature measurement result can be guaranteed to have higher accuracy.
Optionally, in an embodiment of the present invention, before the position information of the key point of the face in the visible light image corresponding to the infrared image is used to correct the target position, the method further includes:
identifying the appointed face angle of the face in the visible light image corresponding to the infrared image;
and if the specified face angle meets the preset face correction condition, executing the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image.
Considering that the human face in the visible light image may be a non-frontal face, in order to further improve the accuracy of correction, angle analysis may be performed on the human face in the visible light image corresponding to the infrared image, and when the human face is a frontal face, the target position is corrected by using the position information of the key point of the human face in the visible light image corresponding to the infrared image.
For example, specifying the face angle may include: pitch angle and yaw angle; accordingly, the predetermined frontal condition is that the pitch angle and the yaw angle are both within a predetermined frontal angle range. Then, when the appointed face angle is obtained, if the pitch angle and the yaw angle in the appointed face angle are both in the preset frontal face angle range, judging that the appointed face angle is in accordance with the preset frontal face condition, and otherwise, judging that the appointed face angle is not in accordance with the preset frontal face condition. In addition, any specific implementation manner capable of realizing the identification of the specified face angle can be applied to the embodiment of the invention.
In addition, if the specified face angle does not meet the predetermined frontal face condition, the method may further include:
outputting guide information for guiding personnel to measure temperature on the front side;
or,
and determining the posture information of the face in the visible light image corresponding to the infrared image, and outputting guide information which contains a posture adjustment mode matched with the posture information and is used for guiding the front temperature measurement of the user.
The attitude information can be represented by the pitch angle and the deflection angle of the human face. Any manner of determining the pose information of the human face in the visible light image corresponding to the infrared image may be applied to the embodiment. The posture adjustment method matched with the posture information is a method for adjusting the posture toward the human face, for example: if the human face of the person is right ahead, the human face deflection angle is 0 degree, when the person turns left, the human face deflection angle is a positive number, and when the person turns right, the human face deflection angle is a negative number, then if the human face deflection angle in the visible light image is-30 degrees, the posture adjustment mode matched with the posture information is turning left, and if the human face deflection angle in the visible light image is 30 degrees, the posture adjustment mode matched with the posture information is turning right.
Compared with the foregoing embodiment, according to the embodiment, the purpose of accurately and effectively measuring the temperature of the person based on image analysis is achieved, and meanwhile, the target position can be effectively corrected, so that the temperature measurement result can be guaranteed to have higher accuracy.
Optionally, in an embodiment of the present invention, before the position information of the key point of the human face in the visible light image corresponding to the infrared image is used to correct the target position, the method further includes:
determining the intersection area of the face area of the visible light image corresponding to the infrared image and the face area of the infrared image;
calculating the proportion of the intersection area in the face area of the infrared image;
and if the ratio is larger than a preset ratio threshold value, executing a step of correcting the target position by using the position information of the key points of the human face in the visible light image corresponding to the infrared image.
In order to ensure the effectiveness of the correction, before the position correction is performed by using the visible light image corresponding to the infrared image, it may be determined that the face in the infrared image and the face in the visible light image corresponding to the infrared image are considered to be the same face, and if so, the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image is performed. The proportional threshold may be set according to actual conditions, for example, the proportional threshold is 90%, 95%, and so on.
Compared with the foregoing embodiment, according to the embodiment, the purpose of accurately and effectively measuring the temperature of the person based on image analysis is achieved, and meanwhile, the target position can be effectively corrected, so that the temperature measurement result can be guaranteed to have higher accuracy.
Corresponding to the method embodiment, the embodiment of the invention provides an image processing device. As shown in fig. 4, the image processing apparatus may include:
a first obtaining module 410, configured to obtain an infrared image;
a first detection module 420, configured to perform living human face detection on the infrared image based on a preset living human face detection model to obtain a first detection result, where the first detection result is used to indicate whether the infrared image is an image of a living human face; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body;
an output module 430, configured to output temperature information of a target area based on the infrared image when it is determined that a living human face exists in the infrared image based on the first detection result, where the target area is an area capable of reflecting a human body temperature in a human face area of the infrared image.
Optionally, as shown in fig. 5, the apparatus further includes:
a second obtaining module 440, configured to obtain a target visible light image before the output module outputs temperature information of a target area based on the infrared image; the target visible light image and the infrared image are images which are related to the same acquisition area and are matched in acquisition time;
the second detection module 450 is configured to perform face detection on the target visible light image to obtain a second detection result; the second detection result is used for representing whether a human face exists in the target visible light image or not;
the output module 430 is specifically configured to:
and outputting temperature information of a target area based on the infrared image when the first detection result shows that the infrared image has the human face living body and the second detection result shows that the target visible light image has the human face.
Optionally, the output module 430 outputs temperature information of a target area based on the infrared image, including:
determining the image position of a target area in the infrared image as a target position based on the face area in the infrared image;
and determining the temperature information of the target area based on the target position, and outputting the temperature information of the target area.
Optionally, the output module 430 is further configured to, before determining the temperature information of the target area based on the target position, correct the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image, so as to obtain the corrected target position;
the visible light image and the infrared image corresponding to the infrared image are images which are related to the same acquisition area and are matched in acquisition time.
Optionally, the output module 430 determines the temperature information of the target area based on the target position, including:
determining a matrix area indicated by the target position in the temperature measurement thermodynamic matrix; the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image;
and selecting the highest temperature value from the matrix area as the temperature information of the target area.
Optionally, the output module 430 is further configured to identify a specified face angle of a face in the visible light image corresponding to the infrared image before correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image; and if the specified face angle meets the preset face correction condition, executing the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image.
Optionally, the output module 430 is further configured to determine an intersection region of a face region of the visible light image corresponding to the infrared image and a face region of the infrared image before correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image; calculating the proportion of the intersection area in the face area of the infrared image; and if the ratio is larger than a preset ratio threshold value, executing the step of correcting the target position by using the position information of the key points of the human face in the visible light image corresponding to the infrared image.
An embodiment of the present invention further provides an electronic device, as shown in fig. 6, including a processor 601, a communication interface 602, a memory 603, and a communication bus 604, where the processor 601, the communication interface 602, and the memory 603 complete mutual communication through the communication bus 604,
a memory 603 for storing a computer program;
the processor 601 is configured to implement the steps of the image processing method provided by the embodiment of the present invention when executing the program stored in the memory 603.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In yet another embodiment provided by the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program realizes the steps of any one of the image processing methods described above when executed by a processor.
In a further embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of any of the image processing methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and apparatus embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. An image processing method, comprising:
acquiring an infrared image;
performing face living body detection on the infrared image based on a preset face living body detection model to obtain a first detection result, wherein the first detection result is used for indicating whether the infrared image is an image of a face living body; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body;
and outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image.
2. The method of claim 1, wherein before outputting temperature information of a target area based on the infrared image, the method further comprises:
acquiring a target visible light image; the target visible light image and the infrared image are images which are related to the same acquisition area and are matched in acquisition time;
carrying out face detection on the target visible light image to obtain a second detection result; the second detection result is used for representing whether a human face exists in the target visible light image or not;
when the human face living body exists in the infrared image based on the first detection result, outputting temperature information of a target area based on the infrared image, wherein the method comprises the following steps:
and outputting temperature information of a target area based on the infrared image when the first detection result shows that the infrared image has the human face living body and the second detection result shows that the target visible light image has the human face.
3. The method according to claim 1 or 2, wherein outputting temperature information of a target area based on the infrared image comprises:
determining the image position of a target area in the infrared image as a target position based on the face area in the infrared image;
and determining the temperature information of the target area based on the target position, and outputting the temperature information of the target area.
4. The method of claim 3, wherein prior to determining the temperature information of the target area based on the target location, the method further comprises:
correcting the target position by using the position information of key points of the human face in the visible light image corresponding to the infrared image to obtain the corrected target position;
the visible light image and the infrared image corresponding to the infrared image are images which are related to the same acquisition area and are matched in acquisition time.
5. The method of claim 3, wherein determining temperature information for the target area based on the target location comprises:
determining a matrix area indicated by the target position in the temperature measurement thermodynamic matrix; the temperature measurement thermodynamic matrix is a temperature matrix representing the temperature value of the image content in the infrared image;
and selecting the highest temperature value from the matrix area as the temperature information of the target area.
6. The method according to claim 4, wherein before the target position is corrected by using the position information of the key points of the human face in the visible light image corresponding to the infrared image, the method further comprises:
identifying the appointed face angle of the face in the visible light image corresponding to the infrared image;
and if the specified face angle meets the preset face correction condition, executing the step of correcting the target position by using the position information of the key point of the face in the visible light image corresponding to the infrared image.
7. The method according to claim 4, wherein before the target position is corrected by using the position information of the key points of the human face in the visible light image corresponding to the infrared image, the method further comprises:
determining an intersection area of a face area of the visible light image corresponding to the infrared image and a face area of the infrared image;
calculating the proportion of the intersection area in the face area of the infrared image;
and if the ratio is larger than a preset ratio threshold value, executing the step of correcting the target position by using the position information of the key points of the human face in the visible light image corresponding to the infrared image.
8. An image processing apparatus characterized by comprising:
the first acquisition module is used for acquiring an infrared image;
the first detection module is used for carrying out human face living body detection on the infrared image based on a preset human face living body detection model to obtain a first detection result, and the first detection result is used for indicating whether the infrared image is an image of a human face living body; the human face living body detection model is used for detecting the temperature distribution in the infrared image, and when the temperature distribution in the infrared image accords with the temperature distribution of a human face living body, the infrared image is determined to be the image of the human face living body;
and the output module is used for outputting temperature information of a target area based on the infrared image when the human face living body exists in the infrared image based on the first detection result, wherein the target area is an area capable of reflecting the temperature of a human body in the human face area of the infrared image.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 7.
CN202011494299.2A 2020-12-17 2020-12-17 Image processing method, device and equipment Pending CN112525355A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011494299.2A CN112525355A (en) 2020-12-17 2020-12-17 Image processing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011494299.2A CN112525355A (en) 2020-12-17 2020-12-17 Image processing method, device and equipment

Publications (1)

Publication Number Publication Date
CN112525355A true CN112525355A (en) 2021-03-19

Family

ID=75000979

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011494299.2A Pending CN112525355A (en) 2020-12-17 2020-12-17 Image processing method, device and equipment

Country Status (1)

Country Link
CN (1) CN112525355A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408377A (en) * 2021-06-03 2021-09-17 山东交通学院 Face living body detection method based on temperature information
CN114360697A (en) * 2021-12-15 2022-04-15 深圳市航通智能技术有限公司 Remote epidemic prevention operation method, system, equipment and storage medium
CN114894337A (en) * 2022-07-11 2022-08-12 深圳市大树人工智能科技有限公司 Temperature measurement method and device for outdoor face recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109446981A (en) * 2018-10-25 2019-03-08 腾讯科技(深圳)有限公司 A kind of face's In vivo detection, identity identifying method and device
CN110060272A (en) * 2018-01-18 2019-07-26 杭州海康威视数字技术股份有限公司 Determination method, apparatus, electronic equipment and the storage medium of human face region
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111507200A (en) * 2020-03-26 2020-08-07 北京迈格威科技有限公司 Body temperature detection method, body temperature detection device and dual-optical camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110060272A (en) * 2018-01-18 2019-07-26 杭州海康威视数字技术股份有限公司 Determination method, apparatus, electronic equipment and the storage medium of human face region
CN109446981A (en) * 2018-10-25 2019-03-08 腾讯科技(深圳)有限公司 A kind of face's In vivo detection, identity identifying method and device
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111507200A (en) * 2020-03-26 2020-08-07 北京迈格威科技有限公司 Body temperature detection method, body temperature detection device and dual-optical camera

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408377A (en) * 2021-06-03 2021-09-17 山东交通学院 Face living body detection method based on temperature information
CN114360697A (en) * 2021-12-15 2022-04-15 深圳市航通智能技术有限公司 Remote epidemic prevention operation method, system, equipment and storage medium
CN114894337A (en) * 2022-07-11 2022-08-12 深圳市大树人工智能科技有限公司 Temperature measurement method and device for outdoor face recognition

Similar Documents

Publication Publication Date Title
CN112525355A (en) Image processing method, device and equipment
CN100361131C (en) Information processing apparatus, information processing method, and computer program
CN111339951A (en) Body temperature measuring method, device and system
CN111626125A (en) Face temperature detection method, system and device and computer equipment
CN105740780B (en) Method and device for detecting living human face
CN111091063A (en) Living body detection method, device and system
CN110390229B (en) Face picture screening method and device, electronic equipment and storage medium
CN111914635A (en) Human body temperature measurement method, device and system and electronic equipment
CN111598865B (en) Hand-foot-mouth disease detection method, device and system based on thermal infrared and RGB double-shooting
CN110059579B (en) Method and apparatus for in vivo testing, electronic device, and storage medium
CN111307331A (en) Temperature calibration method, device, equipment and storage medium
EP2546798A1 (en) Biometric authentication device and biometric authentication method
CN113642639B (en) Living body detection method, living body detection device, living body detection equipment and storage medium
CN112541403B (en) Indoor personnel falling detection method by utilizing infrared camera
CN112084882A (en) Behavior detection method and device and computer readable storage medium
CN111027400A (en) Living body detection method and device
CN108268839A (en) A kind of live body verification method and its system
CN114170690A (en) Method and device for living body identification and construction of living body identification model
CN108875553A (en) Method, apparatus, system and the computer storage medium that the testimony of a witness is veritified
CN113837006A (en) Face recognition method and device, storage medium and electronic equipment
CN113792587A (en) Method and device for acquiring and identifying image, storage medium and electronic equipment
JP7269897B2 (en) Data registration device, biometric authentication device, and data registration program
CN115937971B (en) Method and device for identifying hand-lifting voting
CN116052225A (en) Palmprint recognition method, electronic device, storage medium and computer program product
CN108875472B (en) Image acquisition device and face identity verification method based on image acquisition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination