WO2021227350A1 - 测量温度的方法、装置、电子设备和计算机可读存储介质 - Google Patents

测量温度的方法、装置、电子设备和计算机可读存储介质 Download PDF

Info

Publication number
WO2021227350A1
WO2021227350A1 PCT/CN2020/120964 CN2020120964W WO2021227350A1 WO 2021227350 A1 WO2021227350 A1 WO 2021227350A1 CN 2020120964 W CN2020120964 W CN 2020120964W WO 2021227350 A1 WO2021227350 A1 WO 2021227350A1
Authority
WO
WIPO (PCT)
Prior art keywords
temperature
target part
key point
information
weight information
Prior art date
Application number
PCT/CN2020/120964
Other languages
English (en)
French (fr)
Inventor
冯浩城
岳海潇
王珂尧
张刚
范彦文
余席宇
韩钧宇
刘经拓
丁二锐
王海峰
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Priority to JP2022569508A priority Critical patent/JP2023525378A/ja
Priority to KR1020227041892A priority patent/KR20220166368A/ko
Priority to US17/998,881 priority patent/US20230213388A1/en
Priority to EP20935969.4A priority patent/EP4151968A4/en
Publication of WO2021227350A1 publication Critical patent/WO2021227350A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K13/00Thermometers specially adapted for specific purposes
    • G01K13/20Clinical contact thermometers for use with humans or animals
    • G01K13/223Infrared clinical thermometers, e.g. tympanic
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K7/00Measuring temperature based on the use of electric or magnetic elements directly sensitive to heat ; Power supply therefor, e.g. using thermoelectric elements
    • G01K7/42Circuits effecting compensation of thermal inertia; Circuits for predicting the stationary value of a temperature
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/025Interfacing a pyrometer to an external device or network; User interface

Definitions

  • the embodiments of the present disclosure mainly relate to the field of artificial intelligence, specifically computer vision, and more specifically, to methods, devices, electronic devices, and computer-readable storage media for measuring temperature.
  • a method for measuring temperature may include detecting the target part of the object in the input image.
  • the method further includes determining the key points of the target part and weight information of the key points based on the detection result of the target part, and the weight information indicates the probability that the key points are occluded.
  • the method can also include obtaining temperature information at key points.
  • the method may further include determining the temperature of the target part based on at least the temperature information and weight information of the key points.
  • an electronic device including one or more processors; and a storage device, for storing one or more programs, when one or more programs are used by one or more processors Execution enables one or more processors to implement the method according to the first aspect of the present disclosure.
  • a computer-readable storage medium having a computer program stored thereon, and when the program is executed by a processor, the method according to the first aspect of the present disclosure is implemented.
  • a system for measuring temperature including: an image acquisition module configured to provide an input image associated with a target part of an object; a calculation module communicatively connected with the image acquisition module , The calculation module is configured to implement the method according to the first aspect of the present disclosure; and the output display module is configured to display the processing result of the calculation module.
  • FIG. 1 shows a schematic diagram of an example environment in which multiple embodiments of the present disclosure can be implemented
  • FIG. 2 shows a schematic diagram of a detailed example environment in which multiple embodiments of the present disclosure can be implemented
  • FIG. 3 shows a flowchart of a process for measuring temperature according to an embodiment of the present disclosure
  • FIG. 4 shows a schematic diagram for determining key points and their weight information based on detection results according to an embodiment of the present disclosure
  • a non-contact and automated body temperature measurement method based on thermal imaging technology can usually be used to measure the body temperature of multiple pedestrians at the same time.
  • an infrared thermal imaging device can be used to obtain an infrared thermal imaging map of a pedestrian's face.
  • the detection device can determine that the pedestrian has a fever and issue an alarm message.
  • this type of temperature measurement technology has temperature measurement accuracy that is easily affected by the surrounding environment.
  • the periphery of a pedestrian's face may be blocked by high-temperature objects (for example, mobile phones, hot drinks, etc.).
  • the periphery of the pedestrian's face may be blocked by low-temperature objects (for example, cold food, cold drinks, etc.).
  • the face of a pedestrian who is not feverish may be blocked by a pedestrian who is feverish.
  • a solution for measuring temperature is proposed.
  • the key points of the target part of the object can be determined based on the input image collected by the camera, and the weight information of each key point can be further determined.
  • the weight information is used to indicate the probability that the key point is occluded.
  • the input image 110 may be a real-time monitoring image acquired by an image acquisition device connected to the computing device 120.
  • the image acquisition device may be set in a public place with a large traffic volume, so as to acquire the image information of each person in the crowd passing by the place.
  • the object for obtaining image information may not be limited to humans, but may also include animals (for example, animals in zoos or breeding places) that need to measure body temperature in batches.
  • the input image 110 may also be a multi-frame image with the monitored object, that is, a video.
  • the computing device 120 may receive the input image 110, and use the CNN 140 in the computing device 110 to determine the detection area of the target part of the monitored object, such as the face, and then determine the key points and their weight information.
  • the computing device 120 also receives the temperature sensing image 130.
  • the temperature sensing image 130 may be acquired by a temperature sensing device such as an infrared thermal imaging device.
  • a temperature sensing device such as an infrared thermal imaging device.
  • the computing device 120 can determine the temperature information of each key point, and determine the temperature 150 of the monitored object based on the temperature information of the key points with more reference significance through a method such as a weighted average.
  • the key to generating the temperature 150 based on the input image 110 and the temperature sensing image 130 is that the CNN 140 in the computing device 110 is constructed through pre-training. The construction and use of the CNN 140 will be described in Figure 2 below. .
  • FIG. 2 shows a schematic diagram of a detailed example environment 200 in which multiple embodiments of the present disclosure can be implemented.
  • the example environment 200 may include a computing device 220, an input image 210 and an output result 250.
  • the example environment 200 may include a model training system 270 and a model application system 280 as a whole.
  • the model training system 270 and/or the model application system 280 may be implemented in the computing device 120 as shown in FIG. 1 or the computing device 220 as shown in FIG. 2.
  • the structure and functions of the example environment 200 are described for exemplary purposes only and are not intended to limit the scope of the subject matter described herein.
  • the subject matter described herein can be implemented in different structures and/or functions.
  • the model training system 270 may use the training data set 260 to train the CNN 240 that determines the key points and their weight information.
  • the model application system 280 may receive the trained CNN 240, so that the CNN 240 determines the key points and their weight information based on the determined detection area. It should be understood that the CNN 240 can also be trained to directly determine the key points and their weight information based on the input image 110.
  • CNN 240 may be constructed as a learning network for determining key points and their weight information.
  • a learning network can also be called a learning model, or simply called a network or model.
  • the learning network used to determine the key points and their weight information may include multiple networks, where each network may be a multilayer neural network, which may be composed of a large number of neurons. Through the training process, the corresponding parameters of the neurons in each network can be determined. The parameters of the neurons in these networks are collectively referred to as the parameters of CNN 240.
  • the training process of CNN 240 can be performed in an iterative manner.
  • the model training system 270 may obtain reference images from the training data set 260, and use the reference images to perform one iteration of the training process to update the corresponding parameters of the CNN 240.
  • the model training system 270 may repeatedly perform the above process based on multiple reference images in the training data set 260 until at least some of the parameters of the CNN 240 converge, thereby obtaining the final model parameters.
  • FIG. 3 shows a flowchart of a process 300 for measuring temperature according to an embodiment of the present disclosure.
  • the method 300 may be implemented in the computing device 120 of FIG. 1, the computing device 220 of FIG. 2, and the device shown in FIG. 6.
  • a process 300 for measuring temperature according to an embodiment of the present disclosure will now be described with reference to FIG. 1.
  • the specific examples mentioned in the following description are all exemplary and are not used to limit the protection scope of the present disclosure.
  • the computing device 120 may detect the target part of the object in the input image 110.
  • the computing device 120 may determine the detection area of the target part in the input image 110 through a CNN 140 (such as a detection area generation model).
  • the CNN 140 may perform face region detection on the input image 110.
  • a six-layer convolutional network can be used to extract basic facial features of the input image 110, and each layer of the convolutional network can implement image downsampling, and a fixed number of people of different sizes can be preset based on the final three-layer convolutional neural network.
  • the face anchor point area performs face detection area regression, and finally the face detection area.
  • the foregoing examples are only exemplary, and other layers of convolutional networks may also be used, and it is not limited to determining the detection area of a human face. In this way, the detection area of the target part in the input image 110 can be quickly identified based on the detection area generation model, so as to prepare for subsequent temperature measurement and even face recognition.
  • the computing device 120 may determine the key points of the target part and the weight information of the key points based on the detection result of the target part.
  • the weight information is used to indicate the probability that the key point is occluded.
  • the computing device 120 may apply the detection result of the target part to the CNN 140 (such as a key point determination model) to determine key points and weight information.
  • the CNN 140 is trained based on the reference target part in the reference image and the reference key points and reference weight information in the reference target part.
  • the CNN 140 may determine the key points of the face and the weight information of each key point based on the detection result of the face. In this way, the focus of temperature measurement can be focused on parts that are not blocked or affected by objects with abnormal temperatures, thereby improving the accuracy of temperature measurement.
  • FIG. 4 shows in more detail a schematic diagram for determining the key point 420 and its weight information based on the detection result 410 according to an embodiment of the present disclosure.
  • the detected object is a pedestrian
  • the target part is a pedestrian's face, that is, a human face.
  • the CNN 140 can determine multiple key points in the face detection area 410, such as key points 420.
  • CNN 140 can also determine the weight information of each key point. For example, since the key point 420 is obscured by the hand and the collection, its weight information is determined to be very small. As an example, the weight information is usually set to a value between 0 and 1. The greater the probability that the key point predicted by CNN140 is occluded, the smaller the value of the weight information, which means that the temperature at the key point has no reference value.
  • the computing device 120 may obtain temperature information of the key point.
  • the computing device 120 may obtain the temperature sensing image 130 for the target part.
  • the temperature sensing image 130 may be acquired by a temperature sensing device such as an infrared thermal imaging device.
  • a temperature sensing device such as an infrared thermal imaging device.
  • the computing device 120 can determine the temperature information corresponding to the position of the key point from the temperature sensing image 130. In this way, the temperature measurement of the identified key points is realized, so as to prepare for the subsequent temperature calculation.
  • the temperature information obtained at this time can be used as a basis for calculating the temperature 150, there may still be errors due to factors such as environment. Therefore, it is possible to create a functional relationship between the measured temperature and the actual temperature at the location where the temperature sensing device and the image acquisition device are provided. For example, the functional relationship can be fitted by the least square method based on prior knowledge.
  • the computing device 120 can obtain the measured temperature of the key point, and determine the actual temperature of the key point based on the measured temperature. At this time, the degree of readiness for the actual temperature determined by the computing device 120 is significantly improved.
  • the computing device 120 may determine the temperature of the target site based on at least the temperature information and weight information of the key points.
  • the target part described herein may be at least one of the subject's face, eyes, and hands (including fingerprints), and the subject is not limited to being a human.
  • the computing device 120 can compare the temperature with a threshold temperature, and issue an alarm when the temperature is higher than the threshold temperature. Since the temperature of various parts of the human body is different, when the detected face is a human, the corresponding threshold temperature can be set to be different from the threshold temperature corresponding to the human hand.
  • the corresponding threshold temperature can also be determined for different types of animals, so as to realize the body temperature test and alarm of different animals.
  • the present disclosure can improve the accuracy of temperature measurement.
  • the present disclosure can be applied to scenarios with multiple pedestrians, multiple animals, etc., without staff intervention, the time and labor costs for temperature measurement can be reduced, and the risk of staff infection during the epidemic can be reduced.
  • the computing device 120 may also recognize the target part based on at least the key points and weight information, and determine the object based on the recognized result. In some embodiments, the computing device 120 may recognize a face based on key points and weight information, and then determine the identity information of the monitored pedestrian. In other embodiments, the computing device 120 may also determine the type of animal to be monitored based on key points and weight information. Due to the setting of the weighting mechanism, the occluded part will not actually be used or rarely used by the computing device 120 for the recognition operation, thereby reducing the possibility of misrecognition by the CNN 140 in the computing device 120.
  • the present disclosure also provides a system 500 for measuring temperature.
  • the system includes an image acquisition module 510, which may be an image sensing device such as an RGB camera and a temperature sensing device such as an infrared thermal imaging device.
  • the system 500 may further include a calculation module 520 communicatively connected with the image acquisition module 510, and the calculation module 520 is used for the various methods and processes described above, such as the process 300.
  • the system 500 may include an output display module 530 for displaying the processing result of the calculation module 520 to the user.
  • the output display module 530 may display the temperature of the monitored object to the user. When the body temperature of the monitored object is higher than a predetermined threshold, the output display module 530 may also be used to send out an alarm signal.
  • the system 500 may be applied to a temperature measurement scenario of multiple pedestrians.
  • the image acquisition module 510 in the system 500 can be applied to the entrance of a subway or a stadium, so as to realize real-time acquisition of pedestrians such as RGB images and infrared images.
  • the calculation module 520 can perform image processing such as process 300 on the RGB images and infrared images. Then the temperature information of the pedestrian is obtained.
  • the output display module 530 can lock the pedestrian through a variety of warning methods, and the system can monitor the temperature information of multiple pedestrians passing through the entrance in real time. In this way, direct contact between security inspection and epidemic prevention personnel and suspected patients is avoided or reduced, and the temperature measurement process is simple and efficient, and will not cause artificial congestion.
  • the system 500 may be applied to a farm or zoo.
  • the image acquisition module 510 in the system 500 can be applied to the best viewing angle of a farm or zoo, so as to realize real-time monitoring of animal body temperature information.
  • the calculation module 520 can identify the types of animals, thereby determining the types of animals whose temperature has been measured, and thereby obtaining the body temperature thresholds of such animals. Once the animal's body temperature is found to be higher than the threshold, the output display module 530 can lock the animal through a variety of warning methods, so that the staff can treat or deal with it. In this way, direct contact between workers and animals that may carry germs is avoided or reduced.
  • FIG. 6 shows a block diagram of an apparatus 600 for measuring temperature according to an embodiment of the present disclosure.
  • the device 600 may include: a target part detection module 602 configured to detect the target part of the object in the input image; the key point information determination module 604 is configured to determine the target based on the detection result of the target part The key point of the part and the weight information of the key point, the weight information indicates the probability that the key point is occluded; the temperature information acquisition module 606 is configured to acquire temperature information of the key point; and the temperature determination module 608 is configured to be at least based on the key The temperature information and weight information of the points determine the temperature of the target part.
  • the key point information determination module 604 may include: a detection result application module configured to apply the detection result of the target part to the key point determination model to determine key points and weight information.
  • the key point determination model is It is trained based on the reference target part in the reference image and the reference key points and reference weight information in the reference target part.
  • the temperature information acquisition module 606 may include: a temperature sensing image acquisition module configured to acquire a temperature sensing image for the target part; and a temperature information determination module configured to obtain a temperature sensing image from the temperature sensing image. Determine the temperature information corresponding to the location of the key point.
  • the temperature information acquisition module 606 may include: a measured temperature acquisition module configured to acquire the measured temperature of the key point; and an actual temperature determination module configured to determine the actual temperature of the key point based on the measured temperature.
  • the device 600 may further include: a target part recognition module configured to recognize the target part based on at least key points and weight information; and an object determination module configured to determine the object based on the recognition result.
  • the target part detection module may include: a detection area determination module configured to determine the detection area of the target part in the input image through the detection area generation model.
  • FIG. 7 shows a block diagram of a computing device 700 capable of implementing various embodiments of the present disclosure.
  • the device 700 may be used to implement the computing device 120 in FIG. 1 or the computing device 220 in FIG. 2.
  • the device 700 includes a central processing unit (CPU) 701, which can be based on computer program instructions stored in a read-only memory (ROM) 702 or loaded from a storage unit 708 to a computer in a random access memory (RAM) 703. Program instructions to perform various appropriate actions and processing.
  • ROM read-only memory
  • RAM random access memory
  • various programs and data required for the operation of the device 700 can also be stored.
  • the CPU 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
  • An input/output (I/O) interface 705 is also connected to the bus 704.
  • the I/O interface 705 includes: an input unit 706, such as a keyboard, a mouse, etc.; an output unit 707, such as various types of displays, speakers, etc.; and a storage unit 708, such as a magnetic disk, an optical disk, etc. ; And a communication unit 709, such as a network card, a modem, a wireless communication transceiver, and so on.
  • the communication unit 709 allows the device 700 to exchange information/data with other devices through a computer network such as the Internet and/or various telecommunication networks.
  • the processing unit 701 executes the various methods and processes described above, such as the process 300.
  • the process 300 may be implemented as a computer software program, which is tangibly contained in a machine-readable medium, such as the storage unit 708.
  • part or all of the computer program may be loaded and/or installed on the device 700 via the ROM 702 and/or the communication unit 709.
  • the CPU 701 may be configured to execute the process 300 in any other suitable manner (for example, by means of firmware).
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on Chip System (SOC), Load programmable logic device (CPLD) and so on.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Product
  • SOC System on Chip System
  • CPLD Load programmable logic device
  • the program code for implementing the method of the present disclosure can be written in any combination of one or more programming languages. These program codes can be provided to the processors or controllers of general-purpose computers, special-purpose computers, or other programmable data processing devices, so that when the program codes are executed by the processor or controller, the functions specified in the flowcharts and/or block diagrams/ The operation is implemented.
  • the program code can be executed entirely on the machine, partly executed on the machine, partly executed on the machine and partly executed on the remote machine as an independent software package, or entirely executed on the remote machine or server.
  • a machine-readable medium may be a tangible medium, which may contain or store a program for use by the instruction execution system, apparatus, or device or in combination with the instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • the machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disk read only memory
  • magnetic storage device or any suitable combination of the foregoing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Radiation Pyrometers (AREA)
  • Image Analysis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

一种用于测量温度方法(300)、装置(600)、电子设备和计算机可读存储介质,涉及人工智能领域,具体为计算机视觉。该方法(300)包括对输入图像(110,210)中的对象的目标部位进行检测(302);基于目标部位的检测结果(410)确定目标部位的关键点(420)和关键点(420)的权重信息,权重信息指示关键点(420)被遮挡的概率(304);获取关键点(420)的温度信息(306);至少基于关键点(420)的温度信息和权重信息,确定目标部位的温度(308)。该方法可以快速高效且低成本地获取行人的体温信息,从而降低温度测量的时间和人力成本。

Description

测量温度的方法、装置、电子设备和计算机可读存储介质
本申请要求于2020年05月15日提交的中国专利申请第202010415405.7号的优先权权益。
技术领域
本公开的实施例主要涉及人工智能领域,具体为计算机视觉,并且更具体地,涉及用于测量温度的方法、装置、电子设备和计算机可读存储介质。
背景技术
随着我国经济以及交通建设的发展,在公共交通的进站口、景点以及场馆的入口经常会出现高密度的人流。由于可能出现并长期存在的传染病疫情,对高密度人流的体温测量以及身份信息的识别都是疫情防控的重要环节。然而目前的无接触式体温测量手段通常是安保、防疫人员通过体温枪逐个地对行人进行体温检测,以及通过扫描身份证等方式逐个地对行人进行身份识别。这种方式显然是低效的。并且,由于可能造成人流拥堵,高密度人流的聚集通常不可避免,进而造成更多人群被感染。如何高效且准确地实现体温测量是疫情防控工作亟待解决的问题。
发明内容
根据本公开的示例实施例,提供了一种用于测量温度的方案。
在本公开的第一方面中,提供了一种用于测量温度的方法。该方法可以包括对输入图像中的对象的目标部位进行检测。该方法进一步包括基于目标部位的检测结果确定目标部位的关键点和关键点的权重信息,权重信息指示关键点被遮挡的概率。该方法还可以包 括获取关键点的温度信息。此外,该方法可以进一步包括至少基于关键点的温度信息和权重信息,确定目标部位的温度。
在本公开的第二方面中,提供了一种用于测量温度的装置,包括:目标部位检测模块,被配置为对输入图像中的对象的目标部位进行检测;关键点信息确定模块,被配置为基于所述目标部位的检测结果确定所述目标部位的关键点和所述关键点的权重信息,所述权重信息指示所述关键点被遮挡的概率;温度信息获取模块,被配置为获取所述关键点的温度信息;以及温度确定模块,被配置为至少基于所述关键点的所述温度信息和所述权重信息,确定所述目标部位的温度。
在本公开的第三方面中,提供了一种电子设备,包括一个或多个处理器;以及存储装置,用于存储一个或多个程序,当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现根据本公开的第一方面的方法。
在本公开的第四方面中,提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现根据本公开的第一方面的方法。
在本公开的第五方面中,提供了一种用于测量温度的系统,包括:图像采集模块,被配置为提供与对象的目标部位相关联的输入图像;计算模块,与图像采集模块通信连接,所述计算模块被配置为实现根据本公开的第一方面的方法;以及输出展示模块,被配置为展示计算模块的处理结果。
应当理解,发明内容部分中所描述的内容并非旨在限定本公开的实施例的关键或重要特征,亦非用于限制本公开的范围。本公开的其它特征将通过以下的描述变得容易理解。
附图说明
结合附图并参考以下详细说明,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。在附图中,相同或相似的附图 标注表示相同或相似的元素,其中:
图1示出了本公开的多个实施例能够在其中实现的示例环境的示意图;
图2示出了本公开的多个实施例能够在其中实现的详细示例环境的示意图;
图3示出了根据本公开的实施例的用于测量温度的过程的流程图;
图4示出了根据本公开的实施例的用于基于检测结果确定关键点及其权重信息的示意图;
图5示出了根据本公开的实施例的用于测量温度的系统的框图;
图6示出了根据本公开的实施例的用于测量温度的装置的框图;以及
图7示出了能够实施本公开的多个实施例的计算设备的框图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
在本公开的实施例的描述中,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“第一”、“第二”等等可以指代不同的或相同的对象。下文还可能包括其他明确的和隐含的定义。
为了应对疫情防控,通常可以利用基于热成像技术的无接触且自动化的体温测量手段来同时测量多个行人的体温。例如,可以利用红外热成像设备来获取行人面部的红外热成像图。当红外热成像 图中显示行人面部为高温状态(例如,额头部位显示为红色)时,检测设备即可确定该行人出现发热状况,并发出报警信息。
然而,这种类型的测温技术存在测温精度易受周边环境影响。例如,行人的面部周边可能被高温的物体(例如,手机、热饮等)遮挡。又例如,行人的面部周边可能被低温的物体(例如,冷食、冷饮等)遮挡。还例如,一位不发热的行人的面部可能被发热的行人遮挡。这些遮挡情况通常会导致温度误报,从而影响检测设备的性能。此外,在面部被遮挡的情况下,传统的人脸识别机制也通常会输出不准确的识别结果。
如上文提及的,亟需一种温度测量方法,来快速高效且低成本地获取行人的体温信息,从而降低温度测量的时间和人力成本。
根据本公开的实施例,提出了一种用于测量温度的方案。在该方案中,可以基于摄像头采集的输入图像来对对象的目标部位的关键点进行确定,同时还进一步确定每个关键点的权重信息。这里,权重信息用于指示关键点被遮挡的概率。另一方面,还可以通过诸如红外热成像设备的温度感测设备来获取关键点处的温度信息,并且基于关键点的温度信息以及权重信息来确定该目标部位的温度。以此方式,即便存在诸如人的面部的目标部位被高温物体(例如,手机、热饮等)遮挡的情况,该被遮挡的部位的关键点的权重信息可以被确定为很小或者可以忽略。由此,可以更多地基于未被遮挡或收到其他异常温度影响的部位的温度来确定行人的体温。
以下将参照附图来具体描述本公开的实施例。图1示出了本公开的多个实施例能够在其中实现的示例环境100的示意图。如图1所示,示例环境100中包含输入图像110、计算设备120、温度感测图像130和输出的温度150。此外,计算设备110中还包含卷积神经网络(CNN)140。应理解,图1中的CNN 140仅是示例性的,还可以被其他具备学习功能的人工智能网络所替代。
输入图像110可以是由与计算设备120相连接的图像获取设备获取的实时监控图像。作为示例,图像获取设备可以设置在人流量 较大的公共场所,以便获取经过该场所的人群中的每一个人的图像信息。应理解,获取图像信息的对象可以不限于人,而是还可以包含需要批量测量体温的动物(例如,动物园或饲养场所内的动物)。另外,输入图像110还可以是与被监控的对象的多帧图像,即,视频。计算设备120可以接收输入图像110,并通过计算设备110中的CNN 140来确定被监控对象的诸如脸部的目标部位的检测区域,进而确定关键点及其权重信息。
另一方面,计算设备120还接收温度感测图像130。温度感测图像130可以由诸如红外热成像设备的温度感测设备来获取。通过将诸如红外热成像设备的温度感测设备和上文描述的图像获取设备进行配准,从而实现两设备成像的像素级对齐。由此,计算设备120可以确定每个关键点的温度信息,并通过诸如加权平均等方式基于更具有参考意义的关键点的温度信息来确定被监控对象的温度150。
在图1中,基于输入图像110和温度感测图像130生成温度150的关键在于:计算设备110中的CNN 140是通过预先训练构建的,下文将通过图2对CNN 140的构建和使用进行描述。
图2示出了本公开的多个实施例能够在其中实现的详细示例环境200的示意图。与图1类似地,示例环境200可以包含计算设备220、输入图像210和输出结果250。区别在于,示例环境200总体上可以包括模型训练系统270和模型应用系统280。作为示例,模型训练系统270和/或模型应用系统280可以在如图1所示的计算设备120或如图2所示的计算设备220中实现。应当理解,仅出于示例性的目的描述示例环境200的结构和功能并不旨在限制本文所描述主题的范围。本文所描述主题可以在不同的结构和/或功能中实施。
如前所述,确定被监控对象的诸如脸部的目标部位的检测区域的过程以及确定关键点及其权重信息的过程均可以分为两个阶段:模型训练阶段和模型应用阶段。作为示例,对于确定目标部位的检测区域的过程,在模型训练阶段中,模型训练系统270可以利用训练数据集260来训练确定检测区域的CNN 240。在模型应用阶段中, 模型应用系统280可以接收经训练的CNN 240,从而由CNN 240基于输入图像210确定检测区域。应理解,训练数据集260可以是海量的被标注的监控图像。
作为另一示例,对于确定关键点及其权重信息的过程,在模型训练阶段中,模型训练系统270可以利用训练数据集260来训练确定关键点及其权重信息的CNN 240。在模型应用阶段中,模型应用系统280可以接收经训练的CNN 240,从而由CNN 240基于确定的检测区域确定关键点及其权重信息。应理解,也可以训练CNN 240直接基于输入图像110确定关键点及其权重信息。
在其他实施例中,CNN 240可以被构建为用于确定关键点及其权重信息的学习网络。这样的学习网络也可以被称为学习模型,或者被简称为网络或模型。在一些实施例中,用于确定关键点及其权重信息的学习网络可以包括多个网络,其中每个网络可以是一个多层神经网络,其可以由大量的神经元组成。通过训练过程,每个网络中的神经元的相应参数能够被确定。这些网络中的神经元的参数被统称为CNN 240的参数。
CNN 240的训练过程可以以迭代方式来被执行。具体地,模型训练系统270可以从训练数据集260中获取参考图像,并且利用参考图像来进行训练过程的一次迭代,以更新CNN 240的相应参数。模型训练系统270可以基于训练数据集260中的多个参考图像重复执行上述过程,直至CNN 240的参数中的至少部分参数收敛,由此获得最终的模型参数。
上文描述的技术方案仅用于示例,而非限制本发明。应理解,还可以按照其他方式和连接关系来布置各个网络。为了更清楚地解释上述方案的原理,下文将参考图3来更详细描述温度测量的过程。
图3示出了根据本公开的实施例的用于测量温度的过程300的流程图。在某些实施例中,方法300可以在图1的计算设备120、图2的计算设备220以及图6示出的设备中实现。现参照图1描述根据本公开实施例的用于测量温度的过程300。为了便于理解,在下文描 述中提及的具体实例均是示例性的,并不用于限定本公开的保护范围。
在302,计算设备120可以对输入图像110中的对象的目标部位进行检测。作为示例,计算设备120可以通过CNN 140(诸如,检测区域生成模型)在输入图像110中确定目标部位的检测区域。在一些实施例中,CNN 140可以对输入图像110进行人脸区域检测。例如,可以通过六层卷积网络对输入图像110进行人脸基础特征提取,并且每层卷积网络实现一次图像下采样,基于最后的三层卷积神经网络分别预设置固定数目的不同尺寸人脸锚点区域进行人脸检测区域回归,最终人脸的检测区域。应理解,上述实例仅是示例性的,还可以采用其他层数的卷积网络,并且也不限于确定人脸的检测区域。以此方式,可以基于检测区域生成模型快速识别输入图像110中的目标部位的检测区域,从而为后续的温度测量、甚至人脸识别做准备。
在304,计算设备120可以基于目标部位的检测结果确定目标部位的关键点和关键点的权重信息。权重信息用于指示关键点被遮挡的概率。作为示例,计算设备120可以将目标部位的检测结果应用于CNN 140(诸如,关键点确定模型),以确定关键点和权重信息。CNN 140是基于参考图像中的参考目标部位以及参考目标部位中的参考关键点和参考权重信息来训练得到的。在一些实施例中,CNN 140可以基于人脸的检测结果确定该人脸的关键点和每个关键点的权重信息。以此方式,可以将温度测量的重点侧重在没有被遮挡或者没有被异常温度的物体影响的部位,从而提升了温度测量的准确性。
图4更为详细地示出了根据本公开的实施例的用于基于检测结果410确定关键点420及其权重信息的示意图。如图4所示,被检测对象为行人,且目标部位为行人的面部,即,人脸。当诸如关键点确定模型的CNN 140获取到被确认了人脸检测区域410的图像后,CNN 140可以确定人脸检测区域410内的多个关键点,诸如关键点 420。与此同时,CNN 140还可以确定每个关键点的权重信息。例如,由于关键点420被手和收集遮挡,故其权重信息被确定为很小。作为示例,权重信息通常被设置为0到1之间的数值,由CNN140预测的关键点被遮挡的概率越大,权重信息的数值就越小,这意味着该关键点处的温度越没有参考价值。
之后,回到图3。在306,计算设备120可以获取关键点的温度信息。作为示例,计算设备120可以获取针对目标部位的温度感测图像130。温度感测图像130可以由诸如红外热成像设备的温度感测设备来获取。通过将诸如红外热成像设备的温度感测设备和上文描述的图像获取设备进行配准,从而实现两设备成像的像素级对齐。由此,计算设备120可以从温度感测图像130中确定与关键点的位置对应的温度信息。以此方式,实现了对识别的关键点的温度测量,从而为后续的温度计算做准备。
应理解,此时获取的温度信息虽然可以作为计算温度150的依据,但仍可能因环境等因素的影响而存在误差。因此,可以创建设置有温度感测设备和图像获取设备的位置处的测量温度与实际温度之间的函数关系。例如,可以基于先验知识通过最小二乘法拟合出该函数关系。由此,计算设备120可以获取关键点的测量温度,并且基于测量温度确定关键点的实际温度。此时由计算设备120确定的实际温度的准备度被显著提升。
在308,计算设备120可以至少基于关键点的温度信息和权重信息来确定目标部位的温度。应理解,本文描述的目标部位可以是对象的面部、眼睛、手(包含指纹)中的至少一项,并且对象也不限于是人。例如,计算设备120在确定目标部位的温度后,可以将该温度与阈值温度进行比较,并且当该温度高于阈值温度时进行报警。由于人体各部位的温度有差异,故当被检测的是人的面部时,其对应的阈值温度可以被设置为不同于人的手所对应的阈值温度。此外,当对动物园或者饲养场所中的动物进行体温检测时,由于每种动物的体温通常不同,故也可以针对不同类型的动物确定相应的阈值温 度,从而实现不同动物的体温测试与报警。
通过以上方式,通过识别多个关键点并且细化地确定每个关键点处的温度及其权重信息,本公开可以提升温度测量的准确性。此外,由于本公开可以应用于多行人、多动物等的场景,而无需工作人员介入,故可以降低温度测量的时间和人力成本,降低了疫情期间工作人员被感染的风险。
此外,计算设备120还可以至少基于关键点和权重信息对目标部位进行识别,并且基于所识别的结果确定对象。在一些实施例中,计算设备120可以基于关键点和权重信息识别人脸,进而确定该被监控的行人的身份信息。在另一些实施例中,计算设备120还可以基于关键点和权重信息确定被监控的动物的种类。由于设置有权重机制,被遮挡的部分实际上将不被或很少被计算设备120用于进行识别操作,从而降低了计算设备120中的CNN 140出现误识别的可能性。
此外,本公开还提供了一种用于测量温度的系统500。如图5所示,该系统包括图像采集模块510,该图像采集模块可以是诸如RGB相机的图像感测设备以及诸如红外热成像设备的温度感测设备。该系统500还可以包括与图像采集模块510通信连接的计算模块520,该计算模块520用于上文所描述的各个方法和处理,例如过程300。此外,该系统500可以包括输出展示模块530,用于向用户展示计算模块520的处理结果。例如,输出展示模块530可以向用户展示被监控对象的温度。当被监控对象的体温高于预定阈值时,输出展示模块530还可以用于发出报警信号。
以此方式,可以实现系统级的无接触测温,且在算力需求不变的前提下显著提升测温的正确率。
在某些实施例中,系统500可以应用于多行人的测温场景。例如,可以将系统500中的图像采集模块510应用于地铁或场馆入口,从而实现实时采集行人的诸如RGB图像和红外图像,计算模块520可以对该RGB图像和红外图像进行如过程300的图像处理进而得到 该行人进行体温信息,一旦发现行人的体温高于预定阈值,输出展示模块530可以通过多种告警方式来锁定该行人,该系统可以实时监控经过入口的多个行人的体温信息。以此方式,避免或者减少了安检防疫人员直接与疑似患者的接触,并且测温过程简洁高效,不会人为造成拥堵。
在某些实施例中,系统500可以应用于养殖场或动物园。例如,可以将系统500中的图像采集模块510应用于养殖场或动物园的最佳视角,从而实现实时监控动物的体温信息。此外,计算模块520可以对动物进行种类识别,从而确定经测温的动物的种类,并由此获取该类动物的体温阈值。一旦发现动物的体温高于该阈值,输出展示模块530可以通过多种告警方式来锁定该动物,便于工作人员对其进行治疗或处理。以此方式,避免或者减少了工作人员直接与可能携带病菌的动物的接触。
图6示出了根据本公开的实施例的用于测量温度的装置600的框图。如图6所示,装置600可以包括:目标部位检测模块602,被配置为对输入图像中的对象的目标部位进行检测;关键点信息确定模块604,被配置为基于目标部位的检测结果确定目标部位的关键点和关键点的权重信息,该权重信息指示关键点被遮挡的概率;温度信息获取模块606,被配置为获取关键点的温度信息;以及温度确定模块608,被配置为至少基于关键点的温度信息和权重信息,确定目标部位的温度。
在某些实施例中,关键点信息确定模块604可以包括:检测结果应用模块,被配置为将目标部位的检测结果应用于关键点确定模型,以确定关键点和权重信息,关键点确定模型是基于参考图像中的参考目标部位以及参考目标部位中的参考关键点和参考权重信息来训练得到的。
在某些实施例中,温度信息获取模块606可以包括:温度感测图像获取模块,被配置为获取针对目标部位的温度感测图像;以及温度信息确定模块,被配置为从温度感测图像中确定与关键点的位 置对应的温度信息。
在某些实施例中,温度信息获取模块606可以包括:测量温度获取模块,被配置为获取关键点的测量温度;以及实际温度确定模块,被配置为基于测量温度确定关键点的实际温度。
在某些实施例中,装置600还可以包括:目标部位识别模块,被配置为至少基于关键点和权重信息对目标部位进行识别;以及对象确定模块,被配置为基于识别的结果确定对象。
在某些实施例中,目标部位可以是对象的面部、眼睛、指纹中的至少一项。
在某些实施例中,目标部位检测模块可以包括:检测区域确定模块,被配置为通过检测区域生成模型在输入图像中确定目标部位的检测区域。
图7示出了能够实施本公开的多个实施例的计算设备700的框图。设备700可以用于实现图1的计算设备120或者图2中的计算设备220。如图所示,设备700包括中央处理单元(CPU)701,其可以根据存储在只读存储器(ROM)702中的计算机程序指令或者从存储单元708加载到随机访问存储器(RAM)703中的计算机程序指令,来执行各种适当的动作和处理。在RAM 703中,还可存储设备700操作所需的各种程序和数据。CPU 701、ROM 702以及RAM 703通过总线704彼此相连。输入/输出(I/O)接口705也连接至总线704。
设备700中的多个部件连接至I/O接口705,包括:输入单元706,例如键盘、鼠标等;输出单元707,例如各种类型的显示器、扬声器等;存储单元708,例如磁盘、光盘等;以及通信单元709,例如网卡、调制解调器、无线通信收发机等。通信单元709允许设备700通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
处理单元701执行上文所描述的各个方法和处理,例如过程300。例如,在一些实施例中,过程300可被实现为计算机软件程序,其 被有形地包含于机器可读介质,例如存储单元708。在一些实施例中,计算机程序的部分或者全部可以经由ROM 702和/或通信单元709而被载入和/或安装到设备700上。当计算机程序加载到RAM 703并由CPU 701执行时,可以执行上文描述的过程300的一个或多个步骤。备选地,在其他实施例中,CPU 701可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行过程300。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、负载可编程逻辑设备(CPLD)等等。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
此外,虽然采用特定次序描绘了各操作,但是这应当理解为要求这样操作以所示出的特定次序或以顺序次序执行,或者要求所有图示的操作应被执行以取得期望的结果。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实现中。相反地,在单个实现的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实现中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (17)

  1. 一种用于测量温度的方法,包括:
    对输入图像中的对象的目标部位进行检测;
    基于所述目标部位的检测结果确定所述目标部位的关键点和所述关键点的权重信息,所述权重信息指示所述关键点被遮挡的概率;
    获取所述关键点的温度信息;以及
    至少基于所述关键点的所述温度信息和所述权重信息,确定所述目标部位的温度。
  2. 根据权利要求1所述的方法,其中确定所述关键点和所述权重信息包括:
    将所述目标部位的检测结果应用于关键点确定模型,以确定所述关键点和所述权重信息,所述关键点确定模型是基于参考图像中的参考目标部位以及所述参考目标部位中的参考关键点和参考权重信息来训练得到的。
  3. 根据权利要求1所述的方法,其中获取所述关键点的温度信息包括:
    获取针对所述目标部位的温度感测图像;以及
    从所述温度感测图像中确定与所述关键点的位置对应的温度信息。
  4. 根据权利要求1所述的方法,其中获取所述关键点的温度信息包括:
    获取所述关键点的测量温度;以及
    基于所述测量温度确定所述关键点的实际温度。
  5. 根据权利要求1所述的方法,还包括:
    至少基于所述关键点和所述权重信息对所述目标部位进行识别;以及
    基于所述识别的结果确定所述对象。
  6. 根据权利要求1所述的方法,其中所述目标部位是所述对象 的面部、眼睛、指纹中的至少一项。
  7. 根据权利要求1所述的方法,其中对所述目标部位进行检测包括:
    通过检测区域生成模型在所述输入图像中确定所述目标部位的检测区域。
  8. 一种用于测量温度的装置,包括:
    目标部位检测模块,被配置为对输入图像中的对象的目标部位进行检测;
    关键点信息确定模块,被配置为基于所述目标部位的检测结果确定所述目标部位的关键点和所述关键点的权重信息,所述权重信息指示所述关键点被遮挡的概率;
    温度信息获取模块,被配置为获取所述关键点的温度信息;以及
    温度确定模块,被配置为至少基于所述关键点的所述温度信息和所述权重信息,确定所述目标部位的温度。
  9. 根据权利要求8所述的装置,其中所述关键点信息确定模块包括:
    检测结果应用模块,被配置为将所述目标部位的检测结果应用于关键点确定模型,以确定所述关键点和所述权重信息,所述关键点确定模型是基于参考图像中的参考目标部位以及所述参考目标部位中的参考关键点和参考权重信息来训练得到的。
  10. 根据权利要求8所述的装置,其中所述温度信息获取模块包括:
    温度感测图像获取模块,被配置为获取针对所述目标部位的温度感测图像;以及
    温度信息确定模块,被配置为从所述温度感测图像中确定与所述关键点的位置对应的温度信息。
  11. 根据权利要求8所述的装置,其中所述温度信息获取模块包括:
    测量温度获取模块,被配置为获取所述关键点的测量温度;以及
    实际温度确定模块,被配置为基于所述测量温度确定所述关键点的实际温度。
  12. 根据权利要求8所述的装置,还包括:
    目标部位识别模块,被配置为至少基于所述关键点和所述权重信息对所述目标部位进行识别;以及
    对象确定模块,被配置为基于所述识别的结果确定所述对象。
  13. 根据权利要求8所述的装置,其中所述目标部位是所述对象的面部、眼睛、指纹中的至少一项。
  14. 根据权利要求8所述的装置,其中所述目标部位检测模块包括:
    检测区域确定模块,被配置为通过检测区域生成模型在所述输入图像中确定所述目标部位的检测区域。
  15. 一种电子设备,所述电子设备包括:
    一个或多个处理器;以及
    存储装置,用于存储一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-7中任一项所述的方法。
  16. 一种计算机可读存储介质,其上存储有计算机程序,所述程序被处理器执行时实现如权利要求1-7任一项所述的方法。
  17. 一种用于测量温度的系统,包括:
    图像采集模块,被配置为提供与对象的目标部位相关联的输入图像;
    计算模块,与所述图像采集模块通信连接,所述计算模块被配置为实现如权利要求1-8任一项所述的方法;以及
    输出展示模块,被配置为展示所述计算模块的处理结果。
PCT/CN2020/120964 2020-05-15 2020-10-14 测量温度的方法、装置、电子设备和计算机可读存储介质 WO2021227350A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2022569508A JP2023525378A (ja) 2020-05-15 2020-10-14 温度を測定するための方法、装置、電子機器及びコンピュータ読み取り可能な記憶媒体
KR1020227041892A KR20220166368A (ko) 2020-05-15 2020-10-14 온도 측정 방법, 장치, 전자 기기 및 컴퓨터 판독 가능 저장 매체
US17/998,881 US20230213388A1 (en) 2020-05-15 2020-10-14 Method and apparatus for measuring temperature, and computer-readable storage medium
EP20935969.4A EP4151968A4 (en) 2020-05-15 2020-10-14 METHOD AND DEVICE FOR TEMPERATURE MEASUREMENT, ELECTRONIC DEVICE AND COMPUTER READABLE STORAGE MEDIUM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010415405.7A CN111595450B (zh) 2020-05-15 2020-05-15 测量温度的方法、装置、电子设备和计算机可读存储介质
CN202010415405.7 2020-05-15

Publications (1)

Publication Number Publication Date
WO2021227350A1 true WO2021227350A1 (zh) 2021-11-18

Family

ID=72182846

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/120964 WO2021227350A1 (zh) 2020-05-15 2020-10-14 测量温度的方法、装置、电子设备和计算机可读存储介质

Country Status (6)

Country Link
US (1) US20230213388A1 (zh)
EP (1) EP4151968A4 (zh)
JP (1) JP2023525378A (zh)
KR (1) KR20220166368A (zh)
CN (1) CN111595450B (zh)
WO (1) WO2021227350A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114022907A (zh) * 2021-12-27 2022-02-08 东北农业大学 一种基于深度学习的猪只体表测温装置及方法
CN114152349A (zh) * 2021-11-30 2022-03-08 深圳Tcl新技术有限公司 温度测量方法、装置、存储介质及电子设备

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111595450B (zh) * 2020-05-15 2022-03-25 北京百度网讯科技有限公司 测量温度的方法、装置、电子设备和计算机可读存储介质
CN112418513A (zh) * 2020-11-19 2021-02-26 青岛海尔科技有限公司 温度预测方法及装置、存储介质、电子装置
CN116897012A (zh) * 2021-07-26 2023-10-17 京东方科技集团股份有限公司 中医体质识别方法、装置、电子设备、存储介质及程序
CN114550269A (zh) * 2022-03-02 2022-05-27 北京百度网讯科技有限公司 口罩佩戴检测方法、设备和介质
CN117611378A (zh) * 2022-05-25 2024-02-27 连云港银丰食用菌科技有限公司 一种食用菌储存和养护方法及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5333784A (en) * 1993-03-02 1994-08-02 Exergen Corporation Radiation detector with thermocouple calibration and remote temperature reference
CN105095905A (zh) * 2014-04-18 2015-11-25 株式会社理光 目标识别方法和目标识别装置
CN108353128A (zh) * 2015-10-27 2018-07-31 富士胶片株式会社 摄像系统以及对象检测装置及其工作方法
CN208420179U (zh) * 2018-05-29 2019-01-22 浙江双视红外科技股份有限公司 一种闸机单元及闸机系统
CN110987189A (zh) * 2019-11-21 2020-04-10 北京都是科技有限公司 对目标对象进行温度检测的方法、系统以及装置
CN111595450A (zh) * 2020-05-15 2020-08-28 北京百度网讯科技有限公司 测量温度的方法、装置、电子设备和计算机可读存储介质
CN111666826A (zh) * 2020-05-15 2020-09-15 北京百度网讯科技有限公司 处理图像的方法、装置、电子设备和计算机可读存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080142713A1 (en) * 1992-05-05 2008-06-19 Automotive Technologies International, Inc. Vehicular Occupant Sensing Using Infrared
CN100587422C (zh) * 2006-09-27 2010-02-03 费安路 大流量人群体温检测方法及其装置
CN102147835A (zh) * 2010-11-26 2011-08-10 中华人民共和国深圳出入境检验检疫局 用于口岸车道的司机体温自动测量系统及实现方法
JP6630999B2 (ja) * 2014-10-15 2020-01-15 日本電気株式会社 画像認識装置、画像認識方法、および、画像認識プログラム
CN109196517B (zh) * 2016-06-08 2022-06-21 松下知识产权经营株式会社 对照装置和对照方法
CN108932456B (zh) * 2017-05-23 2022-01-28 北京旷视科技有限公司 人脸识别方法、装置和系统及存储介质
CN107403141B (zh) * 2017-07-05 2020-01-10 中国科学院自动化研究所 人脸检测方法及装置、计算机可读存储介质、设备
CN107361748B (zh) * 2017-07-20 2023-11-28 歌尔股份有限公司 一种体温测试方法和装置
CN109960974A (zh) * 2017-12-22 2019-07-02 北京市商汤科技开发有限公司 人脸关键点检测方法、装置、电子设备及存储介质
JP6981277B2 (ja) * 2018-01-26 2021-12-15 富士フイルムビジネスイノベーション株式会社 検出装置、及び検出プログラム
CN108344525B (zh) * 2018-02-09 2020-06-23 英华达(上海)科技有限公司 自适应体温监控方法及系统
JP7061551B2 (ja) * 2018-10-17 2022-04-28 鹿島建設株式会社 状態判定システム
CN110595622A (zh) * 2019-08-01 2019-12-20 广东美的白色家电技术创新中心有限公司 一种红外测温方法及加热设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5333784A (en) * 1993-03-02 1994-08-02 Exergen Corporation Radiation detector with thermocouple calibration and remote temperature reference
CN105095905A (zh) * 2014-04-18 2015-11-25 株式会社理光 目标识别方法和目标识别装置
CN108353128A (zh) * 2015-10-27 2018-07-31 富士胶片株式会社 摄像系统以及对象检测装置及其工作方法
CN208420179U (zh) * 2018-05-29 2019-01-22 浙江双视红外科技股份有限公司 一种闸机单元及闸机系统
CN110987189A (zh) * 2019-11-21 2020-04-10 北京都是科技有限公司 对目标对象进行温度检测的方法、系统以及装置
CN111595450A (zh) * 2020-05-15 2020-08-28 北京百度网讯科技有限公司 测量温度的方法、装置、电子设备和计算机可读存储介质
CN111666826A (zh) * 2020-05-15 2020-09-15 北京百度网讯科技有限公司 处理图像的方法、装置、电子设备和计算机可读存储介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
See also references of EP4151968A4
YU, KUAI: "AI Security Team in Anti-epidemic Battle", BIG DATA ERA, no. 2, 30 April 2020 (2020-04-30), pages 38 - 49, XP009531813, ISSN: 2096-255X *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114152349A (zh) * 2021-11-30 2022-03-08 深圳Tcl新技术有限公司 温度测量方法、装置、存储介质及电子设备
CN114152349B (zh) * 2021-11-30 2023-11-14 深圳Tcl新技术有限公司 温度测量方法、装置、存储介质及电子设备
CN114022907A (zh) * 2021-12-27 2022-02-08 东北农业大学 一种基于深度学习的猪只体表测温装置及方法

Also Published As

Publication number Publication date
EP4151968A1 (en) 2023-03-22
JP2023525378A (ja) 2023-06-15
US20230213388A1 (en) 2023-07-06
KR20220166368A (ko) 2022-12-16
EP4151968A4 (en) 2024-03-27
CN111595450A (zh) 2020-08-28
CN111595450B (zh) 2022-03-25

Similar Documents

Publication Publication Date Title
WO2021227350A1 (zh) 测量温度的方法、装置、电子设备和计算机可读存储介质
Mohammed et al. Novel COVID-19 detection and diagnosis system using IOT based smart helmet
KR102021999B1 (ko) 인체 감시 발열 경보 장치
CN111666826A (zh) 处理图像的方法、装置、电子设备和计算机可读存储介质
CN111402481A (zh) 一种带有测量体温功能的智能门禁系统、控制方法及计算机可读存储介质
CN111898580B (zh) 针对戴口罩人群的体温和呼吸数据采集系统、方法及设备
CN111626125A (zh) 人脸温度检测的方法、系统、装置和计算机设备
CN112432709B (zh) 人体测温的方法和系统
US11170894B1 (en) Access and temperature monitoring system (ATMs)
JP2012235415A (ja) 画像処理システム、発熱者特定方法、画像処理装置およびその制御方法と制御プログラム
Garg Drowsiness detection of a driver using conventional computer vision application
US11393595B2 (en) Method for determining a disease outbreak condition at a transit facility
WO2021227351A1 (zh) 目标部位跟踪方法、装置、电子设备和可读存储介质
Ulleri et al. Development of contactless employee management system with mask detection and body temperature measurement using TensorFlow
CN113569671A (zh) 异常行为报警方法、装置
US20220208392A1 (en) Monitoring potential contact based infection events
CN109545392B (zh) 基于物联网的远程监控方法、装置、设备及介质
US11727568B2 (en) Rapid illness screening of a population using computer vision and multispectral data
Caliwag et al. Distance estimation in thermal cameras using multi-task cascaded convolutional neural network
Suryadi et al. On the comparison of social distancing violation detectors with graphical processing unit support
Yan et al. Dynamic group difference coding based on thermal infrared face image for fever screening
Al Maashri et al. A novel drone-based system for accurate human temperature measurement and disease symptoms detection using thermography and AI
TWI777689B (zh) 物件辨識暨體溫量測方法
CN113158933A (zh) 一种走失人员识别的方法、系统、装置及存储介质
CN111854963A (zh) 温度检测的方法、装置、设备和计算机设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935969

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022569508

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20227041892

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020935969

Country of ref document: EP

Effective date: 20221215