CN111297337A - Detection object judgment method, system, machine readable medium and equipment - Google Patents

Detection object judgment method, system, machine readable medium and equipment Download PDF

Info

Publication number
CN111297337A
CN111297337A CN202010116863.0A CN202010116863A CN111297337A CN 111297337 A CN111297337 A CN 111297337A CN 202010116863 A CN202010116863 A CN 202010116863A CN 111297337 A CN111297337 A CN 111297337A
Authority
CN
China
Prior art keywords
detection
target
face
temperature
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010116863.0A
Other languages
Chinese (zh)
Inventor
周曦
姚志强
李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuncong Technology Group Co Ltd
Original Assignee
Yuncong Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuncong Technology Group Co Ltd filed Critical Yuncong Technology Group Co Ltd
Priority to CN202010116863.0A priority Critical patent/CN111297337A/en
Publication of CN111297337A publication Critical patent/CN111297337A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • A61B2560/0247Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value
    • A61B2560/0252Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value using ambient temperature
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Veterinary Medicine (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Multimedia (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Radiation Pyrometers (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for judging a detection object, which comprises the steps of acquiring multiple types of images of the detection object; determining a target detection area of the detection object based on the plurality of types of images; acquiring a detection index of the target detection area; acquiring the attribute of the target detection area; different target detection area attributes correspond to different detection index thresholds; and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area. According to the method, the abnormal body temperature person can be screened out only by the fact that the positive thermal infrared image appears in the target detection area of the person to be detected and not requiring the abnormal body temperature person to be located at a specific temperature measurement target point position through the infrared imaging technology, and through the method, the omission factor can be obviously reduced, and the system throughput is improved.

Description

Detection object judgment method, system, machine readable medium and equipment
Technical Field
The invention relates to the field of abnormal condition detection, in particular to a method, a system, a machine readable medium and equipment for judging a detected object.
Background
When the human body temperature abnormity is detected in open places with large human flow, such as a channel bayonet, a railway station, a subway station and the like, detection equipment is often required to quickly screen abnormal body temperature personnel in a non-contact manner. Devices for infrared thermal imaging are now commonly used, but such devices currently on the market often present the following problems in use:
in the existing infrared thermal imaging temperature measurement method, in order to measure temperature accurately, the position of a person to be measured is often required to be located at a fixed position (called a "temperature measurement target point position" for short) away from temperature measurement equipment; when the person to be measured is at the temperature measurement target point, the forehead part is over against the temperature measurement equipment. The two requirements are often difficult to meet at the same time, so that the temperature measurement system has the defects of high omission factor and low throughput.
The infrared thermal imaging temperature measuring equipment measures the human body temperature at a fixed temperature measuring target point, and the measured value has temperature drift under different environments. The temperature measuring system has the defect of insufficient reliability.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present invention to provide a method, system, machine-readable medium and device for determining a detected object, which are used to solve the problems of the prior art.
In order to achieve the above and other related objects, the present invention provides a method for determining a detection object, including:
acquiring multiple types of images of a detection object;
determining a target detection area of the detection object based on the plurality of types of images;
acquiring a detection index of the target detection area;
and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area.
Optionally, the method further comprises:
acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and determining a detection index threshold corresponding to the attribute according to the attribute associated with the target detection area.
Optionally, the attribute associated with the target detection area includes at least one of: face attributes, environment attributes, distance attributes.
Optionally, if the target detection area is a distance attribute, determining a corresponding detection index threshold according to a distance between the target detection area and the image processing device.
Optionally, if the target detection area is an environmental attribute, determining a corresponding detection index threshold according to the environmental temperature of the target detection area.
Optionally, the face attributes include at least one of: age, face angle, gender.
Optionally, a corresponding detection index threshold is determined according to the face angle.
Optionally, when the detection index of the detection object meets a first preset condition, the detection object is a target object.
Optionally, a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object.
Optionally, the method further comprises:
and tracking the target object or/and the potential target object by adopting a face recognition technology or a human body recognition technology.
Optionally, the detection indicator comprises temperature.
Optionally, the multiple types of images include visible light images and infrared images.
Optionally, the determining a target detection area of the detection object based on the multiple types of images includes:
detecting a target part of the visible light image to obtain a target part position;
and determining a target detection area of the target part position in the infrared image of the detection object.
Optionally, the target portion includes a face, a forehead, a back of hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of hand region, a neck region, and a shoulder region.
Optionally, sorting the detection objects or the potential target objects according to detection indexes; or classifying the detection objects according to age, face angle, gender and face size, and sequencing the detection objects in each classification according to detection indexes.
Optionally, the method further comprises:
and when the target object is detected, sending an alarm prompt.
To achieve the above and other related objects, the present invention provides a detection object judging system, including:
the image acquisition module is used for acquiring various types of images of the detection object;
a target area detection module for determining a target detection area of the detection object based on the plurality of types of images;
the detection index acquisition module is used for acquiring the detection index of the target detection area;
and the target object judging module is used for judging whether the detection object is the target object according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area.
Optionally, the system further comprises:
the attribute acquisition module is used for acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and the detection index threshold value determining module is used for determining a detection index threshold value corresponding to the attribute according to the attribute associated with the target detection area.
Optionally, the attribute associated with the target detection area includes at least one of: face attributes, environment attributes, distance attributes.
Optionally, if the target detection area is a distance attribute, determining a corresponding detection index threshold according to a distance between the target detection area and the image processing device.
Optionally, if the target detection area is an environmental attribute, determining a corresponding detection index threshold according to the environmental temperature of the target detection area.
Optionally, the face attributes include at least one of: age, face angle, gender.
Optionally, a corresponding detection index threshold is determined according to the face angle.
Optionally, when the detection index of the detection object meets a first preset condition, the detection object is a target object.
Optionally, a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object.
Optionally, the system further comprises:
and the tracking module is used for tracking the target object or/and the potential target object by adopting a face recognition technology or a human body recognition technology.
Optionally, the detection indicator comprises temperature.
Optionally, the multiple types of images include visible light images and infrared images.
Optionally, the target area detection module includes:
the target part detection unit is used for detecting a target part of the visible light image to obtain a target part position;
and determining and detecting a target area, wherein the target area is used for determining a target detection area of the target part position in the infrared image of the detection object.
Optionally, the target portion includes a face, a forehead, a back of hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of hand region, a neck region, and a shoulder region.
Optionally, sorting the detection objects or the potential target objects according to detection indexes; or classifying the detection objects according to age, face angle, gender and face size, and sequencing the detection objects in each classification according to detection indexes.
Optionally, the system further comprises:
and the alarm module is used for sending out an alarm prompt when the target object is detected.
To achieve the above and other related objects, the present invention provides an apparatus comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform one or more of the methods described previously.
To achieve the foregoing and other related objectives, the present invention provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described above.
As described above, the method, system, machine-readable medium and device for determining a detection object provided by the present invention have the following advantages:
the invention provides a method for judging a detection object, which comprises the following steps: acquiring multiple types of images of a detection object; determining a target detection area of the detection object based on the plurality of types of images; acquiring a detection index of the target detection area; and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area. According to the method, the abnormal body temperature person can be screened out only by the fact that the positive thermal infrared image appears in the target detection area of the person to be detected and not requiring the abnormal body temperature person to be located at a specific temperature measurement target point position through the infrared imaging technology, and through the method, the omission factor can be obviously reduced, and the system throughput is improved. Meanwhile, a human body temperature relative value detection technology is adopted to detect the abnormal body temperature personnel.
Drawings
Fig. 1 is a flowchart of a method for determining a detection object according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a temperature detection process performed on a face area according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of spatial matching of a visible light image and a thermal infrared image according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware structure of a detected object management system according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of a terminal device according to another embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
As shown in fig. 1, a detection object determination method includes:
s11, acquiring multiple types of images of the detection object;
s12 determining a target detection area of the detection object based on the plurality of types of images;
s13, acquiring a detection index of the target detection area;
s14 determines whether or not the detection object is a target object, based on the detection index of the target detection region and the detection index threshold corresponding to the attribute associated with the target detection region.
The invention obtains various types of images of a detection object; determining a target detection area of the detection object based on the plurality of types of images; acquiring a detection index of the target detection area; and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area. According to the method, the abnormal body temperature person can be screened out only by the fact that the positive thermal infrared image appears in the target detection area of the person to be detected and not requiring the abnormal body temperature person to be located at a specific temperature measurement target point position through the infrared imaging technology, and through the method, the omission factor can be obviously reduced, and the system throughput is improved. Meanwhile, a human body temperature relative value detection technology is adopted to detect the abnormal body temperature personnel.
In an embodiment, the method further comprises:
acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and determining a detection index threshold corresponding to the attribute according to the attribute associated with the target detection area.
In one embodiment, the attribute associated with the target detection area includes at least one of: face attributes, environment attributes, distance attributes.
In this embodiment, the plurality of types of images may include: visible light images, infrared images. The visible light image may be acquired by a visible light image acquisition sensor, the infrared image may be acquired by an infrared image acquisition sensor, or of course, after the RGB-IR image sensor (which can simultaneously receive RGB components and IR components) acquires the image, the RGB-IR processing unit separates the received RGB-IR image data to obtain a synchronized RGB image (visible light image) and an IR image (infrared image). Of course, in another embodiment, at least two images of the two images can be collected by one device, for example, an infrared temperature probe capable of collecting the visible light image and the infrared image at the same time can be adopted.
In this embodiment, the determining a target detection area of the detection object based on the plurality of types of images includes:
detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
After the target detection area is determined, the detection index of the target detection area of the detection object can be obtained; and judging whether the detection object is the target object or not according to the detection index. In this embodiment, different detection index thresholds may be set according to the attribute of the target detection area, and the detection index thresholds corresponding to the attribute are compared according to different attributes. Specifically, for example, if the attribute of the target detection area is a, the detection index threshold is B; if the attribute of the target detection area is a1, the detection index threshold is B1, and so on.
In one embodiment, the target part includes a face, a forehead, a back of a hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of a hand region, a neck region, and a shoulder region. In the present embodiment, the detection index includes temperature.
The invention adopts infrared images to acquire the temperature of the target detection area. According to the knowledge of infrared temperature measurement: any object with a temperature above absolute zero (-273.15 c) emits infrared radiation (thermal radiation) without stopping. The infrared radiation is an electromagnetic wave, the wavelength range is 0.7 um-1000 um, the infrared radiation cannot be seen by human eyes, and the wavelength of the external radiation is different at different temperatures. After the infrared radiation is absorbed, the temperature of the heat-sensitive material (detector) can rise, and the thermal imaging equipment further obtains corresponding temperature information according to the corresponding temperature rise condition. Generally, infrared thermometry is performed by acquiring the temperature of a detection object, comparing the temperature with a set temperature threshold, for example, 37.3 ℃, and then determining whether heat is generated. In the process of utilizing infrared temperature measurement, a plurality of factors influence the measurement precision, so that the actual temperature of the target detection area is inconsistent with the measurement temperature, and therefore, the measurement precision is influenced by adopting a fixed temperature threshold value for comparison. Based on this, the present invention uses different temperature thresholds for comparison. Namely, different detection index thresholds are adopted according to different target detection area attributes.
In an embodiment, a description is given of performing temperature detection on a face area of a detection object by using a target detection area as the face area and using a detection index as a temperature, where a specific flow is shown in fig. 2, and includes:
s21, acquiring a visible light image of the detection object in the imaging area;
s22, acquiring an infrared image of the detection object in the imaging area;
s23, carrying out image space registration by a pattern recognition method; the matching of the visible light image and the image on the space is ensured through the image space registration;
as shown in fig. 3, in the ABCD rectangular area in the visible light image, the ABCD rectangular area in the infrared image forms spatial registration, and in the two registration areas, the relative coordinates have a correspondence relationship. Therefore, the position of the target site in the visible light image in the target detection region in the infrared image can be determined by the coordinate conversion algorithm.
S24, analyzing the visible light image through a face detection algorithm, detecting a face region in the image, detecting a temperature measurement target point position, and recording the relative position of the temperature measurement target point position in the registration region;
s25, mapping the detected position of the temperature measurement target point position to a registration area of the infrared image, and acquiring the temperature of the face area in the area.
It will be appreciated that since infrared detection is an ongoing process, detection may be performed continuously over a period of time to obtain a plurality of infrared images. Therefore, when detecting the temperature of the face region, it is necessary to obtain images at the same time, that is, an infrared image at the current time and a visible light image at the current time. In the process of determining the face area, firstly, the face detection is carried out on the visible light image at the current moment to obtain the position of the face, and then the face position in the visible light image at the current moment is mapped to the infrared image of the detection object at the current moment to obtain the face area of the detection object in the infrared image at the current moment.
After the face area is determined, the temperature of the face area in the infrared image at the current moment can be measured to obtain the temperature of the face area.
It can be understood that the image of the face area is an infrared image, a corresponding relationship between color and temperature can be obtained in advance, and the temperature corresponding to the color in the face area is determined according to the corresponding relationship between the color and the temperature, so that the temperature of the face area can be determined.
In one embodiment, the face attributes include at least one of: age, face angle, gender. And setting different temperature thresholds according to different human face attributes. And if the face attribute is a face angle, determining a corresponding detection index threshold value, such as a temperature threshold value, according to the face angle.
Different temperature thresholds can be set according to different age groups because different ages have different temperatures.
For example, the metabolic rate of children is higher, the body temperature is slightly higher than that of adults, and the temperature threshold corresponding to children can be set slightly higher; the elderly have a lower metabolic rate and may have a slightly lower temperature than the young and the old, and therefore the threshold may be set slightly lower for the elderly. The temperature threshold may be an empirical value, or the age of the detected object may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the age.
The specific method comprises the following steps: first, the age (age group) of a detection target is specified, a temperature threshold corresponding to the age (age group) is specified, the detected temperature is compared with the temperature threshold to determine whether the detection target is a target, and if the detected temperature is higher than the temperature threshold, the detection target is the target.
Similarly, differences in sex can also lead to differences in body temperature, typically female's average body temperature is about 0.3 ℃ higher than men, and therefore different temperature thresholds need to be set for temperature measurements for different sexes.
The specific method comprises the following steps: first, gender of a detection object is determined, a temperature threshold corresponding to the gender is determined, the detected temperature is compared with the temperature threshold, whether the detection object is a target object is judged, and if the detected temperature is higher than the temperature threshold, the detection object is the target object. It will be appreciated that the temperature threshold referred to herein may be an empirical value.
In one embodiment, the temperature of the face is measured, and the measured face temperature is different at different face angles, while the measured face temperature is considered to be the most accurate. Therefore, if a frontal face is not obtained when measuring the face temperature, different temperature thresholds need to be set according to the face angle (the angle of the face with respect to the acquisition device).
The temperature threshold may be an empirical value, or a face angle of the detected object may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the face angle.
The specific method comprises the following steps: the method comprises the steps of firstly determining a face angle of a detection object, determining a temperature threshold corresponding to the face angle, comparing the detected temperature with the temperature threshold, judging whether the detection object is a target object, and if the detected temperature is higher than the temperature threshold, determining that the detection object is the target object.
In this embodiment, the distance attribute refers to a distance between the detection object and the image capture device, and the ambient temperature refers to a temperature of an environment in which the detection object is located.
It is understood that the temperature of the target detection region is related to the distance between the image capturing device for capturing the image and the target detection region, therefore, in an embodiment, the distance between the target detection region and the image capturing device is segmented into A, B, C, D, and the corresponding temperature threshold is set as a1, B1, C1, D1, and if the distance between the target detection region and the image capturing device is at a, the temperature threshold a1 is used for comparison when performing the temperature measurement; if the distance between the target detection area and the image acquisition device is B, the temperature threshold B1 is adopted for comparison when temperature measurement is carried out, and the like.
The temperature threshold may be an empirical value, or the ages of the detected object and the image capturing device may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the distance.
In one embodiment, the environmental attribute refers to a temperature of an environment at which the subject is detected. Because the environmental factors can influence the accuracy of temperature measurement when infrared temperature measurement is carried out, different temperature thresholds can be adopted for comparison according to the temperature of a target detection area. Specifically, the ambient temperature is obtained, and different temperature thresholds are used for comparison according to the ambient temperature. More specifically, under the condition of low environmental temperature, the detected temperature value is lower than the real body temperature, and a lower temperature threshold value is adopted for comparison at the moment; in case of a high ambient temperature, the detected temperature value may be higher than the real body temperature, and a high temperature threshold should be used for comparison.
The temperature threshold may be an empirical value, or the temperature of the environment in which the detection object is located may be input to a trained neural network-based temperature threshold model to obtain the temperature threshold corresponding to the distance.
In an embodiment, when the detection index of a detection object meets a first preset condition, the detection object is a target object.
Taking the temperature as an example, the first preset condition may be greater than a preset threshold, for example, greater than a preset temperature threshold, where the temperature threshold may be set to 37.3 ℃, and of course, may also be set to other temperature thresholds, which is not limited in this embodiment. When the temperature of a target detection area, namely a human face area, exceeds 37.3 ℃, a detection object can be defined as a target object, and an alarm prompt is sent out when the target object is detected.
It will be appreciated that the user may set alarm parameters such as preset temperature thresholds, alarm sensitivity, etc. For example, when the temperature of the target detection area exceeds 37.3 ℃, an alarm is given. The alarm mode can be various, such as sound and light indication alarm or voice alarm.
The alarm device is displayed by eye-catching colors on a screen displaying an infrared image or a laser image, can set alarm levels, can display different alarm levels by different colors, and sends different alarm signals by different alarm levels. For example, the alarm level is low, only sound and light alarm, alarm and the like can be given, the voice alarm can be given with higher level, the alarm level is high, and sound and light alarm and voice alarm can be given at the same time.
Still taking the temperature as an example, all the detection objects are sorted according to the temperature, and then the first preset condition may be that the first N% of the detection objects are target detection objects. Wherein N may be preset, for example, 10; the abnormal point of the statistical human body temperature distribution curve is taken as the standard through curve analysis.
In an embodiment, a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object. Wherein a potential target object is defined as being in close contact with the target object. The close contact person refers to a detection object living directly with a target object. Including colleagues in offices, students in a class in a school, colleagues and students in the same classroom or dormitory, passengers in the same plane, and the like. And other forms of direct contact include direct contact of the target object, accompanying, riding a taxi, riding an elevator, and the like. In another embodiment, the second preset condition may also be persons simultaneously present in the same field of view.
In an embodiment, the target object or/and the potential target object may be tracked by using a face recognition technology or a human body recognition technology.
After the face or the human body of the detection object is captured, the detection index is detected, and after the detection index is judged to be the target object, the target object is tracked through the recognized face or human body characteristics.
Or, after the face or the human body of the detection object is captured, the face or the human body is compared with the face or the human body in the target object library, and if the comparison is successful, the detection object is tracked.
The embodiment adopts a face detection technology, and can simultaneously detect a plurality of faces in a video picture so as to obtain face data of a detection object. Further, an optimal face can be obtained through the face data, the face temperature is measured, and the face area when the optimal face is used as a temperature measurement object. The optimal face can be comprehensively selected through multiple dimensions such as face quality score, face size, face angle, face occlusion rate and the like.
In an embodiment, the detection objects or potential target objects may also be sorted according to detection indexes or classified according to age, face angle, gender, and face size (which may be characterized by, but not limited to, interpupillary distance) and the detection objects in each classification may be sorted according to detection indexes. For example, all the detection objects are sorted according to the temperature, or the potential target objects are sorted according to the temperature.
Or, classifying the detection objects according to the human face attributes, and then sorting according to the temperature, including:
body temperature ranking of all adult males analyzed;
body temperature ranking of all analyzed adult females;
ranking all analyses as boy temperature;
ranking all analyses as body temperature of girls;
and sequencing the body temperatures of people with similar human face sizes.
Within each category, the N highest temperature persons are screened, and these persons may be considered target objects.
As shown in fig. 4, a detection object determination method includes:
an image acquisition module 41 for acquiring a plurality of types of images of the detection object;
a target area detection module 42 for determining a target detection area of the detection object based on the plurality of types of images;
a detection index obtaining module 43, configured to obtain a detection index of the target detection area;
and a target object determination module 44, configured to determine whether the detection object is the target object according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area.
The invention obtains various types of images of a detection object; determining a target detection area of the detection object based on the plurality of types of images; acquiring a detection index of the target detection area; and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area. According to the method, the abnormal body temperature person can be screened out only by the fact that the positive thermal infrared image appears in the target detection area of the person to be detected and not requiring the abnormal body temperature person to be located at a specific temperature measurement target point position through the infrared imaging technology, and through the method, the omission factor can be obviously reduced, and the system throughput is improved. Meanwhile, a human body temperature relative value detection technology is adopted to detect the abnormal body temperature personnel.
In one embodiment, the system further comprises:
the attribute acquisition module is used for acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and the detection index threshold value determining module is used for determining a detection index threshold value corresponding to the attribute according to the attribute associated with the target detection area.
In one embodiment, the attribute associated with the target detection area includes at least one of: face attributes, environment attributes, distance attributes.
In this embodiment, the plurality of types of images may include: visible light images, infrared images. The visible light image may be acquired by a visible light image acquisition sensor, the infrared image may be acquired by an infrared image acquisition sensor, or of course, after the RGB-IR image sensor (which can simultaneously receive RGB components and IR components) acquires the image, the RGB-IR processing unit separates the received RGB-IR image data to obtain a synchronized RGB image (visible light image) and an IR image (infrared image). Of course, in another embodiment, at least two images of the two images can be collected by one device, for example, an infrared temperature probe capable of collecting the visible light image and the infrared image at the same time can be adopted.
In this embodiment, the target area detection module includes:
the target part detection unit is used for detecting a target part of the visible light image to obtain a target part position;
and determining and detecting a target area, wherein the target area is used for determining a target detection area of the target part position in the infrared image of the detection object.
After the target detection area is determined, the detection index of the target detection area of the detection object can be obtained; and judging whether the detection object is the target object or not according to the detection index. In this embodiment, different detection index thresholds may be set according to the attribute of the target detection area, and the detection index thresholds corresponding to the attribute are compared according to different attributes. Specifically, for example, if the attribute of the target detection area is a, the detection index threshold is B; if the attribute of the target detection area is a1, the detection index threshold is B1, and so on.
In one embodiment, the target part includes a face, a forehead, a back of a hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of a hand region, a neck region, and a shoulder region. In the present embodiment, the detection index includes temperature.
The invention adopts infrared images to acquire the temperature of the target detection area. According to the knowledge of infrared temperature measurement: any object with a temperature above absolute zero (-273.15 c) emits infrared radiation (thermal radiation) without stopping. The infrared radiation is an electromagnetic wave, the wavelength range is 0.7 um-1000 um, the infrared radiation cannot be seen by human eyes, and the wavelength of the external radiation is different at different temperatures. After the infrared radiation is absorbed, the temperature of the heat-sensitive material (detector) can rise, and the thermal imaging equipment further obtains corresponding temperature information according to the corresponding temperature rise condition. Generally, infrared thermometry is performed by acquiring the temperature of a detection object, comparing the temperature with a set temperature threshold, for example, 37.3 ℃, and then determining whether heat is generated. In the process of utilizing infrared temperature measurement, a plurality of factors influence the measurement precision, so that the actual temperature of the target detection area is inconsistent with the measurement temperature, and therefore, the measurement precision is influenced by adopting a fixed temperature threshold value for comparison. Based on this, the present invention uses different temperature thresholds for comparison. Namely, different detection index thresholds are adopted according to different target detection area attributes.
In an embodiment, a description is given of performing temperature detection on a face area of a detection object by using a target detection area as the face area and using a detection index as a temperature, where a specific flow is shown in fig. 2, and includes:
s21, acquiring a visible light image of the detection object in the imaging area;
s22, acquiring an infrared image of the detection object in the imaging area;
s23, carrying out image space registration by a pattern recognition method; the matching of the visible light image and the image on the space is ensured through the image space registration;
as shown in fig. 3, in the ABCD rectangular area in the visible light image, the ABCD rectangular area in the infrared image forms spatial registration, and in the two registration areas, the relative coordinates have a correspondence relationship. Therefore, the position of the target site in the visible light image in the target detection region in the infrared image can be determined by the coordinate conversion algorithm.
S24, analyzing the visible light image through a face detection algorithm, detecting a face region in the image, detecting a temperature measurement target point position, and recording the relative position of the temperature measurement target point position in the registration region;
s25, mapping the detected position of the temperature measurement target point position to a registration area of the infrared image, and acquiring the temperature of the face area in the area.
It will be appreciated that since infrared detection is an ongoing process, detection may be performed continuously over a period of time to obtain a plurality of infrared images. Therefore, when detecting the temperature of the face region, it is necessary to obtain images at the same time, that is, an infrared image at the current time and a visible light image at the current time. In the process of determining the face area, firstly, the face detection is carried out on the visible light image at the current moment to obtain the position of the face, and then the face position in the visible light image at the current moment is mapped to the infrared image of the detection object at the current moment to obtain the face area of the detection object in the infrared image at the current moment.
After the face area is determined, the temperature of the face area in the infrared image at the current moment can be measured to obtain the temperature of the face area.
It can be understood that the image of the face area is an infrared image, a corresponding relationship between color and temperature can be obtained in advance, and the temperature corresponding to the color in the face area is determined according to the corresponding relationship between the color and the temperature, so that the temperature of the face area can be determined.
In one embodiment, the face attributes include at least one of: age, face angle, gender. And setting different temperature thresholds according to different human face attributes. And if the face attribute is a face angle, determining a corresponding detection index threshold value, such as a temperature threshold value, according to the face angle.
Different temperature thresholds can be set according to different age groups because different ages have different temperatures.
For example, the metabolic rate of children is higher, the body temperature is slightly higher than that of adults, and the temperature threshold corresponding to children can be set slightly higher; the elderly have a lower metabolic rate and may have a slightly lower temperature than the young and the old, and therefore the threshold may be set slightly lower for the elderly. The temperature threshold may be an empirical value, or the age of the detected object may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the age.
The specific method comprises the following steps: first, the age (age group) of a detection target is specified, a temperature threshold corresponding to the age (age group) is specified, the detected temperature is compared with the temperature threshold to determine whether the detection target is a target, and if the detected temperature is higher than the temperature threshold, the detection target is the target.
Similarly, differences in sex can also lead to differences in body temperature, typically female's average body temperature is about 0.3 ℃ higher than men, and therefore different temperature thresholds need to be set for temperature measurements for different sexes.
The specific method comprises the following steps: first, gender of a detection object is determined, a temperature threshold corresponding to the gender is determined, the detected temperature is compared with the temperature threshold, whether the detection object is a target object is judged, and if the detected temperature is higher than the temperature threshold, the detection object is the target object. It will be appreciated that the temperature threshold referred to herein may be an empirical value.
In one embodiment, the temperature of the face is measured, and the measured face temperature is different at different face angles, while the measured face temperature is considered to be the most accurate. Therefore, if a frontal face is not obtained when measuring the face temperature, different temperature thresholds need to be set according to the face angle (the angle of the face with respect to the acquisition device).
The temperature threshold may be an empirical value, or a face angle of the detected object may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the face angle.
The specific method comprises the following steps: the method comprises the steps of firstly determining a face angle of a detection object, determining a temperature threshold corresponding to the face angle, comparing the detected temperature with the temperature threshold, judging whether the detection object is a target object, and if the detected temperature is higher than the temperature threshold, determining that the detection object is the target object.
In this embodiment, the distance attribute refers to a distance between the detection object and the image capture device, and the ambient temperature refers to a temperature of an environment in which the detection object is located.
It is understood that the temperature of the target detection region is related to the distance between the image capturing device for capturing the image and the target detection region, therefore, in an embodiment, the distance between the target detection region and the image capturing device is segmented into A, B, C, D, and the corresponding temperature threshold is set as a1, B1, C1, D1, and if the distance between the target detection region and the image capturing device is at a, the temperature threshold a1 is used for comparison when performing the temperature measurement; if the distance between the target detection area and the image acquisition device is B, the temperature threshold B1 is adopted for comparison when temperature measurement is carried out, and the like.
The temperature threshold may be an empirical value, or the ages of the detected object and the image capturing device may be input to a trained neural network-based temperature threshold model to obtain a temperature threshold corresponding to the distance.
In one embodiment, the environmental attribute refers to a temperature of an environment at which the subject is detected. Because the environmental factors can influence the accuracy of temperature measurement when infrared temperature measurement is carried out, different temperature thresholds can be adopted for comparison according to the temperature of a target detection area. Specifically, the ambient temperature is obtained, and different temperature thresholds are used for comparison according to the ambient temperature. More specifically, under the condition of low environmental temperature, the detected temperature value is lower than the real body temperature, and a lower temperature threshold value is adopted for comparison at the moment; in case of a high ambient temperature, the detected temperature value may be higher than the real body temperature, and a high temperature threshold should be used for comparison.
The temperature threshold may be an empirical value, or the temperature of the environment in which the detection object is located may be input to a trained neural network-based temperature threshold model to obtain the temperature threshold corresponding to the distance.
In an embodiment, when the detection index of a detection object meets a first preset condition, the detection object is a target object.
Taking the temperature as an example, the first preset condition may be greater than a preset threshold, for example, greater than a preset temperature threshold, where the temperature threshold may be set to 37.3 ℃, and of course, may also be set to other temperature thresholds, which is not limited in this embodiment. When the temperature of a target detection area, namely a human face area, exceeds 37.3 ℃, a detection object can be defined as a target object, and meanwhile, when the target object is detected, an alarm module sends out an alarm prompt.
It will be appreciated that the user may set alarm parameters such as preset temperature thresholds, alarm sensitivity, etc. For example, when the temperature of the target detection area exceeds 37.3 ℃, an alarm is given. The alarm mode can be various, such as sound and light indication alarm or voice alarm.
The alarm device is displayed by eye-catching colors on a screen displaying an infrared image or a laser image, can set alarm levels, can display different alarm levels by different colors, and sends different alarm signals by different alarm levels. For example, the alarm level is low, only sound and light alarm, alarm and the like can be given, the voice alarm can be given with higher level, the alarm level is high, and sound and light alarm and voice alarm can be given at the same time.
Still taking the temperature as an example, all the detection objects are sorted according to the temperature, and then the first preset condition may be that the first N% of the detection objects are target detection objects. Wherein N may be preset, for example, 10; the abnormal point of the statistical human body temperature distribution curve is taken as the standard through curve analysis.
In an embodiment, a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object. Wherein a potential target object is defined as being in close contact with the target object. The close contact person refers to a detection object living directly with a target object. Including colleagues in offices, students in a class in a school, colleagues and students in the same classroom or dormitory, passengers in the same plane, and the like. And other forms of direct contact include direct contact of the target object, accompanying, riding a taxi, riding an elevator, and the like. In another embodiment, the second preset condition may also be persons simultaneously present in the same field of view.
In an embodiment, the system further comprises a tracking module for tracking the target object or/and the potential target object by using a face recognition technique or a body recognition technique.
After the face or the human body of the detection object is captured, the detection index is detected, and after the detection index is judged to be the target object, the target object is tracked through the recognized face or human body characteristics.
Or, after the face or the human body of the detection object is captured, the face or the human body is compared with the face or the human body in the target object library, and if the comparison is successful, the detection object is tracked.
The embodiment adopts a face detection technology, and can simultaneously detect a plurality of faces in a video picture so as to obtain face data of a detection object. Further, an optimal face can be obtained through the face data, the face temperature is measured, and the face area when the optimal face is used as a temperature measurement object. The optimal face can be comprehensively selected through multiple dimensions such as face quality score, face size, face angle, face occlusion rate and the like.
In an embodiment, the detection objects or potential target objects may also be sorted according to detection indexes or classified according to age, face angle, gender, and face size (which may be characterized by, but not limited to, interpupillary distance) and the detection objects in each classification may be sorted according to detection indexes. For example, all the detection objects are sorted according to the temperature, or the potential target objects are sorted according to the temperature.
Or, classifying the detection objects according to the human face attributes, and then sorting according to the temperature, including:
body temperature ranking of all adult males analyzed;
body temperature ranking of all analyzed adult females;
ranking all analyses as boy temperature;
ranking all analyses as body temperature of girls;
and sequencing the body temperatures of people with similar human face sizes.
Within each category, the N highest temperature persons are screened, and these persons may be considered target objects.
An embodiment of the present application further provides an apparatus, which may include: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of fig. 1. In practical applications, the device may be used as a terminal device, and may also be used as a server, where examples of the terminal device may include: a smart phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, a laptop, a vehicle-mounted computer, a desktop computer, a set-top box, an intelligent television, a wearable device, and the like.
The present application further provides a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may be caused to execute instructions (instructions) of steps included in the method in fig. 1 according to the present application.
Fig. 5 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown, the terminal device may include: an input device 1100, a first processor 1101, an output device 1102, a first memory 1103, and at least one communication bus 1104. The communication bus 1104 is used to implement communication connections between the elements. The first memory 1103 may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, and the first memory 1103 may store various programs for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the first processor 1101 may be, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the first processor 1101 is coupled to the input device 1100 and the output device 1102 through a wired or wireless connection.
Optionally, the input device 1100 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; the output devices 1102 may include output devices such as a display, audio, and the like.
In this embodiment, the processor of the terminal device includes a module for executing functions of each module in each device, and specific functions and technical effects may refer to the foregoing embodiments, which are not described herein again.
Fig. 6 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application. FIG. 6 is a specific embodiment of the implementation of FIG. 5. As shown, the terminal device of the present embodiment may include a second processor 1201 and a second memory 1202.
The second processor 1201 executes the computer program code stored in the second memory 1202 to implement the method described in fig. 1 in the above embodiment.
The second memory 1202 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The second memory 1202 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a second processor 1201 is provided in the processing assembly 1200. The terminal device may further include: communication component 1203, power component 1204, multimedia component 1205, speech component 1206, input/output interfaces 1207, and/or sensor component 1208. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 1200 generally controls the overall operation of the terminal device. The processing assembly 1200 may include one or more second processors 1201 to execute instructions to perform all or part of the steps of the data processing method described above. Further, the processing component 1200 can include one or more modules that facilitate interaction between the processing component 1200 and other components. For example, the processing component 1200 can include a multimedia module to facilitate interaction between the multimedia component 1205 and the processing component 1200.
The power supply component 1204 provides power to the various components of the terminal device. The power components 1204 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia components 1205 include a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The voice component 1206 is configured to output and/or input voice signals. For example, the voice component 1206 includes a Microphone (MIC) configured to receive external voice signals when the terminal device is in an operational mode, such as a voice recognition mode. The received speech signal may further be stored in the second memory 1202 or transmitted via the communication component 1203. In some embodiments, the speech component 1206 further comprises a speaker for outputting speech signals.
The input/output interface 1207 provides an interface between the processing component 1200 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 1208 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 1208 may detect an open/closed state of the terminal device, relative positioning of the components, presence or absence of user contact with the terminal device. The sensor assembly 1208 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 1208 may also include a camera or the like.
The communication component 1203 is configured to facilitate communications between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
As can be seen from the above, the communication component 1203, the voice component 1206, the input/output interface 1207 and the sensor component 1208 referred to in the embodiment of fig. 6 can be implemented as the input device in the embodiment of fig. 5.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (34)

1. A detected object determination method, characterized by comprising:
acquiring multiple types of images of a detection object;
determining a target detection area of the detection object based on the plurality of types of images;
acquiring a detection index of the target detection area;
and judging whether the detection object is the target object or not according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area.
2. The detection object determination method according to claim 1, further comprising:
acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and determining a detection index threshold corresponding to the attribute according to the attribute associated with the target detection area.
3. The detected object determination method according to claim 2, wherein the attribute associated with the target detection region includes at least one of: face attributes, environment attributes, distance attributes.
4. The detected object determination method according to claim 3, wherein if the detected object is a distance attribute, a corresponding detection index threshold value is determined according to a distance between the target detection area and the image processing apparatus.
5. The detected object determination method according to claim 3, wherein if the detected object is an environmental attribute, a corresponding detection index threshold value is determined according to an environmental temperature at which the target detection region is located.
6. The detected object determination method according to claim 3, wherein the face attribute includes at least one of: age, face angle, gender.
7. The detected object determination method according to claim 6, wherein a corresponding detection index threshold is determined according to the face angle.
8. The detected object determination method according to claim 1, wherein the detected object is a target object when the detection index of the detected object satisfies a first preset condition.
9. The method of claim 8, wherein a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object.
10. The detection object determining method according to claim 9, further comprising:
and tracking the target object or/and the potential target object by adopting a face recognition technology or a human body recognition technology.
11. The detection object determination method according to claim 1, wherein the detection index includes a temperature.
12. The detected object determination method according to claim 1, wherein the plurality of types of images include a visible light image and an infrared image.
13. The detected object determination method according to claim 12, wherein said determining a target detection area of the detected object based on the multi-type images includes:
detecting a target part of the visible light image to obtain a target part position;
and determining a target detection area of the target part position in the infrared image of the detection object.
14. The detection target determination method according to claim 13, wherein the target portion includes a face, a forehead, a back of a hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of a hand region, a neck region, and a shoulder region.
15. The detected object determination method according to claim 9, wherein the detected objects or potential target objects are sorted according to detection indexes; or classifying the detection objects according to age, face angle, gender and face size, and sequencing the detection objects in each classification according to detection indexes.
16. The detection object determination method according to claim 1, further comprising:
and when the target object is detected, sending an alarm prompt.
17. A detected object determination system characterized by comprising:
the image acquisition module is used for acquiring various types of images of the detection object;
a target area detection module for determining a target detection area of the detection object based on the plurality of types of images;
the detection index acquisition module is used for acquiring the detection index of the target detection area;
and the target object judging module is used for judging whether the detection object is the target object according to the detection index of the target detection area and the detection index threshold corresponding to the attribute associated with the target detection area.
18. The detected object judging system according to claim 17, characterized by further comprising:
the attribute acquisition module is used for acquiring the attribute associated with the target detection area; the attributes associated with different target detection areas correspond to different detection index thresholds;
and the detection index threshold value determining module is used for determining a detection index threshold value corresponding to the attribute according to the attribute associated with the target detection area.
19. The detected object determination system according to claim 18, wherein the attribute associated with the target detection region includes at least one of: face attributes, environment attributes, distance attributes.
20. The detected object determination system according to claim 19, wherein in the case of a distance attribute, a corresponding detection index threshold value is determined based on a distance between the target detection region and an image processing apparatus.
21. The detected object determination system according to claim 19, wherein if the detected object is an environmental attribute, a corresponding detection index threshold value is determined according to an environmental temperature at which the target detection region is located.
22. The detected object determination system of claim 19, wherein the face attributes include at least one of: age, face angle, gender.
23. The detected object determination system of claim 22, wherein a corresponding detection index threshold is determined according to the face angle.
24. The detected object determination system according to claim 17, wherein the detected object is a target object when the detection index of the detected object satisfies a first preset condition.
25. The system of claim 24, wherein a second preset condition is set based on the target object, and when the detection object meets the second preset condition, the detection object is a potential target object.
26. The detected object judging system according to claim 25, further comprising:
and the tracking module is used for tracking the target object or/and the potential target object by adopting a face recognition technology or a human body recognition technology.
27. The detection object determination system according to claim 17, wherein the detection index includes a temperature.
28. The detected object determination system according to claim 17, wherein the plurality of types of images include a visible light image and an infrared image.
29. The detected object determination system according to claim 28, wherein the target region detection module includes:
the target part detection unit is used for detecting a target part of the visible light image to obtain a target part position;
and determining and detecting a target area, wherein the target area is used for determining a target detection area of the target part position in the infrared image of the detection object.
30. The system of claim 29, wherein the target region includes a face, a forehead, a back of a hand, a neck, and a shoulder, and the target detection region includes a face region, a forehead region, a back of a hand region, a neck region, and a shoulder region.
31. The detected object determination system of claim 25, wherein the detected objects or potential target objects are ranked according to detection criteria; or classifying the detection objects according to age, face angle, gender and face size, and sequencing the detection objects in each classification according to detection indexes.
32. The detected object judging system according to claim 17, characterized by further comprising:
and the alarm module is used for sending out an alarm prompt when the target object is detected.
33. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of one or more of claims 1-16.
34. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods recited in claims 1-16.
CN202010116863.0A 2020-02-25 2020-02-25 Detection object judgment method, system, machine readable medium and equipment Pending CN111297337A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010116863.0A CN111297337A (en) 2020-02-25 2020-02-25 Detection object judgment method, system, machine readable medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010116863.0A CN111297337A (en) 2020-02-25 2020-02-25 Detection object judgment method, system, machine readable medium and equipment

Publications (1)

Publication Number Publication Date
CN111297337A true CN111297337A (en) 2020-06-19

Family

ID=71152950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010116863.0A Pending CN111297337A (en) 2020-02-25 2020-02-25 Detection object judgment method, system, machine readable medium and equipment

Country Status (1)

Country Link
CN (1) CN111297337A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111898462A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Object attribute processing method and device, storage medium and electronic device
CN112001953A (en) * 2020-07-27 2020-11-27 浙江大华技术股份有限公司 Temperature detection method, device, equipment and computer equipment
CN113084776A (en) * 2021-03-19 2021-07-09 上海工程技术大学 Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1530069A (en) * 1996-06-12 2004-09-22 精工爱普生株式会社 Thermometer
CN1825075A (en) * 2005-02-25 2006-08-30 安捷伦科技有限公司 System and method for detecting thermal anomalies
CN102147835A (en) * 2010-11-26 2011-08-10 中华人民共和国深圳出入境检验检疫局 Driver body temperature automatic measurement system applied to port lanes and implementation method thereof
CN103565422A (en) * 2013-11-06 2014-02-12 江苏大学 Medical infrared thermometer and measurement compensating method of medical infrared thermometer
CN104873172A (en) * 2015-05-11 2015-09-02 京东方科技集团股份有限公司 Apparatus having physical examination function, and method, display apparatus and system thereof
CN105125181A (en) * 2015-09-23 2015-12-09 广东小天才科技有限公司 Method and device for measuring body temperature of user
CN109389036A (en) * 2018-08-29 2019-02-26 中国建设银行股份有限公司 A kind of information-pushing method based on AR, device, terminal and readable medium
CN110160670A (en) * 2019-05-05 2019-08-23 深圳中集智能科技有限公司 Body temperature detection device
CN110276301A (en) * 2019-06-24 2019-09-24 泰康保险集团股份有限公司 Face identification method, device, medium and electronic equipment
CN110555819A (en) * 2019-08-20 2019-12-10 中国石油大学(北京) Equipment monitoring method, device and equipment based on infrared and visible light image fusion
CN111310692A (en) * 2020-02-25 2020-06-19 云从科技集团股份有限公司 Detection object management method, system, machine readable medium and equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1530069A (en) * 1996-06-12 2004-09-22 精工爱普生株式会社 Thermometer
CN1825075A (en) * 2005-02-25 2006-08-30 安捷伦科技有限公司 System and method for detecting thermal anomalies
CN102147835A (en) * 2010-11-26 2011-08-10 中华人民共和国深圳出入境检验检疫局 Driver body temperature automatic measurement system applied to port lanes and implementation method thereof
CN103565422A (en) * 2013-11-06 2014-02-12 江苏大学 Medical infrared thermometer and measurement compensating method of medical infrared thermometer
CN104873172A (en) * 2015-05-11 2015-09-02 京东方科技集团股份有限公司 Apparatus having physical examination function, and method, display apparatus and system thereof
CN105125181A (en) * 2015-09-23 2015-12-09 广东小天才科技有限公司 Method and device for measuring body temperature of user
CN109389036A (en) * 2018-08-29 2019-02-26 中国建设银行股份有限公司 A kind of information-pushing method based on AR, device, terminal and readable medium
CN110160670A (en) * 2019-05-05 2019-08-23 深圳中集智能科技有限公司 Body temperature detection device
CN110276301A (en) * 2019-06-24 2019-09-24 泰康保险集团股份有限公司 Face identification method, device, medium and electronic equipment
CN110555819A (en) * 2019-08-20 2019-12-10 中国石油大学(北京) Equipment monitoring method, device and equipment based on infrared and visible light image fusion
CN111310692A (en) * 2020-02-25 2020-06-19 云从科技集团股份有限公司 Detection object management method, system, machine readable medium and equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111898462A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Object attribute processing method and device, storage medium and electronic device
CN111898462B (en) * 2020-07-08 2023-04-07 浙江大华技术股份有限公司 Object attribute processing method and device, storage medium and electronic device
CN112001953A (en) * 2020-07-27 2020-11-27 浙江大华技术股份有限公司 Temperature detection method, device, equipment and computer equipment
WO2022022425A1 (en) * 2020-07-27 2022-02-03 Zhejiang Dahua Technology Co., Ltd. Systems and methods for temperature measurement
EP4162446A4 (en) * 2020-07-27 2023-08-09 Zhejiang Dahua Technology Co., Ltd. Systems and methods for temperature measurement
CN113084776A (en) * 2021-03-19 2021-07-09 上海工程技术大学 Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion

Similar Documents

Publication Publication Date Title
CN111310692B (en) Detection object management method, system, machine readable medium and equipment
WO2021159682A1 (en) Abnormal object management method and system, machine-readable medium, and device
US11650659B2 (en) User input processing with eye tracking
WO2021159683A1 (en) Exception object determination method and system, and machine readable medium and device
US9443144B2 (en) Methods and systems for measuring group behavior
CN110916620A (en) Body temperature measuring method and terminal
CN111297337A (en) Detection object judgment method, system, machine readable medium and equipment
GB2498299B (en) Evaluating an input relative to a display
US9508005B2 (en) Method for warning a user about a distance between user' s eyes and a screen
CN110196103A (en) Thermometry and relevant device
KR102523871B1 (en) Terminal for measuring skin and method for controlling the same
US9013591B2 (en) Method and system of determing user engagement and sentiment with learned models and user-facing camera images
CN112001886A (en) Temperature detection method, device, terminal and readable storage medium
US9361705B2 (en) Methods and systems for measuring group behavior
Pascali et al. Face morphology: Can it tell us something about body weight and fat?
US11730372B2 (en) Accessory device and imaging device
EP4044907A1 (en) Automatic pressure ulcer measurement
CN109419498A (en) A kind of multifunctional human sensory perceptual system
CN112132056A (en) Living body detection method, system, equipment and medium
TW202002608A (en) Operation method for security monitoring system
CN111695509A (en) Identity authentication method, identity authentication device, machine readable medium and equipment
EP2879018A1 (en) Estimating gaze from un-calibrated eye measurement points
TWI699703B (en) Detection system with recognition label compensation function
KR20170059347A (en) Multi-function physical state measuring devices and systems
US10942575B2 (en) 2D pointing indicator analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200619

RJ01 Rejection of invention patent application after publication