US20240019309A1 - Remaining thermal trace extraction method, remaining thermal trace extraction apparatus and program - Google Patents
Remaining thermal trace extraction method, remaining thermal trace extraction apparatus and program Download PDFInfo
- Publication number
- US20240019309A1 US20240019309A1 US18/248,295 US202018248295A US2024019309A1 US 20240019309 A1 US20240019309 A1 US 20240019309A1 US 202018248295 A US202018248295 A US 202018248295A US 2024019309 A1 US2024019309 A1 US 2024019309A1
- Authority
- US
- United States
- Prior art keywords
- image
- heat trace
- differential
- thermal
- trace area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 46
- 238000000034 method Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 11
- 238000004659 sterilization and disinfection Methods 0.000 description 10
- 230000000249 desinfective effect Effects 0.000 description 7
- 208000015181 infectious disease Diseases 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000000645 desinfectant Substances 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 241000700605 Viruses Species 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 241000711573 Coronaviridae Species 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- NRNCYVBFPDDJNE-UHFFFAOYSA-N pemoline Chemical compound O1C(N)=NC(=O)C1C1=CC=CC=C1 NRNCYVBFPDDJNE-UHFFFAOYSA-N 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000011410 subtraction method Methods 0.000 description 2
- 208000025721 COVID-19 Diseases 0.000 description 1
- 208000001528 Coronaviridae Infections Diseases 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 230000001954 sterilising effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009385 viral infection Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0859—Sighting arrangements, e.g. cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/80—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for detecting, monitoring or modelling epidemics or pandemics, e.g. flu
Definitions
- the present invention relates to a heat trace area extraction method, a heat trace area extraction apparatus, and a program.
- a method of detecting a place where a person is present using a monitoring camera or the like and disinfecting only that place is conceivable.
- a place where a person is present can be identified from a video with considerably high accuracy. If only a place where a person is present can be disinfected using such a technology, it is considered that the labor of disinfection is reduced. For example, it is not necessary to disinfect the periphery of the place in a time zone in which no person uses it at all. On the other hand, if it is known that a plurality of persons use the place, it is considered that the place can be disinfected earlier.
- NPL 2 requires a strong light source such as a projector. Further, it is assumed that recognition accuracy is considerably affected by a positional relationship between the camera and the light source.
- a strong light source cannot be freely installed in many environments, and is not considered that the strong light source is not suitable for the purpose of detecting and presenting a place touched by a person in various places and supporting disinfection.
- the present invention was made in view of the aforementioned circumstances and an object of the present invention is to improve the accuracy of detection of places touched by people.
- a computer executes a differential visible image generation procedure of generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured, a differential thermal image generation procedure of generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured, and an extraction procedure of extracting a heat trace area on the basis of the first differential image and the second differential image.
- FIG. 1 is a schematic diagram of visible images and thermal images obtained by simultaneously photographing the same place.
- FIG. 2 is a diagram showing an example of images obtained according to background subtraction.
- FIG. 3 is a diagram showing a hardware configuration example of a heat trace area extraction apparatus 10 according to an embodiment of the present invention.
- FIG. 4 is a diagram showing a function configuration example of the heat trace area extraction apparatus 10 according to an embodiment of the present invention.
- FIG. 5 is a flowchart for describing an example of a processing procedure executed by the heat trace area extraction apparatus 10 .
- FIG. 6 is a schematic diagram showing an output example of an extraction result of a heat trace area.
- a device, a method, and a program for detecting a place touched by a person using a thermal image in order to help sterilize or disinfect viruses are disclosed. Since humans are homeothermic animals and heat emanates from their hands and feet, heat remains in a contact place for a certain period of time after a person touches something. For example, a method of abusing this heat trace to decode a pass code of a smartphone has been reported (“Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017”).
- Heat traces remain not only on the screen of a smartphone but also on various places such as desks and walls. That is, if a heat trace is identified on the basis of a video (thermal image) of a thermal camera, a place touched by a person indoors or the like can be detected precisely.
- a heat trace area can be extracted according to background subtraction with a thermal image from before a person touches as a background.
- the human body area is also extracted along with a heat trace in this method. Therefore, in the present embodiment, a visible image is acquired simultaneously with a thermal image, and a heat trace area is extracted by comparing the thermal image with the visible image.
- background subtraction is performed for each of a visible image and a thermal image, and a heat trace area is extracted from a difference in results of background subtraction. Since a heat trace cannot be observed in a visible image (that is, with the naked eye), the heat trace cannot be extracted even if background subtraction is performed for the visible image with a visible image from before a person touches as a background. On the other hand, when a person is present on the spot, if background subtraction is performed with a visible image captured in a state where the person is not present as a background, the area of the person is extracted. That is, when an area extracted according to background subtraction in a thermal image is similarly extracted in a visible image, it can be ascertained that the area is not a heat trace.
- an area extracted in the thermal image according to background subtraction and not extracted in the visible image is highly likely to be a heat trace.
- a heat trace area extracted by such a method is visualized, and a place touched by a person is transmitted to a user.
- Simultaneous acquisition of a thermal image and a visible image may be performed using a device such as a sensor node (“Yoshinari Shirai, Yasue Kishino, Takayuki Suyama, Shin Mizutani: PASNIC: a thermal based privacy-aware sensor node for image capturing, UbiComp/ISWC '19 Adjunct, pp. 202-205, 2019”) including a visible light camera and a thermal camera.
- FIG. 1 is a schematic diagram of visible images and thermal images obtained by simultaneously photographing the same place.
- FIG. 1 is a schematic diagram of images obtained by simultaneously photographing a state in which a hand touches a door with a handle using a visible light camera and a thermal camera.
- (a) and (a′) are a visible image and a thermal image at a time t 1 (before the hand touches the door), (b) and (b′) are a visible image and a thermal image at a time t 2 (in a state in which the hand is touching the door), and (c) and (c′) are a visible image and a thermal image at a time t 3 (after the hand touches the door).
- the temperature of the touched place increases as shown in FIG. 1 ( c ′).
- FIG. 2 is a diagram showing an example of images obtained according to background subtraction.
- differential images at the time t 2 and the time t 3 when the image at the time t 1 in FIG. 1 is used as a background image are shown.
- the shape of the arm is extracted as a differential area for both the visible image and the thermal image at the time t 2 , whereas a part touching the door is extracted as a differential area only in the thermal image at the time t 3 .
- a differential area extracted by background subtraction for the thermal image at the time t 3 may be disinfected.
- a differential area extracted by background subtraction for the thermal image at the time t 2 includes a part that does not touch the door.
- the differential area extracted in the thermal image at the time t 2 is actually an area where a human body is present and is not an area of a heat trace remaining after the door is actually touched. From the viewpoint of disinfection, the differential area extracted at the time t 3 may be specified, and the differential area extracted by background subtraction at the time t 2 is not necessary.
- the differential area extracted in the thermal image is not a heat trace area. Since the shape of the arm is extracted even in the visible image at the time t 2 , it is determined that the differential area extracted in the thermal image is not a heat trace area (that is, a part touched by the person). On the other hand, since the area extracted by background subtraction of the thermal image is not extracted in the visible image at the time t 3 , it is determined that the differential area extracted in the thermal image is a heat trace area (that is, the part touched by the person). When the system presents information indicating the heat trace area extracted on the basis of such determination, a user who has viewed the information can efficiently disinfect the part touched by the person.
- FIG. 3 is a diagram showing a hardware configuration example of the heat trace area extraction apparatus 10 according to an embodiment of the present invention.
- the heat trace area extraction apparatus 10 in FIG. 3 includes a drive device 100 , an auxiliary storage device 102 , a memory device 103 , a CPU 104 , an interface device 105 , and the like which are connected via a bus B.
- a program that realizes processing of the heat trace area extraction apparatus 10 is provided by a recording medium 101 such as a CD-ROM.
- a recording medium 101 such as a CD-ROM.
- the program is installed from the recording medium 101 to the auxiliary storage device 102 via the drive device 100 .
- the program may not necessarily be installed from the recording medium 101 and may be downloaded from another computer via a network.
- the auxiliary storage device 102 stores the installed program and stores necessary files, data, and the like.
- the memory device 103 reads the program from the auxiliary storage device 102 and stores the program when an instruction for starting the program is issued.
- the CPU 104 executes functions relevant to the heat trace area extraction apparatus 10 according to the program stored in the memory device 103 .
- the interface device 105 is used as an interface for connection to a network.
- FIG. 4 is a diagram showing a functional configuration example of the heat trace area extraction apparatus 10 according to the embodiment of the present invention.
- the heat trace area extraction apparatus 10 includes a visible image acquisition unit 11 , a background visible image generation unit 12 , a differential visible image generation unit 13 , a thermal image acquisition unit 14 , a background thermal image generation unit 15 , a differential thermal image generation unit 16 , a heat trace area extraction unit 17 , a heat trace area output unit 18 , and the like. These units are realized by processing caused by one or more programs installed in the heat trace area extraction apparatus 10 to be executed by the CPU 104 .
- the heat trace area extraction apparatus 10 is connected to each of a visible light camera 21 and a thermal camera 22 such that images can be input from the visible light camera 21 and the thermal camera 22 .
- the visible light camera 21 and the thermal camera 22 are installed so as to be able to photograph the same place (the same certain range). That is, the present embodiment is based on the assumption that the photographing area of the visible light camera 21 and the photographing area of the thermal camera 22 coincide with each other in units of pixels. When photographed parts of the visible light camera 21 and the thermal camera 22 do not coincide with each other, calibration may be performed in advance to ascertain correspondence between pixels of a visible image and a thermal image.
- FIG. 5 is a flowchart for describing an example of a processing procedure executed by the heat trace area extraction apparatus 10 .
- step S 101 the visible image acquisition unit 11 acquires a visible image captured by the visible light camera 21 , which is input from the visible light camera 21
- the thermal image acquisition unit 14 acquires a thermal image captured by the thermal camera 22 , which is input from the thermal camera 22 .
- acquisition of the visible image by the visible image acquisition unit 11 and acquisition of the thermal image by the thermal image acquisition unit 14 may be performed simultaneously or may not be performed simultaneously. If simultaneous acquisition is not performed, some frames of a camera having a higher frame rate may be ignored in accordance with a camera having a lower frame rate.
- the visible image acquisition unit 11 transmits the acquired visible image to the background visible image generation unit 12
- the thermal image acquisition unit 14 transmits the acquired thermal image to the background thermal image generation unit 15 .
- the background visible image generation unit 12 stores the visible image transmitted from the visible image acquisition unit 11 in an auxiliary storage device 102
- the background thermal image generation unit 15 stores the thermal image transmitted from the thermal image acquisition unit 14 in the auxiliary storage device 102 (S 102 ).
- Steps S 101 and S 102 are repeated until a predetermined time T 1 elapses.
- the predetermined time T 1 may be a period in which one or more visible images and one or more thermal images are accumulated in the auxiliary storage device 102 .
- step S 104 the background visible image generation unit 12 generates a background image of a photographing range (referred to as a “background visible image” hereinafter) on the basis of a visible image group accumulated in the auxiliary storage device 102 in the predetermined period T 1 .
- step S 104 the background thermal image generation unit 15 generates a background image of a photographing range (referred to as a “background thermal image” hereinafter) on the basis of a thermal image group accumulated in the auxiliary storage device 102 in the predetermined period T 1 .
- a background image background visible image and background thermal image
- RGB median value of pixel values
- the predetermined time T 1 corresponds to the time t 1 in FIG. 1 . That is, the time T 1 may not be an instantaneous timing.
- step S 105 and subsequent steps are executed.
- Steps S 101 to S 104 and step S 105 may not be executed synchronously.
- step S 105 and subsequent steps may be started according to an instruction different from the execution instruction for steps S 101 to S 104 .
- step S 105 the visible image acquisition unit 11 and the thermal image acquisition unit 14 wait for the elapse of a predetermined time T 2 .
- the predetermined time T 2 is, for example, an elapsed time from the time t 2 to the time t 3 in FIG. 2 .
- the visible image acquisition unit 11 acquires a visible image (referred to as a “target visible image” hereinafter) input from the visible light camera 21
- the thermal image acquisition unit 14 acquires a thermal image (referred to as a “target thermal image” hereinafter) input from the thermal camera 22 (S 106 ). It is desirable that the target visible image and the target thermal image be images captured simultaneously (or almost simultaneously).
- the differential visible image generation unit 13 compares the background visible image generated by the background visible image generation unit 12 with the target visible image according to a background subtraction method and extracts a differential area with respect to the background visible image (area different from the background visible image) from the target visible image to generate a differential image representing a difference (referred to as a “differential visible image” hereinafter).
- the differential thermal image generation unit 16 compares the background thermal image generated by the background thermal image generation unit 15 with the target thermal image according to the background subtraction method and extracts a differential area with respect to the background thermal image (area different from the background thermal image) from the target visible image, thereby generating a differential image representing a difference (referred to as a “differential thermal image” hereinafter).
- each differential image is sent to the heat trace area extraction unit 17 .
- the heat trace area extraction unit 17 compares the differential visible image with the differential thermal image and extracts a heat trace area in the imaging range (S 108 ).
- the heat trace area extraction unit 17 first performs labeling (extraction of connection area) on each binary image which is the differential visible image or the differential thermal image. Next, the heat trace area extraction unit 17 compares degrees of overlap between one or more differential areas obtained by labeling of the differential thermal image (referred to as “differential thermal areas” hereinafter) and one or more differential areas obtained by labeling of the differential visible image (referred to as “differential visible areas” hereinafter).
- the heat trace area extraction unit 17 counts whether or not the differential areas to be compared match in pixel units, and determines that two compared differential areas are not similar when a matching rate is a certain threshold value or more.
- the heat trace area extraction unit 17 extracts a heat difference area which is not similar to any differential visible region as a heat trace area.
- the heat trace area extraction unit 17 transmits information indicating the heat trace area and the background visible image to the heat trace area output unit 18 .
- the heat trace area extraction unit 17 may generate a binary image in which the heat trace area part is white and the other part is black and transmit the binary image to the heat trace area output unit 18 as the information indicating the heat trace area. It should be noted that determination of a similarity between areas is actively performed in research of pattern matching or the like, and the present embodiment is not limited to a predetermined method
- the heat trace area output unit 18 outputs information indicating the extraction result of the heat trace area such that a user can check the information (S 109 ).
- the heat trace area output unit 18 may output an image obtained by combining white pixels of the binary image indicating the heat trace area on the background visible image.
- the output form is not limited to a predetermined form. For example, setting in a display device, storage in the auxiliary storage device 102 , and transmission to a user terminal via a network may be performed.
- step S 105 and subsequent steps are executed.
- step S 109 may be executed after steps S 105 to S 108 are repeated executed a plurality of times. In this case, heat trace areas extracted the plurality of times can be output collectively.
- FIG. 6 is a schematic diagram showing an output example of a heat trace area extraction result.
- a part of the background visible image touched by a hand of a person is painted out black (however, black is a color for convenience and an actual color may be other colors such as white). The user can recognize the part as a heat trace area.
- steps S 101 to S 103 may be executed in parallel with step S 105 and subsequent steps.
- the background visible image and the background thermal image are periodically updated. Therefore, it is possible to expect improvement of resistance to change in the background according to the lapse of time.
- the present embodiment is applicable to a moving thing if the position of an object that is a target touched by a person can be specified in an image.
- a QR code registered trademark
- the QR code registered trademark
- a heat trace remaining on the seat surface can be estimated with the seat surface as a background even when the chair moves, and the heat trace can be displayed on the seat surface on an image.
- Various techniques for estimating the position of an object in an image have been proposed and are not limited to the use of the QR code (registered trademark).
- the differential visible image is an example of a first differential image.
- the differential thermal image is an example of a second differential image.
- the heat trace area extraction unit 17 is an example of an extraction unit.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
A heat trace area extraction method executed by a computer, includes: generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured; generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
Description
- The present invention relates to a heat trace area extraction method, a heat trace area extraction apparatus, and a program.
- With the epidemic of the novel coronavirus infection (COVID-19), various measures have been taken to prevent infection, among which are disinfection and sterilization. Since virus infection occurs when the mouth, nose, or mucous membrane of the eyes are touched with fingers to which the virus adheres, disinfection of objects around the body that are likely to be touched by fingers and by ab infected person is recommended. For this reason, in restaurants and sports gyms used by many people, measures of periodically wiping desks, doors, training equipment, and the like are periodically wiped with an alcohol disinfectant or the like are taken.
- It is difficult to visually confirm which part of an object has actually been touched, and in such stores, a countermeasure of disinfecting all places that people may touch are disinfected at every predetermined time is often performed. However, the work of disinfecting all places that people may touch periodically involves a large amount of labor. For this reason, a method for improving efficiency of disinfection using a drone has been proposed (for example, NPL 1).
- On the other hand, a method of detecting a place where a person is present using a monitoring camera or the like and disinfecting only that place is conceivable. There are many examples of research in human detection using a video, and since recent years, a place where a person is present can be identified from a video with considerably high accuracy. If only a place where a person is present can be disinfected using such a technology, it is considered that the labor of disinfection is reduced. For example, it is not necessary to disinfect the periphery of the place in a time zone in which no person uses it at all. On the other hand, if it is known that a plurality of persons use the place, it is considered that the place can be disinfected earlier. In disinfection at regular time intervals, infection mediated by an object, that is, infection caused by an infected person touching an object and another person touching the object cannot be prevented within the interval, but it is considered that such spread of infection can be further reduced if it can be disinfected flexibly according to human use. In addition, such a method can prevent unnecessary disinfection, and thus the effect of reducing a disinfectant can be expected. That is, if disinfection can be performed in accordance with the use of an object by a person detected by a monitoring camera, it is possible to expect to reduce labor, prevent the spread of infection, and save a disinfectant as compared to disinfecting everything that may have been used at regular time intervals.
- It is considered that labor and disinfectant are further reduced if it is possible to disinfect only a place that a person actually touches more precisely instead of disinfecting everything around a place where a person is present. However, it is difficult to identify whether or not a person who is present on the spot actually touches an object in a method using a visible video captured by a monitoring camera or the like. For example, it is assumed that a camera is installed downward from the ceiling and captures an image of a table. It is considered that it is difficult to determine whether a hand is touching the desk or not from a video captured by the installed camera when the hand is put out on the table. Therefore, a method of detecting contact with an object using a shadow has been proposed (NPL 2).
- [NPL 1] “Efficiently disinfect stadium using drone! Devised by US start-up has devised for novel coronavirus,” [online], Internet<URL:https://techable.jp/archives/124749>
- [NPL 2] Ryusei Yoshida, Feng Yaokai, Seiichi Uchida, Touch sensing with image recognition, 2011 Electrical Association Kyushu Branch Joint Conference, 2011.
- However, the method of NPL 2 requires a strong light source such as a projector. Further, it is assumed that recognition accuracy is considerably affected by a positional relationship between the camera and the light source. A strong light source cannot be freely installed in many environments, and is not considered that the strong light source is not suitable for the purpose of detecting and presenting a place touched by a person in various places and supporting disinfection.
- The present invention was made in view of the aforementioned circumstances and an object of the present invention is to improve the accuracy of detection of places touched by people.
- Accordingly, in order to solve the above problem, a computer executes a differential visible image generation procedure of generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured, a differential thermal image generation procedure of generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured, and an extraction procedure of extracting a heat trace area on the basis of the first differential image and the second differential image.
- It is possible to improve the accuracy of detection of places touched by people.
-
FIG. 1 is a schematic diagram of visible images and thermal images obtained by simultaneously photographing the same place. -
FIG. 2 is a diagram showing an example of images obtained according to background subtraction. -
FIG. 3 is a diagram showing a hardware configuration example of a heat tracearea extraction apparatus 10 according to an embodiment of the present invention. -
FIG. 4 is a diagram showing a function configuration example of the heat tracearea extraction apparatus 10 according to an embodiment of the present invention. -
FIG. 5 is a flowchart for describing an example of a processing procedure executed by the heat tracearea extraction apparatus 10. -
FIG. 6 is a schematic diagram showing an output example of an extraction result of a heat trace area. - In the present embodiment, a device, a method, and a program for detecting a place touched by a person using a thermal image in order to help sterilize or disinfect viruses are disclosed. Since humans are homeothermic animals and heat emanates from their hands and feet, heat remains in a contact place for a certain period of time after a person touches something. For example, a method of abusing this heat trace to decode a pass code of a smartphone has been reported (“Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017”).
- Heat traces remain not only on the screen of a smartphone but also on various places such as desks and walls. That is, if a heat trace is identified on the basis of a video (thermal image) of a thermal camera, a place touched by a person indoors or the like can be detected precisely.
- A heat trace area can be extracted according to background subtraction with a thermal image from before a person touches as a background. However, since the human body itself is heated, the human body area is also extracted along with a heat trace in this method. Therefore, in the present embodiment, a visible image is acquired simultaneously with a thermal image, and a heat trace area is extracted by comparing the thermal image with the visible image.
- Specifically, background subtraction is performed for each of a visible image and a thermal image, and a heat trace area is extracted from a difference in results of background subtraction. Since a heat trace cannot be observed in a visible image (that is, with the naked eye), the heat trace cannot be extracted even if background subtraction is performed for the visible image with a visible image from before a person touches as a background. On the other hand, when a person is present on the spot, if background subtraction is performed with a visible image captured in a state where the person is not present as a background, the area of the person is extracted. That is, when an area extracted according to background subtraction in a thermal image is similarly extracted in a visible image, it can be ascertained that the area is not a heat trace. On the other hand, an area extracted in the thermal image according to background subtraction and not extracted in the visible image is highly likely to be a heat trace. In the present embodiment, a heat trace area extracted by such a method is visualized, and a place touched by a person is transmitted to a user. Simultaneous acquisition of a thermal image and a visible image may be performed using a device such as a sensor node (“Yoshinari Shirai, Yasue Kishino, Takayuki Suyama, Shin Mizutani: PASNIC: a thermal based privacy-aware sensor node for image capturing, UbiComp/ISWC '19 Adjunct, pp. 202-205, 2019”) including a visible light camera and a thermal camera.
- The object of embodiment of the present invention will be described with reference to
FIGS. 1 and 2 .FIG. 1 is a schematic diagram of visible images and thermal images obtained by simultaneously photographing the same place.FIG. 1 is a schematic diagram of images obtained by simultaneously photographing a state in which a hand touches a door with a handle using a visible light camera and a thermal camera. (a) and (a′) are a visible image and a thermal image at a time t1 (before the hand touches the door), (b) and (b′) are a visible image and a thermal image at a time t2 (in a state in which the hand is touching the door), and (c) and (c′) are a visible image and a thermal image at a time t3 (after the hand touches the door). When a person touches the door, the temperature of the touched place increases as shown inFIG. 1 (c′). -
FIG. 2 is a diagram showing an example of images obtained according to background subtraction. InFIG. 2 , differential images at the time t2 and the time t3 when the image at the time t1 inFIG. 1 is used as a background image are shown. The shape of the arm is extracted as a differential area for both the visible image and the thermal image at the time t2, whereas a part touching the door is extracted as a differential area only in the thermal image at the time t3. - Considering disinfecting the area actually touched by the hand, a differential area extracted by background subtraction for the thermal image at the time t3 may be disinfected. On the other hand, a differential area extracted by background subtraction for the thermal image at the time t2 includes a part that does not touch the door. The differential area extracted in the thermal image at the time t2 is actually an area where a human body is present and is not an area of a heat trace remaining after the door is actually touched. From the viewpoint of disinfection, the differential area extracted at the time t3 may be specified, and the differential area extracted by background subtraction at the time t2 is not necessary.
- Therefore, in the present embodiment, when a similar differential area is extracted by background subtraction even in the visible image, it is determined that the differential area is not a heat trace area. Since the shape of the arm is extracted even in the visible image at the time t2, it is determined that the differential area extracted in the thermal image is not a heat trace area (that is, a part touched by the person). On the other hand, since the area extracted by background subtraction of the thermal image is not extracted in the visible image at the time t3, it is determined that the differential area extracted in the thermal image is a heat trace area (that is, the part touched by the person). When the system presents information indicating the heat trace area extracted on the basis of such determination, a user who has viewed the information can efficiently disinfect the part touched by the person.
- Hereinafter, the heat trace
area extraction apparatus 10 for realizing the above-described embodiment will be described in detail.FIG. 3 is a diagram showing a hardware configuration example of the heat tracearea extraction apparatus 10 according to an embodiment of the present invention. The heat tracearea extraction apparatus 10 inFIG. 3 includes adrive device 100, anauxiliary storage device 102, amemory device 103, aCPU 104, aninterface device 105, and the like which are connected via a bus B. - A program that realizes processing of the heat trace
area extraction apparatus 10 is provided by arecording medium 101 such as a CD-ROM. When therecording medium 101 storing the program is set in thedrive device 100, the program is installed from therecording medium 101 to theauxiliary storage device 102 via thedrive device 100. The program may not necessarily be installed from therecording medium 101 and may be downloaded from another computer via a network. Theauxiliary storage device 102 stores the installed program and stores necessary files, data, and the like. - The
memory device 103 reads the program from theauxiliary storage device 102 and stores the program when an instruction for starting the program is issued. TheCPU 104 executes functions relevant to the heat tracearea extraction apparatus 10 according to the program stored in thememory device 103. Theinterface device 105 is used as an interface for connection to a network. -
FIG. 4 is a diagram showing a functional configuration example of the heat tracearea extraction apparatus 10 according to the embodiment of the present invention. InFIG. 4 , the heat tracearea extraction apparatus 10 includes a visibleimage acquisition unit 11, a background visibleimage generation unit 12, a differential visibleimage generation unit 13, a thermalimage acquisition unit 14, a background thermalimage generation unit 15, a differential thermalimage generation unit 16, a heat tracearea extraction unit 17, a heat tracearea output unit 18, and the like. These units are realized by processing caused by one or more programs installed in the heat tracearea extraction apparatus 10 to be executed by theCPU 104. - As shown in
FIG. 4 , the heat tracearea extraction apparatus 10 is connected to each of avisible light camera 21 and athermal camera 22 such that images can be input from thevisible light camera 21 and thethermal camera 22. Thevisible light camera 21 and thethermal camera 22 are installed so as to be able to photograph the same place (the same certain range). That is, the present embodiment is based on the assumption that the photographing area of thevisible light camera 21 and the photographing area of thethermal camera 22 coincide with each other in units of pixels. When photographed parts of thevisible light camera 21 and thethermal camera 22 do not coincide with each other, calibration may be performed in advance to ascertain correspondence between pixels of a visible image and a thermal image. -
FIG. 5 is a flowchart for describing an example of a processing procedure executed by the heat tracearea extraction apparatus 10. - In step S101, the visible
image acquisition unit 11 acquires a visible image captured by thevisible light camera 21, which is input from thevisible light camera 21, and the thermalimage acquisition unit 14 acquires a thermal image captured by thethermal camera 22, which is input from thethermal camera 22. In step S101, acquisition of the visible image by the visibleimage acquisition unit 11 and acquisition of the thermal image by the thermalimage acquisition unit 14 may be performed simultaneously or may not be performed simultaneously. If simultaneous acquisition is not performed, some frames of a camera having a higher frame rate may be ignored in accordance with a camera having a lower frame rate. In addition, there is no problem if fps is high to some extent even in a method of alternately acquiring still images from thevisible light camera 21 and thethermal camera 22 and regarding the acquired images as being simultaneously acquired. The visibleimage acquisition unit 11 transmits the acquired visible image to the background visibleimage generation unit 12, and the thermalimage acquisition unit 14 transmits the acquired thermal image to the background thermalimage generation unit 15. - Subsequently, the background visible
image generation unit 12 stores the visible image transmitted from the visibleimage acquisition unit 11 in anauxiliary storage device 102, and the background thermalimage generation unit 15 stores the thermal image transmitted from the thermalimage acquisition unit 14 in the auxiliary storage device 102 (S102). - Steps S101 and S102 are repeated until a predetermined time T1 elapses. The predetermined time T1 may be a period in which one or more visible images and one or more thermal images are accumulated in the
auxiliary storage device 102. - When the predetermined time T1 elapses from execution of the first step S101 (Yes in S103), processing proceeds to step S104. In step S104, the background visible
image generation unit 12 generates a background image of a photographing range (referred to as a “background visible image” hereinafter) on the basis of a visible image group accumulated in theauxiliary storage device 102 in the predetermined period T1. - In addition, in step S104, the background thermal
image generation unit 15 generates a background image of a photographing range (referred to as a “background thermal image” hereinafter) on the basis of a thermal image group accumulated in theauxiliary storage device 102 in the predetermined period T1. - For example, when the visible image group or the thermal image group (the visible image group and the thermal image group are simply referred to as a “captured image group” if they are not distinguished from each other) includes a plurality of captured images, a background image (background visible image and background thermal image) may be generated for each image group using an average value or a median value of pixel values (RGB) of each captured image group as a pixel value of each pixel. Accordingly, a background image in which a person or the like that has passed temporarily is removed and only a person or the like that continues to stay on the spot for a long time is included can be generated. Many studies have been made on how to dynamically create a background from images captured for a predetermined time, and a background image generation method in the present embodiment is not limited to a predetermined method.
- The predetermined time T1 corresponds to the time t1 in
FIG. 1 . That is, the time T1 may not be an instantaneous timing. - When the background visible image and the background thermal image are generated, step S105 and subsequent steps are executed. Steps S101 to S104 and step S105 may not be executed synchronously. For example, step S105 and subsequent steps may be started according to an instruction different from the execution instruction for steps S101 to S104.
- In step S105, the visible
image acquisition unit 11 and the thermalimage acquisition unit 14 wait for the elapse of a predetermined time T2. The predetermined time T2 is, for example, an elapsed time from the time t2 to the time t3 inFIG. 2 . - When the predetermined time T2 elapses (Yes in S105), the visible
image acquisition unit 11 acquires a visible image (referred to as a “target visible image” hereinafter) input from thevisible light camera 21, and the thermalimage acquisition unit 14 acquires a thermal image (referred to as a “target thermal image” hereinafter) input from the thermal camera 22 (S106). It is desirable that the target visible image and the target thermal image be images captured simultaneously (or almost simultaneously). - In subsequent step S107, the differential visible
image generation unit 13 compares the background visible image generated by the background visibleimage generation unit 12 with the target visible image according to a background subtraction method and extracts a differential area with respect to the background visible image (area different from the background visible image) from the target visible image to generate a differential image representing a difference (referred to as a “differential visible image” hereinafter). In addition, the differential thermalimage generation unit 16 compares the background thermal image generated by the background thermalimage generation unit 15 with the target thermal image according to the background subtraction method and extracts a differential area with respect to the background thermal image (area different from the background thermal image) from the target visible image, thereby generating a differential image representing a difference (referred to as a “differential thermal image” hereinafter). - Meanwhile, if a difference in pixel values is a certain threshold value or more in comparison with the background image, it is assumed that the corresponding pixel is different from the background and, for example, when a binary image in which the pixel value of the pixel is 1 and the pixel value of the same pixel as the background is 0 is generated as each differential image (differential visible image and differential thermal image). In addition, each differential image is sent to the heat trace
area extraction unit 17. - Subsequently, the heat trace
area extraction unit 17 compares the differential visible image with the differential thermal image and extracts a heat trace area in the imaging range (S108). - When an area which is not similar to the differential visible area is extracted with respect to the differential thermal image, similarity determination of the differential area of each differential image may be used. For example, the heat trace
area extraction unit 17 first performs labeling (extraction of connection area) on each binary image which is the differential visible image or the differential thermal image. Next, the heat tracearea extraction unit 17 compares degrees of overlap between one or more differential areas obtained by labeling of the differential thermal image (referred to as “differential thermal areas” hereinafter) and one or more differential areas obtained by labeling of the differential visible image (referred to as “differential visible areas” hereinafter). Specifically, the heat tracearea extraction unit 17 counts whether or not the differential areas to be compared match in pixel units, and determines that two compared differential areas are not similar when a matching rate is a certain threshold value or more. The heat tracearea extraction unit 17 extracts a heat difference area which is not similar to any differential visible region as a heat trace area. - The heat trace
area extraction unit 17 transmits information indicating the heat trace area and the background visible image to the heat tracearea output unit 18. In this case, the heat tracearea extraction unit 17 may generate a binary image in which the heat trace area part is white and the other part is black and transmit the binary image to the heat tracearea output unit 18 as the information indicating the heat trace area. It should be noted that determination of a similarity between areas is actively performed in research of pattern matching or the like, and the present embodiment is not limited to a predetermined method - Subsequently, the heat trace
area output unit 18 outputs information indicating the extraction result of the heat trace area such that a user can check the information (S109). For example, the heat tracearea output unit 18 may output an image obtained by combining white pixels of the binary image indicating the heat trace area on the background visible image. In addition, the output form is not limited to a predetermined form. For example, setting in a display device, storage in theauxiliary storage device 102, and transmission to a user terminal via a network may be performed. - Subsequently to step S109, step S105 and subsequent steps are executed. Alternatively, step S109 may be executed after steps S105 to S108 are repeated executed a plurality of times. In this case, heat trace areas extracted the plurality of times can be output collectively.
-
FIG. 6 is a schematic diagram showing an output example of a heat trace area extraction result. InFIG. 6 , a part of the background visible image touched by a hand of a person is painted out black (however, black is a color for convenience and an actual color may be other colors such as white). The user can recognize the part as a heat trace area. - It is also possible to project the binary image representing the heat trace area to a photographing range in the environment using a projector or the like. In this case, it is possible to directly transmit the part touched by the person to each person in the environment by projecting the heat trace area trace image to the heat trace part in the environment. Since a person who comes to the place can be aware of a part touched by another person if this method is used, it is possible to encourage an action of avoiding touching the part touched by the person without disinfecting the part.
- In addition, steps S101 to S103 may be executed in parallel with step S105 and subsequent steps. In this case, the background visible image and the background thermal image are periodically updated. Therefore, it is possible to expect improvement of resistance to change in the background according to the lapse of time.
- Although it is assumed that a camera is fixed to an indoor place and a heat trace left on a wall, a desk, or the like which is relatively fixed is extracted in the above case, the present embodiment is applicable to a moving thing if the position of an object that is a target touched by a person can be specified in an image. For example, if a QR code (registered trademark) for identifying a position is attached to four corners of the seat surface of a chair and the position of the seat surface can be estimated using the QR code (registered trademark) as a clue, a heat trace remaining on the seat surface can be estimated with the seat surface as a background even when the chair moves, and the heat trace can be displayed on the seat surface on an image. Various techniques for estimating the position of an object in an image have been proposed and are not limited to the use of the QR code (registered trademark).
- As described above, according to the present embodiment, it is possible to improve the accuracy of detection of a place touched by a person. As a result, it is possible to efficiently sterilize and disinfect a place where a virus such as a novel coronavirus may adhere, for example.
- It should be noted that in the present embodiment, the differential visible image is an example of a first differential image. The differential thermal image is an example of a second differential image. The heat trace
area extraction unit 17 is an example of an extraction unit. - Although an embodiment of the present invention has been described in detail above, the present invention is not limited to the specific embodiment described above, and various modifications and changes can be made within the concept of the present invention described in the claims.
-
-
- 10 Heat trace area extraction apparatus
- 11 Visible image acquisition unit
- 12 Background visible image generation unit
- 13 Differential visible image generation unit
- 14 Thermal image acquisition unit
- 15 Background thermal image generation unit
- 16 Differential thermal image generation unit
- 17 Heat trace area extraction unit
- 18 Heat trace area output unit
- 21 Visible light camera
- 22 Thermal camera
- 100 Drive device
- 101 Recording medium
- 102 Auxiliary storage device
- 103 Memory device
- 104 CPU
- 105 Interface device
- B Bus
Claims (10)
1. A heat trace area extraction method executed by a computer, the heat trace area extraction method comprising:
generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
2. A heat trace area extraction method executed by a computer, the heat trace area extraction method comprising:
generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
extracting, as the heat trace area, an area that is not similar to any of one or more differential visible areas among one or more differential thermal areas, with areas different from the visible image of the background in the first differential image as the differential visible areas and with areas different from the thermal image of the background in the second differential image as the differential thermal areas.
3. The heat trace area extraction method according to claim 1 , further comprising:
outputting an image obtained by combining the visible image of the background of the certain range and a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black.
4. A heat trace area extraction apparatus comprising:
a memory; and
a processor configured to execute
generating a first differential image with respect to a visible image of a background of a certain range for a visible image in which the certain range is captured;
generating a second differential image with respect to a thermal image of the background for a thermal image in which the certain range is captured by a thermal camera; and
extracting a heat trace area by removing an area of a real object from the thermal image on the basis of the first differential image and the second differential image.
5. (canceled)
6. The heat trace area extraction apparatus according to claim 4 , wherein the processor is further configure to execute
outputting an image obtained by combining the visible image of the background of the certain range and a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black.
7. (canceled)
8. The heat trace area extraction method according to claim 1 , further comprising:
projecting a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black to the certain range.
9. The heat trace area extraction apparatus according to claim 4 , wherein the processor is further configure to execute projecting a binary image in which the heat trace area extracted in the extracting is white and an area other than the heat trace area is black to the certain range.
10. A non-transitory computer-readable recording medium having computer-readable instructions stored thereon, which when executed, cause a computer to execute the heat trace area extraction method according to claim 1 .
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/039126 WO2022079910A1 (en) | 2020-10-16 | 2020-10-16 | Thermal trace area extraction method, thermal trace area extraction apparatus, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240019309A1 true US20240019309A1 (en) | 2024-01-18 |
Family
ID=81208224
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/248,295 Pending US20240019309A1 (en) | 2020-10-16 | 2020-10-16 | Remaining thermal trace extraction method, remaining thermal trace extraction apparatus and program |
US18/031,341 Pending US20230377165A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,141 Pending US20230384162A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,147 Pending US20230412768A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,346 Pending US20230377159A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/031,341 Pending US20230377165A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,141 Pending US20230384162A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,147 Pending US20230412768A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
US18/031,346 Pending US20230377159A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
Country Status (3)
Country | Link |
---|---|
US (5) | US20240019309A1 (en) |
JP (5) | JPWO2022079910A1 (en) |
WO (5) | WO2022079910A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047807A1 (en) * | 2022-08-31 | 2024-03-07 | 日本電信電話株式会社 | Threshold determination device, method, and program |
WO2024052974A1 (en) * | 2022-09-06 | 2024-03-14 | 日本電信電話株式会社 | Background updating device, method, and program |
WO2024052973A1 (en) * | 2022-09-06 | 2024-03-14 | 日本電信電話株式会社 | Background updating device, method, and program |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5354767B2 (en) | 2007-10-17 | 2013-11-27 | 株式会社日立国際電気 | Object detection device |
JP2017067503A (en) * | 2015-09-28 | 2017-04-06 | 富士通株式会社 | Position estimation system, position estimation method, and position estimation program |
JP2017090277A (en) * | 2015-11-11 | 2017-05-25 | 国立大学法人九州大学 | Gripping information acquisition device, robot teaching device and robot controller, and gripping information acquisition method, and robot teaching method and robot control method |
JP7144678B2 (en) * | 2018-08-03 | 2022-09-30 | 日本電信電話株式会社 | Image processing device, image processing method, and image processing program |
-
2020
- 2020-10-16 WO PCT/JP2020/039126 patent/WO2022079910A1/en active Application Filing
- 2020-10-16 JP JP2022556819A patent/JPWO2022079910A1/ja active Pending
- 2020-10-16 US US18/248,295 patent/US20240019309A1/en active Pending
-
2021
- 2021-10-12 JP JP2022556990A patent/JP7485070B2/en active Active
- 2021-10-12 WO PCT/JP2021/037676 patent/WO2022080350A1/en active Application Filing
- 2021-10-12 JP JP2022556991A patent/JP7485071B2/en active Active
- 2021-10-12 WO PCT/JP2021/037677 patent/WO2022080351A1/en active Application Filing
- 2021-10-12 JP JP2022556992A patent/JP7485072B2/en active Active
- 2021-10-12 US US18/031,341 patent/US20230377165A1/en active Pending
- 2021-10-12 JP JP2022556993A patent/JPWO2022080353A1/ja active Pending
- 2021-10-12 US US18/031,141 patent/US20230384162A1/en active Pending
- 2021-10-12 WO PCT/JP2021/037678 patent/WO2022080352A1/en active Application Filing
- 2021-10-12 WO PCT/JP2021/037679 patent/WO2022080353A1/en active Application Filing
- 2021-10-12 US US18/031,147 patent/US20230412768A1/en active Pending
- 2021-10-12 US US18/031,346 patent/US20230377159A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2022080350A1 (en) | 2022-04-21 |
JP7485071B2 (en) | 2024-05-16 |
US20230384162A1 (en) | 2023-11-30 |
JP7485070B2 (en) | 2024-05-16 |
WO2022079910A1 (en) | 2022-04-21 |
JPWO2022080351A1 (en) | 2022-04-21 |
JPWO2022080350A1 (en) | 2022-04-21 |
US20230377165A1 (en) | 2023-11-23 |
JPWO2022080353A1 (en) | 2022-04-21 |
JPWO2022079910A1 (en) | 2022-04-21 |
JPWO2022080352A1 (en) | 2022-04-21 |
WO2022080351A1 (en) | 2022-04-21 |
WO2022080352A1 (en) | 2022-04-21 |
US20230412768A1 (en) | 2023-12-21 |
WO2022080353A1 (en) | 2022-04-21 |
US20230377159A1 (en) | 2023-11-23 |
JP7485072B2 (en) | 2024-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240019309A1 (en) | Remaining thermal trace extraction method, remaining thermal trace extraction apparatus and program | |
KR102021999B1 (en) | Apparatus for alarming thermal heat detection results obtained by monitoring heat from human using thermal scanner | |
JP5991224B2 (en) | Image processing apparatus, image processing method, and image processing program | |
KR101118654B1 (en) | rehabilitation device using motion analysis based on motion capture and method thereof | |
US20120140054A1 (en) | Intelligent hand washing monitoring system | |
CN103955272A (en) | Terminal equipment user posture detecting system | |
US20180307896A1 (en) | Facial detection device, facial detection system provided with same, and facial detection method | |
RU2017116701A (en) | PATIENT MONITORING SYSTEM AND METHOD | |
US20210378520A1 (en) | Free flow fever screening | |
JP2014157316A (en) | Projector device | |
WO2020171554A1 (en) | Method and apparatus for measuring body temperature using a camera | |
JP6302007B2 (en) | Clean room resident cleanliness management method and management system | |
JP2014021619A (en) | Patient recognition device | |
JP7047945B2 (en) | Information processing equipment, information processing methods, and programs | |
JP7463792B2 (en) | Information processing system, information processing device, and information processing method | |
WO2017029841A1 (en) | Image analyzing device, image analyzing method, and image analyzing program | |
Hossen | Social distance monitoring using a low-cost 3d sensor | |
Ko et al. | An efficient method for extracting the depth data from the user | |
WO2021246011A1 (en) | Image processing device, image processing method, and program | |
KR102452214B1 (en) | Digital signage with quarantine function and how it works | |
JP7197651B2 (en) | MONITORING SYSTEM, MONITORING METHOD, COMPUTER PROGRAM PRODUCT AND CLEANING SYSTEM FOR SURFACE MONITORING | |
Grimaldo et al. | Multiple Face Mask Scanner as an Innovation for Implementing Health Protocols | |
JP2022137414A (en) | Spatial infection risk determination system, spatial infection risk determination method and program | |
KR20230062439A (en) | System and method for detecting infectious disease patients and tracking the movement of the infectious disease patients using 3d thermal data analysis | |
KR20220144684A (en) | Touch-image device supporting anti-corrosion function and its operation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NIPPON TELEGRAPH AND TELEPHONE CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHIRAI, YOSHINARI;KISHINO, YASUE;SUYAMA, TAKAYUKI;AND OTHERS;SIGNING DATES FROM 20210218 TO 20210308;REEL/FRAME:063259/0450 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |