WO2022080350A1 - 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム - Google Patents
熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム Download PDFInfo
- Publication number
- WO2022080350A1 WO2022080350A1 PCT/JP2021/037676 JP2021037676W WO2022080350A1 WO 2022080350 A1 WO2022080350 A1 WO 2022080350A1 JP 2021037676 W JP2021037676 W JP 2021037676W WO 2022080350 A1 WO2022080350 A1 WO 2022080350A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- region
- heat trace
- heat
- thermal
- Prior art date
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 186
- 239000000284 extract Substances 0.000 claims abstract description 11
- 238000000034 method Methods 0.000 claims description 124
- 239000000126 substance Substances 0.000 claims description 98
- 238000003384 imaging method Methods 0.000 abstract description 11
- 230000008569 process Effects 0.000 description 52
- 241000700605 Viruses Species 0.000 description 27
- 238000012545 processing Methods 0.000 description 20
- 238000004659 sterilization and disinfection Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 230000004048 modification Effects 0.000 description 10
- 238000012986 modification Methods 0.000 description 10
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 9
- 239000000645 desinfectant Substances 0.000 description 5
- 230000000249 desinfective effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 208000015181 infectious disease Diseases 0.000 description 5
- 241000711573 Coronaviridae Species 0.000 description 4
- 241000282412 Homo Species 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000036760 body temperature Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000002194 synthesizing effect Effects 0.000 description 3
- 241000894006 Bacteria Species 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000011410 subtraction method Methods 0.000 description 2
- 238000009834 vaporization Methods 0.000 description 2
- 230000008016 vaporization Effects 0.000 description 2
- 208000025721 COVID-19 Diseases 0.000 description 1
- 208000001528 Coronaviridae Infections Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 210000000746 body region Anatomy 0.000 description 1
- 230000001364 causal effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001954 sterilising effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0859—Sighting arrangements, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/80—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for detecting, monitoring or modelling epidemics or pandemics, e.g. flu
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to a heat trace region extraction device, a heat trace region extraction method and a program.
- Non-Patent Document 1 a method for improving the efficiency of disinfection using a drone has been proposed (for example, Non-Patent Document 1).
- Disinfection at regular time intervals cannot prevent infections mediated by objects, that is, infections caused by an infected person touching an object and another person touching the object within that interval. However, if it can be disinfected flexibly according to human use, it is thought that such spread of infection can be further reduced. In addition, since such a method can prevent unnecessary disinfection, the effect of reducing the disinfectant solution can be expected. In other words, if disinfection can be performed according to the use of things by people detected by surveillance cameras, labor will be reduced and infection will spread compared to the case of disinfecting all things that may have been used at regular time intervals. Can be expected to prevent and save disinfectant.
- Non-Patent Document 2 a method for detecting contact with an object using a shadow.
- Non-Patent Document 2 requires a strong light source such as a projector.
- a strong light source such as a projector.
- the recognition accuracy is significantly affected by the positional relationship between the camera and the light source.
- a strong light source cannot be installed freely in many environments, and it is considered that it is not suitable for the purpose of detecting and presenting a place touched by a person in various places to support disinfection.
- the present invention has been made in view of the above points, and an object of the present invention is to improve the detection accuracy of a place touched by a person.
- the heat trace region extraction device includes a real object image which is an image of a real object obtained by photographing a certain range with a real object camera for photographing a real object, and a certain range.
- a difference real image generator that generates a difference real image, which is an image of the difference between the background real image and the background real image, and a thermal camera for photographing the heat generated by the real object.
- Differential thermal image generation that generates a differential thermal image that is an image of the difference between a thermal image that is an image of heat generated by a real object obtained by photographing a certain range and a background thermal image that is a background thermal image. It is provided with a unit and a heat trace region extraction unit that extracts a heat trace region by removing a region of the substance from the thermal image based on the difference substance image and the difference thermal image.
- an apparatus, a method and a program for detecting a place touched by a person by using a thermal image are disclosed with the aim of helping to sterilize or disinfect a bacterium or a virus. Since a person who is a homeothermic animal has heat on his limbs, when he touches an object, the heat remains in the place where he touches it for a certain period of time. For example, it has been reported how to use the heat trace, which is a trace of human touch identified by the heat remaining in the place where human touched, to decrypt the passcode of a smartphone ("Yomna Abdelrahman, Mohamed"). Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017 ”) ..
- a thermal image is a substance obtained by photographing a certain range with a thermal camera for photographing the heat generated by the substance (that is, the heat rays radiated by the substance and the electromagnetic waves having wavelengths in the far infrared region). It is an image of the heat emitted (that is, an image of a heat ray emitted by a real object). More specifically, the thermal image is an image of the temperature distribution by detecting the infrared radiant energy emitted from the object and converting it into an apparent temperature. In other words, the thermal image is not an image of the infrared rays reflected by the substance.
- the heat trace area can be extracted by background subtraction with the heat image before human touch as the background.
- this method extracts the human body region as well as the heat trace. Therefore, in the first embodiment, the visible image is acquired at the same time as the thermal image, and the thermal trace region is extracted by comparing the thermal image with the visible image.
- background subtraction is performed for each of the visible image and the thermal image, and the thermal trace region is extracted by the difference in the result of the background subtraction. Since heat traces cannot be observed with a visible image (that is, with the naked eye), they cannot be extracted even if background subtraction is performed on the visible image with the visible image before being touched by a person as the background. On the other hand, when there is a person on the spot, the area of the person is extracted by performing background subtraction with the visible image taken in the absence of the person as the background. That is, when the region extracted by background subtraction in the thermal image is similarly extracted in the visible image, it can be seen that the region is not a thermal trace.
- the region extracted in the thermal image due to background subtraction and not extracted in the visible image is likely to be a thermal trace.
- the heat trace area extracted by such a method is visualized, and the place touched by a person is transmitted to the user.
- a sensor node equipped with a visible light camera and a thermal camera (“Yoshinari Shirai, Yasue Kishino, Takayuki Suyama, Shin Mizutani: PASNIC: a thermal based privacy-aware sensor node for image" Devices such as capturing, UbiComp / ISWC'19 Adjunct, pp.202-205, 2019 ”) may be used.
- FIG. 1 is a schematic diagram of a visible image and a thermal image taken at the same place at the same time.
- FIG. 1 shows a schematic diagram of an image of a hand touching a door with a handle taken simultaneously with a visible light camera and a thermal camera.
- 1 (a) and 1 (a') are visible images and thermal images at time t1 (before the hand touches the door), respectively.
- 1 (b) and 1 (b') are a visible image and a thermal image at time t2 (a state in which a hand is touching the door), respectively.
- 1 (c) and 1 (c') are visible and thermal images at time t3 (after the hand touches the door). When a person touches the door, the temperature of the touched place rises as shown in FIG. 1 (c').
- FIG. 2 is a diagram showing an example of an image obtained by background subtraction.
- FIG. 2 shows a difference image which is the background difference between the image at time t2 and the image at time t3 when the image at time t1 in FIG. 1 is used as the background image.
- the arm part is extracted as a difference area where there is a difference from the background image in both the visible image and the thermal image, whereas at time t3, the part touching the door only on the thermal image is extracted. It is extracted as a difference area.
- the difference area extracted by background subtraction with respect to the thermal image (background thermal image) at time t1 of the thermal image at time t3 may be disinfected.
- the difference region extracted by background subtraction with respect to the thermal image (background thermal image) at time t1 of the thermal image at time t2 includes a portion that does not touch the door.
- the difference region extracted by the background subtraction of the thermal image at time t2 with respect to the thermal image (background thermal image) at time t1 is the region where the human body existed, and the heat trace left by actually touching the door. Not in the area of. From the viewpoint of disinfection, it is sufficient to specify the difference region extracted by the background subtraction at time t3, and the difference region extracted by the background subtraction at time t2 is unnecessary.
- the difference region is not a heat trace region.
- the region extracted by the background subtraction of the thermal image is not extracted by the visible image, so that the difference region extracted by the thermal image is the thermal trace region (that is, the portion touched by a person). It is determined that there is. If the system presents information indicating the heat trace area extracted based on such a determination, the user who sees the information can efficiently disinfect the part touched by a person.
- FIG. 3 is a diagram showing a modification of the first embodiment and the first embodiment described later, and a hardware configuration example of the heat trace region extraction device 10 in the second to fifth embodiments.
- the heat trace area extraction device 10 of FIG. 3 has a drive device 100, an auxiliary storage device 102, a memory device 103, a CPU 104, an interface device 105, and the like, which are connected to each other by a bus B, respectively.
- the program that realizes the processing in the heat trace area extraction device 10 is provided by a recording medium 101 such as a CD-ROM.
- a recording medium 101 such as a CD-ROM.
- the program is installed in the auxiliary storage device 102 from the recording medium 101 via the drive device 100.
- the program does not necessarily have to be installed from the recording medium 101, and may be downloaded from another computer via the network.
- the auxiliary storage device 102 stores the installed program and also stores necessary files, data, and the like.
- the memory device 103 reads a program from the auxiliary storage device 102 and stores it when there is an instruction to start the program.
- the CPU 104 executes the function related to the heat trace area extraction device 10 according to the program stored in the memory device 103.
- the interface device 105 is used as an interface for connecting to a network.
- FIG. 4 is a diagram showing a functional configuration example of the heat trace region extraction device 10 in the first embodiment.
- the heat trace region extraction device 10 includes a visible image acquisition unit 11, a background visible image generation unit 12, a difference visible image generation unit 13, a thermal image acquisition unit 14, a background thermal image generation unit 15, and a differential thermal image generation unit. 16. It has a heat trace region extraction unit 17, a heat trace region output unit 18, and the like. Each of these parts is realized by a process of causing the CPU 104 to execute one or more programs installed in the heat trace area extraction device 10.
- the heat trace area extraction device 10 is connected to each of these cameras so that images can be input from the visible light camera 21 and the thermal camera 22.
- the visible light camera 21 and the thermal camera 22 are installed so as to be able to photograph the same place (the same range). That is, the first embodiment is premised on the fact that the shooting area of the visible light camera 21 and the shooting area of the thermal camera 22 coincide with each other on a pixel-by-pixel basis. If the captured portions of the visible light camera 21 and the thermal camera 22 do not match, it is sufficient to perform calibration in advance so that the correspondence between the pixels of the visible image and the thermal image can be grasped.
- FIG. 5 is a flowchart for explaining an example of the processing procedure executed by the heat trace area extraction device 10.
- step S101 the visible image acquisition unit 11 acquires the visible image captured by the visible light camera 21 input from the visible light camera 21, and the thermal image acquisition unit 14 acquires the thermal image input from the thermal camera 22.
- the thermal image taken by the camera 22 is acquired.
- the acquisition of the visible image by the visible image acquisition unit 11 and the acquisition of the thermal image by the thermal image acquisition unit 14 may or may not be performed at the same time. If not at the same time, a part of the frame of the camera with the faster frame rate may be ignored according to the camera with the slower frame rate. Further, there is no problem even if the still images are alternately acquired from the visible light camera 21 and the thermal camera 22 and the acquired images are regarded as acquired at the same time as long as the frame rate is high to some extent.
- the visible image acquisition unit 11 transmits the acquired visible image to the background visible image generation unit 12, and the thermal image acquisition unit 14 transmits the acquired thermal image to the background thermal image generation unit 15.
- the background visible image generation unit 12 stores the visible image transmitted from the visible image acquisition unit 11 in the auxiliary storage device 102
- the background thermal image generation unit 15 stores the thermal image transmitted from the thermal image acquisition unit 14. Is stored in the auxiliary storage device 102 (S102).
- Steps S101 and S102 are repeated from the start of execution until a predetermined time T1 elapses.
- the predetermined time T1 may be a period during which one or more visible images and one or more thermal images are accumulated in the auxiliary storage device 102.
- step S104 the background visible image generation unit 12 generates a background image (hereinafter, referred to as “background visible image”) in the shooting range based on the visible image group stored in the auxiliary storage device 102 at a predetermined time T1. Further, in step S104, the background thermal image generation unit 15 generates a background image (hereinafter, referred to as “background thermal image”) in the shooting range based on the thermal image group stored in the auxiliary storage device 102 in the predetermined time T1. do.
- background visible image hereinafter, referred to as “background visible image”
- background thermal image generation unit 15 generates a background image (hereinafter, referred to as “background thermal image”) in the shooting range based on the thermal image group stored in the auxiliary storage device 102 in the predetermined time T1. do.
- each captured image group A background image background visible image and background thermal image
- RGB center value of the pixel values
- the predetermined time T1 is a time interval corresponding to the time t1 in FIG. 1, and includes the time t1.
- steps S105 and subsequent steps are executed. It should be noted that steps S101 to S104 and step S105 do not have to be executed synchronously. For example, after step S105, it may be started in response to an instruction different from the execution instruction of steps S101 to S104.
- step S105 the visible image acquisition unit 11 and the thermal image acquisition unit 14 wait for the elapse of the predetermined time T2.
- the predetermined time T2 is, for example, the elapsed time from the time t2 to the time t3 in FIG. 2.
- the visible image acquisition unit 11 acquires a visible image input from the visible light camera 21 (hereinafter referred to as “target visible image”), and the thermal image acquisition unit 14 acquires the visible image. , Acquires a thermal image (hereinafter, referred to as “target thermal image”) input from the thermal camera 22 (S106). It is desirable that the target visible image and the target thermal image are images taken at the same time (or almost at the same time).
- the difference visible image generation unit 13 compares the background visible image generated by the background visible image generation unit 12 with the target visible image by the background subtraction method, and makes a difference region (with the background visible image) with respect to the background visible image. By extracting (different regions) from the target visible image, a difference image showing the difference (hereinafter referred to as “difference visible image”) is generated. Further, the differential thermal image generation unit 16 compares the background thermal image generated by the background thermal image generation unit 15 with the target thermal image by the background subtraction method, and has a difference region with respect to the background thermal image (a region different from the background thermal image). Is extracted from the target thermal image to generate a differential image showing the difference (hereinafter referred to as “differential thermal image”).
- each difference image is sent to the heat trace region extraction unit 17.
- the heat trace region extraction unit 17 compares the difference visible image with the differential thermal image and extracts the heat trace region in the photographing range (S108).
- the heat trace region extraction unit 17 When extracting a region dissimilar to the difference visible region of the differential thermal image, the similarity determination of the difference region of each difference image may be used. For example, the heat trace region extraction unit 17 first labels (extracts the connected region) each binary image which is a differential visible image or a differential thermal image. Next, the heat trace region extraction unit 17 has one or more obtained by labeling the differential visible image for each of the one or more differential regions (hereinafter referred to as “differential thermal region”) obtained by labeling the differential thermal image. The degree of overlap with each difference area (hereinafter referred to as “difference visible area”) is compared.
- difference visible area The degree of overlap with each difference area
- the heat trace region extraction unit 17 counts whether or not the difference regions to be compared match each other on a pixel-by-pixel basis, and if the match rate is less than a certain threshold value, the two differences compared. The regions are determined to be dissimilar. The heat trace region extraction unit 17 extracts a differential heat region that is dissimilar to any of the differential visible regions as a heat trace region.
- the heat trace region extraction unit 17 transmits information indicating the heat trace region and a background visible image to the heat trace region output unit 18. At this time, the heat trace region extraction unit 17 generates a binary image in which the heat trace region portion is white and the rest is black, and information indicating whether or not the heat trace region is the heat trace region is provided. , It may be transmitted to the heat trace area output unit 18 as information indicating the heat trace area.
- the region similarity determination is actively performed in pattern matching research and the like, and is not limited to a predetermined method.
- the heat trace region output unit 18 allows the user to confirm information indicating the heat trace region. Output as follows (S109). Here, the user is a person to whom the information indicating the heat trace area is notified.
- the heat trace region output unit 18 may output an image obtained by synthesizing white pixels of a binary image, which is an example of information indicating a heat trace region, on a background visible image.
- the output form is not limited to the predetermined form.
- the device on the display device, the storage in the auxiliary storage device 102, the transmission to the user terminal via the network, and the like may be performed.
- step S109 steps S105 and subsequent steps are repeated.
- step S109 may be executed after steps S105 to S108 are repeated a plurality of times. In this case, the heat trace regions extracted in the plurality of times can be collectively output.
- FIG. 6 is a schematic diagram showing an example of an image output by the heat trace region output unit 18.
- the portion touched by a human hand is painted black (however, black is a color for convenience, and the actual color may be a different color such as white).
- black is a color for convenience, and the actual color may be a different color such as white).
- the user can recognize the portion as a heat trace area.
- the heat trace area output unit 18 may use a projector or the like to appropriately align and then project a binary image showing the heat trace area to the photographing range in the environment.
- the heat trace area trace image is projected on the heat trace portion in the environment, and the portion touched by a person can be directly transmitted to each user in the environment.
- the user can know the heat trace area, and thus can prompt the user to take an action such as avoiding touching the heat trace area.
- steps S101 to S103 may be executed in parallel with steps S105 and subsequent steps.
- the background visible image and the background thermal image are updated periodically. Therefore, it can be expected that the resistance to the change of the background with the passage of time will be improved.
- the first embodiment can be applied to a moving object as long as the position in the image can be specified. For example, if a QR code (registered trademark) for identifying the position is attached to the four corners of the seat surface of the chair and the QR code (registered trademark) can be used as a clue to estimate the position of the seat surface, the position of the seat surface can be estimated. Even if the chair moves, the heat traces left on the seat surface can be estimated and the heat traces can be displayed on the seat surface on the image.
- a QR code registered trademark
- virus such as a new type coronavirus or a bacterium (hereinafter, referred to as "virus” for convenience) may be attached.
- the difference visible image is an example of the first difference image.
- the differential thermal image is an example of the second differential image.
- the heat trace region extraction unit 17 is an example of the extraction unit.
- the heat trace region extraction system 1 of the modified example of the first embodiment includes a heat trace region extraction device 10, a physical camera 210, and a thermal camera 22.
- the heat trace region extraction device 10 of the modified example of the first embodiment has a physical image acquisition unit 110, a background physical image generation unit 120, a differential physical image generation unit 130, and a thermal image acquisition unit.
- a background thermal image generation unit 15, a differential thermal image generation unit 16, a heat trace region extraction unit 17, and a heat trace region output unit 18 are provided, for example.
- the physical camera 210 and the thermal camera 22 are connected to the heat trace region extraction device 10 of the modification of the first embodiment, and the images taken by the physical camera 210 and the thermal camera 22 are input to the heat trace region extraction device 10. Will be done.
- the heat trace region extraction method of the modified example of the first embodiment is realized by each part of the heat trace region extraction device executing the processes of steps S101 to S109 shown in FIG. 8 and below.
- the same parts as those of the first embodiment will be appropriately omitted, and the parts different from the first embodiment will be mainly described.
- the physical camera 210 is a camera for photographing a physical object. It is the same as the visible light camera 21 of the first embodiment except that the wavelength band of the electromagnetic wave obtained by the physical camera 210 is not limited to the wavelength band of the visible light.
- the physical camera 210 captures an image of a real object in a certain shooting range, more specifically, a physical image composed of an image of a real object existing in the shooting range as viewed from the physical camera 210 side. Take a picture.
- the real object image taken by the real object camera 210 is input to the real object image acquisition unit 110.
- the wavelength band of the electromagnetic wave captured by the physical camera 210 may be any wavelength band in which the physical object can be photographed.
- a specific example of the wavelength band of the electromagnetic wave obtained by the physical camera 210 will be described in the fourth embodiment, but the wavelength band of the electromagnetic wave obtained by the physical camera 210 includes the wavelength band of the electromagnetic wave existing in the shooting range due to illumination or the like.
- the heat trace region extraction system 1 of the modified example of the first embodiment is provided with an irradiator 41 to irradiate electromagnetic waves in the wavelength band obtained by the physical camera 210.
- the irradiator 41 may be used to irradiate the imaging range with electromagnetic waves in the wavelength band obtained by the physical camera 210.
- An entity is one that has an entity. Examples of an entity existing in the imaging range are an object existing as a background in the imaging range, a person in the imaging range, and a person entering the imaging range.
- the real object image acquisition unit 110 is the same as the visible image acquisition unit 11 of the first embodiment, except for a portion that acquires and outputs a real object image instead of the visible image. That is, the substance image acquisition unit 110 acquires the substance image taken by the substance camera 210 (S101) and outputs the acquired substance image. The substance image acquired by the substance image acquisition unit 110 is input to the background substance image generation unit 120 and / or the difference substance image generation unit 130.
- the background real object image generation unit 120 is first except for a portion that processes the real object image instead of the visible image and a portion that generates and outputs the background real object image instead of the background visible image. This is the same as the background visible image generation unit 12 of the embodiment. That is, the background substance image generation unit 120 generates a background substance image which is a background substance image in the shooting range based on the substance image acquired by the substance image acquisition unit 110 (S104). Output the generated background entity image.
- the real object image of the background in the shooting range is an image of an object existing as a background in the shooting range, and more specifically, an object existing as a background in the shooting range is displayed on the physical camera 210 side. It is an image composed of images viewed from above.
- the generated background substance image is input to the difference substance image generation unit 130.
- the difference real image generation unit 130 replaces the visible image with a portion that performs processing on the real image, a portion that performs processing on the background real image instead of the background visible image, and a difference visible image. It is the same as the difference visible image generation unit 13 of the first embodiment except for the portion that generates the difference actual image. That is, the difference real image generation unit 130 is the difference between the real image at a certain time acquired by the real image acquisition unit 110 and the background real image input from the background real image generation unit 120.
- the difference substance image which is the image of the above is generated (S107), and the generated difference substance image at a certain time is output.
- the difference substance image generation unit 130 includes a substance image which is an image of the substance obtained by photographing a certain range with the substance camera 210 for photographing the substance, and a background of the certain range. Generates a difference entity image, which is an image of the difference between the background entity image, which is the entity image of.
- the generated difference substance image is input to the heat trace region extraction unit 17.
- the difference physical image generation unit 130 may operate for the physical image at the time to be processed by the heat trace region extraction unit 17, which will be described later.
- the thermal camera 22 is the same as the thermal camera 22 of the first embodiment. That is, the thermal camera 22 is a camera for photographing the heat generated by the substance.
- the thermal camera 22 is composed of an image of heat generated by a real object in the same shooting range as the physical camera 210, more specifically, an image obtained from the thermal camera 22 side of the heat radiated by the real body existing in the shooting range. Take a thermal image, which is an image.
- the thermal image taken by the thermal camera 22 is input to the thermal image acquisition unit 14.
- the thermal image acquisition unit 14 is the same as the thermal image acquisition unit 14 of the first embodiment. That is, the thermal image acquisition unit 14 acquires the thermal image taken by the thermal camera 22 (S101) and outputs the acquired thermal image.
- the thermal image acquired by the thermal image acquisition unit 14 is input to the background thermal image generation unit 15 and / or the differential thermal image generation unit 16.
- the background thermal image generation unit 15 is the same as the background thermal image generation unit 15 of the first embodiment. That is, the background thermal image generation unit 15 generates a background thermal image which is a thermal image of the background in the shooting range based on the thermal image acquired by the thermal image acquisition unit 14 (S104), and the generated background thermal image. Is output.
- the thermal image of the background of the shooting range is an image of the heat generated by the object existing as the background in the shooting range, and more specifically, the heat generated by the object existing as the background in the shooting range. It is an image composed of an image acquired from the thermal camera 22 side.
- the generated background thermal image is input to the differential thermal image generation unit 16.
- the differential thermal image generation unit 16 is the same as the differential thermal image generation unit 16 of the first embodiment. That is, the differential thermal image generation unit 16 is an image of the difference between the thermal image at a certain time acquired by the thermal image acquisition unit 14 and the background thermal image input from the background thermal image generation unit 15. A thermal image is generated (S107), and the generated differential thermal image at a certain time is output.
- the differential thermal image generation unit 16 is a thermal camera 22 for photographing the heat generated by the real object, and is an image of the heat generated by the real object obtained by photographing the same range as the physical camera 210.
- a differential thermal image which is an image of the difference between the image and the background thermal image, which is the thermal image of the background in the certain range, is generated.
- the generated differential thermal image is input to the heat trace region extraction unit 17.
- the differential thermal image generation unit 16 may operate on a thermal image at a time desired to be processed by the thermal trace region extraction unit 17, which will be described later.
- the heat trace region extraction unit 17 is the heat trace region extraction unit of the first embodiment, except for a portion that uses the difference substance image instead of the difference visible image and a portion that uses the background substance image instead of the background visible image. It is the same as 17. That is, the heat trace region extraction unit 17 is based on the difference material image at a certain time generated by the difference material image generation unit 130 and the difference heat image at a certain time generated by the difference heat image generation unit 16. By removing the region of the substance from the thermal image, the heat trace region is extracted (S108), and the information indicating the extracted heat trace region at a certain time is output.
- the heat trace region extraction unit 17 sets each region different from the background physical image in the differential thermal image as the differential physical region, and each region different from the background thermal image in the differential thermal image as the differential thermal region.
- a region dissimilar to any of the one or more differential physical regions is extracted as a thermal trace region.
- the information indicating the extracted heat trace area is input to the heat trace area output unit 18.
- the heat trace region extraction unit 17 generates, for example, a binary image in which the heat trace region portion is white and the rest is black as information indicating the heat trace region, and the generated binary image is used as the heat trace region output unit. Output to 18.
- the heat trace region extraction unit 17 may also output the background substance image to the heat trace region output unit 18.
- the heat trace region extraction unit 17 may operate on the difference substance image and the difference thermal image at the time to be processed by the heat trace region extraction unit 17.
- the time to be processed by the heat trace area extraction unit 17 may be each time at predetermined intervals, or may be a time designated by an operator or the like of the heat trace area extraction device 10.
- the heat trace region output unit 18 is the same as the heat trace region output unit 18 of the first embodiment, except for a portion where a background substance image is used instead of the background visible image.
- the heat trace region output unit 18 outputs information indicating the heat trace region so that the user can confirm it (S109).
- the heat trace region output unit 18 outputs an image obtained by synthesizing white pixels of a binary image, which is an example of information indicating a heat trace region, on a background physical image.
- the heat trace region output unit 18 projects a binary image showing the heat trace region with respect to the photographing range in the environment after appropriately aligning the position using a projector or the like.
- the heat trace region output unit 18 may start outputting information indicating the heat trace region when, for example, information indicating the heat trace region is input. Further, the heat trace region output unit 18 may, for example, continue to output information indicating the heat trace region for a predetermined time and end the output after the predetermined time has elapsed, or the operator of the heat trace region extraction device 10. The output of the information indicating the heat trace region may be continued until the output end is instructed by the above, and the output of the information indicating the heat trace region may be terminated according to the instruction.
- the heat trace region extraction device and method of the first embodiment and the modified examples of the first embodiment inform the user of a region where the temperature has risen due to human touch, for example, a region where a virus or the like may be attached. be able to.
- a region where the temperature has risen due to human touch areas that have been disinfected with alcohol or the like after being touched by humans are areas where there is a low possibility of being infected with a virus or the like even if they are touched by humans. be.
- the heat trace region extraction device and method of the second embodiment extract regions having a low possibility of being infected with a virus or the like even if they are touched by such a person.
- the heat trace region extraction system 1 of the second embodiment includes a heat trace region extraction device 10, a physical camera 210, and a thermal camera 22.
- the heat trace region extraction device 10 of the second embodiment has a real object image acquisition unit 110, a background real object image generation unit 120, a difference real object image generation unit 130, a thermal image acquisition unit 14, and a background. It includes, for example, a thermal image generation unit 15, a differential thermal image generation unit 16, a differential cold image generation unit 161, a heat trace region extraction unit 17, a cold trace region extraction unit 171 and an information output unit 181.
- the physical camera 210 and the thermal camera 22 are connected to the heat trace region extraction device 10 of the second embodiment, and the images taken by the physical camera 210 and the thermal camera 22 are input to the heat trace region extraction device 10.
- the heat trace region extraction method of the second embodiment is realized by each part of the heat trace region extraction device executing the processes of steps S101 to S1091 shown in FIG. 10 and below.
- a part different from the modified example of the first embodiment will be mainly described. Duplicate explanations will be omitted as appropriate for the parts similar to the modified examples of the first embodiment.
- the physical camera 210 is the same as the physical camera 210 of the modified example of the first embodiment.
- the physical camera 210 captures an image of a real object in a certain shooting range.
- the real object image taken by the real object camera 210 is input to the real object image acquisition unit 110.
- the physical image acquisition unit 110 is the same as the physical image acquisition unit 110 of the modified example of the first embodiment. That is, the substance image acquisition unit 110 acquires the substance image taken by the substance camera 210 (S101) and outputs the acquired substance image. The substance image acquired by the substance image acquisition unit 110 is input to the background substance image generation unit 120 and / or the difference substance image generation unit 130.
- the background substance image generation unit 120 is the same as the background substance image generation unit 120 of the modification of the first embodiment. That is, the background substance image generation unit 120 generates a background substance image which is a background substance image in the shooting range based on the substance image acquired by the substance image acquisition unit 110 (S104). Output the generated background entity image. The generated background substance image is input to the difference substance image generation unit 130.
- the difference substance image generation unit 130 is the same as the difference substance image generation unit 130 of the modification of the first embodiment. That is, the difference real image generation unit 130 is the difference between the real image at a certain time acquired by the real image acquisition unit 110 and the background real image input from the background real image generation unit 120. The difference substance image which is the image of the above is generated (S107), and the generated difference substance image at a certain time is output.
- the difference substance image generation unit 130 has a substance image which is an image of a substance obtained by photographing a certain range with a substance camera for photographing the substance, and a background of the certain range.
- the difference substance image generated by the difference substance image generation unit 130 is input not only to the heat trace region extraction unit 17 but also to the cold trace region extraction unit 171.
- the difference substance image generation unit 130 includes a substance image at a time desired to be the target of the processing of the heat trace region extraction unit 17 described later and a processing target of the cold trace region extraction unit 171 described later. It suffices to operate with the actual image of the desired time as the target of the desired time.
- the thermal camera 22 is the same as the thermal camera 22 of the modified example of the first embodiment. That is, the thermal camera 22 is a camera for photographing the heat generated by the substance. The thermal camera 22 captures an image of the heat generated by the entity in the same imaging range as the entity camera 210. The thermal image taken by the thermal camera 22 is input to the thermal image acquisition unit 14.
- the thermal image acquisition unit 14 is the same as the thermal image acquisition unit 14 of the modified example of the first embodiment. That is, the thermal image acquisition unit 14 acquires the thermal image taken by the thermal camera 22 (S101) and outputs the acquired thermal image.
- the thermal image acquired by the thermal image acquisition unit 14 is input to the background thermal image generation unit 15 and / or the differential thermal image generation unit 16 and the differential cold image generation unit 161.
- the background thermal image generation unit 15 is the same as the background thermal image generation unit 15 of the modified example of the first embodiment. That is, the background thermal image generation unit 15 generates a background thermal image which is a thermal image of the background in the shooting range based on the thermal image acquired by the thermal image acquisition unit 14 (S104), and the generated background thermal image. Is output. The generated background thermal image is input to the differential thermal image generation unit 16 and the differential cold image generation unit 161.
- the differential thermal image generation unit 16 is the same as the differential thermal image generation unit 16 of the modified example of the first embodiment.
- the differential thermal image generation unit 16 has a region where the temperature is higher than the background thermal image input from the background thermal image generation unit 15 from the thermal image at a certain time acquired by the thermal image acquisition unit 14 (“high temperature region”). By extracting), a differential thermal image is generated (S107), and the generated differential thermal image at a certain time is output. That is, the differential thermal image generation unit 16 is an image in which a region of the difference between the thermal image and the background thermal image and the temperature in the thermal image is higher than the temperature in the background thermal image is included as a high temperature region. Generate a thermal image.
- the difference between the pixel value of the thermal image and the pixel value of the background thermal image is equal to or more than a certain threshold value for each pixel, and the temperature indicated by the pixel value of the thermal image is the background thermal image. If it is higher than the temperature indicated by the pixel value of, the pixel value of the differential thermal image is set to 1, and if not (that is, the temperature indicated by the pixel value of the thermal image is higher than the temperature indicated by the pixel value of the background thermal image).
- the differential thermal image generation unit 16 may operate on a thermal image at a time desired to be processed by the thermal trace region extraction unit 17, which will be described later.
- the differential cold image generation unit 161 is a portion that generates a differential cold image, which is an image of the difference between the thermal image and the background thermal image and is an image of a region where the temperature is lower than the background, instead of the differential thermal image. Except for the above, it is the same as the differential thermal image generation unit 16 of the modified example of the first embodiment.
- the differential cold image generation unit 161 is a region where the temperature is lower than the background thermal image input from the background thermal image generation unit 15 from the thermal image at a certain time acquired by the thermal image acquisition unit 14 (“low temperature region”). By extracting), a differential cold image is generated (S107), and the generated differential cold image at a certain time is output.
- the differential cold image generation unit 161 is an image in which a region of the difference between the thermal image and the background thermal image and the temperature in the thermal image is lower than the temperature in the background thermal image is included as a low temperature region. Generate a cold image. For example, in the differential cold image generation unit 161, for each pixel, the difference between the pixel value of the thermal image and the pixel value of the background thermal image is equal to or more than a certain threshold value, and the temperature indicated by the pixel value of the thermal image is the background thermal image.
- the pixel value of the differential cold image is set to 1, and if not (that is, the temperature indicated by the pixel value of the thermal image is lower than the temperature indicated by the pixel value of the background thermal image).
- the difference between the pixel value of the thermal image and the pixel value of the background thermal image is less than a certain threshold, and when the temperature indicated by the pixel value of the thermal image is not lower than the temperature indicated by the pixel value of the background thermal image)
- Generates a differential cold image by setting the pixel value in the differential thermal image to 0.
- the generated differential cold image is input to the cold trace region extraction unit 171.
- the differential cold image generation unit 161 may operate on a thermal image at a time desired to be processed by the cold trace region extraction unit 171 described later.
- the heat trace region extraction unit 17 is the same as the heat trace region extraction unit 17 of the modified example of the first embodiment. That is, the heat trace region extraction unit 17 has a high temperature included in the differential thermal image based on the differential physical image generated by the differential physical image generation unit 130 and the differential thermal image generated by the differential thermal image generation unit 16. By removing the region of the substance from the region, the heat trace region is extracted (S108), and the information indicating the extracted heat trace region is output. Specifically, the heat trace region extraction unit 17 sets each region different from the background physical image in the differential thermal image as the differential physical region, and each region different from the background thermal image in the differential thermal image as the differential thermal region.
- a region dissimilar to any of the one or more differential physical regions is extracted as a thermal trace region.
- Information indicating the extracted heat trace area is input to the information output unit 181.
- the heat trace region extracted by the heat trace region extraction unit 17 of the second embodiment is a region whose temperature has risen due to contact with an actual object, and is, for example, an object existing as a background in the photographing range. This is the area that we have touched.
- the heat trace region extraction unit 17 generates, for example, a binary image in which the heat trace region portion is white and the rest is black as information indicating the heat trace region, and the generated binary image is transmitted to the information output unit 181. Output.
- the heat trace region extraction unit 17 may operate on the difference substance image and the difference thermal image at the time to be processed by the heat trace region extraction unit 17.
- the time to be processed by the heat trace area extraction unit 17 may be each time at predetermined intervals, or may be a time designated by an operator or the like of the heat trace area extraction device 10.
- the cold trace region extraction unit 171 extracts a portion to be processed by using the differential cold image instead of the differential thermal image, and a cold trace region which is not a heat trace region but a region where the temperature of the heat generated by the substance is lowered. It is the same as the heat trace region extraction unit 17 of the modified example of the first embodiment except for the portion to be used. That is, the cold trace region extraction unit 171 has a low temperature included in the differential cold image based on the differential physical image generated by the differential physical image generation unit 130 and the differential cold image generated by the differential cold image generation unit 161. By removing the region of the substance from the region, the cold trace region is extracted (S1081), and the information indicating the extracted cold trace region is output.
- the cold trace region extraction unit 171 sets each region different from the background real image in the differential real image as the differential real region, and each region different from the background thermal image in the differential cold image as the differential cold region.
- a region dissimilar to any of the one or more differential substance regions is extracted as a cold trace region.
- the information indicating the extracted cold trace area is input to the information output unit 181.
- the cold trace region extraction unit 171 generates, for example, a binary image in which the cold trace region portion is white and the rest is black as information indicating the cold trace region, and the generated binary image is transmitted to the information output unit 181. Output.
- the cold trace area extraction unit 171 may operate for the difference real image and the difference cold image at the time to be processed by the cold trace area extraction unit 171.
- the time to be processed by the cold trace area extraction unit 171 may be each time at predetermined intervals, or may be a time designated by an operator or the like of the heat trace area extraction device 10.
- the cold trace region extracted by the cold trace region extraction unit 171 is a region where the temperature has dropped due to contact with the substance.
- the temperature of the wiped area is lowered by the heat of vaporization of alcohol.
- the temperature of the wiped area generally decreases due to the heat of vaporization. Therefore, the cold trace region extracted by the cold trace region extraction unit 171 is a region where the temperature has dropped due to disinfection with alcohol or the like, and is a region where there is a low possibility of being infected with a virus or the like even if a person touches it. it is conceivable that. That is, by the treatment of the cold trace region extraction unit 171, it is possible to extract a region that is unlikely to be infected with a virus or the like even if it is touched by a human.
- the area such as the bottle containing the disinfectant used for disinfection is also the area where the temperature in the thermal image is lower than the temperature in the background thermal image, and the differential cold image is generated. It may be shown in the differential cold image obtained in part 161.
- the background real object image is the image of FIG. 11 (a)
- the real object image is the image of FIG. 11 (b)
- the differential real object image is the image of FIG. 11 (c)
- the background thermal image is.
- the differential cold image obtained by the differential cold image generation unit 161 may be the image of FIG. 11 (f).
- the information output unit 181 uses at least the information indicating the cold trace area, and outputs at least the information indicating the cold trace area so that the user can confirm it (S1091).
- the information output unit 181 operates as in the first and second examples below, for example.
- First example of information output unit 181 uses the information indicating the heat trace region and the information indicating the cold trace region so that the user can confirm at least one of the information indicating the heat trace region and the information indicating the cold trace region. Output to. More specifically, the information output unit 181 of the first example continues to output the information indicating the heat trace region once extracted by the heat trace region extraction unit 17 so that the user can confirm it, but the heat trace region. Determines whether the cold trace region overlaps with the heat trace region at a time after the corresponding time, and stops outputting all or part of the information indicating the heat trace region according to the determination result.
- the information output unit 181 of the first example starts to output the information indicating the heat trace region when the heat trace region is extracted by the heat trace region extraction unit 17, and the cold trace region extraction unit 171 starts to output the information indicating the heat trace region.
- the cold trace region is extracted in, it is determined whether or not the heat trace region and the cold trace region overlap, and if it is determined that they overlap, information indicating the heat trace region is shown. Ends all or part of the output of.
- an example of processing of the information output unit 181 of the first example will be described.
- the information output unit 181 outputs information indicating the heat trace region in the same manner as the heat trace region output unit 18 of the first embodiment. Output so that the user can check it.
- the information output unit 181 appropriately positions a binary image, which is information indicating whether or not it is a heat trace region as shown in FIG. 12A, and then projects it onto a shooting range in the environment. Information indicating the heat trace area is displayed so that the user can confirm it.
- the information indicating the heat trace region is a binary image.
- the region filled with the dot pattern is the heat trace region
- the region filled with white is not the heat trace region.
- the cold trace region is extracted by the cold trace region extraction unit 171.
- An example of this cold trace region is shown by a broken line in FIG. 12 (b).
- the dashed line in FIG. 12 (b) indicates the edge of the cold trace region.
- the area surrounded by the broken line in FIG. 12B is the cold trace area.
- the matching rate (the number of pixels in which the heat trace region and the cold trace
- the information output unit 181 stops the output of information indicating the heat trace region for the portion of the heat trace region that overlaps with the cold trace region. For example, when the heat trace region and the cold trace region are in the state illustrated in FIG. 12 (b), the binary image illustrated in FIG. 12 (d) is projected onto the photographing range in the environment. In other words, the information output unit 181 may display only the region excluding the region where the heat trace region overlaps with the cold trace region from the heat trace region. As described above, the information output unit 181 may stop the output of a part of the information indicating the heat trace region according to the determination result of whether or not the heat trace region overlaps with the cold trace region.
- the heat trace region is a region where the temperature rises when touched by a person, and is considered to be a region where a virus or the like may be attached.
- the cold trace region is a region where the temperature is lowered due to disinfection with alcohol or the like, and it can be considered that the region is unlikely to be infected with a virus or the like even if it is touched by a human.
- the information indicating the heat trace area once extracted is continuously output so that the user can confirm it, but the heat trace area overlaps with the cold trace area extracted later.
- Second example of information output unit 181 uses the information indicating the cold trace area and outputs the information indicating the cold trace area so that the user can confirm it. More specifically, the information output unit 181 of the second example outputs only the information indicating the cold trace region extracted by the cold trace region extraction unit 171 so that the user can confirm it. For example, the information output unit 181 outputs an image obtained by synthesizing white pixels of a binary image, which is an example of information indicating a cold trace region, on a background substance image. Further, for example, the information output unit 181 projects a binary image showing a cold trace region with respect to a shooting range in the environment after appropriately aligning the image using a projector or the like.
- the information output unit 181 may start outputting the information indicating the cold trace area when, for example, the information indicating the cold trace area is input.
- the heat trace region extraction device 10 does not need to obtain information indicating the heat trace region.
- the heat trace region extraction device 10 may not include the differential thermal image generation unit 16 and the heat trace region extraction unit 17. Further, the heat trace region extraction device 10 that does not include the differential heat image generation unit 16 and the heat trace region extraction unit 17 can be said to be a cold trace region extraction device.
- the heat trace region extraction device and method of the first embodiment it is possible to inform the user of a region where the temperature has risen due to human touch and a region where a virus or the like may be attached.
- a region where the temperature has risen due to contact by a person if the body temperature of the person who touches it is low, it is unlikely that the person who touched it is infected with a virus or the like, so that area is a virus. It is a region where there is a low possibility that such things are attached.
- the region where the temperature rises due to contact by a person with a high body temperature is a region where there is a high possibility that the virus or the like is attached because the person who touches the body is likely to be infected with the virus or the like. From these facts, in order to more appropriately convey to the user the area where a virus or the like may be attached, not only the heat trace area, which is the area where the temperature has risen due to human touch, but also the heat. It would be nice to be able to identify the person and body temperature that caused the heat traces corresponding to the trace area.
- the heat trace region extraction device and method of the third embodiment specify the temperature that causes the heat trace corresponding to the heat trace region.
- the heat trace region extraction system 1 of the third embodiment includes a heat trace region extraction device 10, a physical camera 210, and a thermal camera 22.
- the heat trace region extraction device 10 of the third embodiment includes a real object image acquisition unit 110, a background real object image generation unit 120, a differential real object image generation unit 130, a thermal image acquisition unit 14, and a background.
- a thermal image generation unit 15, a differential thermal image generation unit 16, a heat trace region extraction unit 17, a heat trace cause identification unit 31, and an information output unit 181 are provided, for example.
- the physical camera 210 and the thermal camera 22 are connected to the heat trace region extraction device 10 of the third embodiment, and the images taken by the physical camera 210 and the thermal camera 22 are input to the heat trace region extraction device 10.
- the heat trace region extraction method of the third embodiment is realized by each part of the heat trace region extraction device executing the processes of steps S101 to S109 shown in FIG. 14 and below.
- a part different from the modified example of the first embodiment will be mainly described. Overlapping description will be omitted for the same parts as the modified example of the first embodiment.
- Real object camera 210 real object image acquisition unit 110, background real object image generation unit 120, differential real object image generation unit 130, thermal camera 22, thermal image acquisition unit 14, background thermal image generation unit 15, and differential thermal image generation unit.
- Reference numeral 16 is a real camera 210, a real image acquisition unit 110, a background real image generation unit 120, a differential real image generation unit 130, a thermal camera 22, a thermal image acquisition unit 14, and so on. This is the same as the background thermal image generation unit 15 and the differential thermal image generation unit 16. However, the differential thermal image generated by the differential thermal image generation unit 16 and the thermal image acquired by the thermal image acquisition unit 14 are further input to the heat trace cause identification unit 31.
- the heat trace region extraction unit 17 has only a differential heat region image showing one differential heat region as a difference and one heat trace region. Is generated and output as a heat trace region image showing the difference, and information on the time corresponding to each heat trace region image is also output.
- the heat trace area extraction unit 17 assigns a file name to each image and also outputs the file name.
- the differential heat region image, the heat trace region image, the time information corresponding to the heat trace region image, and the file name of each image obtained by the heat trace region extraction unit 17 are input to the heat trace cause identification unit 31.
- the heat trace region extraction unit 17 When one differential heat region is obtained, the heat trace region extraction unit 17 generates an image showing the one differential thermal region as a difference as a differential thermal region image. Further, when p is a positive integer of 2 or more and the heat trace region extraction unit 17 obtains p differential heat regions separated from each other, each of the p differential heat regions is different. Each of the p images showing the differential thermal region of only as a differential is generated as a differential thermal region image.
- the heat trace region extraction unit 17 uses, for example, a binary image in which the pixel value of each pixel in the differential heat region is 1 and the pixel value of each pixel in a region other than the differential heat region is 0 as a differential heat region image. Generate.
- the heat trace region extraction unit 17 When one heat trace region is obtained, the heat trace region extraction unit 17 generates an image showing the one heat trace region as a difference as a heat trace region image. Further, when q is a positive integer of 2 or more and q heat trace regions separated from each other are obtained, the heat trace region extraction unit 17 is different from each of the q heat trace regions. Each of the q images showing the heat trace region of only as a region is generated as a heat trace region image.
- the heat trace region extraction unit 17 uses, for example, a binary image in which the pixel value of each pixel in the heat trace region is 1 and the pixel value of each pixel in a region other than the heat trace region is 0 as a heat trace region image. Generate.
- the heat trace region image and the time information corresponding to the heat trace region image obtained by the heat trace region extraction unit 17 are used to create a heat trace table.
- the heat trace table is stored in the auxiliary storage device 102.
- An example of a heat trace table is shown in FIG.
- records for each heat trace region at each time extracted by the heat trace region extraction unit 17 are recorded, and the records for each heat trace region correspond to the heat trace region image.
- the time, the file name of the heat trace area image, and the group ID to which the heat trace area belongs are included.
- the group ID to which each heat trace region belongs in the heat trace table is generated by the heat trace cause identification unit 31 described later.
- the heat trace area at the time “2021-08-10 10: 50: 48.000” in FIG. 15 is processed by the heat trace area extraction unit 17 corresponding to the latest time “2021-08-10 10: 50: 48.000”. It is a heat trace region extracted in the above, and the heat trace cause identification unit 31 has not been processed yet. Therefore, the group ID to which the heat trace area belongs is not assigned to the record of the heat trace area at the time “2021-08-10 10: 50: 48.000” in FIG.
- the heat trace cause identification unit 31 performs a process of identifying the cause of the corresponding heat mark on the heat trace area extracted by the process corresponding to the latest time in the heat trace area extraction unit 17. Assign a group ID in the process.
- the heat trace cause identification unit 31 is different from the cause in which the heat trace region extracted by the process corresponding to the past time becomes the heat trace among the heat trace regions extracted by the process corresponding to the latest time.
- the heat trace region extraction unit 17 sets the cause region, which is the region where the heat trace region specified by the first specific treatment is the region that caused the heat trace, and the first identification process for identifying the heat trace region due to the cause.
- a second specific process that uses the differential thermal image extracted by the process corresponding to the past time, and a differential thermal image that uses the temperature of the cause region specified in the second specific process to identify the cause region.
- a third specific process for specifying from the thermal image corresponding to the above is performed (S310).
- the temperature of the cause region specified in the third specific process (that is, the temperature of the cause region of the heat trace corresponding to each heat trace region) is input to the information output unit 181.
- the second identification process performed by the heat trace cause specifying unit 31 is a time (past time) before the differential thermal image corresponding to the heat trace area specified by the first specific process. ), And all the differential thermal images from the differential thermal image corresponding to the heat trace region specified by the first specific process to the differential thermal image at the previous time (past time). It is a differential thermal image satisfying the condition that the region corresponding to the heat trace region specified by the first specific process is included as the differential thermal region, and is specified by the first specific process.
- a differential thermal image including a differential thermal region in which a region different from the region corresponding to the heat trace region is connected to the region corresponding to the heat trace region specified in the first specific process is specified, and the specified difference is specified. It is a process of identifying the different regions in the thermal image as the causative region that caused the thermal traces corresponding to the thermal trace regions identified in the first specific process.
- the heat trace region extracted by the heat trace region extraction unit 17 in the process corresponding to the latest time is referred to as a new heat trace region.
- the heat trace region extracted by the process corresponding to the time before the latest time is also referred to as an old heat trace region.
- the heat trace cause identification unit 31 performs the following processing each time the heat trace region is extracted by the heat trace region extraction unit 17 in the processing corresponding to the latest time. That is, the heat trace cause identification unit 31 performs the following processing on each new heat trace region.
- the heat trace cause identification unit 31 determines the group to which the new heat trace region belongs by performing the processes of steps S3106 from S3101 shown in FIG. 16 and the following.
- the heat trace cause identification unit 31 selects one of the newest old heat trace regions among the old heat trace regions that have not been selected yet (S3101), and proceeds to the process in step S3102.
- the group ID recorded in the heat trace table is an old heat trace region different from the group ID of the old heat trace region selected so far, and is recorded in the heat trace table. Select one of the old heat trace areas with the latest time.
- the old heat trace region selected by the heat trace cause specifying unit 31 is abbreviated as “selected old heat trace region”.
- the heat trace cause identification unit 31 determines whether or not the difference between the time corresponding to the new heat trace region and the time corresponding to the selected old heat trace region is equal to or less than a predetermined time (S3102).
- the heat trace cause identification unit 31 is, for example, a time included in the record of the new heat trace area recorded in the heat trace table and a time included in the record of the selected old heat trace area recorded in the heat trace table. It may be determined whether the difference is less than or equal to a predetermined time.
- the predetermined time is longer than the average time during which the temperature of the heat trace region drops to the same as the background temperature. For example, the predetermined time is 10 seconds.
- step S3102 When it is determined in step S3102 that the difference between the time corresponding to the new heat trace area and the time corresponding to the selected old heat trace area is equal to or less than a predetermined time, the heat trace cause specifying unit 31 determines in step S3103. If it is determined in step S3102 that the difference between the time corresponding to the new heat trace area and the time corresponding to the selected old heat trace area is not less than or equal to a predetermined time, the process proceeds to step S3106. ..
- step S3106 which is a process when the difference between the time corresponding to the new heat trace area and the time corresponding to the selected old heat trace area in step S3102 is not equal to or less than a predetermined time, will be described later.
- the heat trace cause identification unit 31 is set in the new heat trace region. It is determined whether or not the size of the region not included in the selected old heat trace region is equal to or smaller than the predetermined size (S3103).
- the heat trace cause identification unit 31 is, for example, a heat trace area image of a file specified by the file name of the new heat trace area recorded in the heat trace table and a file of the selected old heat trace area recorded in the heat trace table. The determination in step S3103 may be performed using the heat trace area image of the file specified by the name.
- step S3103 determines in step S3103 that the size of the region not included in the selected old heat trace region in the new heat trace region is not smaller than the predetermined size. The process proceeds to S3101.
- step S3103 determines in step S3103 that the size of the region not included in the selected old heat trace region in the new heat trace region is equal to or less than the predetermined size. The process proceeds to step S3104.
- the heat trace cause identification unit 31 corresponds to the new heat trace region. It is determined whether the temperature is lower than the temperature corresponding to the selected old heat trace region (S3104).
- the temperature corresponding to the new heat trace region can be obtained from the thermal image corresponding to the new heat trace region.
- the heat trace cause identification unit 31 calculates the average value of the temperature indicated by each pixel included in the region corresponding to the new heat trace region in the thermal image corresponding to the heat trace region image of the new heat trace region. Then, the average value is taken as the temperature corresponding to the new heat trace region.
- the temperature corresponding to the selected old heat trace region can be obtained from the thermal image corresponding to the selected old heat trace region.
- the heat trace cause identification unit 31 is the average value of the temperature indicated by each pixel included in the region corresponding to the selected old heat trace region in the thermal image corresponding to the heat trace region image of the selected old heat trace region. Is calculated, and the average value is taken as the temperature corresponding to the selected old heat trace region.
- step S3104 determines in step S3104 that the temperature corresponding to the new heat trace region is not lower than the temperature corresponding to the selected old heat trace region
- the process proceeds to step S3101.
- step S3104 determines in step S3104 that the temperature corresponding to the new heat trace region is lower than the temperature corresponding to the selected old heat trace region
- the process proceeds to step S3105.
- the heat trace cause identification unit 31 states that the new heat trace region belongs to the same group as the selected old heat trace region. Judgment (S3105).
- the heat trace cause identification unit 31 uses, for example, the same number as the group ID of the record of the selected old heat trace area recorded in the heat trace table as the group ID of the record of the new heat trace area. Record in.
- the size of the region not included in the selected old heat trace region in the new heat trace region is less than or equal to the predetermined size, and the temperature corresponding to the new heat trace region is the temperature corresponding to the selected old heat trace region. If it is lower than, the new heat trace region is considered to be the region of the selected old heat trace region shrinking with the passage of time and the temperature decreasing. Therefore, in this case, the heat trace cause specifying unit 31 may determine that the cause of the heat trace corresponding to the new heat trace region is the same as the cause of the heat trace corresponding to the selected old heat trace region. This determination is performed in the processes from step S3103 to step S3105.
- the heat trace cause identification unit 31 determines. It is determined that the new heat trace region belongs to a new group (S3106). That is, when it is determined in the process of step S3102 that the difference between the time corresponding to the new heat trace area and the time corresponding to the selected old heat trace area is not equal to or less than a predetermined time, the heat trace cause identification unit 31 Determines that the new heat trace region is a heat trace region due to a new cause different from the cause in which the heat trace region extracted by the process corresponding to the past time becomes the heat trace. In this case, the heat trace cause identification unit 31 uses, for example, a number different from the group ID of the record of any old heat trace region recorded in the heat trace table as the group ID of the record of the new heat trace region. Record on the table.
- step S3102 and step S3106 makes the determination based on this idea.
- the heat trace cause specifying unit 31 should be selected, in other words, even if there is no old heat trace region that has not been selected in the first process of step S3101. Even if there is no heat trace region, it is determined that the new heat trace region belongs to the new group (S3106).
- the heat trace cause identification unit 31 records, for example, an arbitrary number in the heat trace table as the group ID of the record of the new heat trace area.
- the heat trace area at the time "2021-08-10 10: 50: 48.000" in the heat trace table of FIG. 15 belongs to a new group
- the heat trace area has a new group ID "28". Is assigned.
- the heat trace cause identification unit 31 When it is determined that the new heat trace region belongs to a new group, the heat trace cause identification unit 31 performs the processing of steps S3107 to S3110 shown in FIG. 17 and the following to cause the heat trace in the new heat trace region.
- the temperature of the causative region of the heat trace is estimated by identifying the causal region which is the region where the heat trace became.
- the heat trace cause identification unit 31 is the newest of the differential heat images that have not been selected yet among the differential heat images that are time before the corresponding time of the new heat trace region determined to belong to the new group.
- the differential thermal image is acquired (S3107), and the process proceeds to step S3108.
- the heat trace cause identification unit 31 acquires a differential thermal image one hour before the time corresponding to the new heat trace region.
- the heat trace cause identification unit 31 is the differential thermal image acquired in the k-1st process of step S3107 in the kth and subsequent processes of step S3107.
- the differential thermal image one hour before the time is acquired.
- the heat trace cause identification unit 31 may use a differential heat region image in which the region corresponding to the new heat trace region includes the differential heat region instead of the differential heat image. good.
- the heat trace cause specifying unit 31 uses the difference heat region image instead of the difference heat image, the difference heat image generated by the difference heat image generation unit 16 may not be input to the heat trace cause identification unit 31.
- the heat trace cause identification unit 31 determines whether or not the acquired differential heat image includes a differential heat region in which a region different from the region corresponding to the new heat trace region is connected to the region corresponding to the new heat trace region. (S3108). The heat trace cause identification unit 31 determines that the acquired differential heat image does not include the differential heat region in which the region different from the region corresponding to the new heat trace region is connected to the region corresponding to the new heat trace region. If so, the process proceeds to step S3107. The heat trace cause identification unit 31 has determined that the acquired differential heat image includes a differential heat region in which a region different from the region corresponding to the new heat trace region is connected to the region corresponding to the new heat trace region. In that case, the process proceeds to step S3109.
- the differential thermal image includes a differential thermal region in which a region different from the region corresponding to the new heat trace region (hereinafter, referred to as “connected region” for convenience) is connected to the region corresponding to the new heat trace region.
- the heat trace cause identification unit 31 identifies the connecting region as the cause region that caused the heat trace in the new heat trace region (S3109), and proceeds to the process in step S3110.
- the heat trace cause identification unit 31 estimates the temperature of the cause region from the temperature of the region corresponding to the cause region of the thermal image corresponding to the differential thermal image (S3110). For example, the heat trace cause identification unit 31 calculates the average value of the temperature indicated by each pixel included in the region corresponding to the cause region in the thermal image corresponding to the differential thermal image, and uses the average value as the cause region. The temperature of. In the heat trace cause identification unit 31, the median or maximum value of the temperature indicated by each pixel included in the region corresponding to the cause region in the thermal image corresponding to the differential thermal image may be used as the temperature of the cause region. good.
- the heat trace cause identification unit 31 uses the extracted heat trace region as the new heat trace region, and the heat trace region extracted at a time before the time when the new heat trace region is extracted is the old heat.
- the trace area (1) the difference between the time corresponding to the new heat trace area and the time corresponding to the old heat trace area is less than or equal to a predetermined time, and is included in the old heat trace area in the new heat trace area. If the size of the non-existent region is less than or equal to the predetermined size and the temperature corresponding to the new heat trace region is lower than the temperature corresponding to the old heat trace region, the new heat trace region is the old heat trace region.
- the differential thermal image including the differential thermal region is specified, and the different region in the specified differential thermal image is specified. Is identified as the causative region, which is the region that caused the heat trace corresponding to the new heat trace region. Further, the heat trace cause identification unit estimates the temperature of the region specified as the cause region from the thermal image corresponding to the differential thermal image used to identify the cause region.
- FIG. 18 (a) is a schematic diagram of a background physical image and a physical image at each time
- FIG. 18 (b) is a schematic diagram of a background thermal image and a thermal image at each time
- FIG. 18 (c) is a schematic diagram
- FIG. 18 (d) is a schematic diagram of a differential thermal image at each time
- FIG. 18 (e) is a schematic diagram of an image showing a thermal trace region at each time. be.
- the heat trace region r1 is referred to as a new heat trace region r1.
- the heat trace cause identification unit 31 cannot select the old heat trace region (S3101), it is determined that the new heat trace region r1 belongs to the new group (S3106).
- the heat trace cause identification unit 31 acquires a differential heat image at time t0, which is the newest differential heat image before time t1 (S3107), and the difference heat image at time t0 is moved to the new heat trace region r1. It is determined whether or not the region different from the corresponding region includes the differential heat region connected to the region corresponding to the new heat trace region r1 (S3108).
- the region different from the region corresponding to the new heat trace region r1 (the region of person A) is the new heat trace region r1. It is included as a differential thermal region connected to the region corresponding to. Therefore, the heat trace cause specifying unit 31 identifies the region of the person A in the differential thermal image at time t0 as the region that caused the heat trace in the new heat trace region r1 (S3109).
- the heat trace region r2 is referred to as a new heat trace region r2.
- the heat trace cause identification unit 31 selects the heat trace region r1 extracted at time t1 as the old heat trace region (S3101).
- the heat trace region r1 will be referred to as an old heat trace region r1.
- the heat trace cause identification unit 31 determines whether the difference between the time t2 corresponding to the new heat trace region r2 and the time t1 corresponding to the old heat trace region r1 is equal to or less than a predetermined time (S3102). Judgment as to whether the size of the region not included in the old heat trace region r1 in the heat trace region r2 is smaller than the predetermined size (S3103), and the temperature corresponding to the new heat trace region r2 is the old heat. It is determined whether or not the temperature is lower than the temperature corresponding to the trace region r1 (S3104).
- the difference between the time t2 corresponding to the new heat trace region r2 and the time t1 corresponding to the old heat trace region r1 is less than or equal to a predetermined time, and the old heat trace region in the new heat trace region r2. It is assumed that the size of the region not included in r1 is smaller than the predetermined size, and the temperature corresponding to the new heat trace region r2 is lower than the temperature corresponding to the old heat trace region r1. Therefore, the heat trace cause identification unit 31 determines that the new heat trace region r2 belongs to the same group as the old heat trace region r1 (S3105).
- the fact that the new heat trace region r2 belongs to the same group as the old heat trace region r1 means that the cause of the heat trace in the new heat trace region r2 is the same as the cause of the heat trace in the old heat trace region r1.
- the region that caused the heat trace of the old heat trace region r1 has already been identified as the region of person A in the above-mentioned example at time t1. Therefore, the region that caused the heat trace of the new heat trace region r2 is the region of person A.
- the heat trace region r4 is referred to as a new heat trace region r4.
- the heat trace cause identification unit 31 selects the heat trace region r2 extracted at time t2 as the old heat trace region (S3101).
- the heat trace region r2 will be referred to as an old heat trace region r2.
- the heat trace cause identification unit 31 determines whether the difference between the time t4 corresponding to the new heat trace region r4 and the time t2 corresponding to the old heat trace region r2 is less than or equal to a predetermined time (S3102). Judgment as to whether the size of the region not included in the old heat trace region r2 in the heat trace region r4 is less than or equal to the predetermined size (S3103), and the temperature corresponding to the new heat trace region r4 is the old heat. It is determined whether or not the temperature is lower than the temperature corresponding to the trace region r2 (S3104).
- the difference between the time t4 corresponding to the new heat trace region r4 and the time t2 corresponding to the old heat trace region r2 is less than or equal to a predetermined time, and the old heat trace region in the new heat trace region r4. It is assumed that the size of the region not included in r2 is smaller than the predetermined size, and the temperature corresponding to the new heat trace region r4 is lower than the temperature corresponding to the old heat trace region r2. Therefore, the heat trace cause identification unit 31 determines that the new heat trace region r4 belongs to the same group as the old heat trace region r2 (S3105).
- the fact that the new heat trace region r4 belongs to the same group as the old heat trace region r2 means that the old heat trace region r2 belongs to the same group as the old heat trace region r1 in the above-mentioned example at time t2. Therefore, the new heat trace region r4 belongs to the same group as the old heat trace region r1. That is, the cause of the heat trace in the new heat trace region r4 is the same as the cause of the heat trace in the old heat trace region r1.
- the region that caused the heat trace of the old heat trace region r1 has already been identified as the region of person A in the above-mentioned example at time t1. Therefore, the region that caused the heat trace of the new heat trace region r4 is the region of person A.
- the person who causes the heat trace in the heat trace region by the treatment of the heat trace cause identification unit 31 described above. can be identified more appropriately.
- the information output unit 181 contains information indicating the heat trace region extracted by the heat trace region extraction unit 17, and the temperature of the heat trace cause region corresponding to each heat trace region identified by the heat trace cause identification unit 31. And are entered.
- the information output unit 181 is an expression method in which the information indicating the heat trace region input from the heat trace region extraction unit 17 is determined according to the temperature of the cause region input from the heat trace cause identification unit 31 by the user. Output so that it can be confirmed (S109). For example, the information output unit 181 displays information indicating the heat trace region extracted by the heat trace region extraction unit 17 by an expression method determined according to the temperature specified by the heat trace cause identification unit 31. For example, the information output unit 181 has the temperature of the cause region input from the heat trace cause identification unit 31 with respect to a binary image in which the heat trace region portion, which is information indicating the heat trace region, is white and the rest is black. The image or video processed according to the above is projected on the shooting range.
- An example of an expression method according to the temperature of the cause area is that the higher the temperature, the stronger the expression method.
- the expression method that is stronger as the temperature is higher is, in other words, an expression method that attracts the user's attention as the temperature is higher.
- the higher the temperature, the stronger the expression method when the temperature of the heat trace region is higher than a predetermined threshold value, the display of information indicating that the heat trace region is a portion or the heat trace region is blinked.
- the information output unit 181 sets the brightness to 0 in a region other than the heat trace region, and sets the brightness to a predetermined brightness in the heat trace region where the temperature is equal to or lower than a predetermined threshold, and the temperature is predetermined.
- Another example of the expression method in which the higher the temperature is, the stronger the expression method is. It is an expression method that the higher the temperature of the heat trace region, the faster the color change of the display of the information indicating that the heat trace region is the portion or the heat trace region. Another example of the expression method in which the higher the temperature is, the stronger the expression method is. In addition, the higher the temperature of the heat trace region, the higher the frequency of vibration of the display of the information indicating that the heat trace region is the portion or the heat trace region. Another example of the expression method in which the higher the temperature, the stronger the expression method is, the higher the temperature of the heat trace region, the brighter the display of the information indicating that the portion of the heat trace region or the heat trace region is displayed. Another example of the expression method in which the higher the temperature is, the stronger the expression method is. be.
- Another example of the expression method that differs depending on the temperature of the cause region is the expression method of displaying the numerical value of the temperature together with the information indicating that it is a part of the heat trace region or the heat trace region.
- Another example of the expression method different depending on the temperature of the cause region is the expression method in which the information indicating that the part of the heat trace region or the heat trace region is shown in different colors depending on the temperature.
- An example of this representation method displays in the first color information indicating that the temperature of the heat trace region is a portion of the heat trace region or the heat trace region when the temperature of the heat trace region is higher than a predetermined threshold value. If this is not the case, information indicating that the part of the heat trace region or the heat trace region is present is displayed in the second color. For example, the first color is red and the second color is blue.
- the information output unit 181 may create the group information table illustrated in FIG. 19 and display it with reference to the heat trace table and the group information table stored in the auxiliary storage device 102.
- the group information table of FIG. 19 stores records corresponding to each heat trace area, and each record includes a group ID to which each heat trace area belongs and a region that causes heat traces in each heat trace area. Includes temperature, an ID that identifies a display method corresponding to each heat trace area.
- the ID that specifies the display method is represented by a larger number as the temperature is higher, and when the ID that specifies the display method is displayed in a stronger expression as the temperature is higher, the ID that specifies the display method is represented by the ID. The larger the value, the stronger the expression method.
- a region where the temperature has risen due to human touch in other words, a region where a virus such as a new coronavirus may be attached is projected by a projector or the like.
- the visible light camera 21 also shoots the visible light projected by the projector, and the shades of the light projected by the projector are superimposed on people and objects in the shooting range.
- a visible image will be obtained.
- the visible image is strongly superposed with the light and shade generated by the light projected by the projector on the image of a person or an object existing in the shooting range. It may not be possible to properly extract the heat trace area with the extraction device and method.
- a real camera 210 that does not shoot the visible light wavelength band projected by the projector is used instead of the visible light camera 21 as a camera for shooting the real object. Therefore, the wavelength band of the electromagnetic wave obtained by the physical camera 210 and the wavelength band of the visible light projected by the projector are prevented from overlapping so that the light projected by the projector does not affect the extraction of the heat trace region.
- the heat trace region extraction system 1 of the fourth embodiment includes a heat trace region extraction device 10, a physical camera 210, and a thermal camera 22.
- the heat trace region extraction device 10 of the fourth embodiment has a physical image acquisition unit 110, a background physical image generation unit 120, a differential physical image generation unit 130, a thermal image acquisition unit 14, and a background.
- a thermal image generation unit 15, a differential thermal image generation unit 16, a heat trace region extraction unit 17, and an information output unit 181 are provided, for example.
- the physical camera 210 and the thermal camera 22 are connected to the heat trace region extraction device 10 of the fourth embodiment, and the images taken by the physical camera 210 and the thermal camera 22 are input to the heat trace region extraction device 10.
- the heat trace region extraction method of the fourth embodiment is realized by each part of the heat trace region extraction device executing the processes of steps S101 to S109 shown in FIG. 21 and the following.
- the parts different from the first embodiment to the third embodiment will be mainly described. Duplicate explanations will be omitted for the same parts as those in the first to third embodiments.
- the 16 and the heat trace region extraction unit 17 are a real camera 210, a real image acquisition unit 110, a background real image generation unit 120, a differential real image generation unit 130, and a thermal camera 22, respectively, which are modifications of the first embodiment. This is the same as the thermal image acquisition unit 14, the background thermal image generation unit 15, the differential thermal image generation unit 16, and the heat trace region extraction unit 17.
- the physical camera 210 is a camera in which the wavelength band of the electromagnetic wave to be photographed meets the requirements described later. Further, it is assumed that the information indicating the heat trace region extracted by the heat trace region extraction unit 17 is input to the information output unit 181.
- the information output unit 181 has a projector function. Similar to the heat trace area output unit 18 of the first embodiment, the information output unit 181 having a projector function projects information indicating the heat trace area with visible light to a certain range which is a shooting range (S109). ..
- the physical camera 210 is a camera that does not capture the wavelength band of visible light projected by the projector. That is, the physical camera 210 is a camera that captures electromagnetic waves in a wavelength band different from the wavelength band of visible light projected by the projector. Further, as a matter of course, the physical camera 210 is a camera that does not capture the heat generated by the physical object. That is, the physical camera 210 is a camera that captures electromagnetic waves in a wavelength band different from the wavelength band of the electromagnetic waves obtained by the thermal camera 22. Further, as a matter of course, since the information output unit 181 projects visible light, the information output unit 181 projects visible light in a wavelength band different from the wavelength band of the electromagnetic wave obtained by the thermal camera 22.
- the wavelength band of the electromagnetic wave obtained by the physical camera 210, the wavelength band of the electromagnetic wave obtained by the thermal camera 22, and the wavelength band of the visible light projected by the information output unit 181 do not overlap. Will be done.
- the wavelength band is a range of wavelengths in which the intensity of electromagnetic waves or visible light is equal to or higher than a predetermined intensity.
- the wavelength band of visible light projected by the information output unit 181 is set to 400 nm to 780 nm
- the wavelength band of electromagnetic waves obtained by the physical camera 210 is set to the wavelength band in the near infrared region (for example, 780 nm to 2500 nm)
- the thermal camera is used.
- the wavelength band of the electromagnetic wave obtained in 22 is 8 ⁇ m to 14 ⁇ m, which is the wavelength band of heat rays (far infrared region).
- the wavelength band in the near infrared region may be 800 nm to 1000 nm.
- the upper limit of the wavelength band of visible light projected by the information output unit 181 is set to 800 nm
- the lower limit of the wavelength band of the electromagnetic wave obtained by the physical camera 210 is set to 850 nm
- the wavelength band of the electromagnetic wave obtained by the thermal camera 22 is 8 ⁇ m. It may be from 14 ⁇ m.
- the wavelength band of visible light projected by the information output unit 181 is set to 400 nm to 600 nm
- the lower limit of the wavelength band of electromagnetic waves obtained by the physical camera 210 is set to 600 nm, which is the wavelength of visible light
- the thermal camera 22 is used.
- the wavelength band of the obtained electromagnetic wave may be 8 ⁇ m to 14 ⁇ m.
- both the wavelength band of the electromagnetic wave obtained by the physical camera 210 and the wavelength band of the visible light projected by the information output unit 181 are the wavelength bands of the visible light. There may be.
- the wavelength band of visible light projected by the information output unit 181 is set to 600 nm to 800 nm
- the wavelength band of the electromagnetic wave obtained by the physical camera 210 is set to 400 nm to 600 nm
- the wavelength band of the electromagnetic wave obtained by the thermal camera 22 is set. May be 8 ⁇ m to 14 ⁇ m.
- the order of the wavelength band of visible light projected by the information output unit 181 and the wavelength band of the electromagnetic wave obtained by the physical camera 210 does not matter. That is, it is not essential that the wavelength band of the electromagnetic wave obtained by the physical camera 210 is located in a direction longer than the wavelength band of the visible light projected by the information output unit 181.
- the wavelength band of the visible light projected by the camera may be located in a direction in which the wavelength of the electromagnetic wave obtained by the physical camera 210 is longer than that of the wavelength band of the electromagnetic wave.
- the wavelength band may be controlled by a well-known technique.
- a physical filter that passes electromagnetic waves and visible light in a predetermined wavelength band is attached to the lens of the physical camera 210, the lens of the thermal camera 22, and the projection lens of the projector of the information output unit 181. Can be done with.
- the wavelength band may be controlled by signal processing.
- the information output unit 181 may control the wavelength band by filtering the signal generated for projection and projecting based on the filtered signal.
- the physical camera 210 may control the wavelength band by filtering the signal acquired by the physical camera 210 and acquiring the physical image from the filtered signal.
- the thermal camera 22 may control the wavelength band by filtering the signal acquired by the thermal camera 22 and acquiring a thermal image from the filtered signal.
- the wavelength band of the electromagnetic wave obtained by the physical camera 210 and the thermal are also obtained in the same manner as described above.
- the wavelength band of the electromagnetic wave obtained by the camera 22 and the wavelength band of the visible light projected by the information output unit 181 may not overlap. That is, in the modified example of the first embodiment and the heat trace region extraction device and method of the second embodiment and the third embodiment, the wavelength band of the electromagnetic wave obtained by the physical camera 210 and the thermal camera are obtained in the same manner as described above.
- the fourth embodiment also includes a case in which the wavelength band of the electromagnetic wave obtained in 22 and the wavelength band of visible light projected by the information output unit 181 do not overlap.
- the wavelength band of the electromagnetic wave obtained by the physical camera 210, the wavelength band of the electromagnetic wave obtained by the thermal camera 22, and the wavelength band of visible light projected by the information output unit 181 should not overlap. Then, the visible light projected by the information output unit 181 is not captured by the physical camera 210. Therefore, the heat trace region extraction device and method of the fourth embodiment can appropriately extract the heat trace region.
- the irradiator 41 that irradiates the electromagnetic wave in the wavelength band obtained by the physical camera 210 is used.
- the shooting range may be irradiated.
- the wavelength band of the electromagnetic wave obtained by the physical camera 210 is the wavelength band in the near-infrared region and the illumination in the photographing range does not include the wavelength band in the near-infrared region such as a fluorescent lamp, the illumination is in the near-infrared region.
- the imaging range may be irradiated with the illuminator 41 that irradiates the electromagnetic wave in the wavelength band in the near infrared region obtained by the physical camera 210.
- the heat trace region extraction system 1 of the fourth embodiment may further include an irradiator 41, for example, as shown by a broken line in FIGS. 7, 9, 13, and 20.
- the information output unit 181 may output information indicating the heat trace region in various variations.
- the information output unit 181 outputs information indicating the heat trace region in various variations. It is something to do.
- the heat trace region output unit 18 of the first embodiment and the modified examples of the first embodiment will be referred to as an information output unit 181.
- the information output unit 181 includes a physical image, which is an image of the real body obtained by shooting a certain range with a real body camera for shooting the real body, and a real body.
- a thermal image which is an image of the heat emitted by an entity obtained by photographing a certain range with a thermal camera for photographing the emitted heat
- a thermal trace region which is a region of traces of heat extracted based on the thermal image. It outputs the information to be shown.
- the parts different from the heat trace region extraction apparatus and method of the first to fourth embodiments will be mainly described. Duplicate explanations will be omitted for the same parts as those in the first to fourth embodiments.
- the information output unit 181 may display information indicating the heat trace region on the transmissive display.
- An example of a transmissive display is a transmissive head-mounted display.
- the information output unit 181 includes a transmissive display, and by detecting the position and orientation of the transmissive display, the real space (that is, the physical camera and the real space) visually recognized by the user through the transmissive display.
- the heat trace area that actually exists in the shooting range of the thermal camera) and the information indicating the heat trace area displayed on the transmissive display are aligned.
- the user can visually recognize the information indicating the heat trace area displayed on the transmissive display as if it is superimposed and displayed in the real space that is visually recognized through the transmissive display. It is possible to intuitively understand the place where the virus may be attached.
- the information output unit 181 may display the information indicating the heat trace region on the display by superimposing the information indicating the heat trace region on the image or the video in a certain range which is the photographing range. Examples of displays are digital signage, smartphones, tablet terminals, electric bulletin boards, and TVs. For example, the information output unit 181 may display information indicating a heat trace region on a display by superimposing it on a physical image or a visible image.
- the real object image or visible image is a real object image or visible image acquired by the real object image acquisition unit 110 or the visible image acquisition unit 11.
- the information output unit 181 superimposes information indicating a heat trace region on an image or video of a certain range, which is a shooting range, taken by a camera different from the visible light camera 21 and the real camera 210. It may be displayed on the display.
- a camera different from the visible light camera 21 and the physical camera 210 is a camera provided in a smartphone or tablet terminal.
- information indicating a heat trace region may be superimposed on an image or video in a certain range, which is a shooting range.
- the information output unit 181 actually exists in a real space (that is, a shooting range of a real camera and a thermal camera) that the user can see without going through the display by detecting the position and orientation of the display of the smartphone or tablet terminal.
- the real space visually recognized by the user and the space displayed on the display are aligned in real time according to the position and angle of the smartphone or tablet terminal, so that the user adheres to the virus or the like. You can get a more intuitive understanding of where you might be.
- the information output unit 181 may indicate information indicating the heat trace region by sound. For example, the information output unit 181 may output a voice such as "x people touched here.” Therefore, the information output unit 181 may store the number of times x at which the heat trace region is extracted. Further, the information output unit 181 may output a warning sound when the number of times the heat trace region is extracted after disinfection exceeds a predetermined number of times. The warning sound may be voice or non-voice.
- the information output unit 181 may change the presence / absence of a warning and / or the warning sound according to the movement of the user.
- the information output unit 181 expresses the stronger the distance between the heat trace region and the visible image taken by the visible light camera 21 or the physical camera 210 or the person or the hand of the person included in the physical image.
- the sound expressed by may be output.
- An example of a sound expressed by a method of expression that is stronger as the distance between the heat trace area and the person or the hand of the person is closer is that the sound is louder as the distance between the heat trace area and the person or the hand of the person is closer.
- the distance between the heat trace region and the visible image taken by the visible light camera 21 or the physical camera 210 or the person or the hand of the person included in the physical image is equal to or less than a predetermined distance.
- Information indicating the heat trace region may be indicated by sound only in the case of.
- the information output unit 181 uses two or more directional speakers having different sound transmission regions to output sound expressed in a stronger expression method as the distance between the heat trace region and the person or the person's hand is closer. May be good. In this case, the information output unit 181 controls so that the directional speaker whose distance between the heat trace region and the sound transmission region is shorter outputs the sound expressed by a stronger expression method.
- the information output unit 181 may display information indicating the heat trace area as text.
- the information output unit 181 may send a text corresponding to the information indicating the heat trace area by e-mail, a short message service, a notification in an SNS (Social Network Service), or a message function.
- the text corresponding to the information indicating the heat trace area is transmitted to, for example, a manager of the shooting range and a person who disinfects the shooting range.
- the destination of the text corresponding to the information indicating the heat trace area may be one or more destinations, or may be two or more destinations.
- the information output unit 181 stores the number of times the heat trace region is extracted x, and corresponds to the information indicating the heat trace region when the number of times the heat trace region is extracted exceeds a predetermined number of times. You may output the text. Further, the information output unit 181 may project a text such as "someone touched the door" with a projector or display it on a display. Examples of displays here are digital signage, electric bulletin boards, and TVs installed near the shooting range.
- the information output unit 181 may output information indicating a region other than the heat trace region instead of the information indicating the heat trace region.
- the region other than the heat trace region is a region where the temperature does not rise due to human touch, that is, a region where it is unlikely that a virus such as a new coronavirus is attached. By showing a region where a virus such as a new coronavirus is unlikely to be attached, it is possible to show the user a region that is safe to touch.
- the information output unit 181 may brightly display information indicating an area other than the heat trace area. Further, the information output unit 181 may divide a region other than the heat trace region into a plurality of partial regions, randomly select one of the partial regions, and display information indicating the selected partial region brightly. .. As a result, the information output unit 181 can show the user an area that is safe to touch, for example, in the handrail.
- the information output unit 181 may display information indicating a region other than the heat trace region brighter than the information indicating the heat trace region. Therefore, for example, the information output unit 181 displays the shooting range brightly in advance, and when the heat trace region is extracted, the extracted heat trace region is darkly displayed.
- the information output unit 181 may display information indicating the heat trace region by a different expression method depending on the number of times the heat trace region is extracted. That is, the information output unit 181 may output different information as information indicating the heat trace region, depending on the number of times the heat trace region is extracted. For example, the information output unit 181 may display the region as the heat trace region, which has been extracted more frequently, in a stronger expression method. For example, as illustrated in FIG. 22, the information output unit 181 may display a region as a heat trace region, which is extracted more frequently, in a darker color. By doing so, it is possible to inform the user that the darker the area, the more it should not be touched or should be disinfected.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
第一実施形態では、細菌又はウイルスの除菌又は消毒に役立てることを狙い、熱画像を利用して人が触れた場所を検知する装置、方法及びプログラムが開示される。恒温動物である人は手足に熱を帯びているため、モノに触れると触れた場所に熱が一定時間残る。例えば、人が触れた場所に残った熱により特定される人が触れた痕跡である熱痕跡(Heat trace)をスマートフォンのパスコード解読に悪用する方法などが報告されている(「Yomna Abdelrahman, Mohamed Khamis, Stefan Schneegass, and Florian Alt. 2017. Stay Cool! Understanding Thermal Attacks on Mobile-based User Authentication. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17), pp. 3751.3763, 2017」)。
第一実施形態の狙いを図1及び図2を用いて説明する。図1は、同じ場所を同時に撮影した可視画像及び熱画像の模式図である。図1には、取っ手付きの扉に手が触れる様子を可視光カメラとサーマルカメラで同時に撮影した画像の模式図が示されている。図1(a)及び図1(a')は、それぞれ時刻t1(手が扉に触れる前)の可視画像及び熱画像である。図1(b)及び図1(b')は、それぞれ時刻t2(手が扉に触れている状態)の可視画像及び熱画像である。図1(c)及び図1(c')は、時刻t3(手が扉に触れた後)の可視画像及び熱画像である。人が扉に触れると、図1(c')のように、触れた場所の温度が上昇する。
続いて、熱痕跡領域出力部18は、熱痕跡領域を示す情報をユーザが確認可能なように出力する(S109)。ここで、ユーザとは、熱痕跡領域を示す情報が通知される対象となる人のことである。例えば、熱痕跡領域出力部18は、熱痕跡領域を示す情報の一例である2値画像の白い画素を背景可視画像上に合成した画像を出力してもよい。また、出力形態は所定の形態に限定されない。例えば、表示装置への装置、補助記憶装置102への保存、ネットワークを介してユーザ端末へ送信する等が行われてもよい。
第一実施形態では、熱画像との違いを分かり易く説明するために、撮影範囲に存在する人やモノなどの実体物の画像として、可視カメラで取得した可視画像を用いる例による説明を行った。しかし、当然ながら、実体物の画像であればどのような波長帯によって撮影されたものであってもよく、この形態を第一実施形態の変形例として説明する。
実体物カメラ210は、実体物を撮影するためのカメラである。実体物カメラ210で得られる電磁波の波長帯が可視光の波長帯に限られないという部分を除いて、第一実施形態の可視光カメラ21と同様である。実体物カメラ210は、或る撮影範囲の実体物の画像、より詳しくは、当該撮影範囲に存在する実体物を実体物カメラ210側から見た像により構成される画像である実体物画像、を撮影する。実体物カメラ210で撮影された実体物画像は、実体物画像取得部110に入力される。
実体物画像取得部110は、可視画像に代えて実体物画像を取得して出力する部分を除いて、第一実施形態の可視画像取得部11と同様である。すなわち、実体物画像取得部110は、実体物カメラ210によって撮影された実体物画像を取得して(S101)、取得した実体物画像を出力する。実体物画像取得部110で取得された実体物画像は、背景実体物画像生成部120及び/または差分実体物画像生成部130に入力される。
背景実体物画像生成部120は、可視画像に代えて実体物画像に対して処理を行う部分、及び、背景可視画像に代えて背景実体物画像を生成して出力する部分を除いて、第一実施形態の背景可視画像生成部12と同様である。すなわち、背景実体物画像生成部120は、実体物画像取得部110で取得された実体物画像に基づいて、撮影範囲の背景の実体物画像である背景実体物画像を生成して(S104)、生成した背景実体物画像を出力する。撮影範囲の背景の実体物画像とは、撮影範囲の中に背景として存在している物の画像であり、より詳しくは、撮影範囲の中に背景として存在している物を実体物カメラ210側から見た像により構成される画像である。生成された背景実体物画像は、差分実体物画像生成部130に入力される。
差分実体物画像生成部130は、可視画像に代えて実体物画像に対して処理を行う部分、背景可視画像に代えて背景実体物画像に対して処理を行う部分、及び、差分可視画像に代えて差分実体物画像を生成する部分を除いて、第一実施形態の差分可視画像生成部13と同様である。すなわち、差分実体物画像生成部130は、実体物画像取得部110で取得された或る時刻の実体物画像と、背景実体物画像生成部120から入力された背景実体物画像との間の差分の画像である差分実体物画像を生成して(S107)、生成した当該或る時刻の差分実体物画像を出力する。要するに、差分実体物画像生成部130は、実体物を撮影するための実体物カメラ210で或る範囲を撮影することにより得られる実体物の画像である実体物画像と、当該或る範囲の背景の実体物画像である背景実体物画像との間の差分の画像である差分実体物画像を生成する。生成された差分実体物画像は、熱痕跡領域抽出部17に入力される。なお、差分実体物画像生成部130は、後述する熱痕跡領域抽出部17の処理の対象としたい時刻の実体物画像を対象として動作すればよい。
サーマルカメラ22は、第一実施形態のサーマルカメラ22と同様である。すなわち、サーマルカメラ22は、実体物が発する熱を撮影するためのカメラである。サーマルカメラ22は、実体物カメラ210と同じ撮影範囲の実体物が発する熱の画像、より詳しくは、当該撮影範囲に存在する実体物が放射した熱をサーマルカメラ22側から取得した像により構成される画像である熱画像、を撮影する。サーマルカメラ22で撮影された熱画像は、熱画像取得部14に入力される。
熱画像取得部14は、第一実施形態の熱画像取得部14と同様である。すなわち、熱画像取得部14は、サーマルカメラ22によって撮影された熱画像を取得して(S101)、取得した熱画像を出力する。熱画像取得部14で取得された熱画像は、背景熱画像生成部15及び/または差分熱画像生成部16に入力される。
背景熱画像生成部15は、第一実施形態の背景熱画像生成部15と同様である。すなわち、背景熱画像生成部15は、熱画像取得部14で取得された熱画像に基づいて、撮影範囲の背景の熱画像である背景熱画像を生成して(S104)、生成した背景熱画像を出力する。撮影範囲の背景の熱画像とは、撮影範囲の中に背景として存在している物が発する熱の画像であり、より詳しくは、撮影範囲の中に背景として存在している物が発する熱をサーマルカメラ22側から取得した像により構成される画像である。生成された背景熱画像は、差分熱画像生成部16に入力される。
差分熱画像生成部16は、第一実施形態の差分熱画像生成部16と同様である。すなわち、差分熱画像生成部16は、熱画像取得部14で取得された或る時刻の熱画像と、背景熱画像生成部15から入力された背景熱画像との間の差分の画像である差分熱画像を生成して(S107)、生成した当該或る時刻の差分熱画像を出力する。要するに、差分熱画像生成部16は、実体物が発する熱を撮影するためのサーマルカメラ22で実体物カメラ210と同じ或る範囲を撮影することにより得られる実体物が発する熱の画像である熱画像と、当該或る範囲の背景の熱画像である背景熱画像との間の差分の画像である差分熱画像を生成する。生成された差分熱画像は、熱痕跡領域抽出部17に入力される。なお、差分熱画像生成部16は、後述する熱痕跡領域抽出部17の処理の対象としたい時刻の熱画像を対象として動作すればよい。
熱痕跡領域抽出部17は、差分可視画像に代えて差分実体物画像を用いる部分、背景可視画像に代えて背景実体物画像を用いる部分、を除いて、第一実施形態の熱痕跡領域抽出部17と同様である。すなわち、熱痕跡領域抽出部17は、差分実体物画像生成部130が生成した或る時刻の差分実体物画像と差分熱画像生成部16が生成した当該或る時刻の差分熱画像とに基づいて、熱画像から実体物の領域を取り除くことで、熱痕跡領域を抽出して(S108)、抽出した当該或る時刻の熱痕跡領域を示す情報を出力する。具体的には、熱痕跡領域抽出部17は、差分実体物画像における背景実体物画像と異なる領域それぞれを差分実体物領域とし、差分熱画像における背景熱画像と異なる領域それぞれを差分熱領域として、1以上の差分熱領域のうち、1以上の差分実体物領域のいずれとも非類似な領域を熱痕跡領域として抽出する。抽出された熱痕跡領域を示す情報は、熱痕跡領域出力部18に入力される。熱痕跡領域抽出部17は、例えば、熱痕跡領域を示す情報として、熱痕跡領域部分を白とし、それ以外を黒とした2値画像を生成し、生成した2値画像を熱痕跡領域出力部18に出力する。熱痕跡領域抽出部17は、背景実体物画像も熱痕跡領域出力部18に出力してもよい。なお、熱痕跡領域抽出部17は、熱痕跡領域抽出部17の処理の対象としたい時刻の差分実体物画像と差分熱画像を対象として動作すればよい。熱痕跡領域抽出部17の処理の対象としたい時刻は、所定間隔の各時刻であってもよいし、熱痕跡領域抽出装置10の運用者等によって指定された時刻であってもよい。
熱痕跡領域出力部18は、背景可視画像に代えて背景実体物画像を用いる部分を除いて、第一実施形態の熱痕跡領域出力部18と同様である。熱痕跡領域出力部18は、熱痕跡領域を示す情報をユーザが確認可能なように出力する(S109)。例えば、熱痕跡領域出力部18は、熱痕跡領域を示す情報の一例である2値画像の白い画素を背景実体物画像上に合成した画像を出力する。また例えば、熱痕跡領域出力部18は、プロジェクタ等を利用して、適宜位置合わせをした上で環境内の撮影範囲に対して熱痕跡領域を示す2値画像を投影する。なお、熱痕跡領域出力部18は、例えば、熱痕跡領域を示す情報が入力されたのを契機に熱痕跡領域を示す情報の出力を開始すればよい。また、熱痕跡領域出力部18は、例えば、熱痕跡領域を示す情報の出力を所定時間継続して、当該所定時間経過後に出力を終了してもよいし、熱痕跡領域抽出装置10の運用者等による出力終了の指示されるまで熱痕跡領域を示す情報の出力を継続して、当該指示に従って熱痕跡領域を示す情報の出力を終了するようにしてもよい。
第一実施形態及び第一実施形態の変形例の熱痕跡領域抽出装置及び方法により、人が触れることにより温度が上昇した領域、例えばウイルス等が付着している可能性がある領域をユーザに伝えることができる。しかし、人が触れることにより温度が上昇した領域であっても、人が触れた後にアルコール等により消毒が行われた領域は、人が触れたとしてもウイルス等に感染する可能性が低い領域である。第二実施形態の熱痕跡領域抽出装置及び方法は、このような人が触れたとしてもウイルス等に感染する可能性が低い領域を抽出するものである。
実体物カメラ210は、第一実施形態の変形例の実体物カメラ210と同様である。実体物カメラ210は、或る撮影範囲の実体物の画像を撮影する。実体物カメラ210で撮影された実体物画像は、実体物画像取得部110に入力される。
実体物画像取得部110は、第一実施形態の変形例の実体物画像取得部110と同様である。すなわち、実体物画像取得部110は、実体物カメラ210によって撮影された実体物画像を取得して(S101)、取得した実体物画像を出力する。実体物画像取得部110で取得された実体物画像は、背景実体物画像生成部120及び/または差分実体物画像生成部130に入力される。
背景実体物画像生成部120は、第一実施形態の変形例の背景実体物画像生成部120と同様である。すなわち、背景実体物画像生成部120は、実体物画像取得部110で取得された実体物画像に基づいて、撮影範囲の背景の実体物画像である背景実体物画像を生成して(S104)、生成した背景実体物画像を出力する。生成された背景実体物画像は、差分実体物画像生成部130に入力される。
差分実体物画像生成部130は、第一実施形態の変形例の差分実体物画像生成部130と同様である。すなわち、差分実体物画像生成部130は、実体物画像取得部110で取得された或る時刻の実体物画像と、背景実体物画像生成部120から入力された背景実体物画像との間の差分の画像である差分実体物画像を生成して(S107)、生成した当該或る時刻の差分実体物画像を出力する。要するに、差分実体物画像生成部130は、実体物を撮影するための実体物カメラで或る範囲を撮影することにより得られる実体物の画像である実体物画像と、当該或る範囲の背景の実体物画像である背景実体物画像との間の差分の画像である差分実体物画像を生成する。第二実施形態では、差分実体物画像生成部130で生成された差分実体物画像は、熱痕跡領域抽出部17だけではなく冷痕跡領域抽出部171にも入力される。なお、差分実体物画像生成部130は、後述する熱痕跡領域抽出部17の処理の対象としたい時刻の対象としたい時刻の実体物画像と、後述する冷痕跡領域抽出部171の処理の対象としたい時刻の対象としたい時刻の実体物画像と、を対象として動作すればよい。
サーマルカメラ22は、第一実施形態の変形例のサーマルカメラ22と同様である。すなわち、サーマルカメラ22は、実体物が発する熱を撮影するためのカメラである。サーマルカメラ22は、実体物カメラ210と同じ撮影範囲の実体物が発する熱の画像を撮影する。サーマルカメラ22で撮影された熱画像は、熱画像取得部14に入力される。
熱画像取得部14は、第一実施形態の変形例の熱画像取得部14と同様である。すなわち、熱画像取得部14は、サーマルカメラ22で撮影された熱画像を取得して(S101)、取得した熱画像を出力する。熱画像取得部14で取得された熱画像は、背景熱画像生成部15、及び/または、差分熱画像生成部16と差分冷画像生成部161、に入力される。
背景熱画像生成部15は、第一実施形態の変形例の背景熱画像生成部15と同様である。すなわち、背景熱画像生成部15は、熱画像取得部14で取得された熱画像に基づいて、撮影範囲の背景の熱画像である背景熱画像を生成して(S104)、生成した背景熱画像を出力する。生成された背景熱画像は、差分熱画像生成部16及び差分冷画像生成部161に入力される。
差分熱画像生成部16は、第一実施形態の変形例の差分熱画像生成部16と同様である。差分熱画像生成部16は、熱画像取得部14で取得された或る時刻の熱画像から、背景熱画像生成部15から入力された背景熱画像に対して温度が高い領域(「高温度領域」とする)を抽出することで、差分熱画像を生成して(S107)、生成した当該或る時刻の差分熱画像を出力する。すなわち、差分熱画像生成部16は、熱画像と背景熱画像との間の差分の領域であって熱画像における温度が背景熱画像における温度よりも高い領域を高温度領域として含む画像である差分熱画像を生成する。例えば、差分熱画像生成部16は、各画素について、熱画像の画素値と背景熱画像の画素値の差が一定の閾値以上であり、かつ、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも高い場合には、差分熱画像の画素値を1とし、そうでない場合(すなわち、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも高いものの熱画像の画素値と背景熱画像の画素値の差が一定の閾値未満である場合と、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも高くない場合)には、差分熱画像の画素値を0とすることで、差分熱画像を生成する。生成された差分熱画像は、熱痕跡領域抽出部17に入力される。なお、差分熱画像生成部16は、後述する熱痕跡領域抽出部17の処理の対象としたい時刻の熱画像を対象として動作すればよい。
差分冷画像生成部161は、差分熱画像に代えて、熱画像と背景熱画像との間の差分の画像であって背景よりも温度が低下した領域の画像である差分冷画像を生成する部分を除いて、第一実施形態の変形例の差分熱画像生成部16と同様である。差分冷画像生成部161は、熱画像取得部14で取得された或る時刻の熱画像から、背景熱画像生成部15から入力された背景熱画像に対して温度が低い領域(「低温度領域」とする)を抽出することで、差分冷画像を生成して(S107)、生成した当該或る時刻の差分冷画像を出力する。すなわち、差分冷画像生成部161は、熱画像と背景熱画像との間の差分の領域であって熱画像における温度が背景熱画像における温度よりも低い領域を低温度領域として含む画像である差分冷画像を生成する。例えば、差分冷画像生成部161は、各画素について、熱画像の画素値と背景熱画像の画素値の差が一定の閾値以上であり、かつ、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも低い場合には、差分冷画像の画素値を1とし、そうでない場合(すなわち、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも低いものの熱画像の画素値と背景熱画像の画素値の差が一定の閾値未満である場合と、熱画像の画素値が示す温度が背景熱画像の画素値が示す温度よりも低くない場合)には、差分熱画像の中の画素値を0とすることで、差分冷画像を生成する。生成された差分冷画像は、冷痕跡領域抽出部171に入力される。なお、差分冷画像生成部161は、後述する冷痕跡領域抽出部171の処理の対象としたい時刻の熱画像を対象として動作すればよい。
熱痕跡領域抽出部17は、第一実施形態の変形例の熱痕跡領域抽出部17と同様である。すなわち、熱痕跡領域抽出部17は、差分実体物画像生成部130が生成した差分実体物画像と差分熱画像生成部16が生成した差分熱画像とに基づいて、差分熱画像に含まれる高温度領域から実体物の領域を取り除くことで、熱痕跡領域を抽出して(S108)、抽出した熱痕跡領域を示す情報を出力する。具体的には、熱痕跡領域抽出部17は、差分実体物画像における背景実体物画像と異なる領域それぞれを差分実体物領域とし、差分熱画像における背景熱画像と異なる領域それぞれを差分熱領域として、1以上の差分熱領域のうち、1以上の差分実体物領域のいずれとも非類似な領域を熱痕跡領域として抽出する。抽出された熱痕跡領域を示す情報は、情報出力部181に入力される。なお、第二実施形態の熱痕跡領域抽出部17で抽出される熱痕跡領域は、実体物が触れたことにより温度が上昇した領域であり、例えば、撮影範囲に背景として存在している物のうちの人が触れた領域である。熱痕跡領域抽出部17は、例えば、熱痕跡領域を示す情報として、熱痕跡領域部分を白とし、それ以外を黒とした2値画像を生成し、生成した2値画像を情報出力部181に出力する。なお、熱痕跡領域抽出部17は、熱痕跡領域抽出部17の処理の対象としたい時刻の差分実体物画像と差分熱画像を対象として動作すればよい。熱痕跡領域抽出部17の処理の対象としたい時刻は、所定間隔の各時刻であってもよいし、熱痕跡領域抽出装置10の運用者等によって指定された時刻であってもよい。
冷痕跡領域抽出部171は、差分熱画像に代えて差分冷画像を用いて処理を行う部分、及び、熱痕跡領域ではなく実体物が発する熱の温度が低下した領域である冷痕跡領域を抽出する部分を除いて、第一実施形態の変形例の熱痕跡領域抽出部17と同様である。すなわち、冷痕跡領域抽出部171は、差分実体物画像生成部130が生成した差分実体物画像と差分冷画像生成部161が生成した差分冷画像とに基づいて、差分冷画像に含まれる低温度領域から実体物の領域を取り除くことで、冷痕跡領域を抽出して(S1081)、抽出した冷痕跡領域を示す情報を出力する。具体的には、冷痕跡領域抽出部171は、差分実体物画像における背景実体物画像と異なる領域それぞれを差分実体物領域とし、差分冷画像における背景熱画像と異なる領域それぞれを差分冷領域として、1以上の差分冷領域のうち、1以上の差分実体物領域のいずれとも非類似な領域を冷痕跡領域として抽出する。抽出された冷痕跡領域を示す情報は、情報出力部181に入力される。冷痕跡領域抽出部171は、例えば、冷痕跡領域を示す情報として、冷痕跡領域部分を白とし、それ以外を黒とした2値画像を生成し、生成した2値画像を情報出力部181に出力する。なお、冷痕跡領域抽出部171は、冷痕跡領域抽出部171の処理の対象としたい時刻の差分実体物画像と差分冷画像を対象として動作すればよい。冷痕跡領域抽出部171の処理の対象としたい時刻は、所定間隔の各時刻であってもよいし、熱痕跡領域抽出装置10の運用者等によって指定された時刻であってもよい。
情報出力部181は、少なくとも冷痕跡領域を示す情報を用いて、少なくとも冷痕跡領域を示す情報をユーザが確認可能なように出力する(S1091)。情報出力部181は、例えば、下記の第1例や第2例のように動作する。
第1例の情報出力部181は、熱痕跡領域を示す情報及び冷痕跡領域を示す情報を用いて、熱痕跡領域を示す情報及び冷痕跡領域を示す情報の少なくとも一方をユーザが確認可能なように出力する。より具体的には、第1例の情報出力部181は、熱痕跡領域抽出部17で一旦抽出された熱痕跡領域を示す情報はユーザが確認可能なように出力し続けるものの、当該熱痕跡領域が対応する時刻よりも後の時刻の冷痕跡領域と当該熱痕跡領域が重なっているかどうかを判定し、その判定結果に応じて熱痕跡領域を示す情報の全部又は一部の出力を止める。すなわち、第1例の情報出力部181は、熱痕跡領域抽出部17で熱痕跡領域が抽出されたことを契機に、当該熱痕跡領域を示す情報の出力を開始し、冷痕跡領域抽出部171で冷痕跡領域が抽出されたことを契機に、当該熱痕跡領域と当該冷痕跡領域とが重なっているかどうかを判定し、重なっていると判定された場合には、当該熱痕跡領域を示す情報の全部又は一部の出力を終了する。以下、第1例の情報出力部181の処理の例を説明する。
第2例の情報出力部181は、冷痕跡領域を示す情報を用いて、冷痕跡領域を示す情報をユーザが確認可能なように出力する。より具体的には、第2例の情報出力部181は、冷痕跡領域抽出部171で抽出された冷痕跡領域を示す情報のみをユーザが確認可能なように出力する。例えば、情報出力部181は、冷痕跡領域を示す情報の一例である2値画像の白い画素を背景実体物画像上に合成した画像を出力する。また例えば、情報出力部181は、プロジェクタ等を利用して、適宜位置合わせをした上で環境内の撮影範囲に対して冷痕跡領域を示す2値画像を投影する。これにより、ユーザは、アルコール等により消毒が行われ触れたとしてもウイルス等に感染する可能性が低い領域を知ることができる。なお、情報出力部181は、例えば、冷痕跡領域を示す情報が入力されたのを契機に冷痕跡領域を示す情報の出力を開始すればよい。
第一実施形態の熱痕跡領域抽出装置及び方法により、人が触れることにより温度が上昇した領域、ウイルス等が付着している可能性がある領域をユーザに伝えることができる。しかし、人が触れることにより温度が上昇した領域であっても、触れた人の体温が低い場合には、触れた人がウイルス等に感染している可能性が低いことから、その領域はウイルス等が付着している可能性が低い領域である。一方、体温が高い人が触れることにより温度が上昇した領域は、触れた人がウイルス等に感染している可能性が高いことから、ウイルス等が付着している可能性が高い領域である。これらのことから、ウイルス等が付着している可能性がある領域をより適切にユーザに伝えるために、人が触れることにより温度が上昇した領域である熱痕跡領域を抽出するだけではなく、熱痕跡領域に対応する熱痕跡の原因となる人や体温を特定できるとよい。第三実施形態の熱痕跡領域抽出装置及び方法は、熱痕跡領域に対応する熱痕跡の原因となる温度を特定するものである。
熱痕跡原因特定部31は、最新の時刻に対応する処理で抽出された熱痕跡領域のうちの、過去の時刻に対応する処理で抽出された熱痕跡領域が熱痕跡となった原因とは異なる原因による熱痕跡領域を特定する第1の特定処理と、第1の特定処理で特定された熱痕跡領域が熱痕跡となった原因となった領域である原因領域を熱痕跡領域抽出部17の過去の時刻に対応する処理で抽出された差分熱画像を用いて特定する第2の特定処理と、第2の特定処理で特定された原因領域の温度を原因領域の特定に用いた差分熱画像に対応する熱画像から特定する第3の特定処理と、を行う(S310)。第3の特定処理で特定された原因領域の温度(すなわち、各熱痕跡領域に対応する熱痕跡の原因領域の温度)は、情報出力部181に入力される。
情報出力部181には、熱痕跡領域抽出部17で抽出された熱痕跡領域を示す情報と、熱痕跡原因特定部31により特定された、各熱痕跡領域に対応する熱痕跡の原因領域の温度と、が入力される。
第一実施形態の熱痕跡領域抽出装置及び方法により、人が触れることにより温度が上昇した領域、言い換えれば新型コロナウイルス等のウイルスが付着している可能性がある領域をプロジェクタ等で投影することによりユーザに伝えることができる。しかし、プロジェクタが可視光で投影をすると、プロジェクタが投影した可視光も可視光カメラ21が撮影してしまうことになり、撮影範囲に存在する人やモノなどにプロジェクタが投影した光による濃淡が重畳された可視画像が得られることになる。特にプロジェクタの出力が大きい場合には、可視画像には、撮影範囲に存在する人やモノの像にプロジェクタが投影した光による濃淡が強く重畳していることから、第一実施形態の熱痕跡領域抽出装置及び方法では熱痕跡領域を適切に抽出することができない可能性がある。
第一実施形態から第四実施形態の熱痕跡領域抽出装置及び方法において、情報出力部181は様々なバリエーションで熱痕跡領域を示す情報を出力してもよい。第五実施形態の熱痕跡領域抽出装置及び方法は、第一実施形態から第四実施形態の熱痕跡領域抽出装置及び方法において、情報出力部181が様々なバリエーションで熱痕跡領域を示す情報を出力するものである。なお、記載の簡略化のために、第五実施形態では、第一実施形態及び第一実施形態の変形例の熱痕跡領域出力部18のことを、情報出力部181と呼ぶことにする。
情報出力部181は、熱痕跡領域を示す情報を透過型ディスプレイに表示してもよい。透過型ディスプレイの例は、透過型のヘッドマウントディスプレイである。この場合には、情報出力部181は、透過型ディスプレイを備えて、透過型ディスプレイの位置や向きを検出することによって、透過型ディスプレイを介してユーザが視認する実空間(すなわち、実体物カメラ及びサーマルカメラの撮影範囲)に実在する熱痕跡領域と、透過型ディスプレイ上に表示される熱痕跡領域を示す情報と、が位置合わせされるようにする。これにより、ユーザは、透過型ディスプレイに表示された熱痕跡領域を示す情報が、透過型ディスプレイを介して視認される実空間に、重畳して表示されているように視認することができ、ウイルス等が付着している可能性がある場所を直感的に理解することができる。
情報出力部181は、熱痕跡領域を示す情報を撮影範囲である或る範囲の画像又は映像に重ねてディスプレイに表示してもよい。
ディスプレイの例は、デジタルサイネージ、スマートフォン、タブレット端末、電光掲示板、TVである。例えば、情報出力部181は、熱痕跡領域を示す情報を実体物画像又は可視画像に重ねてディスプレイに表示してもよい。実体物画像又は可視画像は、実体物画像取得部110又は可視画像取得部11で取得された実体物画像又は可視画像である。また、例えば、情報出力部181は、可視光カメラ21及び実体物カメラ210とは異なるカメラで撮影された、撮影範囲である或る範囲の画像又は映像に、熱痕跡領域を示す情報を重ねてディスプレイに表示してもよい。可視光カメラ21及び実体物カメラ210とは異なるカメラの例は、スマートフォン又はタブレット端末に設けられたカメラである。
情報出力部181は、熱痕跡領域を示す情報を音で示してもよい。例えば、情報出力部181は、「ここをx名の人が触りました。」等の音声を出力してもよい。そのために、情報出力部181は、熱痕跡領域が抽出された回数xを記憶しておいてもよい。また、情報出力部181は、消毒後に熱痕跡領域が抽出された回数が所定の回数を超えた場合に、警告音を出力してもよい。警告音は音声であっても非音声であってもよい。
例えば、情報出力部181は、熱痕跡領域と、可視光カメラ21又は実体物カメラ210で撮影された可視画像又は実体物画像に含まれる人物又は人物の手と、の距離が近いほど強い表現方法で表現された音を出力してもよい。熱痕跡領域と人物又は人物の手との距離が近いほど強い表現方法で表現された音の例は、熱痕跡領域と人物又は人物の手との距離が近いほど大きな音である。また、情報出力部181は、熱痕跡領域と、可視光カメラ21又は実体物カメラ210で撮影された可視画像又は実体物画像に含まれる人物又は人物の手との距離が予め定められた距離以下の場合にのみ熱痕跡領域を示す情報を音で示してもよい。
情報出力部181は、熱痕跡領域を示す情報をテキストで表示してもよい。例えば、情報出力部181は、熱痕跡領域を示す情報に対応するテキストを、電子メール、ショートメッセージサービス、SNS(Social Network Service)内の通知又はメッセージ機能で送信してもよい。熱痕跡領域を示す情報に対応するテキストは、例えば、撮影範囲の管理者、撮影範囲を消毒する者に送信される。熱痕跡領域を示す情報に対応するテキストの送信先は、1個以上の送信先であってもよいし、2個以上の送信先であってもよい。
情報出力部181は、熱痕跡領域を示す情報に代えて、熱痕跡領域以外の領域を示す情報を出力してもよい。熱痕跡領域以外の領域は、人が触れることにより温度が上昇した領域ではない領域、言い換えれば新型コロナウイルス等のウイルスが付着している可能性が低い領域である。新型コロナウイルス等のウイルスが付着している可能性が低い領域を示すことで、触れても安全な領域をユーザに示すことができる。
情報出力部181は、熱痕跡領域として抽出された回数に応じて異なる表現方法で熱痕跡領域を示す情報を表示してもよい。すなわち、情報出力部181は、熱痕跡領域として抽出された回数に応じて異なる情報を熱痕跡領域を示す情報として出力してもよい。例えば、情報出力部181は、熱痕跡領域として抽出された回数が多い領域ほど強い表現方法で表示してもよい。例えば、情報出力部181は、図22に例示するように、熱痕跡領域として抽出された回数が多い領域ほど濃い色で表示してもよい。このようにすることで、濃い色の領域ほど、触れてはいけないこと又は消毒する必要があることをユーザに伝えることができる。
Claims (6)
- 実体物を撮影するための実体物カメラで或る範囲を撮影することにより得られる実体物の画像である実体物画像と、前記或る範囲の背景の実体物画像である背景実体物画像との間の差分の画像である差分実体物画像を生成する差分実体物画像生成部と、
実体物が発する熱を撮影するためのサーマルカメラで前記或る範囲を撮影することにより得られる実体物が発する熱の画像である熱画像と、前記背景の熱画像である背景熱画像との間の差分の画像である差分熱画像を生成する差分熱画像生成部と、
前記差分実体物画像と前記差分熱画像とに基づいて、前記熱画像から実体物の領域を取り除くことで熱痕跡領域を抽出する熱痕跡領域抽出部と、
を含む熱痕跡領域抽出装置。 - 請求項1の熱痕跡領域抽出装置であって、
前記抽出された熱痕跡領域を示す情報を可視光で前記或る範囲に投影する情報出力部を更に含み、
前記実体物カメラで得られる電磁波の波長帯と、前記サーマルカメラで得られる電磁波の波長帯と、前記投影される可視光の波長帯とは、重ならない、
熱痕跡領域抽出装置。 - 請求項1または2の熱痕跡領域抽出装置であって、
前記差分実体物画像における前記背景実体物画像と異なる領域を差分実体物領域とし、前記差分熱画像における前記背景熱画像と異なる領域を差分熱領域として、
前記熱痕跡領域抽出部は、
1以上の前記差分熱領域のうち、1以上の前記差分実体物領域のいずれとも非類似な領域を前記熱痕跡領域として抽出する、
熱痕跡領域抽出装置。 - 実体物を撮影するための実体物カメラで或る範囲を撮影することにより得られる実体物の画像である実体物画像と、前記或る範囲の背景の実体物画像である背景実体物画像との間の差分の画像である差分実体物画像を生成する差分実体物画像生成ステップと、
実体物が発する熱を撮影するためのサーマルカメラで前記或る範囲を撮影することにより得られる実体物が発する熱の画像である熱画像と、前記背景の熱画像である背景熱画像との間の差分の画像である差分熱画像を生成する差分熱画像生成ステップと、
前記差分実体物画像と前記差分熱画像とに基づいて、前記熱画像から実体物の領域を取り除くことで熱痕跡領域を抽出する熱痕跡領域抽出ステップと、
を含む熱痕跡領域抽出方法。 - 請求項4の熱痕跡領域抽出方法であって、
前記抽出された熱痕跡領域を示す情報を可視光で前記或る範囲に投影する情報出力ステップを更に含み、
前記実体物カメラで得られる電磁波の波長帯と、前記サーマルカメラで得られる電磁波の波長帯と、前記投影される可視光の波長帯とは、重ならない、
熱痕跡領域抽出方法。 - 請求項1から3の何れかの熱痕跡領域抽出装置の各部としてコンピュータを機能させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/031,141 US20230384162A1 (en) | 2020-10-16 | 2021-10-12 | Heat trace area extraction apparatus, heat trace area extraction method and program |
JP2022556990A JP7485070B2 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JPPCT/JP2020/039126 | 2020-10-16 | ||
PCT/JP2020/039126 WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022080350A1 true WO2022080350A1 (ja) | 2022-04-21 |
Family
ID=81208224
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/039126 WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
PCT/JP2021/037676 WO2022080350A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
PCT/JP2021/037677 WO2022080351A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
PCT/JP2021/037679 WO2022080353A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
PCT/JP2021/037678 WO2022080352A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/039126 WO2022079910A1 (ja) | 2020-10-16 | 2020-10-16 | 熱痕跡領域抽出方法、熱痕跡領域抽出装置及びプログラム |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/037677 WO2022080351A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
PCT/JP2021/037679 WO2022080353A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
PCT/JP2021/037678 WO2022080352A1 (ja) | 2020-10-16 | 2021-10-12 | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (5) | US20240019309A1 (ja) |
JP (5) | JPWO2022079910A1 (ja) |
WO (5) | WO2022079910A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047807A1 (ja) * | 2022-08-31 | 2024-03-07 | 日本電信電話株式会社 | 閾値決定装置、方法及びプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024052973A1 (ja) * | 2022-09-06 | 2024-03-14 | 日本電信電話株式会社 | 背景更新装置、方法及びプログラム |
WO2024052974A1 (ja) * | 2022-09-06 | 2024-03-14 | 日本電信電話株式会社 | 背景更新装置、方法及びプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100256A (ja) * | 2007-10-17 | 2009-05-07 | Hitachi Kokusai Electric Inc | 物体検知装置 |
JP2017067503A (ja) * | 2015-09-28 | 2017-04-06 | 富士通株式会社 | 位置推定装置、位置推定方法、及び位置推定プログラム |
JP2017090277A (ja) * | 2015-11-11 | 2017-05-25 | 国立大学法人九州大学 | 把持情報取得装置、ロボット教示装置及びロボット制御装置、並びに把持情報取得方法、ロボット教示方法及びロボット制御方法 |
WO2020027210A1 (ja) * | 2018-08-03 | 2020-02-06 | 日本電信電話株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
-
2020
- 2020-10-16 US US18/248,295 patent/US20240019309A1/en active Pending
- 2020-10-16 WO PCT/JP2020/039126 patent/WO2022079910A1/ja active Application Filing
- 2020-10-16 JP JP2022556819A patent/JPWO2022079910A1/ja active Pending
-
2021
- 2021-10-12 WO PCT/JP2021/037676 patent/WO2022080350A1/ja active Application Filing
- 2021-10-12 US US18/031,341 patent/US20230377165A1/en active Pending
- 2021-10-12 JP JP2022556990A patent/JP7485070B2/ja active Active
- 2021-10-12 WO PCT/JP2021/037677 patent/WO2022080351A1/ja active Application Filing
- 2021-10-12 JP JP2022556991A patent/JP7485071B2/ja active Active
- 2021-10-12 US US18/031,346 patent/US20230377159A1/en active Pending
- 2021-10-12 US US18/031,147 patent/US20230412768A1/en active Pending
- 2021-10-12 JP JP2022556993A patent/JPWO2022080353A1/ja active Pending
- 2021-10-12 WO PCT/JP2021/037679 patent/WO2022080353A1/ja active Application Filing
- 2021-10-12 WO PCT/JP2021/037678 patent/WO2022080352A1/ja active Application Filing
- 2021-10-12 US US18/031,141 patent/US20230384162A1/en active Pending
- 2021-10-12 JP JP2022556992A patent/JP7485072B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100256A (ja) * | 2007-10-17 | 2009-05-07 | Hitachi Kokusai Electric Inc | 物体検知装置 |
JP2017067503A (ja) * | 2015-09-28 | 2017-04-06 | 富士通株式会社 | 位置推定装置、位置推定方法、及び位置推定プログラム |
JP2017090277A (ja) * | 2015-11-11 | 2017-05-25 | 国立大学法人九州大学 | 把持情報取得装置、ロボット教示装置及びロボット制御装置、並びに把持情報取得方法、ロボット教示方法及びロボット制御方法 |
WO2020027210A1 (ja) * | 2018-08-03 | 2020-02-06 | 日本電信電話株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047807A1 (ja) * | 2022-08-31 | 2024-03-07 | 日本電信電話株式会社 | 閾値決定装置、方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20230377159A1 (en) | 2023-11-23 |
WO2022080352A1 (ja) | 2022-04-21 |
US20230377165A1 (en) | 2023-11-23 |
JPWO2022080352A1 (ja) | 2022-04-21 |
US20230412768A1 (en) | 2023-12-21 |
JPWO2022080351A1 (ja) | 2022-04-21 |
WO2022079910A1 (ja) | 2022-04-21 |
WO2022080353A1 (ja) | 2022-04-21 |
JPWO2022079910A1 (ja) | 2022-04-21 |
JP7485070B2 (ja) | 2024-05-16 |
US20230384162A1 (en) | 2023-11-30 |
JP7485072B2 (ja) | 2024-05-16 |
JP7485071B2 (ja) | 2024-05-16 |
WO2022080351A1 (ja) | 2022-04-21 |
US20240019309A1 (en) | 2024-01-18 |
JPWO2022080350A1 (ja) | 2022-04-21 |
JPWO2022080353A1 (ja) | 2022-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022080350A1 (ja) | 熱痕跡領域抽出装置、熱痕跡領域抽出方法及びプログラム | |
JP2016514305A5 (ja) | ||
KR20160050755A (ko) | 전자 장치 및 그의 홍채 인식 방법 | |
JP2006189712A (ja) | 情報提示装置、情報提示方法及びプログラム | |
RU2015121090A (ru) | Устройство и способы обнаружения камеры | |
KR20150075906A (ko) | 시선 추적 장치 및 방법 | |
JP2010063033A (ja) | 物体抽出装置、物体抽出方法および物体抽出プログラム | |
US20170126966A1 (en) | Photography method using gaze detection | |
JP2016532217A (ja) | グリントにより眼を検出する方法および装置 | |
JP2006258651A (ja) | 不特定撮像装置の検出方法および装置 | |
JP2014157316A (ja) | プロジェクタ装置 | |
JP2011043545A (ja) | 画像表示装置 | |
KR101002072B1 (ko) | 펄스 구동 방식의 투영 영상 터치 장치 | |
CN111754575A (zh) | 对象定位方法、投影方法、装置和投影仪 | |
US20190297315A1 (en) | Structured-light method and system of dynamically generating a depth map | |
JP2005269473A (ja) | 監視システム | |
JP2007249743A (ja) | 移動物体の識別方法、移動物体の識別装置、及び移動物体の識別処理を実行させるプログラム | |
JP2016152467A (ja) | 追尾装置、追尾方法及び追尾プログラム | |
EP2912834A1 (en) | Security camera system and method for eliminating video aberrations from excessive ir illumination | |
JP2010033141A (ja) | 眼鏡検出装置 | |
JP2006092290A (ja) | 不審人物検出装置 | |
Grimaldo et al. | Multiple Face Mask Scanner as an Innovation for Implementing Health Protocols | |
CN112532801A (zh) | 基于热分布检测的vr设备的安全防护方法和系统 | |
CN112446967A (zh) | Vr设备的安全防护方法和系统及其vr眼镜 | |
KR101749211B1 (ko) | 모바일 기기 연동형 적외선 기반 혈관 시각화 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21880089 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022556990 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18031141 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21880089 Country of ref document: EP Kind code of ref document: A1 |