WO2020012906A1 - Dispositif d'affichage, dispositif de traitement d'image et procédé de commande - Google Patents

Dispositif d'affichage, dispositif de traitement d'image et procédé de commande Download PDF

Info

Publication number
WO2020012906A1
WO2020012906A1 PCT/JP2019/024410 JP2019024410W WO2020012906A1 WO 2020012906 A1 WO2020012906 A1 WO 2020012906A1 JP 2019024410 W JP2019024410 W JP 2019024410W WO 2020012906 A1 WO2020012906 A1 WO 2020012906A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
aerosol
space
type
sensor
Prior art date
Application number
PCT/JP2019/024410
Other languages
English (en)
Japanese (ja)
Inventor
大山 達史
宮下 万里子
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019108042A external-priority patent/JP7113375B2/ja
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN201980029258.5A priority Critical patent/CN112041665A/zh
Publication of WO2020012906A1 publication Critical patent/WO2020012906A1/fr
Priority to US17/120,085 priority patent/US11694659B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A50/00TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE in human health protection, e.g. against extreme weather
    • Y02A50/20Air quality improvement or preservation, e.g. vehicle emission control or emission reduction by using catalytic converters

Definitions

  • the present disclosure relates to a display device, an image processing device, and a control method.
  • Patent Documents 1 and 2 disclose such a terminal device.
  • the present disclosure provides a display device, an image processing device, and a control method capable of accurately presenting the position of an aerosol.
  • a display screen a first image obtained by imaging a space with a camera, and a second image representing at least one type of aerosol existing in the space are synthesized.
  • the image processing device includes an acquisition circuit that acquires three-dimensional coordinate data representing a position in the space of at least one type of aerosol existing in the space, and a processor.
  • the processor is configured to synthesize a first image obtained by imaging the space with a camera and a second image representing the at least one aerosol existing in the space, based on the three-dimensional coordinate data.
  • a composite image is generated. The position of the at least one aerosol in the depth direction in the first image is reflected in the second image.
  • a control method includes a light source that emits irradiation light toward at least one type of object in a space, and a light detection that detects return light from the at least one type of object. And a sensor that outputs data representing the result of detection of the return light by the photodetector, and a display device, comprising: obtaining the data from the sensor; Based on data, generating three-dimensional coordinate data representing the position of the at least one type of object in the space, and obtaining the three-dimensional coordinate data by imaging the space with a camera based on the three-dimensional coordinate data.
  • the second image imaged to generate a synthesized image synthesized and the display device comprises, possible to display the composite image.
  • one embodiment of the present disclosure can be realized as a program that causes a computer to execute the control method.
  • it can be realized as a non-transitory computer-readable recording medium storing the program.
  • the position of the aerosol can be presented with high accuracy.
  • FIG. 1 is a top view illustrating a space to which the non-contact sensing system according to the embodiment is applied.
  • FIG. 2 is a block diagram illustrating a configuration of the non-contact sensing system according to the embodiment.
  • FIG. 3 is a diagram illustrating an example of the sensor device according to the embodiment.
  • FIG. 4 is a diagram illustrating an example of sensor data output from the sensor device according to the embodiment.
  • FIG. 5 is a block diagram showing a conditional expression for determining a management level in the non-contact sensing system according to the embodiment.
  • FIG. 6 is a diagram illustrating an example of a reference value database indicating a reference value for each substance.
  • FIG. 7 is a diagram for explaining a method of determining an aerosol contour by the non-contact sensing system according to the embodiment.
  • FIG. 8 is a diagram illustrating a representative value of the management level for each object in the space obtained by the non-contact sensing system according to the embodiment.
  • FIG. 9 is a diagram illustrating a display example on the display screen of the display device according to the embodiment.
  • FIG. 10 is a sequence diagram showing an operation of the non-contact sensing system according to the embodiment.
  • FIG. 11 is a flowchart illustrating a process of converting captured image data into a 3D database, among operations of the non-contact sensing system according to the embodiment.
  • FIG. 12 is a flowchart illustrating a process of converting the sensor data into a 3D database in the operation of the non-contact sensing system according to the embodiment.
  • FIG. 13 is a diagram illustrating an example of a 3D database generated by the non-contact sensing system according to the embodiment.
  • FIG. 14 is a flowchart illustrating a level distribution generation process in the operation of the non-contact sensing system according to the embodiment.
  • FIG. 15 is a flowchart illustrating a process of generating auxiliary information in the operation of the non-contact sensing system according to the embodiment.
  • FIG. 16 is a diagram showing another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 17 is a diagram illustrating another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 18 is a diagram illustrating another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 19 is a diagram illustrating another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 20 is a diagram showing another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 21 is a diagram illustrating another example of the display on the display screen of the display device according to the embodiment.
  • FIG. 22 is a diagram illustrating a display device integrally including the non-contact sensing system according to the embodiment.
  • a display screen In the display device according to an embodiment of the present disclosure, a display screen, a first image obtained by imaging a space with a camera, and a second image representing at least one type of aerosol existing in the space are synthesized.
  • the position of the aerosol in the depth direction is reflected in the second image representing the aerosol. For this reason, the position of the aerosol in the depth direction with respect to the first image as well as the position of the aerosol in the vertical and horizontal directions represented by the first image is displayed on the display screen. Thereby, the position of the aerosol can be accurately presented.
  • the first image represents a two-dimensional space
  • the control unit further projects three-dimensional coordinate data representing a position of the at least one type of aerosol in the space on the two-dimensional space.
  • the second image may be generated, and the first image and the second image may be synthesized to generate the synthesized image.
  • control unit further obtains the three-dimensional coordinate data from a sensor that obtains a position of the at least one type of aerosol in the space, and converts the first image into a pseudo three-dimensional image.
  • the second image may be generated by associating the three-dimensional image with the three-dimensional coordinate data and projecting the three-dimensional coordinate data onto the two-dimensional space.
  • the second image may include a contour indicating a range where the at least one type of aerosol exists, and distance information indicating a distance from a reference position in the space to a representative position in the contour.
  • the range in which the aerosol exists can be displayed on the display screen, and the position in the depth direction of the aerosol can be displayed as a representative position. For this reason, the position of the aerosol can be displayed simply, that is, in a display mode that is easy for the user viewing the display screen to understand.
  • the representative position may be a center of gravity of the concentration distribution of the at least one type of aerosol in the outline.
  • the position of the aerosol can be presented with high accuracy by setting the center of gravity of the concentration distribution as the representative position.
  • the distance information may be a numerical value indicating the distance.
  • the position of the aerosol can be displayed in a display mode that is easy for the user to understand.
  • the distance information may be a color given in the outline, which is predetermined according to the distance.
  • the position of the aerosol can be displayed in a display mode that is easy for the user to understand.
  • the composite image may show a three-dimensional model including the space and a contour representing a range in which the at least one type of aerosol exists.
  • the second image is a moving image in which a plurality of images are temporally switched, each of the plurality of images corresponds to a distance from a reference position in the space, and It may include a contour representing a range where one type of aerosol exists.
  • the second image may further reflect the concentration of the at least one aerosol.
  • the second image may include level information indicating a concentration level of the at least one type of aerosol.
  • the aerosol concentration can be displayed in a simple manner, that is, in a display mode that is easy to understand for a user viewing the display screen.
  • the at least one type of aerosol may include a plurality of types of aerosols, and the second image may represent the plurality of types of aerosols in different display modes.
  • control unit may display an image for calling a user's attention on the display screen.
  • the image processing apparatus includes an acquisition circuit that acquires three-dimensional coordinate data representing a position in the space of at least one type of aerosol existing in the space, and a processor,
  • the processor is configured to synthesize a first image obtained by imaging the space with a camera and a second image representing the at least one aerosol existing in the space, based on the three-dimensional coordinate data.
  • the second image reflects the position of the at least one aerosol in the depth direction in the first image.
  • the position of the aerosol in the depth direction is reflected in the second image representing the aerosol.
  • the composite image displayed on the display screen not only the position of the aerosol in the vertical and horizontal directions represented by the first image but also the position of the aerosol in the depth direction with respect to the first image appear.
  • control method detects a light source that emits irradiation light toward at least one type of object in a space, and detects return light from the at least one type of object.
  • a control method for a system including a photodetector and outputting a data representing a result of detection of the return light by the photodetector, and a display device, wherein the data is obtained from the sensor.
  • a second image representing the at least one type of object existing in the space wherein the at least one type of object is a depth direction in the first image.
  • the second image position is reflected to generate a synthesized image synthesized, and the display device comprises, possible to display the composite image.
  • the display device displays not only the position of the object in the vertical and horizontal directions represented by the first image but also the position of the object in the depth direction with respect to the first image. Thereby, the position of the target object can be presented with high accuracy.
  • the return light is fluorescence emitted when the at least one kind of object is excited by the irradiation light, and in the generation of the composite image, the fluorescence is further analyzed to obtain the at least one kind of light. May be determined, and the type may be reflected in the second image.
  • the irradiation light includes a predetermined polarization component
  • the type of the at least one target object is further determined based on a degree of depolarization of the polarization component included in the return light. May be determined, and the type may be reflected in the second image.
  • the three-dimensional coordinate data is calculated based on a difference between a time at which the irradiation light is emitted and a time at which the return light is detected. It may be generated using a relative positional relationship and coordinates of the sensor in the space.
  • the detection of the object and the distance to the detected object can be executed by the same light source and photodetector.
  • the configuration of the sensor device can be simplified.
  • the at least one type of object may be an organic substance attached to an object existing in the space.
  • a substance containing an organic substance such as a vomit or pollen can be detected, and its position can be presented with high accuracy.
  • the at least one type of object may be an aerosol existing in the space.
  • a substance floating in the air such as pollen or dust
  • its position can be presented with high accuracy.
  • the return light may be backscattered light generated by the irradiation light being scattered by the at least one type of object.
  • the aerosol can be detected with high accuracy.
  • a computer-readable recording medium includes a light source that emits irradiation light toward at least one type of object in a space, and a return light from the at least one type of object.
  • a computer readable program storing a program for controlling a system including a photodetector for detecting light, a sensor for outputting data representing a result of detection of the return light by the photodetector, and a display device
  • a recording medium wherein when the program is executed by the computer, acquiring the data from the sensor; and representing a position of the at least one type of object in the space based on the data.
  • a program includes a light source that emits irradiation light toward at least one type of object in a space, and a light that detects return light from the at least one type of object.
  • a sensor that includes a detector and outputs a data representing a result of detection of the return light by the light detector, and a display device, a computer-executable program for controlling a system including: Obtaining the data, generating three-dimensional coordinate data representing the position of the at least one type of object in the space based on the data, and using the camera to generate the three-dimensional coordinate data based on the three-dimensional coordinate data.
  • a computer is configured to generate a combined image in which a second image reflecting the position of the elephant in the first image in the depth direction is reflected, and to cause the display device to display the combined image. Let it.
  • all or a part of a circuit, a unit, a device, a member, or a part, or all or a part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) May be performed by one or more electronic circuits.
  • the LSI or IC may be integrated on one chip or may be configured by combining a plurality of chips.
  • functional blocks other than the storage element may be integrated on one chip.
  • LSI LSI
  • IC integrated circuit
  • FPGA Field Programmable Gate Array
  • the software is recorded on one or more non-transitory recording media such as a ROM, an optical disk, a hard disk drive, etc., and when the software is executed by a processor, a function specified by the software is executed. It is performed by a processor and peripheral devices.
  • the system or apparatus may include one or more non-transitory storage media on which the software is recorded, a processor, and any required hardware devices, such as an interface.
  • each drawing is a schematic diagram, and is not necessarily strictly illustrated. Therefore, for example, the scales and the like do not always match in each drawing. Further, in each of the drawings, substantially the same configuration is denoted by the same reference numeral, and redundant description will be omitted or simplified.
  • the non-contact sensing system captures an image of a space and detects an object existing in the space in a non-contact manner.
  • the non-contact sensing system displays, on a display screen, a combined image in which a first image indicating a captured space and a second image representing a detected target object are combined. At this time, the position of the detected object in the depth direction in the first image is reflected in the second image.
  • FIG. 1 is a top view showing a space 95 to which the non-contact sensing system according to the present embodiment is applied.
  • the space 95 is, for example, a room in a building such as a residence, an office, a nursing facility, or a hospital.
  • the space 95 is, for example, a space partitioned by walls, windows, doors, floors, ceilings, and the like, and is a closed space, but is not limited thereto.
  • the space 95 may be an outdoor open space.
  • the space 95 may be an internal space of a moving object such as a bus or an airplane.
  • a first target object 90 to be detected by the non-contact sensing system exists in a space 95.
  • the first object 90 is, specifically, an aerosol floating in the space 95.
  • the aerosol includes dust such as dust or dust, suspended particulate matter such as PM2.5, biological particles such as pollen, or fine water droplets. Biological particles also include molds and mites floating in the air.
  • the aerosol may also include substances that are dynamically generated from the human body, such as coughing or sneezing.
  • the aerosol may include a substance of air quality, such as carbon dioxide (CO 2 ).
  • the detection target is not limited to the aerosol.
  • the target object may be an organic stain.
  • the organic soil is food or vomit attached to an object such as a wall, a floor, or furniture that forms the space 95, and does not have to be floating in the air.
  • FIG. 2 is a block diagram showing a configuration of the non-contact sensing system 10 according to the present embodiment.
  • the non-contact sensing system 10 includes a camera 20, a first sensor 30, a second sensor 40, a third sensor 50, a computer 60, a server device 70, a tablet terminal 80, Is provided.
  • the configuration of the non-contact sensing system 10 is not limited to the example shown in FIG.
  • the non-contact sensing system 10 may include only one of the first sensor 30, the second sensor 40, and the third sensor 50. That is, the number of sensor devices included in the non-contact sensing system 10 may be only one, or may be plural.
  • the non-contact sensing system 10 may not include the computer 60 and the server device 70. Further, for example, the non-contact sensing system 10 may include a display connected to the computer 60 instead of the tablet terminal 80.
  • each of the camera 20, the first sensor 30, the second sensor 40, the third sensor, the server device 70, and the tablet terminal 80 has a communication interface. Various data and information are transmitted and received via the communication interface.
  • the camera 20 generates a captured image by capturing an image of the space 95.
  • the captured image is an example of a first image generated when the camera 20 captures an image of the space 95.
  • the camera 20 is, for example, a fixed-point camera fixed at a position where the space 95 can be imaged, but is not limited thereto.
  • the camera 20 may be a movable camera in which at least one of the shooting direction and the shooting position is variable.
  • the camera 20 may generate a plurality of captured images by imaging the space 95 from a plurality of viewpoints.
  • the camera 20 transmits the captured image data obtained by the imaging to the computer 60.
  • the camera 20 may be a visible light camera that captures a space visible to humans.
  • the first sensor 30, the second sensor 40, and the third sensor 50 are each an example of a sensor device that detects an object to be detected in a non-contact manner. That is, the non-contact sensing system 10 according to the present embodiment includes three sensor devices according to the type of the detection target.
  • the first target object 90 illustrated in FIG. 2 is pollen detected by the first sensor 30.
  • the second object 92 is dust detected by the second sensor 40.
  • the third object 94 is an organic stain detected by the third sensor 50.
  • Each of the first sensor 30, the second sensor 40, and the third sensor 50 is a non-contact sensor device using, for example, LIDAR (Laser Imaging Detection and Ranging).
  • LIDAR Laser Imaging Detection and Ranging
  • FIG. 3 is a diagram showing a first sensor 30 which is an example of the sensor device according to the present embodiment.
  • the first sensor 30 is an autonomously moving sensor device.
  • the second sensor 40 and the third sensor 50 have the same configuration as the first sensor 30.
  • the first sensor 30 can run on the floor 96 of the space 95. After emitting irradiation light L1 at a predetermined position on floor surface 96, first sensor 30 receives return light L2 returning from first object 90. The first sensor 30 measures a distance to the first object 90 based on a time difference between emission of the irradiation light L1 and reception of the return light L2. Further, the first sensor 30 measures the density of the first target 90 based on the intensity of the return light L2.
  • the first sensor 30 includes a light source 32, a photodetector 34, and a signal processing circuit 36, as shown in FIG.
  • the light source 32 is a light source that emits the irradiation light L1 toward the first object 90 in the space 95.
  • the light source 32 is, for example, an LED (Light Emitting Diode) or a laser element.
  • the irradiation light L1 emitted from the light source 32 includes a wavelength component for exciting the first object 90.
  • the irradiation light L1 is light having a peak wavelength in a range from 220 nm to 550 nm.
  • the irradiation light L1 is, for example, pulsed light.
  • the photodetector 34 is a photodetector that detects the return light L2 from the first object 90.
  • the return light L2 detected by the light detector 34 is fluorescence emitted when the first object 90 is excited by the irradiation light L1 emitted from the light source 32. Fluorescence is light containing more long wavelength components than the irradiation light L1.
  • the photodetector 34 is, for example, a photodiode having a light receiving sensitivity to a wavelength component of fluorescence.
  • the photodetector outputs an output signal corresponding to the intensity of the received fluorescence to the signal processing circuit.
  • the output signal is, for example, an electric signal whose signal intensity increases as the intensity of the received fluorescence increases.
  • the signal processing circuit 36 processes the output signal output from the photodetector 34 to determine the distance to the first object 90 and the density of the first object 90. As shown in FIG. 2, the signal processing circuit 36 includes a position information obtaining unit 37 and a density information obtaining unit 38.
  • the position information acquiring unit 37 acquires position information indicating a three-dimensional position of the first object 90 in the space 95.
  • the position information includes a distance and a direction to the first target object 90.
  • the position information acquisition unit 37 calculates the distance by a TOF (Time $ Of $ Flight) method.
  • the position information acquisition unit 37 acquires distance information based on a time difference between emission of the irradiation light L1 by the light source 32 and detection of fluorescence by the photodetector 34.
  • the distance information includes a distance ri to the first object 90, a horizontal angle ⁇ i indicating a direction in which the first object 90 is detected, and a vertical angle ⁇ i.
  • the direction in which the first target object 90 is detected is the direction in which the light source 32 emits the irradiation light L1.
  • the density information acquisition unit 38 acquires density information indicating the density of the first target object 90. Specifically, the density information acquisition unit 38 determines the density of the first target 90 according to the signal strength of the output signal. For example, when the signal intensity is Si, the density Di is calculated based on the following equation (1).
  • is a constant.
  • the subscripts “i” of Di and Si and the above-mentioned ri, ⁇ i, and ⁇ i indicate the data numbers of the sensor data. Note that the method of calculating the density Di used by the density information acquisition unit 38 is not limited to this. For example, instead of the output signal itself, the density information acquisition unit 38 may use a signal obtained by removing a noise component from the output signal.
  • the signal processing circuit 36 may determine the type of the first target object 90 by analyzing the fluorescence. Specifically, the signal processing circuit 36 determines the type of the first object 90 based on a combination of the wavelength of the irradiation light and the wavelength of the fluorescence. For example, in the first sensor 30, the light source 32 may emit a plurality of irradiation lights corresponding to a plurality of excitation wavelengths, and the photodetector 34 may receive a plurality of fluorescences corresponding to a plurality of light reception wavelengths. The signal processing circuit 36 can accurately determine the type of the first object 90 that has generated the fluorescence by generating a three-dimensional matrix of the excitation wavelength, the reception wavelength, and the reception intensity, that is, a so-called fluorescent fingerprint.
  • the signal processing circuit 36 outputs the density information indicating the determined density Di and the position information to the computer 60 as sensor data.
  • the first sensor 30 and the computer 60 are wirelessly connected, for example, so that data can be transmitted and received.
  • the first sensor 30 performs wireless communication based on a wireless communication standard such as Wi-Fi (registered trademark), Bluetooth (registered trademark), or ZigBee (registered trademark). Note that the first sensor 30 and the computer 60 may be connected by wire.
  • the second sensor 40 detects the second object 92 by emitting irradiation light toward the second object 92 and receiving return light from the second object 92.
  • the second object 92 is a substance that does not emit fluorescence, and is, for example, dust.
  • the second sensor 40 includes a light source 42, a photodetector 44, and a signal processing circuit 46.
  • the light source 42, the light detector 44, and the signal processing circuit 46 correspond to the light source 32, the light detector 34, and the signal processing circuit 36 of the first sensor 30, respectively.
  • the light source 42 is a light source that emits irradiation light toward the second object 92.
  • the light source 42 is, for example, an LED or a laser element.
  • the irradiation light emitted from the light source 42 does not need to excite the second object 92. Therefore, a wavelength component selected from a wide wavelength band can be used as the wavelength component of the irradiation light.
  • the irradiation light emitted from the light source 42 is light having a peak wavelength in a range from 300 nm to 1300 nm. That is, the irradiation light may be ultraviolet light, visible light, or near-infrared light.
  • the irradiation light is, for example, pulsed light.
  • the photodetector 44 is a photodetector that detects the return light from the second object 92.
  • the return light detected by the photodetector 44 is backscattered light generated when the irradiation light emitted from the light source 42 is scattered by the second object 92.
  • the backscattered light is, for example, scattered light due to Mie scattering.
  • the backscattered light has the same wavelength component as the irradiation light.
  • the photodetector 44 is a photodiode having a light receiving sensitivity to a wavelength component of irradiation light.
  • the photodetector 44 outputs an output signal corresponding to the intensity of the received backscattered light to the signal processing circuit 46.
  • the output signal is, for example, an electric signal whose signal intensity increases as the intensity of the received backscattered light increases.
  • the irradiation light emitted from the light source 42 may include a predetermined polarization component.
  • the signal processing circuit 46 may determine the type of the second object 92 based on the degree of depolarization of the polarization component included in the return light.
  • the polarization component is, for example, linearly polarized light, but may be circularly polarized light or elliptically polarized light.
  • the signal processing circuit 46 can determine the type of the second object 92 based on the degree of depolarization of the backscattered light. For example, the degree of depolarization of yellow sand is about 10%, and the degree of depolarization of pollen is about 1 to about 4%.
  • the signal processing circuit 46 determines the distance to the second object 92 and the density of the second object 92 by processing the output signal output from the photodetector 44.
  • the signal processing circuit 46 includes a position information acquisition unit 47 and a density information acquisition unit 48, as shown in FIG. The specific operation of determining the distance and the density is the same as that of the signal processing circuit 36 of the first sensor 30.
  • the third sensor 50 detects the third object 94 by emitting irradiation light toward the third object 94 and receiving return light from the third object 94.
  • the third object 94 is an organic stain that emits fluorescence when irradiated with excitation light.
  • the third sensor 50 includes a light source 52, a photodetector 54, and a signal processing circuit 56.
  • the signal processing circuit 56 includes a position information acquisition unit 57 and a density information acquisition unit 58.
  • the light source 52, the light detector 54, and the signal processing circuit 56 correspond to the light source 32, the light detector 34, and the signal processing circuit 36 of the first sensor 30, respectively.
  • the first sensor 30 and the third sensor 50 differ in the direction in which each light source emits irradiation light. For example, while the light source 32 emits irradiation light toward the air in the space 95, the light source 52 emits irradiation light toward the floor or wall surface of the space 95.
  • the operation of each of the light source 52, the photodetector 54, and the signal processing circuit 56 is the same as that of each of the light source 32, the photodetector 34, and the signal processing circuit 36.
  • the first sensor 30, the second sensor 40, and the third sensor 50 each detect an object located in a direction in which the irradiation light is emitted. At this time, when there are a plurality of objects in the emission direction of the irradiation light, the return light returns at different times according to the positions of the objects. Therefore, a plurality of objects located in the emission direction of the irradiation light can be detected at a time based on the time at which the return light is received. When the target object does not exist in the emission direction of the irradiation light, the return light does not return. Therefore, when the return light does not return, it is detected that the target object does not exist on the path of the irradiation light. Each of the first sensor 30, the second sensor 40, and the third sensor 50 transmits a detection result to the computer 60 as sensor data.
  • FIG. 4 is a diagram showing an example of a database including sensor data output from the sensor device according to the present embodiment.
  • the database shown in FIG. 4 is managed by the processor 64 of the computer 60 and stored in the memory 66.
  • the substance name Mi, the sensor data, and the sensor reference position are associated with each other for each i.
  • the sensor data includes the density Di, the distance ri, the horizontal angle ⁇ i, and the vertical angle ⁇ i.
  • Data number No. i is attached to each sensor data received by the computer 60.
  • the processor 64 assigns data numbers in ascending order, for example, in the order in which the communication interface 62 receives the sensor data.
  • the substance name Mi is information indicating the type of the detection target.
  • the type of the target object corresponds to each sensor device. Therefore, the processor 64 can determine the substance name Mi corresponding to the sensor data by determining the transmission destination of the sensor data received by the communication interface 62.
  • the sensor data of data number 1 indicates that the sensor data is transmitted from the first sensor 30 that detects pollen.
  • the density Di is a value calculated based on the above equation (1).
  • Each of the signal processing circuits 36, 46, and 56 of each sensor device calculates based on the signal strength Si.
  • the distance ri, the horizontal angle ⁇ i, and the vertical angle ⁇ i are data indicating the three-dimensional position of the object obtained using LIDAR. Since the position data obtained by LIDAR is shown in a polar coordinate system, in the present embodiment, the computer 60 converts the position data into a three-dimensional orthogonal coordinate system. Details of the coordinate conversion will be described later.
  • the sensor reference position is, for example, the installation position of the sensor device that has transmitted the sensor data among the first sensor 30, the second sensor 40, and the third sensor 50.
  • the sensor reference position does not change.
  • the computer 60 is an example of an image processing device, and includes a communication interface 62, a processor 64, and a memory 66, as shown in FIG.
  • the communication interface 62 transmits and receives data by communicating with each device constituting the non-contact sensing system 10.
  • Communication with each device is, for example, wireless communication based on a wireless communication standard such as Wi-Fi (registered trademark), Bluetooth (registered trademark), or ZigBee (registered trademark), but may be wired communication.
  • the communication interface 62 is an example of an acquisition circuit that acquires three-dimensional coordinate data.
  • the communication interface 62 acquires sensor data from each of the first sensor 30, the second sensor 40, and the third sensor 50 by communicating with each of the first sensor 30, the second sensor 40, and the third sensor 50.
  • the sensor data includes position information, which is an example of three-dimensional coordinate data representing the position of at least one type of object in the space 95. Further, the sensor data includes density information.
  • the three-dimensional coordinate data is calculated based on the difference between the time at which the irradiation light is emitted and the time at which the return light is detected, and the relative positional relationship between the sensor device and the object, and the space 95 of the sensor device Is generated using the coordinates of The relative positional relationship corresponds to the distance ri shown in FIG.
  • the coordinates in the space 95 of the sensor device correspond to the coordinates (x0, y0, z0) indicating the reference position shown in FIG.
  • the communication interface 62 acquires captured image data from the camera 20 by, for example, communicating with the camera 20.
  • the communication interface 62 may transmit a control signal including a shooting instruction or a sensing instruction to at least one of the camera 20, the first sensor 30, the second sensor 40, and the third sensor 50.
  • the communication interface 62 further transmits level distribution information corresponding to the concentration distribution of the target object to the server device 70 by communicating with the server device 70.
  • the communication interface 62 transmits the composite image data to the tablet terminal 80 by communicating with the tablet terminal 80.
  • the processor 64 generates a composite image based on the sensor data acquired by the communication interface 62.
  • the composite image is a composite image in which a captured image representing the space 95 captured by the camera 20 and an object image are composited.
  • the object image is an example of a second image representing at least one type of object existing in the space 95.
  • the processor 64 generates a concentration distribution of the object in the space 95 based on the sensor data. Specifically, the processor 64 generates a three-dimensional distribution of the density by expressing the space 95 by coordinates in a three-dimensional orthogonal coordinate system and associating the density with each coordinate.
  • the x-axis, y-axis, and z-axis shown in FIG. 1 indicate three axes of a three-dimensional orthogonal coordinate system.
  • the x axis and the y axis are two axes parallel to the floor of the space 95, and the z axis is one axis perpendicular to the floor.
  • the setting example of the three axes is not limited to this.
  • the processor 64 generates a level distribution which is an example of the concentration distribution of the object.
  • the level distribution is a distribution of the management level Ci determined based on the density information.
  • the density Di is classified into a plurality of level values according to its magnitude.
  • the management level Ci is a level value at which the density Di indicated by the density information is classified.
  • the processor 64 determines the management level Ci based on the conditional expression shown in FIG.
  • FIG. 5 is a diagram showing a conditional expression for determining the management level Ci in the non-contact sensing system 10 according to the present embodiment.
  • the conditional expression is stored in the memory 66, for example.
  • the management level Ci is represented by five levels from “1” to “5”. Based on the relationship between the density Di and the reference value Lm, the processor 64 determines the management level Ci. As shown in FIG. 6, the reference value Lm is a value predetermined for each type of target object.
  • FIG. 6 is a diagram illustrating an example of a reference value database indicating a reference value for each substance. The reference value database is stored in the memory 66, for example.
  • the level of the management level Ci is not limited to five levels, but may be two levels, three levels, or four levels, or may be six levels or more. In the conditional expression shown in FIG. 5, the value of the coefficient (for example, “0.4”) by which the reference value Lm is multiplied is merely an example.
  • the processor 64 further determines the contour of the object based on the generated three-dimensional distribution. Further, the processor 64 determines a predetermined position in the determined contour as a representative position.
  • the object image includes the determined outline and the representative position.
  • the processor 64 determines the contour of the target object based on the density Di for each coordinate. Specifically, the processor 64 determines the contour of the aerosol existing in the space 95 based on the management level Ci calculated based on the density Di for each coordinate.
  • FIG. 7 is a diagram for explaining a method of determining an aerosol contour by the non-contact sensing system 10 according to the present embodiment.
  • a method of determining a contour in a two-dimensional level distribution defined by the x-axis and the y-axis will be described.
  • the same method can be applied to a three-dimensional case.
  • the management level Ci is calculated for each coordinate represented by the x coordinate and the y coordinate.
  • the processor 64 determines, for example, a region where the management level Ci is equal to or greater than the set value, and determines the outline of the region as the aerosol outline. For example, when the set value is “2”, the processor 64 determines the outline 90 a of the area where the management level Ci is “2” or more as the aerosol outline. In FIG. 7, the area where the management level Ci is “2” or more is shaded with dots. The example shown in FIG. 7 indicates that aerosol was detected at two places in the space.
  • the set value for determining the contour may be changeable. For example, when the set value is increased, only the portion where the concentration of the aerosol is sufficiently high can be determined as the aerosol existence range. Alternatively, when the set value is reduced, it can be determined as the existence range of the aerosol including the portion where the concentration of the aerosol is low.
  • the processor 64 may determine the contour for each set value using a plurality of set values. For example, in the example shown in FIG. 7, a contour 90a corresponding to the set value "2" and a contour 90b corresponding to the set value "3" are determined.
  • the contour 90a is the outermost contour of the determined plurality of contours, and corresponds to a contour indicating an aerosol existence range.
  • the outline 90b corresponds to an outline indicating a region where the concentration of the aerosol is higher in the range where the aerosol exists. As described above, the difference in the concentration of the aerosol can be represented by the outline within the aerosol existing range.
  • the representative position in the contour is the center of gravity of the aerosol concentration distribution in the contour.
  • the processor 64 determines the center of gravity based on the management level Ci for each coordinate existing in the contour. For example, when the coordinates of the center of gravity are (Xc, Yc, Zc), the processor 64 determines the coordinates of the center of gravity based on the following equation (2).
  • ⁇ () is an arithmetic symbol representing the sum in (). i corresponds to the coordinates located within the determined contour.
  • the representative position may be the center of gravity of the three-dimensional figure having the determined contour as the outer periphery.
  • the memory 66 is a storage device for storing captured image data and sensor data.
  • the memory 66 stores a program executed by the processor 64, parameters necessary for executing the program, and the like.
  • the memory 66 also functions as an area for executing a program by the processor 64.
  • the memory 66 has, for example, a non-volatile memory such as an HDD (Hard Disk Drive) or a semiconductor memory, and a volatile memory such as a RAM (Random Access Memory).
  • the server device 70 receives the level distribution information transmitted from the computer 60, and performs a predetermined process using the received level distribution information. Specifically, server device 70 alerts a person using space 95 based on the level distribution information. For example, the server device 70 generates an attention image which is an image for alerting, and transmits the generated attention image to the tablet terminal 80.
  • the server device 70 determines whether or not the detected concentration of at least one type of target object exceeds a threshold value. Specifically, the server device 70 determines whether or not the representative management level C in the space 95 exceeds a threshold. If the server device 70 determines that the representative management level C exceeds the threshold, it generates a caution image.
  • the threshold value is a predetermined fixed value, but is not limited to this. For example, the threshold may be appropriately updated by machine learning.
  • the representative management level C is calculated based on, for example, a representative value Cm of the management level for each object.
  • the representative value Cm is a value representing the management level of the corresponding object, and is, for example, the maximum value of the management level in the level distribution of the corresponding object.
  • the server device 70 calculates a representative value Cm for each object based on the level distribution.
  • FIG. 8 is a diagram showing a representative value Cm of the management level for each object in the space 95 obtained by the non-contact sensing system 10 according to the present embodiment.
  • the server device 70 calculates the representative management level C by averaging the representative values for each object. For example, in the example shown in FIG. 8, the representative management level C is “3.8”.
  • the representative management level C may not be the average value of the plurality of representative values Cm.
  • the representative management level C may be a weighted addition value of a plurality of representative values Cm.
  • the weight of pollen and dust when the weight of pollen and dust is set to 1, the weights of CO 2 , moisture, and surface organic soil may be set to 0.3, 0.1, and 0.1, respectively.
  • the weight value is not limited to these, and may be changeable based on an instruction from a user or the like.
  • the server device 70 may control an air conditioner installed in the space 95. Alternatively, the server device 70 may give preventive advice for suppressing an increase in the concentration of pollen or dust, for example.
  • the preventive advice is, for example, an instruction to prompt the user to ventilate the space 95 or an instruction to drive the device such as an air purifier disposed in the space 95.
  • the server device 70 outputs image data or audio data including preventive advice to the tablet terminal 80. For example, the server device 70 obtains information on alerting or preventive advice by referring to weather observation data and the like. In addition, the server device 70 may generate information on alerting or preventive advice by performing machine learning based on a temporal change in the concentration or the management level.
  • the tablet terminal 80 is a portable information processing terminal.
  • the tablet terminal 80 may be, for example, a multifunctional information terminal such as a tablet PC or a smartphone, or may be an information terminal dedicated to the non-contact sensing system 10.
  • the tablet terminal 80 is an example of a display device including a display screen 82 and a control unit 84.
  • the display screen 82 displays the composite image.
  • the display screen 82 is, for example, a liquid crystal display panel, but is not limited to this.
  • the display screen 82 may be a self-luminous display panel using an organic EL (Electroluminescence) element.
  • the display screen 82 is, for example, a touch panel display, and may be capable of receiving an input from a user.
  • the control unit 84 causes the display screen 82 to display the composite image.
  • the control unit 84 includes, for example, a nonvolatile memory storing a program, a volatile memory serving as a temporary storage area for executing the program, an input / output port, a processor executing the program, and the like.
  • control unit 84 acquires the composite image data transmitted from the computer 60, and displays the composite image on the display screen 82 based on the acquired composite image data. For example, the control unit 84 causes the display screen 82 to display the composite image 100 shown in FIG.
  • FIG. 9 is a diagram showing a display example on the display screen 82 of the tablet terminal 80 which is an example of the display device according to the present embodiment. As shown in FIG. 9, the composite image 100 is displayed on the display screen 82.
  • the composite image 100 is an image in which the captured image 101 and the aerosol image 102 are composited.
  • the composite image 100 is, for example, a still image.
  • the photographed image 101 represents the space 95 captured by the camera 20.
  • the captured image 101 is an example of a first image.
  • the captured image 101 is an image obtained by capturing the space 95 in the horizontal direction, but is not limited to this.
  • the captured image 101 may be, for example, an image obtained by imaging the space 95 from above. In this case, the captured image 101 corresponds to the top view illustrated in FIG.
  • the aerosol image 102 is an example of an object image representing at least one type of object existing in the space 95.
  • the aerosol image 102 represents pollen, which is an example of an aerosol.
  • the aerosol image 102 reflects the position of at least one type of object in the captured image 101 in the depth direction.
  • the aerosol image 102 is an example of a second image.
  • the aerosol image 102 includes a contour 102a and distance information 102b.
  • the outline 102a represents, for example, a range where the first object 90 detected by the first sensor 30 exists.
  • the distance information 102b is a numerical value indicating the distance from the reference position to the representative position in the outline 102a.
  • the reference position is a position existing in the space 95.
  • the reference position is the installation position of the camera 20.
  • the reference position may be a position of a person or an apparatus such as an air purifier existing in the space 95.
  • the aerosol image 102 may reflect the concentration of the aerosol.
  • the aerosol image 102 may include level information indicating the management level Ci of the concentration of the aerosol.
  • the aerosol image may represent two or more types of aerosols in different display modes. Further, when the concentration of the aerosol exceeds the threshold value, a caution image for calling the user's attention may be displayed on the display screen 82.
  • FIG. 10 is a sequence diagram showing an operation of the non-contact sensing system 10 according to the present embodiment.
  • the camera 20 captures an image of the space 95 (S10).
  • the camera 20 transmits the captured image data obtained by the imaging to the computer 60 (S12).
  • the first sensor 30 performs a process of detecting the first object 90 (S14). Specifically, in the first sensor 30, the light source 32 emits irradiation light toward the first target 90, and the photodetector 34 receives the return light from the first target 90.
  • the signal processing circuit 36 generates sensor data including the distance and the density of the first object 90 based on the signal intensity of the return light.
  • the first sensor 30 transmits the generated sensor data to the computer 60 (S16).
  • the second sensor 40 performs a process of detecting the second object 92 (S18). Specifically, in the second sensor 40, the light source 42 emits irradiation light toward the second object 92, and the photodetector 44 receives return light from the second object 92. The signal processing circuit 46 generates sensor data including the distance and the density of the second object 92 based on the signal intensity of the return light. The second sensor 40 transmits the generated sensor data to the computer 60 (S20).
  • the third sensor 50 performs the detection processing of the third object 94 (S22). Specifically, in the third sensor 50, the light source 52 emits irradiation light toward the third object 94, and the photodetector 54 receives return light from the third object 94. The signal processing circuit 56 generates sensor data including the distance and the density of the third object 94 based on the signal intensity of the return light. The third sensor 50 transmits the generated sensor data to the computer 60 (S24).
  • any one of the imaging by the camera 20 (S10), the detection processing by the first sensor 30 (S14), the detection processing by the second sensor 40 (S18), and the detection processing by the third sensor 50 (S22) is performed first. Or these may be performed simultaneously.
  • the timing at which the imaging (S10) and the detection processing (S14, S18, and S22) are performed may be performed based on an instruction from the computer 60, the server device 70, or the like.
  • Each device transmits the captured image data or the sensor data when the captured image data or the sensor data is obtained. Alternatively, each device may transmit captured image data or sensor data when receiving a request from the computer 60.
  • the computer 60 receives the captured image data and each sensor data, and performs a process of creating a 3D database based on the received captured image data and each sensor data (S26). Specifically, the processor 64 of the computer 60 converts a two-dimensional captured image into a pseudo three-dimensional image. Further, the processor 64 performs coordinate conversion of the sensor data obtained in the polar coordinate system into a three-dimensional orthogonal coordinate system.
  • FIG. 11 is a flowchart showing a process of converting captured image data into a 3D database in the operation of the non-contact sensing system 10 according to the present embodiment.
  • FIG. 11 is an example of the detailed operation of step S26 in FIG.
  • the processor 64 acquires captured image data via the communication interface 62 (S102).
  • the captured image included in the captured image data is a two-dimensional image.
  • the processor 64 converts the two-dimensional captured image into a pseudo three-dimensional image by using a generally known technique of converting a two-dimensional image into a pseudo three-dimensional image (S104).
  • the captured image data may include a distance image indicating a distance to a wall, a floor, and a ceiling constituting the space 95, a person and furniture located in the space 95, and the like.
  • the captured image data may include a plurality of captured images captured from a plurality of different viewpoints.
  • the processor 64 may generate a three-dimensional image using a captured image and a distance image, or using a plurality of captured images. Thereby, the certainty of the three-dimensional image can be increased.
  • FIG. 12 is a flowchart showing a process of converting the sensor data into a 3D database in the operation of the non-contact sensing system 10 according to the present embodiment.
  • FIG. 12 is an example of the detailed operation of step S26 in FIG.
  • the processor 64 acquires sensor data from the database stored in the memory 66 (S112). Specifically, the processor 64 acquires the distance ri, the horizontal angle ⁇ i, the vertical angle ⁇ i, and the substance name Mi. The processor 64 converts the acquired sensor data into spatial coordinates, which is a three-dimensional orthogonal coordinate system, based on the following equation (3) (S114).
  • Either the pseudo three-dimensionalization of the captured image data shown in FIG. 11 and the three-dimensionalization of the sensor data shown in FIG. 12 may be performed first or simultaneously.
  • the spatial coordinates (Xi, Yi, Zi) expressed by the three-dimensional orthogonal coordinate system are converted to the data numbers No. i.
  • FIG. 13 is a diagram illustrating an example of a 3D database generated by the non-contact sensing system 10 according to the present embodiment.
  • a substance name Mi, a concentration Di, a management level Ci, and space coordinates (Xi, Yi, Zi) are associated with each i.
  • the computer 60 After the 3D database is generated, as shown in FIG. 10, the computer 60 generates a level distribution based on the generated 3D database (S28). The computer 60 transmits the generated level distribution information indicating the level distribution to the server device 70 (S30).
  • FIG. 14 is a flowchart showing a level distribution generation process in the operation of the non-contact sensing system 10 according to the present embodiment.
  • FIG. 14 shows an example of the detailed operation of step S28 in FIG.
  • the processor 64 acquires density information and spatial coordinates (Xi, Yi, Zi) by reading from the memory 66 (S122).
  • the processor 64 determines the management level Ci based on the comparison with the reference value Lm for each substance, and generates a level distribution (S124).
  • the processor 64 determines a contour and a representative position in the contour based on the generated level distribution (S126). The process of determining the contour and the representative position is, for example, as described above with reference to FIG.
  • the computer 60 After the level distribution is generated, as shown in FIG. 10, the computer 60 generates a composite image (S32). Specifically, the computer 60 maps the contour and distance information and the captured image by mapping the level distribution to the captured image. The computer 60 transmits the composite image data to the tablet terminal 80 (S34).
  • the image including the contour and the distance information is an example of a second image generated by projecting three-dimensional coordinate data representing a position in the space of at least one type of aerosol onto a two-dimensional space represented by the captured image. is there.
  • the image including the contour and the distance information is, for example, the aerosol image 102 shown in FIG.
  • the computer 60 generates an image including contour and distance information by projecting three-dimensional coordinate data representing a position of at least one type of aerosol in the space into a two-dimensional space represented by the captured image. .
  • the computer 60 generates an image including contour and distance information by expanding the captured image into a pseudo three-dimensional image and projecting the expanded three-dimensional image in association with the three-dimensional coordinate data.
  • the correspondence between the three-dimensional image and the three-dimensional coordinate data means that the origin and three axes of the three-dimensional coordinates of the three-dimensional image and the origin and three axes of the three-dimensional coordinate of the three-dimensional coordinate data are at the same position in space. It is to match.
  • the computer 60 generates a synthesized image by synthesizing the image including the contour and the distance information with the photographed image.
  • the server device 70 acquires auxiliary information based on the level distribution information transmitted from the computer 60 (S36).
  • the auxiliary information is information including, for example, alert or preventive advice.
  • the server device 70 transmits the acquired auxiliary information to the tablet terminal 80 (S38).
  • FIG. 15 is a flowchart illustrating a process of generating auxiliary information in the operation of the non-contact sensing system 10 according to the present embodiment.
  • FIG. 15 shows an example of the detailed operation of step S36 in FIG.
  • the server device 70 determines a representative value Cm of the management level for each object in the space 95 (S132). Next, the server device 70 determines a representative management level C in the space 95 (S134). The specific method of determining the representative management level C is as described above with reference to FIG.
  • the server device 70 compares the representative management level C with the threshold (S136). When the representative management level C is larger than the threshold (Yes in S136), the server device 70 generates a caution image (S138). Preventive advice may be generated instead of a caution image. When the representative management level C is equal to or smaller than the threshold (No in S136), the processing for generating the auxiliary information ends.
  • the server device 70 may compare the representative value Cm of the management level for each object with the threshold value. That is, the server device 70 need not determine the representative management level C. For example, when at least one representative value Cm among the representative values Cm of the management levels of a plurality of objects such as pollen and dust is larger than a threshold, the server device 70 may generate a caution image.
  • the tablet terminal 80 acquires the composite image data transmitted from the computer 60 and the auxiliary information transmitted from the server device 70, and displays the composite image on the display screen 82.
  • the composite image displayed on the display screen 82 may not include the auxiliary information.
  • a composite image 100 as shown in FIG. 9 is displayed on the display screen 82.
  • FIG. 9 shows a display example that does not include auxiliary information. A display example including the auxiliary information will be described later with reference to FIG.
  • FIG. 16 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. As shown in FIG. 16, a composite image 110 is displayed on the display screen 82.
  • the composite image 110 is an image in which the captured image 101 and the aerosol images 112 and 114 are composited.
  • Each of the aerosol images 112 and 114 is an example of a second image representing at least one type of aerosol present in the space 95.
  • the aerosol images 112 and 114 each represent pollen.
  • the aerosol image 112 includes a contour 112a and distance information 112b.
  • the aerosol image 114 includes a contour 114a and distance information 114b.
  • the distance information 112b is a color given in the outline 112a, which is predetermined according to the distance.
  • the type or shade of color is predetermined in accordance with the distance.
  • the color is represented by the density of the hatched dots provided in the outline 112a.
  • the color given in the outline 114a as the distance information 114b is a darker color than the color given in the outline 112a as the distance information 112b.
  • the composite image 110 shows that the pollen represented by the aerosol image 114 is shorter in distance than the pollen represented by the aerosol image 112.
  • the distance information 112b and 114b may be represented by shades of shade instead of colors.
  • the distance may be represented by the density of dots provided in the outline 112a or 114a.
  • the aerosol image 112 further includes level information 112c.
  • the aerosol image 114 further includes level information 114c.
  • the level information 112c indicates the type and density of the aerosol represented by the aerosol image 112.
  • the density represented by the level information 112c is, for example, a value representing the management level Ci of each coordinate in the outline 112a.
  • the level information 112c indicates the maximum value or the average value of the management level Ci of each coordinate in the outline 112a.
  • the level information 112c includes a character representing pollen, which is the type of aerosol, and a numerical value indicating the management level Ci. The same applies to the level information 114c.
  • the distance to the aerosol is displayed in a display mode other than the numerical values
  • the number of characters including the numerical values in the image is increased, and the complexity can be suppressed.
  • the concentration of the aerosol can be represented using the numerical value and characters. This makes it possible to increase the amount of information that can be presented to the user while suppressing complication in the image.
  • FIG. 17 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. As shown in FIG. 17, on the display screen 82, a composite image 120 is displayed.
  • the composite image 120 is an image in which the captured image 101 and the aerosol images 122, 124, 126, and 128 are composited.
  • Each of the aerosol images 122, 124, 126 and 128 represents at least one type of aerosol present in the space 95.
  • the aerosol images 122 and 128 represent pollen.
  • Aerosol images 124 and 126 represent dust.
  • the aerosol image 122 includes a contour 122a and distance information 122b.
  • the aerosol image 124 includes an outline 124a and distance information 124b.
  • the aerosol image 126 includes an outline 126a and distance information 126b.
  • the aerosol image 128 includes an outline 128a and distance information 128b.
  • Each of the distance information 122b, 124b, 126b, and 128b is a numerical value representing the distance, similarly to the composite image 100 shown in FIG.
  • the aerosol image 122 further includes the level information 122c.
  • the aerosol image 124 further includes level information 124c.
  • the aerosol image 126 further includes level information 126c.
  • the aerosol image 128 further includes level information 128c.
  • the level information 122c is a color or hatching added to the outline 122a. Specifically, the level information 122c indicates the magnitude of the management level Ci by shading the color or shading density. For example, the level information 122c indicates that the management level Ci is higher as the color is darker or as the shade is denser. The level information 122c indicates that the management level is smaller as the color is lighter or the hatching is sparser. The same applies to the level information 124c, 126c and 128c.
  • the level information 122c indicates the type of the aerosol by the color or the type of hatching. That is, the same type of color or shading means the same aerosol. For example, in the example shown in FIG. 17, the hatching of dots represents pollen, and the grid-like shading represents dust. The same applies to the level information 124c, 126c and 128c.
  • the aerosol image 122 is of the same type as the aerosol represented by the aerosol image 128, and has a lower concentration and a greater distance than the aerosol represented by the aerosol image 128.
  • the aerosol image 124 is of the same type as the aerosol represented by the aerosol image 126, and has a higher concentration and a shorter distance than the aerosol represented by the aerosol image 126.
  • FIG. 18 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. As shown in FIG. 18, a composite image 130 is displayed on the display screen 82.
  • the composite image 130 is an image in which the photographed image 101 and the aerosol images 132, 134, 136, and 138 are composited. Aerosol images 132, 134, 136 and 138 each represent at least one aerosol present in space 95. In the example shown in FIG. 18, the aerosol images 132 and 138 represent pollen. Aerosol images 134 and 136 represent dust.
  • the aerosol image 132 includes a contour 132a, distance information 132b, and level information 132c.
  • the aerosol image 134 includes an outline 134a, distance information 134b, and level information 134c.
  • the aerosol image 136 includes an outline 136a, distance information 136b, and level information 136c.
  • the aerosol image 138 includes an outline 138a, distance information 138b, and level information 138c.
  • Distance information 132b, 134b, 136b, and 138b are each a color given in the outline that is predetermined according to the distance, similarly to the composite image 110 shown in FIG.
  • the distance information 132b, 134b, 136b, and 138b indicates the type of the aerosol by the color or the type of hatching. That is, the same type of color or shading means the same aerosol.
  • the hatching of dots represents pollen
  • the grid-like shading represents dust.
  • the level information 132c, 134c, 136c, and 138c each include a character indicating pollen, which is the type of aerosol, and a numerical value indicating the management level Ci, similarly to the composite image 110 illustrated in FIG.
  • FIG. 19 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. As shown in FIG. 19, on the display screen 82, a composite image 140 is displayed.
  • the synthesized image 140 is different from the synthesized image 130 shown in FIG. 18 in that an aerosol image 148 is synthesized instead of the aerosol image 138.
  • the aerosol image 148 includes an outline 148a, distance information 148b, and level information 148c.
  • the contour 148a, the distance information 148b, and the level information 148c are the same as the contour 138a, the distance information 138b, and the level information 138c shown in FIG. 18, respectively.
  • the level information 148c of the aerosol image 148 has the management level Ci of “3”. Since the management level Ci exceeds the threshold value, the display screen 82 displays a caution image 141 for calling attention.
  • the attention image 141 is, for example, a character calling for attention, but is not limited thereto.
  • the attention image 141 may be, for example, a predetermined figure.
  • the display mode is not particularly limited as long as it can attract the user's attention.
  • the whole of the composite image 140 displayed on the display screen 82 may be displayed blinking or the color tone may be changed.
  • preventive advice may be displayed on the display screen 82 in addition to or instead of the caution image 141.
  • the preventive advice is displayed, for example, as character information.
  • a URL Uniform Resource Locator
  • a QR code registered trademark
  • FIG. 20 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. As shown in FIG. 20, on the display screen 82, a composite image 200 is displayed.
  • the composite image 200 is a three-dimensional modeled image of the space 95 and a contour representing a range in which at least one type of aerosol exists. Specifically, the composite image 200 is a pseudo three-dimensional image whose viewpoint can be changed.
  • the composite image 200 is an image in which the captured image 201 and the aerosol image 202 are composited.
  • the aerosol image 202 includes an outline 202a and level information 202c.
  • the composite image 200 when the space 95 is viewed in the horizontal direction is displayed on the display screen 82 as in FIG.
  • the display screen 82 displays the composite image 200 when the space 95 is viewed from obliquely above, as shown in part (b) of FIG.
  • the viewpoint can be freely changed.
  • the composite image 200 may be displayed so as to be freely enlarged and reduced.
  • FIG. 21 is a diagram showing another example of the display on the display screen 82 of the tablet terminal 80 according to the present embodiment. Portions (a) to (e) of FIG. 21 each show a time change of the display on the display screen 82.
  • the composite image 300 displayed on the display screen 82 is sequentially switched, for example, from one second to several seconds.
  • the composite image 300 is an image in which the captured image 301 and a plurality of aerosol images 312, 322, 332, and 334 are composited.
  • Each of the plurality of aerosol images 312, 322, 332, and 334 corresponds to a distance from the reference position.
  • distance information 302 is displayed on the display screen 82.
  • the distance information 302 represents the distance in the depth direction by a numerical value.
  • the plurality of aerosol images 312, 322, 332, and 334 represent aerosols at distances of 0.8 m, 1.1 m, 1.4 m, and 1.7 m.
  • the aerosol image 312 includes an outline 312a and level information 312c.
  • the aerosol image 322 includes an outline 322a and level information 322c.
  • the aerosol image 332 includes an outline 332a and level information 332c.
  • the aerosol image 342 includes an outline 342a and level information 342c.
  • the contours 312a, 322a, 332a, and 342a each represent the aerosol presence range at the corresponding distance.
  • the level information 312c, 322c, 332c, and 342c indicates the concentration of the aerosol at the corresponding distance.
  • the level information 312c, 322c, 332c, and 342c represent the maximum value of the concentration of each coordinate in the outline of the aerosol at the corresponding distance. As shown in part (d) of FIG. 21, when the distance is 1.4 m, the management level Ci of the aerosol, that is, the highest concentration appears.
  • the captured image 301 is a still image, but may change with time. That is, the captured image 301 may be a moving image.
  • the first sensor 30, the second sensor 40, and the third sensor 50 have been described as examples in which each of the sensors is an autonomous mobile sensor, but the present invention is not limited to this.
  • At least one of the first sensor 30, the second sensor 40, and the third sensor 50 may be a stationary sensor device fixed at a predetermined position in the space 95.
  • the predetermined position is, for example, a ceiling, a floor, a wall, or the like that forms the space 95.
  • the value of the density itself may be displayed as a numerical value instead of the management level.
  • the line type of the outline may be different. For example, pollen may be represented by a solid outline, and dust may be represented by a broken outline.
  • the non-contact sensing system 10 may not include the camera 20.
  • a photographed image of the space 95 may be stored in the memory 66 of the computer 60 in advance.
  • At least one of the first sensor 30, the second sensor 40, and the third sensor 50 may be a contact-type sensor.
  • the communication method between the devices described in the above embodiment is not particularly limited.
  • the wireless communication method is, for example, ZigBee (registered trademark), Bluetooth (registered trademark), or short-range wireless communication such as wireless LAN (Local Area Network).
  • the wireless communication method may be communication via a wide area communication network such as the Internet. Wired communication may be performed between the devices instead of wireless communication.
  • the wired communication is power line communication (PLC) or communication using a wired LAN.
  • another processing unit may execute a process executed by a specific processing unit. Further, the order of the plurality of processes may be changed, or the plurality of processes may be executed in parallel.
  • the distribution of the components included in the non-contact sensing system 10 to a plurality of devices is an example. For example, components provided in one device may be provided in another device. Further, the non-contact sensing system 10 may be realized as a single device.
  • FIG. 22 is a diagram showing a tablet terminal 480 integrally provided with the non-contact sensing system 10 according to the embodiment.
  • the tablet terminal 480 is a plate-like device.
  • Parts (a) and (b) of FIG. 22 are plan views showing one surface and the other surface of the tablet terminal 480, respectively.
  • a display screen 482 is provided on one surface of the tablet terminal 480.
  • the camera 20, the light source 32, and the photodetector 34 are provided on the other surface of the tablet terminal 480.
  • the tablet terminal 480 includes the processor 64 and the memory 66 of the computer 60 in the embodiment.
  • the tablet terminal 480 may be configured such that the display screen 482 displaying the composite image 481, the camera 20, the sensor device, and the computer 60 are integrated.
  • the processing performed by the server device 70 may be performed by the computer 60 or the tablet terminal 80.
  • the processing performed by the computer 60 may be performed by the server device 70 or the tablet terminal 80.
  • control unit 84 of the tablet terminal 80 may generate a composite image.
  • the control unit 84 may perform the processing of converting the captured image data into a 3D database and the processing of converting the sensor data into a 3D database illustrated in FIG. 11.
  • the control unit 84 may generate the 3D database (S26), generate the level distribution (S28), and generate the composite image (S32) shown in FIG.
  • the processing described in the above embodiment may be realized by centralized processing using a single device or system, or may be realized by distributed processing using a plurality of devices.
  • the number of processors that execute the program may be one or more. That is, centralized processing or distributed processing may be performed.
  • all or a part of the components such as the control unit may be configured by dedicated hardware, or may be realized by executing a software program suitable for each component. Is also good.
  • Each component may be realized by a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as an HDD (Hard Disk Drive) or a semiconductor memory. Good.
  • a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as an HDD (Hard Disk Drive) or a semiconductor memory. Good.
  • the components such as the control unit may be configured by one or a plurality of electronic circuits.
  • Each of the one or more electronic circuits may be a general-purpose circuit or a dedicated circuit.
  • the one or more electronic circuits may include, for example, a semiconductor device, an integrated circuit (IC), or a large scale integration (LSI).
  • the IC or LSI may be integrated on one chip, or may be integrated on a plurality of chips.
  • IC or LSI the term varies depending on the degree of integration, and may be referred to as a system LSI, VLSI (Very Large Scale Integration), or ULSI (Ultra Large Scale Integration).
  • an FPGA Field Programmable Gate Array programmed after the manufacture of the LSI can be used for the same purpose.
  • general or specific aspects of the present disclosure may be realized by a system, an apparatus, a method, an integrated circuit, or a computer program.
  • the present invention may be implemented by a computer-readable non-transitory recording medium such as an optical disk, an HDD, or a semiconductor memory in which the computer program is stored.
  • the present invention may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a recording medium.
  • the present disclosure can be used as a display device or the like that can accurately indicate the exact position of an aerosol, and can be used, for example, for air conditioning control or control of space purification processing.
  • Non-Contact Sensing System 20 Camera 30 First Sensors 32, 42, 52 Light Sources 34, 44, 54 Light Detectors 36, 46, 56 Signal Processing Circuits 37, 47, 57 Position Information Acquisition Units 38, 48, 58 Acquisition of Density Information Unit 40 second sensor 50 third sensor 60 computer 62 communication interface 64 processor 66 memory 70 server device 80, 480 tablet terminal 82, 482 display screen 84 control unit 90 first object 90a, 90b contour 92 second object 94 3 object 95 space 96 floor surface 100, 110, 120, 130, 140, 200, 300, 481 composite image 101, 201, 301 photographed image 102, 112, 114, 122, 124, 126, 128, 132, 134, 136, 138, 148, 202, 312, 322, 332, 42 aerosol images 102a, 112a, 114a, 122a, 124a, 126a, 128a, 132a, 134a, 136a, 138a, 148a, 202a, 312a,

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Dispersion Chemistry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention, selon un mode de réalisation, concerne un dispositif d'affichage qui est pourvu d'un écran d'affichage et d'une unité de commande qui amène l'écran d'affichage à afficher une image combinée dans laquelle une première image obtenue par imagerie d'un espace à l'aide d'un appareil photo est combinée à une seconde image représentant au moins un type d'aérosol présent dans l'espace. La position du ou des types d'aérosol dans la direction de profondeur de la première image est réfléchie dans la seconde image.
PCT/JP2019/024410 2018-07-11 2019-06-20 Dispositif d'affichage, dispositif de traitement d'image et procédé de commande WO2020012906A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980029258.5A CN112041665A (zh) 2018-07-11 2019-06-20 显示装置、图像处理装置及控制方法
US17/120,085 US11694659B2 (en) 2018-07-11 2020-12-11 Display apparatus, image processing apparatus, and control method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018131681 2018-07-11
JP2018-131681 2018-07-11
JP2019-108042 2019-06-10
JP2019108042A JP7113375B2 (ja) 2018-07-11 2019-06-10 表示装置、画像処理装置及び制御方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/120,085 Continuation US11694659B2 (en) 2018-07-11 2020-12-11 Display apparatus, image processing apparatus, and control method

Publications (1)

Publication Number Publication Date
WO2020012906A1 true WO2020012906A1 (fr) 2020-01-16

Family

ID=69141986

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/024410 WO2020012906A1 (fr) 2018-07-11 2019-06-20 Dispositif d'affichage, dispositif de traitement d'image et procédé de commande

Country Status (1)

Country Link
WO (1) WO2020012906A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003294567A (ja) * 2002-03-29 2003-10-15 Osaka Gas Co Ltd 気体漏洩可視化および測距装置
US20060203248A1 (en) * 2005-03-11 2006-09-14 Reichardt Thomas A Natural gas leak mapper
JP2006343153A (ja) * 2005-06-07 2006-12-21 Konica Minolta Sensing Inc 3次元位置計測方法および3次元位置計測に用いる装置
JP2007232374A (ja) * 2006-02-27 2007-09-13 Shikoku Res Inst Inc ラマン散乱光による水素ガス可視化方法及びシステム
JP2011529189A (ja) * 2008-07-24 2011-12-01 マサチューセッツ インスティテュート オブ テクノロジー 吸収を利用して画像形成を行うためのシステム及び方法
JP2017032362A (ja) * 2015-07-30 2017-02-09 株式会社キーエンス 測定対象物計測プログラム、測定対象物計測方法および拡大観察装置
US20170089800A1 (en) * 2015-09-30 2017-03-30 General Monitors, Inc. Ultrasonic gas leak location system and method
WO2018061816A1 (fr) * 2016-09-28 2018-04-05 パナソニックIpマネジメント株式会社 Dispositif d'imagerie
WO2019138641A1 (fr) * 2018-01-15 2019-07-18 コニカミノルタ株式会社 Système et procédé de surveillance de gaz

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003294567A (ja) * 2002-03-29 2003-10-15 Osaka Gas Co Ltd 気体漏洩可視化および測距装置
US20060203248A1 (en) * 2005-03-11 2006-09-14 Reichardt Thomas A Natural gas leak mapper
JP2006343153A (ja) * 2005-06-07 2006-12-21 Konica Minolta Sensing Inc 3次元位置計測方法および3次元位置計測に用いる装置
JP2007232374A (ja) * 2006-02-27 2007-09-13 Shikoku Res Inst Inc ラマン散乱光による水素ガス可視化方法及びシステム
JP2011529189A (ja) * 2008-07-24 2011-12-01 マサチューセッツ インスティテュート オブ テクノロジー 吸収を利用して画像形成を行うためのシステム及び方法
JP2017032362A (ja) * 2015-07-30 2017-02-09 株式会社キーエンス 測定対象物計測プログラム、測定対象物計測方法および拡大観察装置
US20170089800A1 (en) * 2015-09-30 2017-03-30 General Monitors, Inc. Ultrasonic gas leak location system and method
WO2018061816A1 (fr) * 2016-09-28 2018-04-05 パナソニックIpマネジメント株式会社 Dispositif d'imagerie
WO2019138641A1 (fr) * 2018-01-15 2019-07-18 コニカミノルタ株式会社 Système et procédé de surveillance de gaz

Similar Documents

Publication Publication Date Title
CN111033300B (zh) 用于确定至少一项几何信息的测距仪
JP6579450B2 (ja) スマート照明システム、照明を制御するための方法及び照明制御システム
KR102644439B1 (ko) 하나 이상의 물체를 광학적으로 검출하기 위한 검출기
JP5950296B2 (ja) 人物追跡属性推定装置、人物追跡属性推定方法、プログラム
US9117281B2 (en) Surface segmentation from RGB and depth images
US11568511B2 (en) System and method for sensing and computing of perceptual data in industrial environments
CN107449459A (zh) 自动调试系统和方法
US11694659B2 (en) Display apparatus, image processing apparatus, and control method
CN107025663A (zh) 视觉系统中用于3d点云匹配的杂波评分系统及方法
US20180137369A1 (en) Method and system for automatically managing space related resources
US20150042645A1 (en) Processing apparatus for three-dimensional data, processing method therefor, and processing program therefor
EP3226212A1 (fr) Dispositif de modélisation, dispositif de production de modèle tridimensionnel, procédé de modélisation et programme
CN104919334B (zh) 车辆上用于检测操作手势的通用传感器系统
US11602845B2 (en) Method of detecting human and/or animal motion and performing mobile disinfection
US11055841B2 (en) System and method for determining the quality of concrete from construction site images
JP2023512280A (ja) 物体認識のための検出器
TW201724022A (zh) 對象辨識系統,對象辨識方法及電腦記憶媒體
WO2020012906A1 (fr) Dispositif d'affichage, dispositif de traitement d'image et procédé de commande
Burbano et al. 3D cameras benchmark for human tracking in hybrid distributed smart camera networks
KR102089719B1 (ko) 기계 설비 공사 공정을 위한 제어 방법 및 장치
JP2019075037A (ja) 情報処理装置、情報処理方法、およびプログラム
WO2021199225A1 (fr) Système de traitement d'informations, système de capteur, procédé de traitement d'informations et programme
KR102645539B1 (ko) 구조적 깊이 카메라 시스템에서 인코딩 장치 및 방법
JP7051031B1 (ja) 情報処理装置及び検知システム
WO2022220064A1 (fr) Dispositif de test, procédé de test, programme de test et système de test

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19833562

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19833562

Country of ref document: EP

Kind code of ref document: A1