WO2023090188A1 - Système de détection de lumière, dispositif de traitement, procédé de commande de système de détection de lumière et programme - Google Patents

Système de détection de lumière, dispositif de traitement, procédé de commande de système de détection de lumière et programme Download PDF

Info

Publication number
WO2023090188A1
WO2023090188A1 PCT/JP2022/041391 JP2022041391W WO2023090188A1 WO 2023090188 A1 WO2023090188 A1 WO 2023090188A1 JP 2022041391 W JP2022041391 W JP 2022041391W WO 2023090188 A1 WO2023090188 A1 WO 2023090188A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
distance
light pulse
photodetector
living body
Prior art date
Application number
PCT/JP2022/041391
Other languages
English (en)
Japanese (ja)
Inventor
俊輔 今井
貴真 安藤
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2023561536A priority Critical patent/JPWO2023090188A1/ja
Publication of WO2023090188A1 publication Critical patent/WO2023090188A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters

Definitions

  • the present disclosure relates to a photodetection system, a processing device, a method of controlling the photodetection system, and a program.
  • Patent Literature 1 discloses an apparatus for acquiring internal information of a subject.
  • the present disclosure provides a photodetection system capable of stably acquiring biometric information of a subject in a non-contact manner in an environment in which a living body moves.
  • a light detection system includes a light source that emits a light pulse to a subject part of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part, a processing circuit that controls the light source and the photodetector and acquires distance data regarding the distance between the photodetector and the living body, wherein the processing circuit causes the light source to emit the light pulse, The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse.
  • the detection of the reflected light pulse is terminated at the point in time, the time at which the distance deviates from the first range is set as a first time, and the time after the first time at which the change over time of the distance falls within a predetermined range is set.
  • the second time it is determined whether or not to change the first time based on the distance at the second time.
  • Computer-readable recording media include non-volatile recording media such as CD-ROMs (Compact Disc-Read Only Memory).
  • a photodetection system capable of stably acquiring biometric information of a subject without contact in an environment in which a living body moves.
  • FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes.
  • FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold.
  • FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure;
  • FIG. 3A is a diagram schematically showing an example of temporal changes in surface reflection components and internal scattering components included in a reflected light pulse when the light pulse has an impulse waveform.
  • FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component and the internal scattering component included in the reflected light pulse when the light pulse has a rectangular waveform.
  • FIG. 4 is a diagram showing the positional relationship between the photodetector and the living body in this embodiment.
  • FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device in this embodiment when the measured distance between the photodetector and the living body changes.
  • FIG. 6A is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body becomes short.
  • FIG. 6B is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body decreases and then increases.
  • FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment.
  • FIG. 8 is a flowchart schematically showing another example of correction operation performed by the processing device according to the present embodiment when the measured distance between the photodetector and the living body changes.
  • FIG. 9 is a diagram showing an example of the configuration of a photodetector.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 11 is a flowchart outlining the operation of the processor with respect to the light source and photodetector.
  • FIG. 12 is a flowchart for explaining a modification.
  • FIG. 13A is a timing chart for explaining a modification.
  • FIG. 13B is a timing chart for explaining a modification.
  • FIG. 14 is a table for explaining the modification.
  • the reflected light pulse generated by irradiating the forehead of the subject with the light pulse contains much surface information of the subject during the rising period and contains much internal information of the subject during the falling period.
  • the imaging device disclosed in Patent Document 1 includes a light receiving element, and the light receiving element detects the fall period component of the reflected light pulse. From the received light intensity, the amount of change from the initial values of oxygenated hemoglobin (HbO 2 ) and deoxygenated hemoglobin (Hb) in blood in the brain can be calculated as the amount of change in cerebral blood flow.
  • the initial value is the concentration of oxygenated hemoglobin and deoxygenated hemoglobin in the brain blood at the start of measurement of the subject.
  • FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes.
  • (a) of FIG. 1A shows the temporal change of the intensity of the light emission pulse.
  • (b) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse in the light receiving element when the distance between the imaging device and the subject is appropriate for the exposure period.
  • (c) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is longer than the appropriate distance.
  • (d) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is shorter than the appropriate distance.
  • (e) of FIG. 1A shows an exposure period during which the light receiving element receives the reflected light pulse.
  • the light receiving element receives part of the fall period component of the reflected light pulse. As a result, it is possible to accurately measure the amount of change in cerebral blood flow.
  • the distance between the imaging device and the subject increases, the time it takes for the reflected light pulse to reach the light receiving element is delayed. As a result, the light-receiving element receives not only part of the fall period component of the reflected light pulse, but also the other period component, and the measurement accuracy of the amount of change in cerebral blood flow decreases.
  • Patent Document 1 discloses a method of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject changes.
  • FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold. The distance between the imaging device and the subject was measured by a TOF (Time Of Flight) method.
  • TOF Time Of Flight
  • FIG. 1B shows the change over time of the distance between the imaging device and the subject. Dashed lines represent thresholds.
  • (b) of FIG. 1B shows the relationship between the exposure period and time.
  • the vertical axis represents a plurality of preset exposure periods P1 to P4.
  • the exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant.
  • (c) of FIG. 1B shows temporal changes in the amount of change in cerebral blood flow when the subject is at rest.
  • the amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the forehead of the optical model.
  • the optical model is an object imitating the light absorption characteristics of the human forehead, the light scattering characteristics of the human forehead, and the shape of the human forehead. A specific method for calculating the amount of change in cerebral blood flow will be described in detail later.
  • the exposure period is reset from P4 to P3 as shown in (b) of FIG. was reset to the cerebral blood flow value at that timing.
  • the amount of change in cerebral blood flow remains substantially zero even if the distance changes.
  • the amount of change in cerebral blood flow fluctuates greatly while the distance is changing, and even after the distance change ends, the value deviates greatly from zero. rice field. This is because the initial value of the cerebral blood flow and the resetting of the exposure period are performed while the distance is changing.
  • the exposure period is changed after the change in the distance has subsided to some extent.
  • the biological information includes surface information and/or internal information of the subject.
  • a light detection system includes a light source that emits a light pulse to a test site of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the test site, and a processing circuit that controls the light source and the photodetector and acquires distance data relating to the distance between the photodetector and the living body.
  • the processing circuit causes the light source to emit the light pulse, causes the photodetector to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and causes the reflected light pulse to be detected. The detection of the reflected light pulse is terminated when a predetermined exposure time has passed since the detection of the light pulse was started.
  • the processing circuit sets a time when the distance deviates from the first range as a first time, and sets a time when the change over time of the distance falls within a predetermined range after the first time as a second time. Based on the distance at time 2, it is determined whether to change the first time.
  • the processing circuit changes the first time in the following cases. In the above case, the distance at the second time is outside the first range.
  • the timing to start detecting reflected light pulses can be changed according to the distance between the photodetector and the living body.
  • the photodetection system according to the third item is the photodetection system according to the second item, wherein the processing circuit changes the first time to a second time corresponding to the second range in the following cases: do.
  • the case is a case where the distance at the second time is within a second range different from the first range.
  • the photodetection system according to the fourth item is the photodetection system according to the first item, wherein the processing circuit maintains the first time in the following cases.
  • the case is a case where the distance at the second time is within the first range.
  • a photodetection system is the photodetection system according to any one of the first to fourth items, wherein the processing circuit detects that the change over time of the distance falls within the predetermined range in the following cases: Determine that it fits.
  • the above case is a case where the reference value determined based on at least one of the fluctuation width, variance, and standard deviation of the change over time of the distance is equal to or less than a threshold.
  • the distance between the photodetector and the living body can be accurately defined after the body movement of the living body.
  • a photodetection system is the photodetection system according to any one of the first to fifth items, wherein the photodetector is an image sensor having a plurality of pixels arranged two-dimensionally. .
  • the processing circuitry generates the distance data based on the intensity of the reflected light pulses detected by each of the plurality of pixels.
  • a photodetection system is the photodetection system according to the sixth item, wherein the processing circuit detects, among the plurality of pixels, the number of pixels that do not correspond to the test area, or the number of pixels that do not correspond to the test area. It is determined whether or not to change the first time based on the number of pixels that do not correspond to and the number of pixels that correspond to the peripheral region of the subject.
  • the timing to start detecting reflected light pulses is based on the number of pixels that do not correspond to the test site, or the number of pixels that do not correspond to the test site and the number of pixels that correspond to the peripheral region of the test site. is appropriate.
  • a photodetection system is the photodetection system according to any one of the first to seventh items, wherein the living body is a human and the subject is a human forehead.
  • the processing circuit generates brain activity data relating to brain activity of the living body based on a signal corresponding to the intensity of the reflected light pulse detected by the photodetector.
  • This light detection system can generate human brain activity data.
  • the processing device is a light detection device including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part.
  • the processing device comprises a processor and a memory storing a computer program executed by the processor.
  • the computer program causes the processor to emit the light pulse from the light source, acquires distance data relating to the distance between the test site and the photodetector, and causes the photodetector to perform the
  • the detection of the reflected light pulse is started when a first time has passed since the light pulse is emitted, and the reflected light is started when a predetermined exposure time has passed since the detection of the reflected light pulse is started.
  • the time when the pulse detection is terminated and the distance deviates from the first range is defined as a first time
  • the time after the first time when the change over time of the distance falls within a predetermined range is defined as a second time. and determining whether to change the first time based on the distance at the second time.
  • a method is a photodetection system including a light source that emits a light pulse to a subject part of a living body, and a photodetector that detects a reflected light pulse generated by the light pulse being reflected by the subject part. is a method of controlling The method includes causing the light source to emit the light pulse, obtaining distance data relating to the distance between the part to be inspected and the photodetector, and emitting the light pulse to the photodetector. detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse.
  • the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. determining whether to change the first time based on the distance in time.
  • a program according to the eleventh item is a light detection system including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part.
  • a program run by a computer that controls the The program causes the light source to emit the light pulse, acquires distance data relating to the distance between the part to be inspected and the photodetector, and emits the light pulse to the photodetector.
  • detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse.
  • the time at which the distance deviates from the first range is defined as a first time
  • the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. and determining whether to change the first time based on the distance in time.
  • all or part of a circuit, unit, device, member or section, or all or part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) may be performed by one or more electronic circuits.
  • An LSI or IC may be integrated on one chip, or may be configured by combining a plurality of chips.
  • functional blocks other than memory elements may be integrated on one chip.
  • LSIs or ICs may be called system LSIs, VLSIs (very large scale integration), or ULSIs (ultra large scale integration) depending on the degree of integration.
  • a FIpld Programmable Gate Array which is programmed after the LSI is manufactured, or a reconfigurable logic device that can reconfigure the connection relationships inside the LSI or set up the circuit partitions inside the LSI can also be used for the same purpose.
  • FPGA FIpld Programmable Gate Array
  • circuits, units, devices, members or parts can be executed by software processing.
  • the software is recorded on one or more non-transitory storage media, such as ROMs, optical discs, hard disk drives, etc., such that when the software is executed by a processor, the functions specified in the software are performed. It is executed by processors and peripherals.
  • a system or apparatus may include one or more non-transitory storage media on which software is recorded, a processor, and required hardware devices such as interfaces.
  • light refers not only to visible light (having a wavelength of about 400 nm to about 700 nm), but also to electromagnetic waves including ultraviolet rays (having a wavelength of about 10 nm to about 400 nm) and infrared rays (having a wavelength of about 700 nm to about 1 mm). means.
  • FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure
  • FIG. 2 shows a human head assuming that the living body 10 is a human.
  • the living body 10 is not always stationary, such as when working or driving a vehicle, but may move back and forth.
  • the living body 10 may be an animal instead of a human.
  • the subject 12 of the living body 10 is the forehead, but it may be a portion other than the forehead.
  • the photodetection system 100 shown in FIG. 2 includes a first light source 20a, a second light source 20b, a photodetection device 30, and a processing device 40.
  • the processor 40 comprises a control circuit 42 , a signal processor 44 and a memory 46 .
  • the first light source 20a and the second light source 20b are also referred to as "light source 20" without distinction.
  • the light source 20 emits light pulses for irradiating the subject 12 of the living body 10 .
  • the photodetector 30 acquires biometric information of the subject 12 by detecting a reflected light pulse generated by the light pulse being reflected by the subject 12 during a predetermined exposure period.
  • a reflected light pulse generated by being reflected by the test part 12 may include a component reflected by the surface of the test part 12 and a component diffusely reflected inside the test part 12 .
  • the processing device 40 suppresses fluctuations in biological information of the subject 12 by appropriately resetting the initial value of the cerebral blood flow and the exposure period. As a result, it is possible to stably acquire the biological information of the subject 12 in a non-contact manner.
  • the biological information may be, for example, blood flow information on the face or scalp of the living body 10, or cerebral blood flow information. Alternatively, the biological information may be both blood flow information.
  • the first light source 20a emits a first light pulse Ip1 for irradiating the part 12 to be inspected, as shown in FIG.
  • the first light pulse Ip1 has a first wavelength.
  • the second light source 20b emits a second light pulse Ip2 for illuminating the subject 12, as shown in FIG.
  • the second light pulse Ip2 has a second wavelength that is longer than the first wavelength.
  • the number of first light sources 20a is one, but it may be plural. The same applies to the number of second light sources 20b. Depending on the application, it is not necessary to use both the first light source 20a and the second light source 20b, and either one may be used.
  • the first optical pulse I p1 and the second optical pulse I p2 are also referred to herein as “optical pulses I p ” without distinction.
  • the light pulse Ip includes a rising portion and a falling portion.
  • the rising portion is the portion of the optical pulse Ip from when the intensity starts to increase until when the increase ends.
  • the trailing portion is the portion of the optical pulse Ip from when the intensity starts to decrease until the decrease ends.
  • a portion of the light pulse Ip that has reached the test site 12 becomes a surface reflection component I1 that is reflected on the surface of the test site 12, and the other part is reflected once inside the test site 12. It becomes the internally scattered component I2 that is reflected, scattered, or multiply scattered.
  • the surface reflection component I1 includes three components: a direct reflection component, a diffuse reflection component, and a diffuse reflection component.
  • a direct reflection component is a reflection component for which the angle of incidence is equal to the angle of reflection.
  • the diffuse reflection component is a component that diffuses and reflects due to the uneven shape of the surface.
  • the scattered reflection component is the component that is scattered and reflected by the internal tissue near the surface.
  • the scattered reflection component is a component that is scattered and reflected inside the epidermis.
  • the surface reflection component I1 reflected on the surface of the subject 12 includes these three components.
  • the internal scattering component I2 will be described as not including the component scattered and reflected by the internal tissue near the surface.
  • the surface reflection component I1 and the internal scattering component I2 are reflected or scattered, the direction of travel of these components is changed, and some of them reach the photodetector 30 as reflected light pulses.
  • the surface reflection component I1 includes surface information of the living body 10, such as blood flow information of the face and scalp.
  • the internal scattering component I2 contains internal information of the living body 10, such as cerebral blood flow information.
  • cerebral blood flow information For example, the cerebral blood flow, blood pressure, blood oxygen saturation, or heart rate of the living body 10 can be known from the cerebral blood flow information.
  • a method for detecting the internal scattering component I2 from the reflected light pulse will be described later in detail.
  • the first wavelength of the first optical pulse I p1 and the second wavelength of the second optical pulse I p2 may be arbitrary wavelengths included in the wavelength range of 650 nm to 950 nm, for example. This wavelength range is included in the red to near-infrared wavelength range.
  • the above wavelength range is called the "window of the body" and has the property of being relatively difficult to be absorbed by moisture and skin in the body.
  • detection sensitivity can be increased by using light in the above wavelength range.
  • the light used is believed to be absorbed primarily by oxygenated hemoglobin and deoxygenated hemoglobin.
  • changes in blood flow result in changes in the concentration of oxygenated hemoglobin and deoxygenated hemoglobin.
  • the degree of light absorption also changes. Therefore, when the blood flow changes, the amount of detected light also changes with time.
  • Oxygenated hemoglobin and deoxygenated hemoglobin differ in the wavelength dependence of light absorption. When the wavelength is 650 nm or more and shorter than 805 nm, the light absorption coefficient of deoxygenated hemoglobin is greater than that of oxygenated hemoglobin. At a wavelength of 805 nm, the light absorption coefficient of deoxygenated hemoglobin and the light absorption coefficient of oxygenated hemoglobin are equal. When the wavelength is longer than 805 nm and 950 nm or less, the light absorption coefficient of oxygenated hemoglobin is greater than that of deoxygenated hemoglobin.
  • the first wavelength of the first optical pulse Ip1 is set to 650 nm or more and shorter than 805 nm
  • the second wavelength of the second optical pulse Ip2 is set to be longer than 805 nm and 950 nm or less, good.
  • the light source 20 can be designed in consideration of the influence on the user's retina.
  • the light source 20 is a laser light source such as a laser diode, and can satisfy class 1 of the laser safety standards established by various countries. If Class 1 is satisfied, the test area 12 is illuminated with light of such low intensity that the accessible emission limit (AEL) is less than 1 mW. Note that the light source 20 itself does not need to satisfy Class 1.
  • a diffuser plate or neutral density filter may be placed in front of the light source 20 to diffuse or attenuate the light so that class 1 laser safety standards are met.
  • the photodetector 30 detects at least a part of the rise period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof.
  • the signal includes surface information of the test part 12 .
  • the photodetector 30 detects at least a part of the falling period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof.
  • the signal includes internal information of the test part 12 .
  • the “rising period” of the light pulse refers to the period from when the intensity of the light pulse starts increasing to when it ends increasing on the photodetection surface of the photodetector device 30 .
  • the “falling period” of the light pulse refers to the period from when the intensity of the light pulse starts decreasing to when it ends decreasing on the photodetection surface of the photodetector 30 . More precisely, the “rising period” means the period from when the intensity of the light pulse exceeds a preset lower limit to when it reaches a preset upper limit.
  • the “falling period” means a period from when the intensity of the light pulse falls below a preset upper limit to when it reaches a preset lower limit.
  • the upper limit value can be set to a value that is, for example, 90% of the peak value of the intensity of the light pulse
  • the lower limit value can be set to a value that is, for example, 10% of the peak value.
  • the photodetector device 30 may be equipped with an electronic shutter.
  • the electronic shutter is a circuit that controls imaging timing.
  • the electronic shutter controls one signal accumulation period during which the received light is converted into an effective electrical signal and accumulated, and a period during which the signal accumulation is stopped.
  • the signal accumulation period is also called an "exposure period”.
  • the width of the exposure period is also called “shutter width”.
  • the time from the end of one exposure period to the start of the next exposure period is also called a "non-exposure period”.
  • the photodetection device 30 can adjust the exposure period and the non-exposure period within a sub-nanosecond range, for example, 30 ps to 1 ns, using an electronic shutter.
  • a conventional TOF camera whose purpose is to measure distance detects all of the light emitted from the light source 20 and returned after being reflected by the subject.
  • Conventional TOF cameras require the shutter width to be greater than the light pulse width.
  • the shutter width need not be greater than the pulse width.
  • the shutter width can be set to a value of 1 ns or more and 30 ns or less, for example. According to the photodetection system 100 of this embodiment, since the shutter width can be reduced, the influence of dark current contained in the detection signal can be reduced.
  • the photodetection device 30 has one or more photodetection cells.
  • the photodetection device 30 may be an image sensor having a plurality of photodetection cells two-dimensionally arranged along a photodetection surface.
  • the image sensor can be any image sensor such as a CCD image sensor or a CMOS image sensor.
  • Each photodetector cell may comprise a photoelectric conversion element 32 , such as a photodiode, and one or more charge storages 34 .
  • the photodetector cells are also referred to as "pixels", and the intensity of light detected by the photodetector cells is also referred to as a "luminance value".
  • the above-described signal detected and output by the photodetector 30 is an image signal indicating luminance values of a plurality of pixels distributed two-dimensionally.
  • the image signal may contain imaged information, or may contain numerical information of luminance values for a plurality of pixels. The details of the configuration of the photodetector 30 will be described later.
  • Control circuitry 42 included in processor 40 controls the operation of light source 20 , photodetector 30 , and signal processor 44 .
  • the control circuit 42 adjusts the time difference between the emission timing of the light pulse Ip from the light source 20 and the shutter timing of the photodetector 30 .
  • the time difference is also called "phase difference”.
  • the “emission timing” of the light source 20 is the timing at which the light pulse emitted from the light source 20 starts rising.
  • “Shutter timing” is the timing to start exposure.
  • the control circuit 42 may adjust the phase difference by changing the emission timing, or may adjust the phase difference by changing the shutter timing.
  • the control circuit 42 may be configured to remove the offset component from the signal detected by each pixel of the photodetector 30 .
  • the offset component is a signal component due to environmental light such as sunlight or illumination light, or disturbance light.
  • a signal processing device 44 included in the processing device 40 generates and outputs data indicating biological information of the subject 12 of the living body 10 based on the signal output from the photodetector 30 .
  • the data includes surface information and/or internal information of the test part 12 .
  • the signal processing device 44 can also estimate the psychological state and/or physical state of the living body 10 based on the surface information and/or internal information of the subject 12 .
  • the signal processor 44 may generate and output data indicating the psychological and/or physical state of the living body 10 .
  • a psychological state can be, for example, a mood, an emotion, a state of health, or a temperature sensation.
  • Moods can include, for example, moods such as pleasant or unpleasant.
  • Emotions may include, for example, feelings of relief, anxiety, sadness, or resentment.
  • a state of health may include, for example, a state of well-being or fatigue.
  • Temperature sensations may include, for example, sensations of hot, cold, or muggy.
  • indices representing the degree of brain activity, such as interest, proficiency, proficiency, and concentration can also be included in psychological states.
  • the physical condition can be, for example, the degree of fatigue, drowsiness, or drunkenness.
  • the control circuit 42 may be, for example, a combination processor and memory or an integrated circuit such as a microcontroller containing a processor and memory.
  • the control circuit 42 executes a computer program recorded in the memory 46 by the processor, for example, to adjust the emission timing and the shutter timing, and cause the signal processing device 44 to perform signal processing.
  • the signal processor 44 is, for example, a digital signal processor (DSP), a programmable logic device (PLD) such as a field programmable gate array (FPGA), or a central processing unit (CPU) or an arithmetic processor (GPU) for image processing and a computer program. It can be realized by a combination of The signal processing device 44 executes signal processing by the processor executing a computer program recorded in the memory 46 .
  • DSP digital signal processor
  • PLD programmable logic device
  • FPGA field programmable gate array
  • CPU central processing unit
  • GPU arithmetic processor
  • the signal processor 44 and the control circuit 42 may be one integrated circuit or separate individual circuits.
  • the signal processor 44 and control circuitry 42 are also collectively referred to herein as "processing circuitry.”
  • At least one of signal processor 44, control circuitry 42, and memory 46 may be components of an external device, such as a remotely located server. In this case, an external device such as a server exchanges data with the rest of the components via wireless or wired communication.
  • control circuit 42 the operation of the control circuit 42 and the operation of the signal processing device 44 are collectively described as the operation of the processing device 40 .
  • the photodetection system 100 may include an imaging optical system that forms a two-dimensional image of the subject 12 on the photodetection surface of the photodetection device 30 .
  • the optical axis of the imaging optical system is substantially orthogonal to the photodetection surface of the photodetector 30 .
  • the imaging optics may include a zoom lens. When the position of the zoom lens changes, the magnification of the two-dimensional image of the living body 10 and its subject 12 changes, and the resolution of the two-dimensional image on the photodetector 30 changes. Therefore, even if the living body 10 is far away, it is possible to magnify the desired subject 12 and observe it in detail.
  • the photodetection system 100 may include a band-pass filter that passes the light in the wavelength band emitted from the light source 20 or light in the vicinity thereof between the subject 12 and the photodetector 30 .
  • a band-pass filter can be constituted by a multilayer filter or an absorption filter, for example.
  • the bandwidth of the band-pass filter may have a width of about 20 nm or more and 100 nm or less.
  • the photodetection system 100 may include polarizing plates between the test section 12 and the light source 20 and between the test section 12 and the photodetector 30, respectively.
  • the polarization directions of the polarizing plate arranged on the light source 20 side and the polarizing plate arranged on the photodetector 30 side may have a crossed Nicols relationship.
  • the internal information is the internal scattering component I2 .
  • FIG. 3A is a diagram schematically showing an example of temporal changes of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has an impulse waveform.
  • FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has a rectangular waveform.
  • the diagram on the left side of each diagram shows an example of the waveform of the light pulse Ip emitted from the light source 20, and the diagram on the right side shows an example of the waveforms of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse. show.
  • the surface reflection component I1 has a waveform similar to that of the light pulse Ip
  • the internal scattering component I2 is the surface reflection component I2 , as shown in the right-hand diagram of FIG. 3A. It has an impulse response waveform that lags behind component I1 . This is because the internal scattering component I 2 corresponds to a combination of light rays that have passed through various paths within the subject 12 .
  • the surface reflection component I 1 has a waveform similar to that of the light pulse I p
  • the internal scattering component I 2 is It has a waveform in which a plurality of impulse response waveforms are superimposed.
  • the present inventors confirmed that by superimposing a plurality of impulse response waveforms, the amount of light of the internal scattering component I2 detected by the photodetector 30 can be amplified, compared to the case where the light pulse Ip has an impulse waveform. .
  • the internally scattered component I2 can be effectively detected.
  • 3B represents an example of the exposure period during which the electronic shutter of the photodetector 30 is open. If the pulse width of the rectangular pulse is on the order of 1 ns to 10 ns, the light source 20 can be driven with a low voltage. Therefore, it is possible to reduce the size and cost of the photodetection system 100 in this embodiment.
  • streak cameras have been used to distinguish and detect information such as light absorption coefficients or light scattering coefficients at different locations in the depth direction inside a living body.
  • JP-A-4-189349 discloses an example of such a streak camera.
  • These streak cameras use ultrashort light pulses with femtosecond or picosecond pulse widths to measure at the desired spatial resolution.
  • the surface reflection component I1 and the internal scattering component I2 can be detected separately. Therefore, the light pulse emitted from the light source 20 does not have to be an ultra-short light pulse, and the pulse width can be arbitrarily selected.
  • the amount of light of the internal scattering component I2 is extremely small, which is approximately one to several ten thousandths of the amount of light of the surface reflection component I1 . can be small. Furthermore, considering the laser safety standards, the amount of light that can be emitted is extremely small. Therefore, detection of the internal scatter component I2 becomes very difficult. Even in such a case, if the light source 20 emits a light pulse Ip with a relatively large pulse width, it is possible to increase the integrated amount of the internal scattering component I2 with a time delay. As a result, the amount of detected light can be increased and the SN ratio can be improved.
  • the light source 20 can emit a light pulse Ip with a pulse width of 3 ns or more, for example.
  • the light source 20 may emit a light pulse Ip with a pulse width of 5 ns or more, or 10 ns or more.
  • the light source 20 can emit an optical pulse Ip with a pulse width of 50 ns or less, for example.
  • the light source 20 may emit an optical pulse Ip with a pulse width of 30 ns or less, or even 20 ns or less. If the pulse width of the rectangular pulse is several ns to several tens of ns, the light source 20 can be driven at a low voltage. Therefore, it is possible to reduce the cost of the photodetection system 100 in this embodiment.
  • the irradiation pattern of the light source 20 may be, for example, a pattern having a uniform intensity distribution within the irradiation area.
  • the photodetection system 100 of this embodiment differs from the conventional device disclosed in, for example, Japanese Patent Application Laid-Open No. 11-164826.
  • the detector and the light source are separated by about 3 cm, and the surface reflection component is spatially separated from the internal scattering component. It has to be a pattern with
  • the surface reflection component I1 can be temporally separated from the internal scattering component I2 and reduced. Therefore, the light source 20 having an irradiation pattern having a uniform intensity distribution can be used.
  • An irradiation pattern having a uniform intensity distribution may be formed by diffusing the light emitted from the light source 20 with a diffusion plate.
  • the internal scattering component I2 can be detected even just below the irradiation point of the subject 12 .
  • the measurement resolution can be increased.
  • the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , and the initial values of the concentrations of HbO 2 and Hb in the blood Calculation of the amount of change in will be described later in detail.
  • FIG. 4 is a diagram showing the positional relationship between the photodetector 30 and the living body 10 in this embodiment.
  • the measured distance Z 0 between the photodetector 30 and the living body 10 is the average value of the distances between the photodetection surface of the photodetector 30 and the surface of the test site 12 of the living body 10 .
  • four measurement ranges R1-R4 are defined.
  • the measurement range R1 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D11 and less than or equal to D12 .
  • the measurement range R2 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D21 and less than or equal to D22 .
  • the measurement range R3 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D31 and less than or equal to D32 .
  • the measurement range R4 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D41 and less than or equal to D42 .
  • the measurement range R1 and the measurement range R2 are set so as to partially overlap each other.
  • the measurement range R2 and the measurement range R3 are set so as to partially overlap each other.
  • the measurement range R3 and the measurement range R4 are set so as to partially overlap each other. That is, the relationship of D11 ⁇ D21 ⁇ D12 ⁇ D31 ⁇ D22 ⁇ D41 ⁇ D32 ⁇ D42 is satisfied.
  • the distance D11 is a value greater than or equal to zero, and can be, for example, greater than or equal to 5 cm and less than or equal to 100 cm.
  • the distance width of each of the measurement ranges R1 to R4 can be, for example, 3 cm or more and 24 cm or less. With such a distance width, the measurement ranges R1 to R4 can be accurately distinguished.
  • the width of the range in which two adjacent measurement ranges partially overlap can be, for example, 0 cm or more and 8 cm or less.
  • the number of measurement ranges does not have to be four, it can be any number.
  • Exposure periods P1 to P4 are set for the measurement ranges R1 to R4, respectively.
  • the exposure period can be determined, for example, using an optical model that simulates changes in human cerebral blood flow.
  • the state of the human brain is divided into an inactive state simulating the absorption and scattering characteristics of the skin or brain at rest, absorption when brain activity is activated and cerebral blood flow increases, and the active state mimicking the scattering properties.
  • the exposure period P1 is the exposure period during which the amount of change in luminance value between the inactive state and the active state is maximized when the optical model is placed in the center of the measurement range R1, that is, at a distance of (D 11 +D 12 )/2. is.
  • the exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant.
  • the processing device 40 may further include a storage device (not shown), and the storage device may store data indicating the exposure periods P1 to P4 associated with the measurement ranges R1 to R4, respectively.
  • the processing device 40 acquires the data from the storage device as needed.
  • the processing device 40 selects an exposure period corresponding to the measurement range from the exposure periods P1 to P4, and measures the living body of the subject 12. Get information.
  • the processing device 40 selects another exposure period corresponding to the other measurement range from the exposure periods P1 to P4, and Biological information of the detection unit 12 is acquired.
  • FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device 40 in this embodiment when the measured distance Z0 between the photodetector 30 and the living body 10 changes.
  • the processing device 40 executes the operations of steps S101 to S108 shown in FIG.
  • the photodetector 30 is an image sensor having a plurality of pixels distributed two-dimensionally.
  • the biological information acquired by the processing device 40 is cerebral blood flow information.
  • the processing device 40 acquires luminance image data as follows.
  • the processing device 40 causes the light source 20 to emit a light pulse Ip for irradiating the subject 12 of the living body 10 .
  • the processing device 40 causes the photodetector 30 to detect the light pulse Ip or the reflected light pulse generated by being reflected by the part 12 to be inspected during the currently set exposure period, and to generate and output luminance image data.
  • the processing device 40 causes the photodetector 30 to start detecting the reflected light pulse when the first time elapses after the light pulse Ip is emitted.
  • the detection of the reflected light pulse is terminated when a predetermined exposure time has elapsed from the start of the operation.
  • the predetermined exposure time is the duration of the currently set exposure period.
  • the processor 40 causes the photodetector 30 to send the luminance image data to the processor 40 .
  • starting detection of the reflected light pulse means causing the charge accumulation unit to start accumulating charges corresponding to components of the reflected light pulse.
  • Termination of detection of the reflected light pulse means to cause the charge storage unit to terminate the storage of the charge corresponding to the component of the reflected light pulse.
  • the processing device 40 acquires distance data regarding the distance between the photodetector 30 and the living body 10 based on the luminance image data as follows. Calibration is performed before measuring the distance between the photodetector 30 and the living body 10 . In the calibration, reflected light pulses generated by irradiating the object positioned in the measurement ranges R1 to R4 with light pulses are detected in the exposure periods P1 to P4, respectively.
  • the object is, for example, a flat plate, and the absorption and scattering coefficients of the flat plate may be close to the absorption and scattering coefficients of humans, respectively.
  • Equation (1) By changing the distance Z, data indicating the relationship between the distance Z and the luminance value I at the position (X, Y) in the luminance image is obtained.
  • the function represented by Equation (1) is obtained. Interpolation and extrapolation methods may be applied to interpolate between data points, and various regression methods may be applied. This data acquisition may be performed each time the cerebral blood flow of the living body 10 is measured, or may be performed once at the beginning.
  • the function represented by Equation (1) may be stored in the above-described storage device in table format.
  • the processing device 40 can calculate the distance Z at each pixel by inputting the brightness image data acquired in step S101 into the function represented by Equation (1).
  • the processing device 40 detects, for example, the position of the face of the living body 10 in the image, and extracts the forehead from the relative positional relationship with the position of the face.
  • the processing device 40 calculates the average value of the distance Z at the forehead as the measured distance Z0 .
  • the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
  • Step S103 The processing device 40 determines whether or not the measured distance Z0 is within the measurement range corresponding to the currently set exposure period. In this specification, the measurement range is called "first range”. If the determination is Yes, the processing device 40 performs the operation of step S104. If the determination is No, the processing device 40 performs the operation of step S105.
  • Step S104 If the determination in step S103 is Yes, the processing device 40 generates and outputs cerebral blood flow data indicating the amount of change in cerebral blood flow based on the luminance image data acquired in step S101.
  • the cerebral blood flow data can also be said to be brain activity data relating to brain activity.
  • the time when the measured distance Z0 deviates from the first range is defined as the first time.
  • the processing device 40 determines whether or not the amount of variation in the measured distance Z0 is equal to or less than a predetermined threshold during a predetermined determination period after the first time. In other words, the processing device 40 determines whether or not the change over time of the measured distance Z0 falls within a predetermined range during the determination period.
  • the amount of variation in distance may be, for example, a reference value determined based on at least one of the variation width, variance, and standard deviation of the variation over time in distance within the determination period.
  • the duration of the non-determination period until the determination starts after the first time can be, for example, 1 second or more and 10 seconds or less.
  • the time width of the determination period can be, for example, 1 second or more and 10 seconds or less.
  • the predetermined threshold may be, for example, 1 mm or more and 30 mm or less. If the determination is No, the processing device 40 performs the operation of step S106. A non-decision period is also included if the decision is no. If the determination is Yes, the processing device 40 performs the operation of step S107.
  • a second time is defined as a time at which the change in the measured distance Z0 becomes equal to or less than a predetermined threshold.
  • the processing device 40 initializes the cerebral blood flow value and resets the initial value of the cerebral blood flow to the cerebral blood flow value at the second time.
  • Step S108> If the measured distance Z0 at the second time is outside the first range, the measured distance Z0 at the second time is in a different measurement range than the first range.
  • a measurement range different from the first range is referred to as a "second range”.
  • the processing device 40 resets the exposure period corresponding to the first range to the exposure period corresponding to the second range. In other words, the processing device 40 changes the first time from the emission of the light pulse Ip to the start of detection of the reflected light pulse in step S101 to a second time different from the first time.
  • the processing device 40 selects the measurement range based on the condition that the measurement range whose center is closer to the measurement distance Z0 is adopted. to select. On the other hand, when the measured distance Z0 at the second time is again within the first range, the processing device 40 maintains the exposure period corresponding to the first range. As described above, the processing device 40 determines whether or not to change the first time based on the measured distance Z0 at the second time.
  • the processing device 40 repeatedly executes the operations of steps S101 to S108 for each frame until the measurement of cerebral blood flow is completed.
  • FIG. 6A is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 becomes short.
  • (a) of FIG. 6A shows the time change of the measured distance Z0 .
  • (b) of FIG. 6A shows the relationship between the exposure period and time.
  • (c) of FIG. 6A shows temporal changes in the amount of change in cerebral blood flow when the living body 10 is at rest.
  • the amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the subject 12 .
  • the sum of the amount of change in oxygenated hemoglobin and the amount of change in deoxygenated hemoglobin in the test area 12 may be used.
  • the initial measurement distance Z0 is in the measurement range R2. If the measured distance Z0 is shorter than D21 , the determination in step S103 is No. The time when the measured distance Z0 becomes shorter than D21 is the first time. While the measured distance Z0 is changing, the output of the cerebral blood flow data at the first time in step S106 continues, as shown in FIG. 6A(c). If the amount of variation in the measured distance Z0 is less than or equal to the threshold during the hatched determination period, the determination in step S105 is Yes. The second time is the time when the amount of variation in the measured distance Z0 becomes equal to or less than the threshold. In step S107, the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow. In step S108, as shown in (b) of FIG. 6A, the exposure period P2 is reset to the exposure period P1. After the second time, the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
  • FIG. 6B is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 decreases and then increases.
  • (a) to (c) of FIG. 6B correspond to (a) to (c) of FIG. 6A, respectively.
  • It is the first time.
  • the measured distance Z0 becomes longer than D12
  • the measured distance Z0 at the second time is It is in the measurement range R2.
  • step S107 the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow.
  • step S108 as shown in (b) of FIG. 6B, the exposure period is maintained as the exposure period P2.
  • the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
  • the correction operation performed by the processing device 40 in this embodiment can suppress the influence of the body movement of the living body 10 on the cerebral blood flow data, and the cerebral blood flow of the living body 10 can be suppressed. It becomes possible to acquire data stably without contact.
  • the measured distance Z0 is calculated based on the brightness image data. If the measured distance Z0 is calculated based on the distance image data instead of the luminance image data, the measured distance Z0 can be measured with high accuracy. As a result, the initial value of cerebral blood flow and the exposure period can be reset with more accurate timing.
  • the exposure periods P1A to P4A including the rising period of the reflected light pulse and the falling period of the reflected light pulse are set.
  • Exposure periods P1B to P4B are set, respectively. That is, three types of exposure periods are set for each of the measurement ranges R1 to R4. Detection of the reflected light pulse by the three types of exposure periods may be performed for each different pixel, may be performed for each different charge accumulation portion of the same pixel, or may be performed by switching the exposure period for each time. or a combination of these.
  • FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment.
  • FIG. 7(a) shows the temporal change of the light emission pulse.
  • (b) of FIG. 7 shows the temporal change of the reflected light pulse.
  • (c) of FIG. 7 shows the relationship between the exposure periods P1 to P4 and time.
  • (d) of FIG. 7 shows the relationship between the exposure periods P1A to P4A and time.
  • e) of FIG. 7 shows the relationship between the exposure periods P1B to P4B and time.
  • the signal indicating the amount of charge accumulated in the exposure periods P1 to P4 is assumed to be S0 .
  • Signal S0 contains cerebral blood flow information.
  • the processor 40 generates and outputs luminance image data based on the signal S0 .
  • the time width of the light emission pulse and the reflected light pulse is set to T0 .
  • the time from when the light source 20 starts emitting light pulses to when the exposure periods P1A to P4A end is tA .
  • the time from when the light source 20 starts emitting light pulses to when the exposure periods P1B to P4B start is tB .
  • S A be a signal indicating the amount of charge accumulated in the exposure periods P1A to P4A.
  • the signal indicating the amount of charge accumulated in the exposure periods P1B to P4B is SB .
  • the intensity of each of signals S A and S B varies with distance Z.
  • FIG. Assuming that the speed of light in air is c ( ⁇ 3.0 ⁇ 10 8 m/s), the distance Z can be calculated using the following equation (2).
  • the processing device 40 generates and outputs distance image data based on Equation (2). Details of the method of generating distance image data based on Equation (2) are disclosed in Japanese Patent Application No. 2021-012027 (filed on January 28, 2021). For reference, the entire disclosure of Japanese Patent Application No. 2021-012027 is incorporated herein by reference.
  • the processing device 40 acquires range image data in addition to luminance image data for acquiring cerebral blood flow data, and calculates the measured distance Z0 based on the range image data. Since the measurement distance Z0 can be accurately measured, the initial value of the cerebral blood flow and the exposure period can be reset with more accurate timing.
  • FIG. 8 is a flowchart schematically showing another example of the correction operation performed by the processing device 40 when the measured distance Z0 between the photodetector 30 and the living body 10 changes.
  • the processing device 40 executes the operations of steps S201 to S208 shown in FIG. In the following, the points different from the correction operation shown in FIG. 5 will be mainly described.
  • the processing device 40 acquires luminance image data and distance image data.
  • the processing device 40 extracts the forehead, which is the subject 12 of the living body 10, from the distance image indicated by the distance image data, and converts the distance values of the pixels that do not correspond to the forehead into missing values. This processing can prevent the distance values of the pixels that do not correspond to the forehead from affecting the calculation of the measured distance Z0 .
  • Extraction of the forehead can be performed, for example, as follows. A distance is estimated for each pixel, and a portion of the distance image that includes pixels whose estimated distance falls within a certain range is extracted as the forehead. Alternatively, the forehead is extracted by face detection processing using an image showing the appearance of the living body 10 . You can combine them.
  • the distance value for that pixel may be converted to a missing value due to movement of the living body 10 during measurement. Therefore, the distance values for the pixels corresponding to the peripheral region of the forehead may be converted to missing values.
  • the peripheral region of the forehead may be, for example, a region 0 pixels or more and 5 pixels or less inward from the edge of the forehead.
  • Converting the distance value of the 0 pixel from the edge of the forehead into a missing value means the operation of converting the distance value of the pixel corresponding to the edge into a missing value. Converting the distance value of the pixel located in the area inside by n pixels (n is an integer of 1 to 5) from the edge of the forehead to a missing value means repeating the following operation n times. .
  • the operation converts distance values for pixels corresponding to edges to missing values, redetects edges from the remaining forehead, and converts distance values for pixels corresponding to the redetected edges to missing values. It is to be.
  • Step S203 The processing device 40 determines whether or not the number of pixels showing missing values is equal to or less than a predetermined threshold.
  • the threshold can be determined, for example, according to the size of the image of the living body 10 in the image. If the determination is No, the processing device 40 performs the operation of step S204. If the determination is Yes, the processing device 40 performs the operation of step S206.
  • Step S203 If the determination in step S203 is No, that is, if the number of missing value pixels exceeds the threshold, the measurement distance Z0 is not within the measurement range corresponding to the currently set three types of exposure periods. If the measurement distance Z0 is longer than the upper limit of the measurement range, the time it takes for the reflected light pulse to reach the photodetector 30 is delayed. The amount of light detected in periods P1B to P4B relatively increases. Conversely, when the measurement distance Z0 is shorter than the lower limit of the measurement range, the reflected light pulse reaches the photodetector 30 earlier, so the amount of light detected during the exposure periods P1A to P4A relatively increases. However, the amount of light detected during the exposure periods P1B to P4B is relatively decreased. Accuracy of distance data can be reduced due to such an increase or decrease in the amount of detected light.
  • the processing device 40 resets the initial value of the cerebral blood flow in step S204, and resets the three exposure periods in step S205.
  • the processing device 40 may search the three types of exposure periods corresponding to the measurement ranges by switching in order of the measurement ranges R1 to R4.
  • the measurement ranges R1 to R4 may be switched in order from the measurement range closest to the measurement range corresponding to the measurement distance Z0 . By such switching, it is possible to efficiently search for three types of exposure periods corresponding to appropriate measurement ranges.
  • the processing device 40 executes the operations of steps S201 to S203 each time switching is performed. If the three switched exposure periods are appropriate, the determination in step S203 is Yes. If the three switched exposure periods are not appropriate, the determination in step S203 is No.
  • the processing device 40 acquires distance data from the distance image data.
  • the processing device 40 calculates the average value of the distances on the forehead as the measured distance Z0 .
  • the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
  • Steps S207 to S212 The operations from steps S207 to S212 are the same as the operations from steps S103 to S108 shown in FIG.
  • the processing device 40 repeatedly executes the operations of steps S201 to S212 for each frame until the measurement of cerebral blood flow is completed. As a result, the accuracy of the measured distance Z0 can be improved, and the cerebral blood flow data of the living body 10 can be acquired more stably without contact.
  • the living body 10 can move forward and backward, move horizontally, and tilt.
  • An example of the correction operation performed by the processing device 40 according to the present embodiment when the living body 10 makes such movements will be described below.
  • step S206 the processing device 40 expresses the distance indicated by the distance data as the sum of the initial distance before body movement of the living body 10 and the distance ⁇ Z deviated therefrom as follows.
  • the processing device 40 processes the range images before and after the body motion of the living body 10 using a known ICP algorithm, and aligns the range images before and after the body motion using the rotation matrix and the translation vector.
  • the processing device 40 sets the average value of the distances at the forehead before body movement of the living body 10 as the initial distance, and sets the component of the translation vector in the front-rear direction as the distance ⁇ Z.
  • a rotation matrix and a translation vector that define the body motion of the living body 10 can be obtained by the ICP algorithm.
  • the processing device 40 may make the determinations in steps S207 and S209 based not only on the distance but also on the amount of horizontal body movement and/or the amount of face rotation.
  • the amount of body motion in the horizontal direction and the amount of rotation of the face can be known from the above translation vector and rotation matrix, respectively. As described above, fluctuations in the cerebral blood flow data due to body movements other than the back and forth movement of the living body 10 can be suppressed, and the cerebral blood flow data can be stably acquired.
  • the photodetection system 100 may further include, in addition to the photodetection device 30, another photodetection device for measuring the distance between the photodetection device 30 and the living body 10.
  • the other photodetection device can be, for example, an RGB camera, an omnidirectional camera, or a ToF camera.
  • the RGB camera the distance can be calculated from the magnification/reduction ratio of the image of the living body 10 in the acquired image.
  • an omnidirectional camera the distance can be calculated by attaching the camera directly above each of the photodetector 30 and the living body 10, for example.
  • the ToF camera can calculate the distance from the distance measurement result.
  • the other photodetector can measure the body motion of the living body 10 from a plurality of viewpoints, and the distance between the photodetector 30 and the living body 10 can be measured more accurately.
  • whether or not the correction operation of the processing device 40 according to this embodiment is being executed can be known by checking the time change of the voltage output by the processing device.
  • the output voltage is zero during the non-exposure period, and becomes non-zero during the exposure period. That is, a voltage pulse corresponding to the exposure period appears in the temporal change of the output voltage.
  • FIG. 6A (a) the voltage pulse shifts along the time axis when the exposure period is reset. If the living body 10 moves as shown in FIG. 6B (a), or if the living body 10 does not move, the exposure period is not reset, so no voltage pulse shift occurs.
  • whether or not the voltage pulse shifts depends on the body motion of the living body 10 .
  • the items are the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , the operation of the processing unit 40, and the initial values of the concentrations of HbO 2 and Hb in blood. This is the calculation of the amount of change.
  • FIG. 9 is a diagram showing an example of the configuration of the photodetector 30.
  • a region surrounded by a two-dot chain line frame corresponds to one pixel 201 .
  • Pixel 201 includes one photodiode, not shown. Although eight pixels arranged in two rows and four columns are shown in FIG. 9, more pixels may actually be arranged.
  • Each pixel 201 includes a first floating diffusion layer 204 and a second floating diffusion layer 206 .
  • the wavelength of the first optical pulse Ip1 is 650 nm or more and shorter than 805 nm
  • the wavelength of the second optical pulse Ip2 is longer than 805 nm and 950 nm or less.
  • the first floating diffusion layer 204 accumulates charges generated by receiving the first reflected light pulse from the first light pulse Ip1 .
  • the second floating diffusion layer 206 accumulates charges generated by receiving the second reflected light pulse from the second light pulse Ip2 .
  • the signals accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are treated as if they were two pixel signals of a general CMOS image sensor, and output from the photodetector 30 .
  • Each pixel 201 has four signal detection circuits.
  • Each signal detection circuit includes a source follower transistor 309 , a row select transistor 308 and a reset transistor 310 .
  • Each transistor is, for example, a field effect transistor formed on a semiconductor substrate, but is not limited to this.
  • one of the input and output terminals of source follower transistor 309 is connected to one of the input and output terminals of row select transistor 308 .
  • the one of the input and output terminals of source follower transistor 309 is typically the source.
  • the one of the input and output terminals of row select transistor 308 is typically the drain.
  • the gate which is the control terminal of the source follower transistor 309, is connected to the photodiode. Signal charges of holes or electrons generated by the photodiode are accumulated in a floating diffusion layer, which is a charge accumulation part between the photodiode and the source follower transistor 309 .
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 are connected to photodiodes.
  • a switch may be provided between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 . This switch switches the conduction state between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 in response to the signal accumulation pulse from the processing device 40 . This controls the start and stop of signal charge accumulation in each of the first floating diffusion layer 204 and the second floating diffusion layer 206 .
  • the electronic shutter in this embodiment has a mechanism for such exposure control.
  • the signal charges accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are read out by turning on the gate of the row selection transistor 308 by the row selection circuit 302 .
  • the current flowing from the source follower power supply 305 to the source follower transistor 309 and the source follower load 306 is amplified according to the signal potential of the first floating diffusion layer 204 and the second floating diffusion layer 206 .
  • An analog signal based on this current read out from the vertical signal line 304 is converted into digital signal data by an analog-digital (AD) conversion circuit 307 connected for each column. This digital signal data is read column by column by the column selection circuit 303 and output from the photodetector 30 .
  • AD analog-digital
  • the row selection circuit 302 and column selection circuit 303 After reading one row, the row selection circuit 302 and column selection circuit 303 read out the next row, and so on, to read the signal charge information of the floating diffusion layers of all the rows. After reading all the signal charges, the processing device 40 resets all the floating diffusion layers by turning on the gate of the reset transistor 310 . This completes imaging of one frame. Similarly, by repeating high-speed imaging of frames, a series of frame imaging by the photodetector 30 is completed.
  • the photodetector 30 may be another type of imaging device.
  • the photodetector 30 may be, for example, a CCD type, a single photon counting device, or an intensifying image sensor such as an EMCCD or an ICCD.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 .
  • the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be alternately switched multiple times. As a result, it is possible to reduce the time difference between acquisition timings of the detection images by the two kinds of wavelengths, and use the first optical pulse Ip1 and the second optical pulse Ip2 almost simultaneously even when the subject 12 is moving. imaging is possible.
  • FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 .
  • the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be switched for each frame.
  • detection of the first reflected light pulse by the first light pulse Ip1 and detection of the second reflected light pulse by the second light pulse Ip2 can be switched for each frame.
  • each pixel 201 may have a single charge reservoir. With such a configuration, the number of charge storage units in each pixel 201 can be reduced, so the size of each pixel 201 can be increased, and the sensitivity can be improved.
  • FIG. 11 is a flow chart outlining the operation of the processor 40 with respect to the light source 20 and the photodetector 30.
  • the processor 40 causes the photodetector 30 to detect at least part of the fall period components of each of the first and second reflected light pulses by performing the operation schematically shown in FIG.
  • step S301 the processing device 40 causes the first light source 20a to emit the first light pulse Ip1 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposing until the surface reflection component I1 of the first reflected light pulse reaches the photodetector 30 for a period of time.
  • step S302> the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the first reflected light pulse reaches the photodetector 30 .
  • step S303 the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed.
  • Signal charges are accumulated in the first floating diffusion layer 204 shown in FIG. 9 by steps S102 and S103.
  • the signal charges are called "first signal charges”.
  • step S304 the processing device 40 causes the second light source 20b to emit the second light pulse Ip2 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposure until the surface reflection component I1 of the second reflected light pulse reaches the photodetector 30 for a period of time.
  • step S ⁇ b>305 the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the second reflected light pulse reaches the photodetector 30 .
  • Step S306 the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed.
  • steps S105 and S106 signal charges are accumulated in the second floating diffusion layer 206 shown in FIG. The signal charges are called "second signal charges”.
  • step S307 the processing device 40 determines whether or not the number of times the above signal accumulation has been performed has reached a predetermined number. If the determination in step S307 is No, the processing device 40 repeats steps S301 to S306 until it determines Yes. If the determination in step S307 is Yes, the processing device 40 performs the operation of step S308.
  • Step S308 the processor 40 causes the photodetector 30 to generate and output a first signal and a second signal based on the first signal charge and the second signal charge, respectively.
  • the first signal and the second signal contain internal information of the test part 12 .
  • the operation shown in FIG. 11 is summarized as follows.
  • the processing device 40 performs a first operation of causing the first light source 20a to emit the first light pulse Ip1 and causing the photodetector device 30 to detect at least part of the falling edge period of the first reflected light pulse.
  • the processing device 40 causes the second light source 20b to emit the second light pulse Ip2 and performs a second operation of causing the photodetector 30 to detect at least part of the fall period component of the second reflected light pulse.
  • the processing device 40 repeats a series of operations including the first operation and the second operation a predetermined number of times. Alternatively, the processing device 40 may repeat the first action a predetermined number of times, and then repeat the second action a predetermined number of times. The first action and the second action may be interchanged.
  • the internal scattering component I2 can be detected with high sensitivity.
  • the attenuation rate of light inside is very large.
  • the emitted light can be attenuated to about 1/1,000,000 of the incident light. Therefore, in order to detect the internal scattering component I2 , the amount of light may be insufficient with one pulse irradiation. In the case of irradiation in class 1 of laser safety standards, the amount of light is particularly weak.
  • the light source 20 emits light pulses a plurality of times, and the photodetector 30 is also exposed a plurality of times by the electronic shutter accordingly, thereby integrating detection signals and improving sensitivity.
  • the multiple times of light emission and exposure are not essential, and are performed as necessary.
  • the surface reflection of each of the first and second reflected light pulses is detected by causing the photodetector 30 to detect at least a portion of the rising period of each of the first and second reflected light pulses.
  • Component I1 can be detected, making it possible to obtain surface information such as blood flow on the face and scalp.
  • a first floating diffusion layer 204 and a second floating diffusion layer 206 included in each pixel 201 shown in FIG. The charge generated can be accumulated.
  • two pixels 201 adjacent to each other in the row direction shown in FIG. 9 may be treated as one pixel.
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 included in one pixel 201 respectively receive at least part of the fall period components of the first and second reflected light pulses.
  • the charge generated can be accumulated.
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 included in the other pixel 201 receive the charge generated by receiving at least part of the rising period component of the first and second reflected light pulses, respectively. can accumulate.
  • not only the internal information of the living body 10 but also the surface information can be obtained.
  • Equations (3) and (4) below represent examples of simultaneous equations.
  • ⁇ HbO 2 and ⁇ Hb represent the amount of change from the initial values of the concentrations of HbO 2 and Hb in blood, respectively.
  • ⁇ 750 OXY and ⁇ 750 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 750 nm, respectively.
  • ⁇ 850 OXY and ⁇ 850 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 850 nm, respectively.
  • I 750 ini and I 750 now represent the detected intensity at a wavelength of 750 nm at a reference time (initial time) and a certain time, respectively.
  • I 850 ini and I 850 now represent the detected intensity at a wavelength of 850 nm at a reference time (initial time) and a certain time, respectively. These symbols represent, for example, the detection strength in the non-activated state and the activated state of the brain.
  • FIG. 12 is a flowchart for explaining this modification.
  • FIG. 13A is a timing chart for explaining this modification.
  • FIG. 13B is a timing chart for explaining this modification.
  • FIG. 14 is a table for explaining this modified example.
  • This modification may be as shown below.
  • a method being performed by one or more processors configured to execute instructions stored in one or more memories (a) Determine the distance d(i) between the apparatus and the living body based on 1 or the image taken of the living body including the test site at time t(i), thereby obtaining the distances d(k), .
  • a distance d(k+n0) is determined, wherein i is a variable, said i takes the value of an integer greater than or equal to 1, said k is an integer greater than or equal to 1, said n0 is a predetermined integer greater than or equal to 1; (b-1) performing a first process when the distances d(k), . . .
  • the first processing includes: (c-1) determining p(k+n0) based on the distance d(k), . . .
  • the first processing may include generating information Inf(k+n0) on the blood of the living body based on the light r(k+n0).
  • the light r(k+n0) may be light from the subject based on the light o(k+n0).
  • the first processing described above corresponds to S1004 to S1005 in FIG.
  • the photodetection system 100 includes a processor (not shown) and memory 46 .
  • a processor may be one or more processors.
  • Memory 46 may be one or more memories.
  • the light detection system 100 may further include a first camera (not shown), a second camera (not shown), Memory 46 stores a plurality of instructions. The instructions are executed by a processor. The plurality of instructions include the processing shown in S1001 to S1006 shown in FIG.
  • the photodetection system 100 accepts an instruction to start processing.
  • the processor sets the value of the timer of the photodetection system 100 to 0, starts the operation of the timer, and performs the processes shown in S1001 to S1006 shown in FIG. Execute the process.
  • the time indicated in this modified example may be the time indicated by the timer.
  • the processor causes the first camera to image the living body 10 including the subject 12 at time t(i).
  • the first camera thereby generates an image IL(i).
  • Image IL(i) is stored in memory 46 .
  • the processor causes the second camera to image the living body 10 including the subject 12 at time t(i).
  • the second camera thereby produces an image IR(i).
  • Image IR(i) is stored in memory 46 .
  • the time interval for imaging the living body 10 including the subject 12 by the first camera may be constant.
  • a time interval for imaging the living body 10 including the test part 12 by the second camera may be constant.
  • the processor calculates the distance d(i) between the points included in the living body 10 and the points included in the photodetection system 100 based on the images IL(i) and IR(i) stored in the memory 46 .
  • Distance d(i) may be referred to as the distance between detection system 100 and living body 10 .
  • the processor stores the calculated distance d(i) in the memory 46.
  • the points included in the living body 10 may be predetermined points included in the subject 12 .
  • the points included in the photodetection system 100 may be predetermined points included in the photodetection device 30 .
  • the distance is determined based on the three-dimensional coordinates (xi, yi, zi) of points included in the living body 10 .
  • the three-dimensional coordinates (xi, yi, zi) may be determined by providing a stereo camera system and the processor using a technique of distance measurement of points included in the living body 10. .
  • the stereo camera includes the above-described first camera and second camera. A single camera ranging technique may be used to determine the distance.
  • the distance d(1), the distance d(2) ⁇ , the distance d(k) ⁇ , and the distance d(k+n0) are determined.
  • Distances d(k), . . . , and distances d(k+n0) are shown in FIGS. 13A and 13B.
  • i is a variable, i takes the value of an integer of 1 or more, k is an integer of 1 or more, and n0 is a predetermined integer of 1 or more.
  • the processor determines d(k+n0)max, which is the maximum value of the distances d(k) to d(k+n0).
  • the processor determines d(k+n0)min, which is the minimum value of the distances d(k), .about.d(k+n0).
  • the processor determines if ⁇ d(k+n0) is less than or equal to a predetermined value d0.
  • the processor executes the processing shown in S1004 and the processing shown in S1005. If ⁇ d(k+n0)>d0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005, and then images the living body at time t(k+n0+1) in S1002. 1 or based on the image determine the distance d(k+n0+1).
  • the processor refers to the table shown in FIG. 14 recorded in the memory 46 to determine the mask time p(k+n0) corresponding to d(k+n0)avg. For example, if L1 ⁇ d(k+n0)avg ⁇ L2, the mask time p(k+n0) is p2. In FIG. 14, p1 ⁇ p2 ⁇ . . . ⁇ p ⁇ may be satisfied.
  • the processor causes the light source 20 to emit a light pulse Ip to the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ . That is, the processor causes the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ .
  • the processor causes the photodetector 30 to start detecting light r(k+n0) from the test site 12 of the living body 10 based on the light o(k+n0) from time ⁇ t(k+n0)+t0+p(k+n0) ⁇ .
  • the processor causes the photodetector 30 to finish detecting the light r(k+n0) at time ⁇ t(k+n0)+t0+p(k+n0)+(predetermined exposure period) ⁇ .
  • FIG. 13A will be described.
  • FIG. 13A shows an overview of processing related to the case of Yes in S1003.
  • the processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
  • the processor causes the light source 20 to emit light o(k+n0) toward the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ . emitted from
  • the processor causes the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0+p (k+n0) ⁇ .
  • FIG. 13B will be described.
  • FIG. 13B shows an overview of the processing related to No in S1003.
  • the processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
  • the processor does not cause the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 .
  • the processor does not cause the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 .
  • Information Inf(k+n0) regarding the blood of the living body may be generated.
  • the information about the blood of the living body may be the concentration of HbO 2 and/or the concentration of Hb in the blood generated using equations (3) and (4).
  • the processor may end the processing shown in FIG.
  • (S1003) may be the following processing.
  • the processor calculates ⁇ (k+n0), which is the deviation of the distances d(k) to d(k+n0), and determines whether the deviation is less than or equal to a predetermined value ⁇ 0. If ⁇ (k+n0) ⁇ 0, the processor executes the processing shown in S1004 and the processing shown in S1005. If ⁇ (k+n0)> ⁇ 0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005.
  • t0 may be 0.
  • the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0), and (ii) the processor causes the time t(k+n0 ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0).
  • the processor causes the photodetector device 30 to detect the light r(k+n0) from the test site 12 of the living body 10 from time ⁇ t(k+n0)+p(k+n0) ⁇ .
  • the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0); ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0).
  • the processor prevents the photodetector 30 from detecting the light r(k+n0) from the subject 12 of the living body 10 from time ⁇ t(k+n0)+p(k+n0) ⁇ .
  • the distance d(i) is obtained using the first camera and the second camera.
  • light source 20 and photodetector 30 may be used to determine distance d(i).
  • t0 0.
  • the light detection system according to the present disclosure is capable of acquiring biometric information on a subject of a living body.
  • Optical detection systems in the present disclosure are useful, for example, for biosensing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

Un système de détection de lumière (100) comprend : une source de lumière (20) qui émet une impulsion de lumière vers une partie à inspecter d'un corps vivant; un dispositif de détection de lumière (30) qui détecte une impulsion de lumière réfléchie générée à partir de l'impulsion de lumière réfléchie par la partie à inspecter; et un circuit de traitement (42,44) qui commande la source de lumière et le dispositif de détection de lumière et acquiert des données de distance relatives à la distance entre le dispositif de détection de lumière et le corps vivant. Le circuit de traitement amène la source de lumière à émettre l'impulsion de lumière, amène le dispositif de détection de lumière à démarrer la détection de l'impulsion de lumière réfléchie lorsqu'une première période s'est écoulée depuis que l'impulsion de lumière a été émise, et amène le dispositif de détection de lumière à finir la détection de l'impulsion de lumière réfléchie lorsqu'un temps d'exposition prédéterminé s'est écoulé depuis que la détection de l'impulsion de lumière réfléchie a été démarrée. Lorsqu'un instant au niveau duquel la distance s'écarte d'une première plage est défini sur un premier instant et qu'un instant au niveau duquel le changement temporel de la distance se situe dans une plage prédéterminée après que le premier instant est défini sur un second instant, le besoin de modification ou non de la première période est déterminé sur la base de la distance au niveau du second instant.
PCT/JP2022/041391 2021-11-22 2022-11-07 Système de détection de lumière, dispositif de traitement, procédé de commande de système de détection de lumière et programme WO2023090188A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023561536A JPWO2023090188A1 (fr) 2021-11-22 2022-11-07

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021189218 2021-11-22
JP2021-189218 2021-11-22

Publications (1)

Publication Number Publication Date
WO2023090188A1 true WO2023090188A1 (fr) 2023-05-25

Family

ID=86396947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/041391 WO2023090188A1 (fr) 2021-11-22 2022-11-07 Système de détection de lumière, dispositif de traitement, procédé de commande de système de détection de lumière et programme

Country Status (2)

Country Link
JP (1) JPWO2023090188A1 (fr)
WO (1) WO2023090188A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017124153A (ja) * 2016-01-07 2017-07-20 パナソニックIpマネジメント株式会社 生体情報計測装置
JP2017185200A (ja) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 光源と、光検出器と、制御回路とを備える撮像装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017124153A (ja) * 2016-01-07 2017-07-20 パナソニックIpマネジメント株式会社 生体情報計測装置
JP2017185200A (ja) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 光源と、光検出器と、制御回路とを備える撮像装置

Also Published As

Publication number Publication date
JPWO2023090188A1 (fr) 2023-05-25

Similar Documents

Publication Publication Date Title
US11303828B2 (en) Imaging apparatus including light source that emits pulsed light, image sensor, and control circuit
JP6887097B2 (ja) 撮像装置
JP7065421B2 (ja) 撮像装置および対象物の内部の情報を取得する方法
JP6998529B2 (ja) 撮像装置
US10194094B2 (en) Imaging apparatus including light source that emits pulsed light, image sensor, and control circuit
WO2020044854A1 (fr) Dispositif de mesure biologique et procédé de mesure biologique
JPWO2019230306A1 (ja) 識別装置および識別方法
WO2020137276A1 (fr) Dispositif d'imagerie
WO2023090188A1 (fr) Système de détection de lumière, dispositif de traitement, procédé de commande de système de détection de lumière et programme
WO2021182018A1 (fr) Appareil de mesure, et procédé de commande d'un appareil de mesure
JP7417867B2 (ja) 光計測装置
JP7142246B2 (ja) 生体計測装置、ヘッドマウントディスプレイ装置、および生体計測方法
JP2021141949A (ja) 測定装置、およびプログラム
WO2022138063A1 (fr) Dispositif de mesure biologique, procédé de mesure biologique et programme informatique
WO2023079862A1 (fr) Système d'imagerie, dispositif de traitement et procédé exécuté par ordinateur dans un système d'imagerie
WO2020137352A1 (fr) Procédé de biodétection, procédé de génération de données cartographiques, programme, support lisible par ordinateur, et dispositif de biodétection
JP2020032105A (ja) 生体計測装置、生体計測システム、制御方法、およびコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22895470

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023561536

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE