WO2023090188A1 - Light detecting system, processing device, method for controlling light detecting system, and program - Google Patents

Light detecting system, processing device, method for controlling light detecting system, and program Download PDF

Info

Publication number
WO2023090188A1
WO2023090188A1 PCT/JP2022/041391 JP2022041391W WO2023090188A1 WO 2023090188 A1 WO2023090188 A1 WO 2023090188A1 JP 2022041391 W JP2022041391 W JP 2022041391W WO 2023090188 A1 WO2023090188 A1 WO 2023090188A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
distance
light pulse
photodetector
living body
Prior art date
Application number
PCT/JP2022/041391
Other languages
French (fr)
Japanese (ja)
Inventor
俊輔 今井
貴真 安藤
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2023090188A1 publication Critical patent/WO2023090188A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters

Definitions

  • the present disclosure relates to a photodetection system, a processing device, a method of controlling the photodetection system, and a program.
  • Patent Literature 1 discloses an apparatus for acquiring internal information of a subject.
  • the present disclosure provides a photodetection system capable of stably acquiring biometric information of a subject in a non-contact manner in an environment in which a living body moves.
  • a light detection system includes a light source that emits a light pulse to a subject part of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part, a processing circuit that controls the light source and the photodetector and acquires distance data regarding the distance between the photodetector and the living body, wherein the processing circuit causes the light source to emit the light pulse, The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse.
  • the detection of the reflected light pulse is terminated at the point in time, the time at which the distance deviates from the first range is set as a first time, and the time after the first time at which the change over time of the distance falls within a predetermined range is set.
  • the second time it is determined whether or not to change the first time based on the distance at the second time.
  • Computer-readable recording media include non-volatile recording media such as CD-ROMs (Compact Disc-Read Only Memory).
  • a photodetection system capable of stably acquiring biometric information of a subject without contact in an environment in which a living body moves.
  • FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes.
  • FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold.
  • FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure;
  • FIG. 3A is a diagram schematically showing an example of temporal changes in surface reflection components and internal scattering components included in a reflected light pulse when the light pulse has an impulse waveform.
  • FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component and the internal scattering component included in the reflected light pulse when the light pulse has a rectangular waveform.
  • FIG. 4 is a diagram showing the positional relationship between the photodetector and the living body in this embodiment.
  • FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device in this embodiment when the measured distance between the photodetector and the living body changes.
  • FIG. 6A is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body becomes short.
  • FIG. 6B is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body decreases and then increases.
  • FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment.
  • FIG. 8 is a flowchart schematically showing another example of correction operation performed by the processing device according to the present embodiment when the measured distance between the photodetector and the living body changes.
  • FIG. 9 is a diagram showing an example of the configuration of a photodetector.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse and the second optical pulse.
  • FIG. 11 is a flowchart outlining the operation of the processor with respect to the light source and photodetector.
  • FIG. 12 is a flowchart for explaining a modification.
  • FIG. 13A is a timing chart for explaining a modification.
  • FIG. 13B is a timing chart for explaining a modification.
  • FIG. 14 is a table for explaining the modification.
  • the reflected light pulse generated by irradiating the forehead of the subject with the light pulse contains much surface information of the subject during the rising period and contains much internal information of the subject during the falling period.
  • the imaging device disclosed in Patent Document 1 includes a light receiving element, and the light receiving element detects the fall period component of the reflected light pulse. From the received light intensity, the amount of change from the initial values of oxygenated hemoglobin (HbO 2 ) and deoxygenated hemoglobin (Hb) in blood in the brain can be calculated as the amount of change in cerebral blood flow.
  • the initial value is the concentration of oxygenated hemoglobin and deoxygenated hemoglobin in the brain blood at the start of measurement of the subject.
  • FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes.
  • (a) of FIG. 1A shows the temporal change of the intensity of the light emission pulse.
  • (b) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse in the light receiving element when the distance between the imaging device and the subject is appropriate for the exposure period.
  • (c) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is longer than the appropriate distance.
  • (d) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is shorter than the appropriate distance.
  • (e) of FIG. 1A shows an exposure period during which the light receiving element receives the reflected light pulse.
  • the light receiving element receives part of the fall period component of the reflected light pulse. As a result, it is possible to accurately measure the amount of change in cerebral blood flow.
  • the distance between the imaging device and the subject increases, the time it takes for the reflected light pulse to reach the light receiving element is delayed. As a result, the light-receiving element receives not only part of the fall period component of the reflected light pulse, but also the other period component, and the measurement accuracy of the amount of change in cerebral blood flow decreases.
  • Patent Document 1 discloses a method of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject changes.
  • FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold. The distance between the imaging device and the subject was measured by a TOF (Time Of Flight) method.
  • TOF Time Of Flight
  • FIG. 1B shows the change over time of the distance between the imaging device and the subject. Dashed lines represent thresholds.
  • (b) of FIG. 1B shows the relationship between the exposure period and time.
  • the vertical axis represents a plurality of preset exposure periods P1 to P4.
  • the exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant.
  • (c) of FIG. 1B shows temporal changes in the amount of change in cerebral blood flow when the subject is at rest.
  • the amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the forehead of the optical model.
  • the optical model is an object imitating the light absorption characteristics of the human forehead, the light scattering characteristics of the human forehead, and the shape of the human forehead. A specific method for calculating the amount of change in cerebral blood flow will be described in detail later.
  • the exposure period is reset from P4 to P3 as shown in (b) of FIG. was reset to the cerebral blood flow value at that timing.
  • the amount of change in cerebral blood flow remains substantially zero even if the distance changes.
  • the amount of change in cerebral blood flow fluctuates greatly while the distance is changing, and even after the distance change ends, the value deviates greatly from zero. rice field. This is because the initial value of the cerebral blood flow and the resetting of the exposure period are performed while the distance is changing.
  • the exposure period is changed after the change in the distance has subsided to some extent.
  • the biological information includes surface information and/or internal information of the subject.
  • a light detection system includes a light source that emits a light pulse to a test site of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the test site, and a processing circuit that controls the light source and the photodetector and acquires distance data relating to the distance between the photodetector and the living body.
  • the processing circuit causes the light source to emit the light pulse, causes the photodetector to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and causes the reflected light pulse to be detected. The detection of the reflected light pulse is terminated when a predetermined exposure time has passed since the detection of the light pulse was started.
  • the processing circuit sets a time when the distance deviates from the first range as a first time, and sets a time when the change over time of the distance falls within a predetermined range after the first time as a second time. Based on the distance at time 2, it is determined whether to change the first time.
  • the processing circuit changes the first time in the following cases. In the above case, the distance at the second time is outside the first range.
  • the timing to start detecting reflected light pulses can be changed according to the distance between the photodetector and the living body.
  • the photodetection system according to the third item is the photodetection system according to the second item, wherein the processing circuit changes the first time to a second time corresponding to the second range in the following cases: do.
  • the case is a case where the distance at the second time is within a second range different from the first range.
  • the photodetection system according to the fourth item is the photodetection system according to the first item, wherein the processing circuit maintains the first time in the following cases.
  • the case is a case where the distance at the second time is within the first range.
  • a photodetection system is the photodetection system according to any one of the first to fourth items, wherein the processing circuit detects that the change over time of the distance falls within the predetermined range in the following cases: Determine that it fits.
  • the above case is a case where the reference value determined based on at least one of the fluctuation width, variance, and standard deviation of the change over time of the distance is equal to or less than a threshold.
  • the distance between the photodetector and the living body can be accurately defined after the body movement of the living body.
  • a photodetection system is the photodetection system according to any one of the first to fifth items, wherein the photodetector is an image sensor having a plurality of pixels arranged two-dimensionally. .
  • the processing circuitry generates the distance data based on the intensity of the reflected light pulses detected by each of the plurality of pixels.
  • a photodetection system is the photodetection system according to the sixth item, wherein the processing circuit detects, among the plurality of pixels, the number of pixels that do not correspond to the test area, or the number of pixels that do not correspond to the test area. It is determined whether or not to change the first time based on the number of pixels that do not correspond to and the number of pixels that correspond to the peripheral region of the subject.
  • the timing to start detecting reflected light pulses is based on the number of pixels that do not correspond to the test site, or the number of pixels that do not correspond to the test site and the number of pixels that correspond to the peripheral region of the test site. is appropriate.
  • a photodetection system is the photodetection system according to any one of the first to seventh items, wherein the living body is a human and the subject is a human forehead.
  • the processing circuit generates brain activity data relating to brain activity of the living body based on a signal corresponding to the intensity of the reflected light pulse detected by the photodetector.
  • This light detection system can generate human brain activity data.
  • the processing device is a light detection device including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part.
  • the processing device comprises a processor and a memory storing a computer program executed by the processor.
  • the computer program causes the processor to emit the light pulse from the light source, acquires distance data relating to the distance between the test site and the photodetector, and causes the photodetector to perform the
  • the detection of the reflected light pulse is started when a first time has passed since the light pulse is emitted, and the reflected light is started when a predetermined exposure time has passed since the detection of the reflected light pulse is started.
  • the time when the pulse detection is terminated and the distance deviates from the first range is defined as a first time
  • the time after the first time when the change over time of the distance falls within a predetermined range is defined as a second time. and determining whether to change the first time based on the distance at the second time.
  • a method is a photodetection system including a light source that emits a light pulse to a subject part of a living body, and a photodetector that detects a reflected light pulse generated by the light pulse being reflected by the subject part. is a method of controlling The method includes causing the light source to emit the light pulse, obtaining distance data relating to the distance between the part to be inspected and the photodetector, and emitting the light pulse to the photodetector. detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse.
  • the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. determining whether to change the first time based on the distance in time.
  • a program according to the eleventh item is a light detection system including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part.
  • a program run by a computer that controls the The program causes the light source to emit the light pulse, acquires distance data relating to the distance between the part to be inspected and the photodetector, and emits the light pulse to the photodetector.
  • detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse.
  • the time at which the distance deviates from the first range is defined as a first time
  • the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. and determining whether to change the first time based on the distance in time.
  • all or part of a circuit, unit, device, member or section, or all or part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) may be performed by one or more electronic circuits.
  • An LSI or IC may be integrated on one chip, or may be configured by combining a plurality of chips.
  • functional blocks other than memory elements may be integrated on one chip.
  • LSIs or ICs may be called system LSIs, VLSIs (very large scale integration), or ULSIs (ultra large scale integration) depending on the degree of integration.
  • a FIpld Programmable Gate Array which is programmed after the LSI is manufactured, or a reconfigurable logic device that can reconfigure the connection relationships inside the LSI or set up the circuit partitions inside the LSI can also be used for the same purpose.
  • FPGA FIpld Programmable Gate Array
  • circuits, units, devices, members or parts can be executed by software processing.
  • the software is recorded on one or more non-transitory storage media, such as ROMs, optical discs, hard disk drives, etc., such that when the software is executed by a processor, the functions specified in the software are performed. It is executed by processors and peripherals.
  • a system or apparatus may include one or more non-transitory storage media on which software is recorded, a processor, and required hardware devices such as interfaces.
  • light refers not only to visible light (having a wavelength of about 400 nm to about 700 nm), but also to electromagnetic waves including ultraviolet rays (having a wavelength of about 10 nm to about 400 nm) and infrared rays (having a wavelength of about 700 nm to about 1 mm). means.
  • FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure
  • FIG. 2 shows a human head assuming that the living body 10 is a human.
  • the living body 10 is not always stationary, such as when working or driving a vehicle, but may move back and forth.
  • the living body 10 may be an animal instead of a human.
  • the subject 12 of the living body 10 is the forehead, but it may be a portion other than the forehead.
  • the photodetection system 100 shown in FIG. 2 includes a first light source 20a, a second light source 20b, a photodetection device 30, and a processing device 40.
  • the processor 40 comprises a control circuit 42 , a signal processor 44 and a memory 46 .
  • the first light source 20a and the second light source 20b are also referred to as "light source 20" without distinction.
  • the light source 20 emits light pulses for irradiating the subject 12 of the living body 10 .
  • the photodetector 30 acquires biometric information of the subject 12 by detecting a reflected light pulse generated by the light pulse being reflected by the subject 12 during a predetermined exposure period.
  • a reflected light pulse generated by being reflected by the test part 12 may include a component reflected by the surface of the test part 12 and a component diffusely reflected inside the test part 12 .
  • the processing device 40 suppresses fluctuations in biological information of the subject 12 by appropriately resetting the initial value of the cerebral blood flow and the exposure period. As a result, it is possible to stably acquire the biological information of the subject 12 in a non-contact manner.
  • the biological information may be, for example, blood flow information on the face or scalp of the living body 10, or cerebral blood flow information. Alternatively, the biological information may be both blood flow information.
  • the first light source 20a emits a first light pulse Ip1 for irradiating the part 12 to be inspected, as shown in FIG.
  • the first light pulse Ip1 has a first wavelength.
  • the second light source 20b emits a second light pulse Ip2 for illuminating the subject 12, as shown in FIG.
  • the second light pulse Ip2 has a second wavelength that is longer than the first wavelength.
  • the number of first light sources 20a is one, but it may be plural. The same applies to the number of second light sources 20b. Depending on the application, it is not necessary to use both the first light source 20a and the second light source 20b, and either one may be used.
  • the first optical pulse I p1 and the second optical pulse I p2 are also referred to herein as “optical pulses I p ” without distinction.
  • the light pulse Ip includes a rising portion and a falling portion.
  • the rising portion is the portion of the optical pulse Ip from when the intensity starts to increase until when the increase ends.
  • the trailing portion is the portion of the optical pulse Ip from when the intensity starts to decrease until the decrease ends.
  • a portion of the light pulse Ip that has reached the test site 12 becomes a surface reflection component I1 that is reflected on the surface of the test site 12, and the other part is reflected once inside the test site 12. It becomes the internally scattered component I2 that is reflected, scattered, or multiply scattered.
  • the surface reflection component I1 includes three components: a direct reflection component, a diffuse reflection component, and a diffuse reflection component.
  • a direct reflection component is a reflection component for which the angle of incidence is equal to the angle of reflection.
  • the diffuse reflection component is a component that diffuses and reflects due to the uneven shape of the surface.
  • the scattered reflection component is the component that is scattered and reflected by the internal tissue near the surface.
  • the scattered reflection component is a component that is scattered and reflected inside the epidermis.
  • the surface reflection component I1 reflected on the surface of the subject 12 includes these three components.
  • the internal scattering component I2 will be described as not including the component scattered and reflected by the internal tissue near the surface.
  • the surface reflection component I1 and the internal scattering component I2 are reflected or scattered, the direction of travel of these components is changed, and some of them reach the photodetector 30 as reflected light pulses.
  • the surface reflection component I1 includes surface information of the living body 10, such as blood flow information of the face and scalp.
  • the internal scattering component I2 contains internal information of the living body 10, such as cerebral blood flow information.
  • cerebral blood flow information For example, the cerebral blood flow, blood pressure, blood oxygen saturation, or heart rate of the living body 10 can be known from the cerebral blood flow information.
  • a method for detecting the internal scattering component I2 from the reflected light pulse will be described later in detail.
  • the first wavelength of the first optical pulse I p1 and the second wavelength of the second optical pulse I p2 may be arbitrary wavelengths included in the wavelength range of 650 nm to 950 nm, for example. This wavelength range is included in the red to near-infrared wavelength range.
  • the above wavelength range is called the "window of the body" and has the property of being relatively difficult to be absorbed by moisture and skin in the body.
  • detection sensitivity can be increased by using light in the above wavelength range.
  • the light used is believed to be absorbed primarily by oxygenated hemoglobin and deoxygenated hemoglobin.
  • changes in blood flow result in changes in the concentration of oxygenated hemoglobin and deoxygenated hemoglobin.
  • the degree of light absorption also changes. Therefore, when the blood flow changes, the amount of detected light also changes with time.
  • Oxygenated hemoglobin and deoxygenated hemoglobin differ in the wavelength dependence of light absorption. When the wavelength is 650 nm or more and shorter than 805 nm, the light absorption coefficient of deoxygenated hemoglobin is greater than that of oxygenated hemoglobin. At a wavelength of 805 nm, the light absorption coefficient of deoxygenated hemoglobin and the light absorption coefficient of oxygenated hemoglobin are equal. When the wavelength is longer than 805 nm and 950 nm or less, the light absorption coefficient of oxygenated hemoglobin is greater than that of deoxygenated hemoglobin.
  • the first wavelength of the first optical pulse Ip1 is set to 650 nm or more and shorter than 805 nm
  • the second wavelength of the second optical pulse Ip2 is set to be longer than 805 nm and 950 nm or less, good.
  • the light source 20 can be designed in consideration of the influence on the user's retina.
  • the light source 20 is a laser light source such as a laser diode, and can satisfy class 1 of the laser safety standards established by various countries. If Class 1 is satisfied, the test area 12 is illuminated with light of such low intensity that the accessible emission limit (AEL) is less than 1 mW. Note that the light source 20 itself does not need to satisfy Class 1.
  • a diffuser plate or neutral density filter may be placed in front of the light source 20 to diffuse or attenuate the light so that class 1 laser safety standards are met.
  • the photodetector 30 detects at least a part of the rise period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof.
  • the signal includes surface information of the test part 12 .
  • the photodetector 30 detects at least a part of the falling period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof.
  • the signal includes internal information of the test part 12 .
  • the “rising period” of the light pulse refers to the period from when the intensity of the light pulse starts increasing to when it ends increasing on the photodetection surface of the photodetector device 30 .
  • the “falling period” of the light pulse refers to the period from when the intensity of the light pulse starts decreasing to when it ends decreasing on the photodetection surface of the photodetector 30 . More precisely, the “rising period” means the period from when the intensity of the light pulse exceeds a preset lower limit to when it reaches a preset upper limit.
  • the “falling period” means a period from when the intensity of the light pulse falls below a preset upper limit to when it reaches a preset lower limit.
  • the upper limit value can be set to a value that is, for example, 90% of the peak value of the intensity of the light pulse
  • the lower limit value can be set to a value that is, for example, 10% of the peak value.
  • the photodetector device 30 may be equipped with an electronic shutter.
  • the electronic shutter is a circuit that controls imaging timing.
  • the electronic shutter controls one signal accumulation period during which the received light is converted into an effective electrical signal and accumulated, and a period during which the signal accumulation is stopped.
  • the signal accumulation period is also called an "exposure period”.
  • the width of the exposure period is also called “shutter width”.
  • the time from the end of one exposure period to the start of the next exposure period is also called a "non-exposure period”.
  • the photodetection device 30 can adjust the exposure period and the non-exposure period within a sub-nanosecond range, for example, 30 ps to 1 ns, using an electronic shutter.
  • a conventional TOF camera whose purpose is to measure distance detects all of the light emitted from the light source 20 and returned after being reflected by the subject.
  • Conventional TOF cameras require the shutter width to be greater than the light pulse width.
  • the shutter width need not be greater than the pulse width.
  • the shutter width can be set to a value of 1 ns or more and 30 ns or less, for example. According to the photodetection system 100 of this embodiment, since the shutter width can be reduced, the influence of dark current contained in the detection signal can be reduced.
  • the photodetection device 30 has one or more photodetection cells.
  • the photodetection device 30 may be an image sensor having a plurality of photodetection cells two-dimensionally arranged along a photodetection surface.
  • the image sensor can be any image sensor such as a CCD image sensor or a CMOS image sensor.
  • Each photodetector cell may comprise a photoelectric conversion element 32 , such as a photodiode, and one or more charge storages 34 .
  • the photodetector cells are also referred to as "pixels", and the intensity of light detected by the photodetector cells is also referred to as a "luminance value".
  • the above-described signal detected and output by the photodetector 30 is an image signal indicating luminance values of a plurality of pixels distributed two-dimensionally.
  • the image signal may contain imaged information, or may contain numerical information of luminance values for a plurality of pixels. The details of the configuration of the photodetector 30 will be described later.
  • Control circuitry 42 included in processor 40 controls the operation of light source 20 , photodetector 30 , and signal processor 44 .
  • the control circuit 42 adjusts the time difference between the emission timing of the light pulse Ip from the light source 20 and the shutter timing of the photodetector 30 .
  • the time difference is also called "phase difference”.
  • the “emission timing” of the light source 20 is the timing at which the light pulse emitted from the light source 20 starts rising.
  • “Shutter timing” is the timing to start exposure.
  • the control circuit 42 may adjust the phase difference by changing the emission timing, or may adjust the phase difference by changing the shutter timing.
  • the control circuit 42 may be configured to remove the offset component from the signal detected by each pixel of the photodetector 30 .
  • the offset component is a signal component due to environmental light such as sunlight or illumination light, or disturbance light.
  • a signal processing device 44 included in the processing device 40 generates and outputs data indicating biological information of the subject 12 of the living body 10 based on the signal output from the photodetector 30 .
  • the data includes surface information and/or internal information of the test part 12 .
  • the signal processing device 44 can also estimate the psychological state and/or physical state of the living body 10 based on the surface information and/or internal information of the subject 12 .
  • the signal processor 44 may generate and output data indicating the psychological and/or physical state of the living body 10 .
  • a psychological state can be, for example, a mood, an emotion, a state of health, or a temperature sensation.
  • Moods can include, for example, moods such as pleasant or unpleasant.
  • Emotions may include, for example, feelings of relief, anxiety, sadness, or resentment.
  • a state of health may include, for example, a state of well-being or fatigue.
  • Temperature sensations may include, for example, sensations of hot, cold, or muggy.
  • indices representing the degree of brain activity, such as interest, proficiency, proficiency, and concentration can also be included in psychological states.
  • the physical condition can be, for example, the degree of fatigue, drowsiness, or drunkenness.
  • the control circuit 42 may be, for example, a combination processor and memory or an integrated circuit such as a microcontroller containing a processor and memory.
  • the control circuit 42 executes a computer program recorded in the memory 46 by the processor, for example, to adjust the emission timing and the shutter timing, and cause the signal processing device 44 to perform signal processing.
  • the signal processor 44 is, for example, a digital signal processor (DSP), a programmable logic device (PLD) such as a field programmable gate array (FPGA), or a central processing unit (CPU) or an arithmetic processor (GPU) for image processing and a computer program. It can be realized by a combination of The signal processing device 44 executes signal processing by the processor executing a computer program recorded in the memory 46 .
  • DSP digital signal processor
  • PLD programmable logic device
  • FPGA field programmable gate array
  • CPU central processing unit
  • GPU arithmetic processor
  • the signal processor 44 and the control circuit 42 may be one integrated circuit or separate individual circuits.
  • the signal processor 44 and control circuitry 42 are also collectively referred to herein as "processing circuitry.”
  • At least one of signal processor 44, control circuitry 42, and memory 46 may be components of an external device, such as a remotely located server. In this case, an external device such as a server exchanges data with the rest of the components via wireless or wired communication.
  • control circuit 42 the operation of the control circuit 42 and the operation of the signal processing device 44 are collectively described as the operation of the processing device 40 .
  • the photodetection system 100 may include an imaging optical system that forms a two-dimensional image of the subject 12 on the photodetection surface of the photodetection device 30 .
  • the optical axis of the imaging optical system is substantially orthogonal to the photodetection surface of the photodetector 30 .
  • the imaging optics may include a zoom lens. When the position of the zoom lens changes, the magnification of the two-dimensional image of the living body 10 and its subject 12 changes, and the resolution of the two-dimensional image on the photodetector 30 changes. Therefore, even if the living body 10 is far away, it is possible to magnify the desired subject 12 and observe it in detail.
  • the photodetection system 100 may include a band-pass filter that passes the light in the wavelength band emitted from the light source 20 or light in the vicinity thereof between the subject 12 and the photodetector 30 .
  • a band-pass filter can be constituted by a multilayer filter or an absorption filter, for example.
  • the bandwidth of the band-pass filter may have a width of about 20 nm or more and 100 nm or less.
  • the photodetection system 100 may include polarizing plates between the test section 12 and the light source 20 and between the test section 12 and the photodetector 30, respectively.
  • the polarization directions of the polarizing plate arranged on the light source 20 side and the polarizing plate arranged on the photodetector 30 side may have a crossed Nicols relationship.
  • the internal information is the internal scattering component I2 .
  • FIG. 3A is a diagram schematically showing an example of temporal changes of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has an impulse waveform.
  • FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has a rectangular waveform.
  • the diagram on the left side of each diagram shows an example of the waveform of the light pulse Ip emitted from the light source 20, and the diagram on the right side shows an example of the waveforms of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse. show.
  • the surface reflection component I1 has a waveform similar to that of the light pulse Ip
  • the internal scattering component I2 is the surface reflection component I2 , as shown in the right-hand diagram of FIG. 3A. It has an impulse response waveform that lags behind component I1 . This is because the internal scattering component I 2 corresponds to a combination of light rays that have passed through various paths within the subject 12 .
  • the surface reflection component I 1 has a waveform similar to that of the light pulse I p
  • the internal scattering component I 2 is It has a waveform in which a plurality of impulse response waveforms are superimposed.
  • the present inventors confirmed that by superimposing a plurality of impulse response waveforms, the amount of light of the internal scattering component I2 detected by the photodetector 30 can be amplified, compared to the case where the light pulse Ip has an impulse waveform. .
  • the internally scattered component I2 can be effectively detected.
  • 3B represents an example of the exposure period during which the electronic shutter of the photodetector 30 is open. If the pulse width of the rectangular pulse is on the order of 1 ns to 10 ns, the light source 20 can be driven with a low voltage. Therefore, it is possible to reduce the size and cost of the photodetection system 100 in this embodiment.
  • streak cameras have been used to distinguish and detect information such as light absorption coefficients or light scattering coefficients at different locations in the depth direction inside a living body.
  • JP-A-4-189349 discloses an example of such a streak camera.
  • These streak cameras use ultrashort light pulses with femtosecond or picosecond pulse widths to measure at the desired spatial resolution.
  • the surface reflection component I1 and the internal scattering component I2 can be detected separately. Therefore, the light pulse emitted from the light source 20 does not have to be an ultra-short light pulse, and the pulse width can be arbitrarily selected.
  • the amount of light of the internal scattering component I2 is extremely small, which is approximately one to several ten thousandths of the amount of light of the surface reflection component I1 . can be small. Furthermore, considering the laser safety standards, the amount of light that can be emitted is extremely small. Therefore, detection of the internal scatter component I2 becomes very difficult. Even in such a case, if the light source 20 emits a light pulse Ip with a relatively large pulse width, it is possible to increase the integrated amount of the internal scattering component I2 with a time delay. As a result, the amount of detected light can be increased and the SN ratio can be improved.
  • the light source 20 can emit a light pulse Ip with a pulse width of 3 ns or more, for example.
  • the light source 20 may emit a light pulse Ip with a pulse width of 5 ns or more, or 10 ns or more.
  • the light source 20 can emit an optical pulse Ip with a pulse width of 50 ns or less, for example.
  • the light source 20 may emit an optical pulse Ip with a pulse width of 30 ns or less, or even 20 ns or less. If the pulse width of the rectangular pulse is several ns to several tens of ns, the light source 20 can be driven at a low voltage. Therefore, it is possible to reduce the cost of the photodetection system 100 in this embodiment.
  • the irradiation pattern of the light source 20 may be, for example, a pattern having a uniform intensity distribution within the irradiation area.
  • the photodetection system 100 of this embodiment differs from the conventional device disclosed in, for example, Japanese Patent Application Laid-Open No. 11-164826.
  • the detector and the light source are separated by about 3 cm, and the surface reflection component is spatially separated from the internal scattering component. It has to be a pattern with
  • the surface reflection component I1 can be temporally separated from the internal scattering component I2 and reduced. Therefore, the light source 20 having an irradiation pattern having a uniform intensity distribution can be used.
  • An irradiation pattern having a uniform intensity distribution may be formed by diffusing the light emitted from the light source 20 with a diffusion plate.
  • the internal scattering component I2 can be detected even just below the irradiation point of the subject 12 .
  • the measurement resolution can be increased.
  • the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , and the initial values of the concentrations of HbO 2 and Hb in the blood Calculation of the amount of change in will be described later in detail.
  • FIG. 4 is a diagram showing the positional relationship between the photodetector 30 and the living body 10 in this embodiment.
  • the measured distance Z 0 between the photodetector 30 and the living body 10 is the average value of the distances between the photodetection surface of the photodetector 30 and the surface of the test site 12 of the living body 10 .
  • four measurement ranges R1-R4 are defined.
  • the measurement range R1 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D11 and less than or equal to D12 .
  • the measurement range R2 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D21 and less than or equal to D22 .
  • the measurement range R3 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D31 and less than or equal to D32 .
  • the measurement range R4 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D41 and less than or equal to D42 .
  • the measurement range R1 and the measurement range R2 are set so as to partially overlap each other.
  • the measurement range R2 and the measurement range R3 are set so as to partially overlap each other.
  • the measurement range R3 and the measurement range R4 are set so as to partially overlap each other. That is, the relationship of D11 ⁇ D21 ⁇ D12 ⁇ D31 ⁇ D22 ⁇ D41 ⁇ D32 ⁇ D42 is satisfied.
  • the distance D11 is a value greater than or equal to zero, and can be, for example, greater than or equal to 5 cm and less than or equal to 100 cm.
  • the distance width of each of the measurement ranges R1 to R4 can be, for example, 3 cm or more and 24 cm or less. With such a distance width, the measurement ranges R1 to R4 can be accurately distinguished.
  • the width of the range in which two adjacent measurement ranges partially overlap can be, for example, 0 cm or more and 8 cm or less.
  • the number of measurement ranges does not have to be four, it can be any number.
  • Exposure periods P1 to P4 are set for the measurement ranges R1 to R4, respectively.
  • the exposure period can be determined, for example, using an optical model that simulates changes in human cerebral blood flow.
  • the state of the human brain is divided into an inactive state simulating the absorption and scattering characteristics of the skin or brain at rest, absorption when brain activity is activated and cerebral blood flow increases, and the active state mimicking the scattering properties.
  • the exposure period P1 is the exposure period during which the amount of change in luminance value between the inactive state and the active state is maximized when the optical model is placed in the center of the measurement range R1, that is, at a distance of (D 11 +D 12 )/2. is.
  • the exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant.
  • the processing device 40 may further include a storage device (not shown), and the storage device may store data indicating the exposure periods P1 to P4 associated with the measurement ranges R1 to R4, respectively.
  • the processing device 40 acquires the data from the storage device as needed.
  • the processing device 40 selects an exposure period corresponding to the measurement range from the exposure periods P1 to P4, and measures the living body of the subject 12. Get information.
  • the processing device 40 selects another exposure period corresponding to the other measurement range from the exposure periods P1 to P4, and Biological information of the detection unit 12 is acquired.
  • FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device 40 in this embodiment when the measured distance Z0 between the photodetector 30 and the living body 10 changes.
  • the processing device 40 executes the operations of steps S101 to S108 shown in FIG.
  • the photodetector 30 is an image sensor having a plurality of pixels distributed two-dimensionally.
  • the biological information acquired by the processing device 40 is cerebral blood flow information.
  • the processing device 40 acquires luminance image data as follows.
  • the processing device 40 causes the light source 20 to emit a light pulse Ip for irradiating the subject 12 of the living body 10 .
  • the processing device 40 causes the photodetector 30 to detect the light pulse Ip or the reflected light pulse generated by being reflected by the part 12 to be inspected during the currently set exposure period, and to generate and output luminance image data.
  • the processing device 40 causes the photodetector 30 to start detecting the reflected light pulse when the first time elapses after the light pulse Ip is emitted.
  • the detection of the reflected light pulse is terminated when a predetermined exposure time has elapsed from the start of the operation.
  • the predetermined exposure time is the duration of the currently set exposure period.
  • the processor 40 causes the photodetector 30 to send the luminance image data to the processor 40 .
  • starting detection of the reflected light pulse means causing the charge accumulation unit to start accumulating charges corresponding to components of the reflected light pulse.
  • Termination of detection of the reflected light pulse means to cause the charge storage unit to terminate the storage of the charge corresponding to the component of the reflected light pulse.
  • the processing device 40 acquires distance data regarding the distance between the photodetector 30 and the living body 10 based on the luminance image data as follows. Calibration is performed before measuring the distance between the photodetector 30 and the living body 10 . In the calibration, reflected light pulses generated by irradiating the object positioned in the measurement ranges R1 to R4 with light pulses are detected in the exposure periods P1 to P4, respectively.
  • the object is, for example, a flat plate, and the absorption and scattering coefficients of the flat plate may be close to the absorption and scattering coefficients of humans, respectively.
  • Equation (1) By changing the distance Z, data indicating the relationship between the distance Z and the luminance value I at the position (X, Y) in the luminance image is obtained.
  • the function represented by Equation (1) is obtained. Interpolation and extrapolation methods may be applied to interpolate between data points, and various regression methods may be applied. This data acquisition may be performed each time the cerebral blood flow of the living body 10 is measured, or may be performed once at the beginning.
  • the function represented by Equation (1) may be stored in the above-described storage device in table format.
  • the processing device 40 can calculate the distance Z at each pixel by inputting the brightness image data acquired in step S101 into the function represented by Equation (1).
  • the processing device 40 detects, for example, the position of the face of the living body 10 in the image, and extracts the forehead from the relative positional relationship with the position of the face.
  • the processing device 40 calculates the average value of the distance Z at the forehead as the measured distance Z0 .
  • the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
  • Step S103 The processing device 40 determines whether or not the measured distance Z0 is within the measurement range corresponding to the currently set exposure period. In this specification, the measurement range is called "first range”. If the determination is Yes, the processing device 40 performs the operation of step S104. If the determination is No, the processing device 40 performs the operation of step S105.
  • Step S104 If the determination in step S103 is Yes, the processing device 40 generates and outputs cerebral blood flow data indicating the amount of change in cerebral blood flow based on the luminance image data acquired in step S101.
  • the cerebral blood flow data can also be said to be brain activity data relating to brain activity.
  • the time when the measured distance Z0 deviates from the first range is defined as the first time.
  • the processing device 40 determines whether or not the amount of variation in the measured distance Z0 is equal to or less than a predetermined threshold during a predetermined determination period after the first time. In other words, the processing device 40 determines whether or not the change over time of the measured distance Z0 falls within a predetermined range during the determination period.
  • the amount of variation in distance may be, for example, a reference value determined based on at least one of the variation width, variance, and standard deviation of the variation over time in distance within the determination period.
  • the duration of the non-determination period until the determination starts after the first time can be, for example, 1 second or more and 10 seconds or less.
  • the time width of the determination period can be, for example, 1 second or more and 10 seconds or less.
  • the predetermined threshold may be, for example, 1 mm or more and 30 mm or less. If the determination is No, the processing device 40 performs the operation of step S106. A non-decision period is also included if the decision is no. If the determination is Yes, the processing device 40 performs the operation of step S107.
  • a second time is defined as a time at which the change in the measured distance Z0 becomes equal to or less than a predetermined threshold.
  • the processing device 40 initializes the cerebral blood flow value and resets the initial value of the cerebral blood flow to the cerebral blood flow value at the second time.
  • Step S108> If the measured distance Z0 at the second time is outside the first range, the measured distance Z0 at the second time is in a different measurement range than the first range.
  • a measurement range different from the first range is referred to as a "second range”.
  • the processing device 40 resets the exposure period corresponding to the first range to the exposure period corresponding to the second range. In other words, the processing device 40 changes the first time from the emission of the light pulse Ip to the start of detection of the reflected light pulse in step S101 to a second time different from the first time.
  • the processing device 40 selects the measurement range based on the condition that the measurement range whose center is closer to the measurement distance Z0 is adopted. to select. On the other hand, when the measured distance Z0 at the second time is again within the first range, the processing device 40 maintains the exposure period corresponding to the first range. As described above, the processing device 40 determines whether or not to change the first time based on the measured distance Z0 at the second time.
  • the processing device 40 repeatedly executes the operations of steps S101 to S108 for each frame until the measurement of cerebral blood flow is completed.
  • FIG. 6A is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 becomes short.
  • (a) of FIG. 6A shows the time change of the measured distance Z0 .
  • (b) of FIG. 6A shows the relationship between the exposure period and time.
  • (c) of FIG. 6A shows temporal changes in the amount of change in cerebral blood flow when the living body 10 is at rest.
  • the amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the subject 12 .
  • the sum of the amount of change in oxygenated hemoglobin and the amount of change in deoxygenated hemoglobin in the test area 12 may be used.
  • the initial measurement distance Z0 is in the measurement range R2. If the measured distance Z0 is shorter than D21 , the determination in step S103 is No. The time when the measured distance Z0 becomes shorter than D21 is the first time. While the measured distance Z0 is changing, the output of the cerebral blood flow data at the first time in step S106 continues, as shown in FIG. 6A(c). If the amount of variation in the measured distance Z0 is less than or equal to the threshold during the hatched determination period, the determination in step S105 is Yes. The second time is the time when the amount of variation in the measured distance Z0 becomes equal to or less than the threshold. In step S107, the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow. In step S108, as shown in (b) of FIG. 6A, the exposure period P2 is reset to the exposure period P1. After the second time, the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
  • FIG. 6B is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 decreases and then increases.
  • (a) to (c) of FIG. 6B correspond to (a) to (c) of FIG. 6A, respectively.
  • It is the first time.
  • the measured distance Z0 becomes longer than D12
  • the measured distance Z0 at the second time is It is in the measurement range R2.
  • step S107 the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow.
  • step S108 as shown in (b) of FIG. 6B, the exposure period is maintained as the exposure period P2.
  • the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
  • the correction operation performed by the processing device 40 in this embodiment can suppress the influence of the body movement of the living body 10 on the cerebral blood flow data, and the cerebral blood flow of the living body 10 can be suppressed. It becomes possible to acquire data stably without contact.
  • the measured distance Z0 is calculated based on the brightness image data. If the measured distance Z0 is calculated based on the distance image data instead of the luminance image data, the measured distance Z0 can be measured with high accuracy. As a result, the initial value of cerebral blood flow and the exposure period can be reset with more accurate timing.
  • the exposure periods P1A to P4A including the rising period of the reflected light pulse and the falling period of the reflected light pulse are set.
  • Exposure periods P1B to P4B are set, respectively. That is, three types of exposure periods are set for each of the measurement ranges R1 to R4. Detection of the reflected light pulse by the three types of exposure periods may be performed for each different pixel, may be performed for each different charge accumulation portion of the same pixel, or may be performed by switching the exposure period for each time. or a combination of these.
  • FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment.
  • FIG. 7(a) shows the temporal change of the light emission pulse.
  • (b) of FIG. 7 shows the temporal change of the reflected light pulse.
  • (c) of FIG. 7 shows the relationship between the exposure periods P1 to P4 and time.
  • (d) of FIG. 7 shows the relationship between the exposure periods P1A to P4A and time.
  • e) of FIG. 7 shows the relationship between the exposure periods P1B to P4B and time.
  • the signal indicating the amount of charge accumulated in the exposure periods P1 to P4 is assumed to be S0 .
  • Signal S0 contains cerebral blood flow information.
  • the processor 40 generates and outputs luminance image data based on the signal S0 .
  • the time width of the light emission pulse and the reflected light pulse is set to T0 .
  • the time from when the light source 20 starts emitting light pulses to when the exposure periods P1A to P4A end is tA .
  • the time from when the light source 20 starts emitting light pulses to when the exposure periods P1B to P4B start is tB .
  • S A be a signal indicating the amount of charge accumulated in the exposure periods P1A to P4A.
  • the signal indicating the amount of charge accumulated in the exposure periods P1B to P4B is SB .
  • the intensity of each of signals S A and S B varies with distance Z.
  • FIG. Assuming that the speed of light in air is c ( ⁇ 3.0 ⁇ 10 8 m/s), the distance Z can be calculated using the following equation (2).
  • the processing device 40 generates and outputs distance image data based on Equation (2). Details of the method of generating distance image data based on Equation (2) are disclosed in Japanese Patent Application No. 2021-012027 (filed on January 28, 2021). For reference, the entire disclosure of Japanese Patent Application No. 2021-012027 is incorporated herein by reference.
  • the processing device 40 acquires range image data in addition to luminance image data for acquiring cerebral blood flow data, and calculates the measured distance Z0 based on the range image data. Since the measurement distance Z0 can be accurately measured, the initial value of the cerebral blood flow and the exposure period can be reset with more accurate timing.
  • FIG. 8 is a flowchart schematically showing another example of the correction operation performed by the processing device 40 when the measured distance Z0 between the photodetector 30 and the living body 10 changes.
  • the processing device 40 executes the operations of steps S201 to S208 shown in FIG. In the following, the points different from the correction operation shown in FIG. 5 will be mainly described.
  • the processing device 40 acquires luminance image data and distance image data.
  • the processing device 40 extracts the forehead, which is the subject 12 of the living body 10, from the distance image indicated by the distance image data, and converts the distance values of the pixels that do not correspond to the forehead into missing values. This processing can prevent the distance values of the pixels that do not correspond to the forehead from affecting the calculation of the measured distance Z0 .
  • Extraction of the forehead can be performed, for example, as follows. A distance is estimated for each pixel, and a portion of the distance image that includes pixels whose estimated distance falls within a certain range is extracted as the forehead. Alternatively, the forehead is extracted by face detection processing using an image showing the appearance of the living body 10 . You can combine them.
  • the distance value for that pixel may be converted to a missing value due to movement of the living body 10 during measurement. Therefore, the distance values for the pixels corresponding to the peripheral region of the forehead may be converted to missing values.
  • the peripheral region of the forehead may be, for example, a region 0 pixels or more and 5 pixels or less inward from the edge of the forehead.
  • Converting the distance value of the 0 pixel from the edge of the forehead into a missing value means the operation of converting the distance value of the pixel corresponding to the edge into a missing value. Converting the distance value of the pixel located in the area inside by n pixels (n is an integer of 1 to 5) from the edge of the forehead to a missing value means repeating the following operation n times. .
  • the operation converts distance values for pixels corresponding to edges to missing values, redetects edges from the remaining forehead, and converts distance values for pixels corresponding to the redetected edges to missing values. It is to be.
  • Step S203 The processing device 40 determines whether or not the number of pixels showing missing values is equal to or less than a predetermined threshold.
  • the threshold can be determined, for example, according to the size of the image of the living body 10 in the image. If the determination is No, the processing device 40 performs the operation of step S204. If the determination is Yes, the processing device 40 performs the operation of step S206.
  • Step S203 If the determination in step S203 is No, that is, if the number of missing value pixels exceeds the threshold, the measurement distance Z0 is not within the measurement range corresponding to the currently set three types of exposure periods. If the measurement distance Z0 is longer than the upper limit of the measurement range, the time it takes for the reflected light pulse to reach the photodetector 30 is delayed. The amount of light detected in periods P1B to P4B relatively increases. Conversely, when the measurement distance Z0 is shorter than the lower limit of the measurement range, the reflected light pulse reaches the photodetector 30 earlier, so the amount of light detected during the exposure periods P1A to P4A relatively increases. However, the amount of light detected during the exposure periods P1B to P4B is relatively decreased. Accuracy of distance data can be reduced due to such an increase or decrease in the amount of detected light.
  • the processing device 40 resets the initial value of the cerebral blood flow in step S204, and resets the three exposure periods in step S205.
  • the processing device 40 may search the three types of exposure periods corresponding to the measurement ranges by switching in order of the measurement ranges R1 to R4.
  • the measurement ranges R1 to R4 may be switched in order from the measurement range closest to the measurement range corresponding to the measurement distance Z0 . By such switching, it is possible to efficiently search for three types of exposure periods corresponding to appropriate measurement ranges.
  • the processing device 40 executes the operations of steps S201 to S203 each time switching is performed. If the three switched exposure periods are appropriate, the determination in step S203 is Yes. If the three switched exposure periods are not appropriate, the determination in step S203 is No.
  • the processing device 40 acquires distance data from the distance image data.
  • the processing device 40 calculates the average value of the distances on the forehead as the measured distance Z0 .
  • the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
  • Steps S207 to S212 The operations from steps S207 to S212 are the same as the operations from steps S103 to S108 shown in FIG.
  • the processing device 40 repeatedly executes the operations of steps S201 to S212 for each frame until the measurement of cerebral blood flow is completed. As a result, the accuracy of the measured distance Z0 can be improved, and the cerebral blood flow data of the living body 10 can be acquired more stably without contact.
  • the living body 10 can move forward and backward, move horizontally, and tilt.
  • An example of the correction operation performed by the processing device 40 according to the present embodiment when the living body 10 makes such movements will be described below.
  • step S206 the processing device 40 expresses the distance indicated by the distance data as the sum of the initial distance before body movement of the living body 10 and the distance ⁇ Z deviated therefrom as follows.
  • the processing device 40 processes the range images before and after the body motion of the living body 10 using a known ICP algorithm, and aligns the range images before and after the body motion using the rotation matrix and the translation vector.
  • the processing device 40 sets the average value of the distances at the forehead before body movement of the living body 10 as the initial distance, and sets the component of the translation vector in the front-rear direction as the distance ⁇ Z.
  • a rotation matrix and a translation vector that define the body motion of the living body 10 can be obtained by the ICP algorithm.
  • the processing device 40 may make the determinations in steps S207 and S209 based not only on the distance but also on the amount of horizontal body movement and/or the amount of face rotation.
  • the amount of body motion in the horizontal direction and the amount of rotation of the face can be known from the above translation vector and rotation matrix, respectively. As described above, fluctuations in the cerebral blood flow data due to body movements other than the back and forth movement of the living body 10 can be suppressed, and the cerebral blood flow data can be stably acquired.
  • the photodetection system 100 may further include, in addition to the photodetection device 30, another photodetection device for measuring the distance between the photodetection device 30 and the living body 10.
  • the other photodetection device can be, for example, an RGB camera, an omnidirectional camera, or a ToF camera.
  • the RGB camera the distance can be calculated from the magnification/reduction ratio of the image of the living body 10 in the acquired image.
  • an omnidirectional camera the distance can be calculated by attaching the camera directly above each of the photodetector 30 and the living body 10, for example.
  • the ToF camera can calculate the distance from the distance measurement result.
  • the other photodetector can measure the body motion of the living body 10 from a plurality of viewpoints, and the distance between the photodetector 30 and the living body 10 can be measured more accurately.
  • whether or not the correction operation of the processing device 40 according to this embodiment is being executed can be known by checking the time change of the voltage output by the processing device.
  • the output voltage is zero during the non-exposure period, and becomes non-zero during the exposure period. That is, a voltage pulse corresponding to the exposure period appears in the temporal change of the output voltage.
  • FIG. 6A (a) the voltage pulse shifts along the time axis when the exposure period is reset. If the living body 10 moves as shown in FIG. 6B (a), or if the living body 10 does not move, the exposure period is not reset, so no voltage pulse shift occurs.
  • whether or not the voltage pulse shifts depends on the body motion of the living body 10 .
  • the items are the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , the operation of the processing unit 40, and the initial values of the concentrations of HbO 2 and Hb in blood. This is the calculation of the amount of change.
  • FIG. 9 is a diagram showing an example of the configuration of the photodetector 30.
  • a region surrounded by a two-dot chain line frame corresponds to one pixel 201 .
  • Pixel 201 includes one photodiode, not shown. Although eight pixels arranged in two rows and four columns are shown in FIG. 9, more pixels may actually be arranged.
  • Each pixel 201 includes a first floating diffusion layer 204 and a second floating diffusion layer 206 .
  • the wavelength of the first optical pulse Ip1 is 650 nm or more and shorter than 805 nm
  • the wavelength of the second optical pulse Ip2 is longer than 805 nm and 950 nm or less.
  • the first floating diffusion layer 204 accumulates charges generated by receiving the first reflected light pulse from the first light pulse Ip1 .
  • the second floating diffusion layer 206 accumulates charges generated by receiving the second reflected light pulse from the second light pulse Ip2 .
  • the signals accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are treated as if they were two pixel signals of a general CMOS image sensor, and output from the photodetector 30 .
  • Each pixel 201 has four signal detection circuits.
  • Each signal detection circuit includes a source follower transistor 309 , a row select transistor 308 and a reset transistor 310 .
  • Each transistor is, for example, a field effect transistor formed on a semiconductor substrate, but is not limited to this.
  • one of the input and output terminals of source follower transistor 309 is connected to one of the input and output terminals of row select transistor 308 .
  • the one of the input and output terminals of source follower transistor 309 is typically the source.
  • the one of the input and output terminals of row select transistor 308 is typically the drain.
  • the gate which is the control terminal of the source follower transistor 309, is connected to the photodiode. Signal charges of holes or electrons generated by the photodiode are accumulated in a floating diffusion layer, which is a charge accumulation part between the photodiode and the source follower transistor 309 .
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 are connected to photodiodes.
  • a switch may be provided between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 . This switch switches the conduction state between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 in response to the signal accumulation pulse from the processing device 40 . This controls the start and stop of signal charge accumulation in each of the first floating diffusion layer 204 and the second floating diffusion layer 206 .
  • the electronic shutter in this embodiment has a mechanism for such exposure control.
  • the signal charges accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are read out by turning on the gate of the row selection transistor 308 by the row selection circuit 302 .
  • the current flowing from the source follower power supply 305 to the source follower transistor 309 and the source follower load 306 is amplified according to the signal potential of the first floating diffusion layer 204 and the second floating diffusion layer 206 .
  • An analog signal based on this current read out from the vertical signal line 304 is converted into digital signal data by an analog-digital (AD) conversion circuit 307 connected for each column. This digital signal data is read column by column by the column selection circuit 303 and output from the photodetector 30 .
  • AD analog-digital
  • the row selection circuit 302 and column selection circuit 303 After reading one row, the row selection circuit 302 and column selection circuit 303 read out the next row, and so on, to read the signal charge information of the floating diffusion layers of all the rows. After reading all the signal charges, the processing device 40 resets all the floating diffusion layers by turning on the gate of the reset transistor 310 . This completes imaging of one frame. Similarly, by repeating high-speed imaging of frames, a series of frame imaging by the photodetector 30 is completed.
  • the photodetector 30 may be another type of imaging device.
  • the photodetector 30 may be, for example, a CCD type, a single photon counting device, or an intensifying image sensor such as an EMCCD or an ICCD.
  • FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 .
  • the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be alternately switched multiple times. As a result, it is possible to reduce the time difference between acquisition timings of the detection images by the two kinds of wavelengths, and use the first optical pulse Ip1 and the second optical pulse Ip2 almost simultaneously even when the subject 12 is moving. imaging is possible.
  • FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 .
  • the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be switched for each frame.
  • detection of the first reflected light pulse by the first light pulse Ip1 and detection of the second reflected light pulse by the second light pulse Ip2 can be switched for each frame.
  • each pixel 201 may have a single charge reservoir. With such a configuration, the number of charge storage units in each pixel 201 can be reduced, so the size of each pixel 201 can be increased, and the sensitivity can be improved.
  • FIG. 11 is a flow chart outlining the operation of the processor 40 with respect to the light source 20 and the photodetector 30.
  • the processor 40 causes the photodetector 30 to detect at least part of the fall period components of each of the first and second reflected light pulses by performing the operation schematically shown in FIG.
  • step S301 the processing device 40 causes the first light source 20a to emit the first light pulse Ip1 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposing until the surface reflection component I1 of the first reflected light pulse reaches the photodetector 30 for a period of time.
  • step S302> the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the first reflected light pulse reaches the photodetector 30 .
  • step S303 the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed.
  • Signal charges are accumulated in the first floating diffusion layer 204 shown in FIG. 9 by steps S102 and S103.
  • the signal charges are called "first signal charges”.
  • step S304 the processing device 40 causes the second light source 20b to emit the second light pulse Ip2 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposure until the surface reflection component I1 of the second reflected light pulse reaches the photodetector 30 for a period of time.
  • step S ⁇ b>305 the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the second reflected light pulse reaches the photodetector 30 .
  • Step S306 the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed.
  • steps S105 and S106 signal charges are accumulated in the second floating diffusion layer 206 shown in FIG. The signal charges are called "second signal charges”.
  • step S307 the processing device 40 determines whether or not the number of times the above signal accumulation has been performed has reached a predetermined number. If the determination in step S307 is No, the processing device 40 repeats steps S301 to S306 until it determines Yes. If the determination in step S307 is Yes, the processing device 40 performs the operation of step S308.
  • Step S308 the processor 40 causes the photodetector 30 to generate and output a first signal and a second signal based on the first signal charge and the second signal charge, respectively.
  • the first signal and the second signal contain internal information of the test part 12 .
  • the operation shown in FIG. 11 is summarized as follows.
  • the processing device 40 performs a first operation of causing the first light source 20a to emit the first light pulse Ip1 and causing the photodetector device 30 to detect at least part of the falling edge period of the first reflected light pulse.
  • the processing device 40 causes the second light source 20b to emit the second light pulse Ip2 and performs a second operation of causing the photodetector 30 to detect at least part of the fall period component of the second reflected light pulse.
  • the processing device 40 repeats a series of operations including the first operation and the second operation a predetermined number of times. Alternatively, the processing device 40 may repeat the first action a predetermined number of times, and then repeat the second action a predetermined number of times. The first action and the second action may be interchanged.
  • the internal scattering component I2 can be detected with high sensitivity.
  • the attenuation rate of light inside is very large.
  • the emitted light can be attenuated to about 1/1,000,000 of the incident light. Therefore, in order to detect the internal scattering component I2 , the amount of light may be insufficient with one pulse irradiation. In the case of irradiation in class 1 of laser safety standards, the amount of light is particularly weak.
  • the light source 20 emits light pulses a plurality of times, and the photodetector 30 is also exposed a plurality of times by the electronic shutter accordingly, thereby integrating detection signals and improving sensitivity.
  • the multiple times of light emission and exposure are not essential, and are performed as necessary.
  • the surface reflection of each of the first and second reflected light pulses is detected by causing the photodetector 30 to detect at least a portion of the rising period of each of the first and second reflected light pulses.
  • Component I1 can be detected, making it possible to obtain surface information such as blood flow on the face and scalp.
  • a first floating diffusion layer 204 and a second floating diffusion layer 206 included in each pixel 201 shown in FIG. The charge generated can be accumulated.
  • two pixels 201 adjacent to each other in the row direction shown in FIG. 9 may be treated as one pixel.
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 included in one pixel 201 respectively receive at least part of the fall period components of the first and second reflected light pulses.
  • the charge generated can be accumulated.
  • the first floating diffusion layer 204 and the second floating diffusion layer 206 included in the other pixel 201 receive the charge generated by receiving at least part of the rising period component of the first and second reflected light pulses, respectively. can accumulate.
  • not only the internal information of the living body 10 but also the surface information can be obtained.
  • Equations (3) and (4) below represent examples of simultaneous equations.
  • ⁇ HbO 2 and ⁇ Hb represent the amount of change from the initial values of the concentrations of HbO 2 and Hb in blood, respectively.
  • ⁇ 750 OXY and ⁇ 750 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 750 nm, respectively.
  • ⁇ 850 OXY and ⁇ 850 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 850 nm, respectively.
  • I 750 ini and I 750 now represent the detected intensity at a wavelength of 750 nm at a reference time (initial time) and a certain time, respectively.
  • I 850 ini and I 850 now represent the detected intensity at a wavelength of 850 nm at a reference time (initial time) and a certain time, respectively. These symbols represent, for example, the detection strength in the non-activated state and the activated state of the brain.
  • FIG. 12 is a flowchart for explaining this modification.
  • FIG. 13A is a timing chart for explaining this modification.
  • FIG. 13B is a timing chart for explaining this modification.
  • FIG. 14 is a table for explaining this modified example.
  • This modification may be as shown below.
  • a method being performed by one or more processors configured to execute instructions stored in one or more memories (a) Determine the distance d(i) between the apparatus and the living body based on 1 or the image taken of the living body including the test site at time t(i), thereby obtaining the distances d(k), .
  • a distance d(k+n0) is determined, wherein i is a variable, said i takes the value of an integer greater than or equal to 1, said k is an integer greater than or equal to 1, said n0 is a predetermined integer greater than or equal to 1; (b-1) performing a first process when the distances d(k), . . .
  • the first processing includes: (c-1) determining p(k+n0) based on the distance d(k), . . .
  • the first processing may include generating information Inf(k+n0) on the blood of the living body based on the light r(k+n0).
  • the light r(k+n0) may be light from the subject based on the light o(k+n0).
  • the first processing described above corresponds to S1004 to S1005 in FIG.
  • the photodetection system 100 includes a processor (not shown) and memory 46 .
  • a processor may be one or more processors.
  • Memory 46 may be one or more memories.
  • the light detection system 100 may further include a first camera (not shown), a second camera (not shown), Memory 46 stores a plurality of instructions. The instructions are executed by a processor. The plurality of instructions include the processing shown in S1001 to S1006 shown in FIG.
  • the photodetection system 100 accepts an instruction to start processing.
  • the processor sets the value of the timer of the photodetection system 100 to 0, starts the operation of the timer, and performs the processes shown in S1001 to S1006 shown in FIG. Execute the process.
  • the time indicated in this modified example may be the time indicated by the timer.
  • the processor causes the first camera to image the living body 10 including the subject 12 at time t(i).
  • the first camera thereby generates an image IL(i).
  • Image IL(i) is stored in memory 46 .
  • the processor causes the second camera to image the living body 10 including the subject 12 at time t(i).
  • the second camera thereby produces an image IR(i).
  • Image IR(i) is stored in memory 46 .
  • the time interval for imaging the living body 10 including the subject 12 by the first camera may be constant.
  • a time interval for imaging the living body 10 including the test part 12 by the second camera may be constant.
  • the processor calculates the distance d(i) between the points included in the living body 10 and the points included in the photodetection system 100 based on the images IL(i) and IR(i) stored in the memory 46 .
  • Distance d(i) may be referred to as the distance between detection system 100 and living body 10 .
  • the processor stores the calculated distance d(i) in the memory 46.
  • the points included in the living body 10 may be predetermined points included in the subject 12 .
  • the points included in the photodetection system 100 may be predetermined points included in the photodetection device 30 .
  • the distance is determined based on the three-dimensional coordinates (xi, yi, zi) of points included in the living body 10 .
  • the three-dimensional coordinates (xi, yi, zi) may be determined by providing a stereo camera system and the processor using a technique of distance measurement of points included in the living body 10. .
  • the stereo camera includes the above-described first camera and second camera. A single camera ranging technique may be used to determine the distance.
  • the distance d(1), the distance d(2) ⁇ , the distance d(k) ⁇ , and the distance d(k+n0) are determined.
  • Distances d(k), . . . , and distances d(k+n0) are shown in FIGS. 13A and 13B.
  • i is a variable, i takes the value of an integer of 1 or more, k is an integer of 1 or more, and n0 is a predetermined integer of 1 or more.
  • the processor determines d(k+n0)max, which is the maximum value of the distances d(k) to d(k+n0).
  • the processor determines d(k+n0)min, which is the minimum value of the distances d(k), .about.d(k+n0).
  • the processor determines if ⁇ d(k+n0) is less than or equal to a predetermined value d0.
  • the processor executes the processing shown in S1004 and the processing shown in S1005. If ⁇ d(k+n0)>d0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005, and then images the living body at time t(k+n0+1) in S1002. 1 or based on the image determine the distance d(k+n0+1).
  • the processor refers to the table shown in FIG. 14 recorded in the memory 46 to determine the mask time p(k+n0) corresponding to d(k+n0)avg. For example, if L1 ⁇ d(k+n0)avg ⁇ L2, the mask time p(k+n0) is p2. In FIG. 14, p1 ⁇ p2 ⁇ . . . ⁇ p ⁇ may be satisfied.
  • the processor causes the light source 20 to emit a light pulse Ip to the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ . That is, the processor causes the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ .
  • the processor causes the photodetector 30 to start detecting light r(k+n0) from the test site 12 of the living body 10 based on the light o(k+n0) from time ⁇ t(k+n0)+t0+p(k+n0) ⁇ .
  • the processor causes the photodetector 30 to finish detecting the light r(k+n0) at time ⁇ t(k+n0)+t0+p(k+n0)+(predetermined exposure period) ⁇ .
  • FIG. 13A will be described.
  • FIG. 13A shows an overview of processing related to the case of Yes in S1003.
  • the processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
  • the processor causes the light source 20 to emit light o(k+n0) toward the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0 ⁇ . emitted from
  • the processor causes the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 at time ⁇ t(k+n0)+t0+p (k+n0) ⁇ .
  • FIG. 13B will be described.
  • FIG. 13B shows an overview of the processing related to No in S1003.
  • the processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
  • the processor does not cause the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 .
  • the processor does not cause the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 .
  • Information Inf(k+n0) regarding the blood of the living body may be generated.
  • the information about the blood of the living body may be the concentration of HbO 2 and/or the concentration of Hb in the blood generated using equations (3) and (4).
  • the processor may end the processing shown in FIG.
  • (S1003) may be the following processing.
  • the processor calculates ⁇ (k+n0), which is the deviation of the distances d(k) to d(k+n0), and determines whether the deviation is less than or equal to a predetermined value ⁇ 0. If ⁇ (k+n0) ⁇ 0, the processor executes the processing shown in S1004 and the processing shown in S1005. If ⁇ (k+n0)> ⁇ 0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005.
  • t0 may be 0.
  • the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0), and (ii) the processor causes the time t(k+n0 ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0).
  • the processor causes the photodetector device 30 to detect the light r(k+n0) from the test site 12 of the living body 10 from time ⁇ t(k+n0)+p(k+n0) ⁇ .
  • the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0); ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0).
  • the processor prevents the photodetector 30 from detecting the light r(k+n0) from the subject 12 of the living body 10 from time ⁇ t(k+n0)+p(k+n0) ⁇ .
  • the distance d(i) is obtained using the first camera and the second camera.
  • light source 20 and photodetector 30 may be used to determine distance d(i).
  • t0 0.
  • the light detection system according to the present disclosure is capable of acquiring biometric information on a subject of a living body.
  • Optical detection systems in the present disclosure are useful, for example, for biosensing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

A light detecting system (100) comprises: a light source (20) that emits a light pulse to a part to be inspected of a living body; a light detecting device (30) that detects a reflected light pulse generated from the light pulse reflected by the part to be inspected; and a processing circuit (42,44) that controls the light source and the light detecting device and acquires distance data pertaining to the distance between the light detecting device and the living body. The processing circuit causes the light source to emit the light pulse, causes the light detecting device to start detection of the reflected light pulse when a first time period has elapsed since the light pulse was emitted, and causes the light detecting device to finish the detection of the reflected light pulse when a predetermined exposure time has elapsed since the detection of the reflected light pulse was started. When a time point at which the distance deviates from a first range is set to a first time point and a time point at which the temporal change in the distance falls within a predetermined range after the first time point is set to a second time point, whether or not the first time period is to be changed is determined on the basis of the distance at the second time point.

Description

光検出システム、処理装置、光検出システムを制御する方法、およびプログラムLight detection system, processor, method and program for controlling light detection system
 本開示は、光検出システム、処理装置、光検出システムを制御する方法、およびプログラムに関する。 The present disclosure relates to a photodetection system, a processing device, a method of controlling the photodetection system, and a program.
 生体の被検部を光で照射して生じる反射光は、被検部の表面および内部を経由する成分を含む。そのような反射光を検出することにより、被検部の生体情報、例えば表面情報および/または内部情報を取得することができる。特許文献1は、被検部の内部情報を取得する装置を開示している。 The reflected light generated by irradiating the subject area of the living body with light includes components that pass through the surface and inside of the subject area. By detecting such reflected light, it is possible to acquire biological information of the subject, such as surface information and/or internal information. Patent Literature 1 discloses an apparatus for acquiring internal information of a subject.
特開2018―54632号公報JP 2018-54632 A
 生体が移動する環境下では、被検部の生体情報を安定的に取得できない可能性がある。本開示は、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することが可能な光検出システムを提供する。  In an environment where the living body moves, it may not be possible to stably acquire the biological information of the subject. The present disclosure provides a photodetection system capable of stably acquiring biometric information of a subject in a non-contact manner in an environment in which a living body moves.
 本開示の一態様に係る光検出システムは、生体の被検部に光パルスを出射する光源と、前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置と、前記光源と前記光検出装置を制御し、かつ前記光検出装置と前記生体との距離に関する距離データを取得する処理回路と、を備え、前記処理回路は、前記光源に前記光パルスを出射させ、前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させ、前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定する。 A light detection system according to an aspect of the present disclosure includes a light source that emits a light pulse to a subject part of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part, a processing circuit that controls the light source and the photodetector and acquires distance data regarding the distance between the photodetector and the living body, wherein the processing circuit causes the light source to emit the light pulse, The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse. The detection of the reflected light pulse is terminated at the point in time, the time at which the distance deviates from the first range is set as a first time, and the time after the first time at which the change over time of the distance falls within a predetermined range is set. When the second time is set, it is determined whether or not to change the first time based on the distance at the second time.
 この包括的又は具体的な態様は、装置、方法、集積回路、コンピュータプログラム又はコンピュータ読み取り可能な記録媒体で実現されてもよく、装置、方法、システム、集積回路、コンピュータプログラム及びコンピュータ読み取り可能な記録媒体の任意な組み合わせで実現されてもよい。コンピュータ読み取り可能な記録媒体は、例えばCD-ROM(Compact Disc-Read Only Memory)等の不揮発性の記録媒体を含む。 This general or specific aspect may be embodied in an apparatus, method, integrated circuit, computer program, or computer readable recording medium, including: apparatus, method, system, integrated circuit, computer program and computer readable recording Any combination of media may be implemented. Computer-readable recording media include non-volatile recording media such as CD-ROMs (Compact Disc-Read Only Memory).
 本開示の技術によれば、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することが可能な光検出システムを実現できる。 According to the technology of the present disclosure, it is possible to realize a photodetection system capable of stably acquiring biometric information of a subject without contact in an environment in which a living body moves.
図1Aは、撮像装置と被検者との距離が変化する場合の、反射光パルスの時間変化と固定された露光期間との関係を示す図である。FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes. 図1Bは、撮像装置と被検者との距離が所定の閾値を下回る時点で、脳血流の初期値および露光期間を再設定する例を説明するための図である。FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold. 図2は、本開示の例示的な実施形態による光検出システムの構成を模式的に示すブロック図である。FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure; 図3Aは、光パルスがインパルス波形を有する場合の、反射光パルスに含まれる表面反射成分および内部散乱成分の時間変化の例を模式的に示す図である。FIG. 3A is a diagram schematically showing an example of temporal changes in surface reflection components and internal scattering components included in a reflected light pulse when the light pulse has an impulse waveform. 図3Bは、光パルスが矩形形状の波形を有する場合の、反射光パルスに含まれる表面反射成分および内部散乱成分の時間変化の例を模式的に示す図である。FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component and the internal scattering component included in the reflected light pulse when the light pulse has a rectangular waveform. 図4は、本実施形態における光検出装置と生体との位置関係を示す図である。FIG. 4 is a diagram showing the positional relationship between the photodetector and the living body in this embodiment. 図5は、光検出装置と生体の計測距離が変化する場合の、本実施形態における処理装置が実行する補正動作の例を概略的に示すフローチャートである。FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device in this embodiment when the measured distance between the photodetector and the living body changes. 図6Aは、本実施形態における処理装置の補正動作を、光検出装置と生体の計測距離が短くなる場合に適用する例を説明するための図である。FIG. 6A is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body becomes short. 図6Bは、本実施形態における処理装置の補正動作を、光検出装置と生体の計測距離が短くなってから長くなる場合に適用する例を説明するための図である。FIG. 6B is a diagram for explaining an example of applying the correction operation of the processing device according to the present embodiment when the measurement distance between the photodetector and the living body decreases and then increases. 図7は、本実施形態における距離画像データの生成方法を説明するための図である。FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment. 図8は、光検出装置と生体との計測距離が変化する場合の、本実施形態における処理装置が実行する補正動作の他の例を概略的に示すフローチャートである。FIG. 8 is a flowchart schematically showing another example of correction operation performed by the processing device according to the present embodiment when the measured distance between the photodetector and the living body changes. 図9は、光検出装置の構成の一例を示す図である。FIG. 9 is a diagram showing an example of the configuration of a photodetector. 図10Aは、第1光パルスおよび第2光パルスを出射する動作の例を示す図である。FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse and the second optical pulse. 図10Bは、第1光パルスおよび第2光パルスを出射する動作の他の例を示す図である。FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse and the second optical pulse. 図11は、光源および光検出装置に関する処理装置の動作の概略を示すフローチャートである。FIG. 11 is a flowchart outlining the operation of the processor with respect to the light source and photodetector. 図12は、変形例を説明するためのフローチャートである。FIG. 12 is a flowchart for explaining a modification. 図13Aは、変形例を説明するためのタイミングチャートである。FIG. 13A is a timing chart for explaining a modification. 図13Bは、変形例を説明するためのタイミングチャートである。FIG. 13B is a timing chart for explaining a modification. 図14は、変形例を説明するためのテーブルである。FIG. 14 is a table for explaining the modification.
 以下で説明される実施形態は、いずれも包括的または具体的な例を示すものである。以下の実施形態で示される数値、形状、材料、構成要素、構成要素の配置位置および接続形態、ステップ、およびステップの順序は、一例であり、本開示の技術を限定する趣旨ではない。以下の実施形態における構成要素のうち、最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。各図は模式図であり、必ずしも厳密に図示されたものではない。さらに、各図において、実質的に同一または類似の構成要素には同一の符号が付されている。重複する説明は省略または簡略化されることがある。 All of the embodiments described below are comprehensive or specific examples. Numerical values, shapes, materials, components, arrangement positions and connections of components, steps, and order of steps shown in the following embodiments are examples and are not intended to limit the technology of the present disclosure. Among the constituent elements in the following embodiments, constituent elements not described in independent claims representing the highest concept will be described as optional constituent elements. Each figure is a schematic diagram and is not necessarily strictly illustrated. Further, substantially identical or similar components are provided with the same reference numerals in each figure. Redundant description may be omitted or simplified.
 まず、本開示の実施の形態を説明する前に、図1Aおよび図1Bを参照して、本発明者らによって見出された知見を説明する。 First, before describing the embodiments of the present disclosure, the knowledge found by the inventors will be described with reference to FIGS. 1A and 1B.
 被検者の額部に光パルスを照射して生じる反射光パルスは、立ち上がり期間において被検部の表面情報を多く含み、立ち下がり期間において被検部の内部情報を多く含む。特許文献1によって開示されている撮像装置は受光素子を備え、当該受光素子は反射光パルスの立ち下がり期間の成分を検出する。受光強度から、脳の血液中の酸素化ヘモグロビン(HbO)および脱酸素化ヘモグロビン(Hb)の初期値からの変化量を、脳血流の変化量として算出することができる。当該初期値は、被検者の計測開始時における脳の血液中の酸素化ヘモグロビンおよび脱酸素化ヘモグロビンの濃度である。 The reflected light pulse generated by irradiating the forehead of the subject with the light pulse contains much surface information of the subject during the rising period and contains much internal information of the subject during the falling period. The imaging device disclosed in Patent Document 1 includes a light receiving element, and the light receiving element detects the fall period component of the reflected light pulse. From the received light intensity, the amount of change from the initial values of oxygenated hemoglobin (HbO 2 ) and deoxygenated hemoglobin (Hb) in blood in the brain can be calculated as the amount of change in cerebral blood flow. The initial value is the concentration of oxygenated hemoglobin and deoxygenated hemoglobin in the brain blood at the start of measurement of the subject.
 図1Aは、撮像装置と被検者との距離が変化する場合の、反射光パルスの時間変化と固定された露光期間との関係を示す図である。図1Aの(a)は、発光パルスの強度の時間変化を示す。図1Aの(b)は、撮像装置と被検部との距離が露光期間に対して適切な距離である場合の、受光素子における反射光パルスの強度の時間変化を示す。図1Aの(c)は、撮像装置と被検者との距離が適切な距離よりも長い場合の、反射光パルスの強度の時間変化を示す。図1Aの(d)は、撮像装置と被検者との距離が適切な距離よりも短い場合の、反射光パルスの強度の時間変化を示す。図1Aの(e)は、受光素子が反射光パルスを受光する露光期間を示す。 FIG. 1A is a diagram showing the relationship between the time change of the reflected light pulse and the fixed exposure period when the distance between the imaging device and the subject changes. (a) of FIG. 1A shows the temporal change of the intensity of the light emission pulse. (b) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse in the light receiving element when the distance between the imaging device and the subject is appropriate for the exposure period. (c) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is longer than the appropriate distance. (d) of FIG. 1A shows temporal changes in the intensity of the reflected light pulse when the distance between the imaging device and the subject is shorter than the appropriate distance. (e) of FIG. 1A shows an exposure period during which the light receiving element receives the reflected light pulse.
 図1Aの(b)に示すように、撮像装置と被検者との距離が適切である場合、受光素子は、反射光パルスの立ち下がり期間の一部の成分を受光する。その結果、脳血流の変化量を精度よく計測することができる。これに対して、図1Aの(c)に示すように、撮像装置と被検者との距離が長くなる場合、反射光パルスが受光素子に到達する時間が遅くなる。その結果、受光素子は、反射光パルスの立ち下がり期間の一部の成分だけでなく、それ以外の期間の成分も受光してしまい、脳血流の変化量の計測精度が低下する。図1Aの(d)に示すように、撮像装置と被検者との距離が短くなる場合、反射光パルスが受光素子に到達する時間が早くなる。その結果、受光素子が受光する反射光パルスの立ち下がり期間の成分は少なくなってしまい、脳血流の変化量の計測精度が低下する。 As shown in (b) of FIG. 1A, when the distance between the imaging device and the subject is appropriate, the light receiving element receives part of the fall period component of the reflected light pulse. As a result, it is possible to accurately measure the amount of change in cerebral blood flow. On the other hand, as shown in (c) of FIG. 1A, when the distance between the imaging device and the subject increases, the time it takes for the reflected light pulse to reach the light receiving element is delayed. As a result, the light-receiving element receives not only part of the fall period component of the reflected light pulse, but also the other period component, and the measurement accuracy of the amount of change in cerebral blood flow decreases. As shown in (d) of FIG. 1A, when the distance between the imaging device and the subject is shortened, the time for the reflected light pulse to reach the light receiving element is shortened. As a result, the fall period component of the reflected light pulse received by the light-receiving element decreases, and the measurement accuracy of the amount of change in cerebral blood flow decreases.
 特許文献1は、撮像装置と被検者との距離が変化する場合の、脳血流の初期値および露光期間を再設定する方法を開示している。図1Bは、撮像装置と被検者との距離が所定の閾値を下回る時点で、脳血流の初期値および露光期間を再設定する例を説明するための図である。撮像装置と被検者との距離は、TOF(Time Of Flight)方式によって計測された。図1Bの(a)は、撮像装置と被検者との距離の時間変化を示す。破線は閾値を表す。図1Bの(b)は、露光期間と時間との関係を示す。縦軸は、予め設定される複数の露光期間P1~P4を表す。複数の露光期間P1~P4において露光開始時間は互いに異なり、数字が小さいほど、露光開始時間は早くなる。複数の露光期間P1~P4において、露光開始時間と露光終了時間との時間差は一定である。図1Bの(c)は、被検者が安静にしている場合の、脳血流の変化量の時間変化を示す。当該脳血流の変化量は、光学模型の額部における酸素化ヘモグロビンの変化量を額部において平均化した値である。光学模型は、ヒト額部の光の吸収特性、ヒト額部の光の散乱特性、およびヒト額部の形状を模した物体である。脳血流の変化量の具体的な算出方法については詳細を後述する。 Patent Document 1 discloses a method of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject changes. FIG. 1B is a diagram for explaining an example of resetting the initial value of cerebral blood flow and the exposure period when the distance between the imaging device and the subject falls below a predetermined threshold. The distance between the imaging device and the subject was measured by a TOF (Time Of Flight) method. (a) of FIG. 1B shows the change over time of the distance between the imaging device and the subject. Dashed lines represent thresholds. (b) of FIG. 1B shows the relationship between the exposure period and time. The vertical axis represents a plurality of preset exposure periods P1 to P4. The exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant. (c) of FIG. 1B shows temporal changes in the amount of change in cerebral blood flow when the subject is at rest. The amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the forehead of the optical model. The optical model is an object imitating the light absorption characteristics of the human forehead, the light scattering characteristics of the human forehead, and the shape of the human forehead. A specific method for calculating the amount of change in cerebral blood flow will be described in detail later.
 図1Bの(a)に示すように距離が変化して閾値を下回ったタイミングで、図1Bの(b)に示すように露光期間がP4からP3に再設定され、かつ脳血流の初期値が当該タイミングでの脳血流値に再設定された。被検者が安静にしている場合、脳血流の変化量は、距離が変化してもほぼゼロを維持することが望ましい。しかし、図1Bの(c)に示すように、脳血流の変化量は、距離の変化している最中に大きく変動し、距離の変化が終了してもゼロから大きくずれた値を示した。これは、脳血流の初期値および露光期間の再設定が、距離が変化している最中に行われるためである。被検者が姿勢を変えることによって距離が継続して変化する場合は、脳血流の初期値および露光期間の再設定が頻繁に行われ、脳血流の変化量の変動が頻発する。このように、撮像装置と被検者との距離が変化する場合、脳血流の変化量が意図せず変動してしまい、被検者の脳活動を正確に推定することが容易ではなかった。以上のことから、特許文献1に開示されている方法には改善の余地があった。 At the timing when the distance changes and falls below the threshold as shown in (a) of FIG. 1B, the exposure period is reset from P4 to P3 as shown in (b) of FIG. was reset to the cerebral blood flow value at that timing. When the subject is at rest, it is desirable that the amount of change in cerebral blood flow remains substantially zero even if the distance changes. However, as shown in (c) of FIG. 1B, the amount of change in cerebral blood flow fluctuates greatly while the distance is changing, and even after the distance change ends, the value deviates greatly from zero. rice field. This is because the initial value of the cerebral blood flow and the resetting of the exposure period are performed while the distance is changing. When the distance continuously changes as the subject changes his or her posture, the initial value of cerebral blood flow and the exposure period are frequently reset, and the amount of change in cerebral blood flow frequently fluctuates. Thus, when the distance between the imaging device and the subject changes, the amount of change in cerebral blood flow unintentionally fluctuates, making it difficult to accurately estimate the brain activity of the subject. . From the above, there is room for improvement in the method disclosed in Patent Document 1.
 本開示の実施形態による光検出システムでは、光検出装置と生体との距離が変化した場合、当該距離の変化がある程度収まってから、露光期間が変更される。その結果、生体情報が意図せず変動することを抑制でき、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することが可能になる。当該生体情報は、被検部の表面情報および/または内部情報を含む。以下に、本開示の実施形態による光検出システム、処理装置、光検出システムを制御する方法、およびプログラムを説明する。 In the photodetection system according to the embodiment of the present disclosure, when the distance between the photodetector and the living body changes, the exposure period is changed after the change in the distance has subsided to some extent. As a result, it is possible to suppress unintended fluctuations in the biological information, and it is possible to stably obtain the biological information of the test site in a non-contact manner in an environment where the living body moves. The biological information includes surface information and/or internal information of the subject. A photodetection system, a processing device, a method for controlling the photodetection system, and a program according to embodiments of the present disclosure are described below.
 第1の項目に係る光検出システムは、生体の被検部に光パルスを出射する光源と、前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置と、前記光源と前記光検出装置を制御し、かつ前記光検出装置と前記生体との距離に関する距離データを取得する処理回路と、を備える。前記処理回路は、前記光源に前記光パルスを出射させ、前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させる。前記処理回路は、前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定する。 A light detection system according to the first item includes a light source that emits a light pulse to a test site of a living body, a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the test site, and a processing circuit that controls the light source and the photodetector and acquires distance data relating to the distance between the photodetector and the living body. The processing circuit causes the light source to emit the light pulse, causes the photodetector to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and causes the reflected light pulse to be detected. The detection of the reflected light pulse is terminated when a predetermined exposure time has passed since the detection of the light pulse was started. The processing circuit sets a time when the distance deviates from the first range as a first time, and sets a time when the change over time of the distance falls within a predetermined range after the first time as a second time. Based on the distance at time 2, it is determined whether to change the first time.
 この光検出システムでは、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することができる。 With this photodetection system, it is possible to stably acquire biometric information of the subject without contact in an environment where the living body moves.
 第2の項目に係る光検出システムは、第1の項目に係る光検出システムにおいて、前記処理回路が、以下の場合に、前記第1時間を変更する。前記場合は、前記第2時刻での前記距離が前記第1の範囲を逸脱している場合である。 In the photodetection system according to the second item, in the photodetection system according to the first item, the processing circuit changes the first time in the following cases. In the above case, the distance at the second time is outside the first range.
 この光検出システムでは、光検出装置と生体との距離に応じて、反射光パルスの検出を開始するタイミングを変更することができる。 In this photodetection system, the timing to start detecting reflected light pulses can be changed according to the distance between the photodetector and the living body.
 第3の項目に係る光検出システムは、第2の項目に係る光検出システムにおいて、前記処理回路が、以下の場合に、前記第1時間を前記第2の範囲に対応する第2時間に変更する。前記場合は、前記第2時刻での前記距離が前記第1の範囲とは異なる第2の範囲内にある場合である。 The photodetection system according to the third item is the photodetection system according to the second item, wherein the processing circuit changes the first time to a second time corresponding to the second range in the following cases: do. The case is a case where the distance at the second time is within a second range different from the first range.
 この光検出システムでは、光検出装置と前記生体との距離が第2の範囲にある場合に、適切なタイミングで反射光パルスの検出を開始することができる。 In this photodetection system, when the distance between the photodetector and the living body is within the second range, detection of reflected light pulses can be started at appropriate timing.
 第4の項目に係る光検出システムは、第1の項目に係る光検出システムにおいて、前記処理回路が、以下の場合に、前記第1時間を維持する。前記場合は、前記第2時刻での前記距離が前記第1の範囲にある場合である。 The photodetection system according to the fourth item is the photodetection system according to the first item, wherein the processing circuit maintains the first time in the following cases. The case is a case where the distance at the second time is within the first range.
 この光検出システムでは、光検出装置と生体との距離が一時的に第1の範囲を逸脱する場合、反射光パルスの検出を開始するタイミングは変更されない。 In this photodetection system, when the distance between the photodetector and the living body temporarily deviates from the first range, the timing to start detecting the reflected light pulse is not changed.
 第5の項目に係る光検出システムは、第1から第4の項目のいずれかに係る光検出システムにおいて、前記処理回路が、以下の場合に、前記距離の前記経時変化が前記所定の範囲に収まっていると判定する。前記場合は、前記距離の前記経時変化の変動幅、分散、および標準偏差の少なくとも1つに基づいて決定される基準値が閾値以下である場合である。 A photodetection system according to a fifth item is the photodetection system according to any one of the first to fourth items, wherein the processing circuit detects that the change over time of the distance falls within the predetermined range in the following cases: Determine that it fits. The above case is a case where the reference value determined based on at least one of the fluctuation width, variance, and standard deviation of the change over time of the distance is equal to or less than a threshold.
 この光検出システムでは、生体の体動後における光検出装置と生体との距離を正確に規定することができる。 With this photodetection system, the distance between the photodetector and the living body can be accurately defined after the body movement of the living body.
 第6の項目に係る光検出システムは、第1から第5の項目のいずれかに係る光検出システムにおいて、前記光検出装置が、2次元的に配列された複数の画素を有するイメージセンサである。前記処理回路は、前記複数の画素の各々によって検出される前記反射光パルスの強度に基づいて前記距離データを生成する。 A photodetection system according to a sixth item is the photodetection system according to any one of the first to fifth items, wherein the photodetector is an image sensor having a plurality of pixels arranged two-dimensionally. . The processing circuitry generates the distance data based on the intensity of the reflected light pulses detected by each of the plurality of pixels.
 この光検出システムでは、広がりを有する被検部についての距離データを生成することができる。 With this photodetection system, it is possible to generate distance data for a subject that has a spread.
 第7の項目に係る光検出システムは、第6の項目に係る光検出システムにおいて、前記処理回路が、前記複数の画素のうち、前記被検部に対応しない画素の数、または前記被検部に対応しない画素および前記被検部の周縁領域に対応する画素の数に基づいて、前記第1時間を変更するか否かを決定する。 A photodetection system according to a seventh item is the photodetection system according to the sixth item, wherein the processing circuit detects, among the plurality of pixels, the number of pixels that do not correspond to the test area, or the number of pixels that do not correspond to the test area. It is determined whether or not to change the first time based on the number of pixels that do not correspond to and the number of pixels that correspond to the peripheral region of the subject.
 この光検出システムでは、被検部に対応しない画素の数、または被検部に対応しない画素および被検部の周縁領域に対応する画素の数に基づいて、反射光パルスの検出を開始するタイミングが適切であるか否かを判定することができる。 In this photodetection system, the timing to start detecting reflected light pulses is based on the number of pixels that do not correspond to the test site, or the number of pixels that do not correspond to the test site and the number of pixels that correspond to the peripheral region of the test site. is appropriate.
 第8の項目に係る光検出システムは、第1から第7の項目のいずれかに係る光検出システムにおいて、前記生体が人であり、前記被検部が人の額部である。前記処理回路は、前記光検出装置によって検出される前記反射光パルスの強度に応じた信号に基づいて、前記生体の脳活動に関する脳活動データを生成する。 A photodetection system according to an eighth item is the photodetection system according to any one of the first to seventh items, wherein the living body is a human and the subject is a human forehead. The processing circuit generates brain activity data relating to brain activity of the living body based on a signal corresponding to the intensity of the reflected light pulse detected by the photodetector.
 この光検出システムでは、人の脳活動データを生成することができる。 This light detection system can generate human brain activity data.
 第9の項目に係る処理装置は、生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムに用いられる処理装置である。前記処理装置は、プロセッサと、前記プロセッサによって実行されるコンピュータプログラムを格納したメモリと、を備える。前記コンピュータプログラムは、前記プロセッサに、前記光源に前記光パルスを出射させることと、前記被検部と前記光検出装置の間の距離に関する距離データを取得させることと、前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、を実行させる。 The processing device according to the ninth item is a light detection device including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part. A processor used in the system. The processing device comprises a processor and a memory storing a computer program executed by the processor. The computer program causes the processor to emit the light pulse from the light source, acquires distance data relating to the distance between the test site and the photodetector, and causes the photodetector to perform the The detection of the reflected light pulse is started when a first time has passed since the light pulse is emitted, and the reflected light is started when a predetermined exposure time has passed since the detection of the reflected light pulse is started. The time when the pulse detection is terminated and the distance deviates from the first range is defined as a first time, and the time after the first time when the change over time of the distance falls within a predetermined range is defined as a second time. and determining whether to change the first time based on the distance at the second time.
 この処理装置により、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することができる。 With this processing device, it is possible to stably acquire biometric information of the test site without contact in an environment where the living body moves.
 第10の項目に係る方法は、生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムを制御する方法である。前記方法は、前記光源に前記光パルスを出射させることと、前記被検部と前記光検出装置の間の距離に関する距離データを取得することと、前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、を含む。 A method according to the tenth item is a photodetection system including a light source that emits a light pulse to a subject part of a living body, and a photodetector that detects a reflected light pulse generated by the light pulse being reflected by the subject part. is a method of controlling The method includes causing the light source to emit the light pulse, obtaining distance data relating to the distance between the part to be inspected and the photodetector, and emitting the light pulse to the photodetector. detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse. and the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. determining whether to change the first time based on the distance in time.
 この方法により、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することができる。 With this method, it is possible to stably acquire biological information of the subject area without contact in an environment where the living body moves.
 第11の項目に係るプログラムは、生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムを制御するコンピュータによって実行されるプログラムである。前記プログラムは、前記光源に前記光パルスを出射させることと、前記被検部と前記光検出装置の間の距離に関する距離データを取得することと、前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、を実行させる。 A program according to the eleventh item is a light detection system including a light source that emits a light pulse to a subject part of a living body, and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part. A program run by a computer that controls the The program causes the light source to emit the light pulse, acquires distance data relating to the distance between the part to be inspected and the photodetector, and emits the light pulse to the photodetector. detection of the reflected light pulse is started when a first time elapses after the first time, and detection of the reflected light pulse ends when a predetermined exposure time elapses after the start of detection of the reflected light pulse. and the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time. and determining whether to change the first time based on the distance in time.
 このプログラムにより、生体が移動する環境下で、その被検部の生体情報を非接触で安定的に取得することができる。 With this program, it is possible to stably acquire biological information of the subject area without contact in an environment where the living body moves.
 本開示において、回路、ユニット、装置、部材または部の全部または一部、またはブロック図における機能ブロックの全部または一部は、例えば、半導体装置、半導体集積回路(IC)、またはLSI(large scale integration)を含む1つまたは複数の電子回路によって実行され得る。LSIまたはICは、1つのチップに集積されてもよいし、複数のチップを組み合わせて構成されてもよい。例えば、記憶素子以外の機能ブロックは、1つのチップに集積されてもよい。ここでは、LSIまたはICと呼んでいるが、集積の度合いによって呼び方が変わり、システムLSI、VLSI(very large scale integration)、もしくはULSI(ultra large scale integration)と呼ばれるものであってもよい。LSIの製造後にプログラムされる、FIpld Programmable Gate Array(FPGA)、またはLSI内部の接合関係の再構成またはLSI内部の回路区画のセットアップができるreconfigurable logic deviceも同じ目的で使うことができる。 In the present disclosure, all or part of a circuit, unit, device, member or section, or all or part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) may be performed by one or more electronic circuits. An LSI or IC may be integrated on one chip, or may be configured by combining a plurality of chips. For example, functional blocks other than memory elements may be integrated on one chip. Although they are called LSIs or ICs here, they may be called system LSIs, VLSIs (very large scale integration), or ULSIs (ultra large scale integration) depending on the degree of integration. A FIpld Programmable Gate Array (FPGA), which is programmed after the LSI is manufactured, or a reconfigurable logic device that can reconfigure the connection relationships inside the LSI or set up the circuit partitions inside the LSI can also be used for the same purpose.
 さらに、回路、ユニット、装置、部材または部の全部または一部の機能または操作は、ソフトウェア処理によって実行することが可能である。この場合、ソフトウェアは1つまたは複数のROM、光学ディスク、ハードディスクドライブなどの非一時的記録媒体に記録され、ソフトウェアが処理装置(processor)によって実行されたときに、そのソフトウェアで特定された機能が処理装置(processor)および周辺装置によって実行される。システムまたは装置は、ソフトウェアが記録されている1つまたは複数の非一時的記録媒体、処理装置(processor)、および必要とされるハードウェアデバイス、例えばインターフェースを備えていてもよい。 Furthermore, all or part of the functions or operations of circuits, units, devices, members or parts can be executed by software processing. In this case, the software is recorded on one or more non-transitory storage media, such as ROMs, optical discs, hard disk drives, etc., such that when the software is executed by a processor, the functions specified in the software are performed. It is executed by processors and peripherals. A system or apparatus may include one or more non-transitory storage media on which software is recorded, a processor, and required hardware devices such as interfaces.
 本開示において、「光」とは、可視光(波長が約400nm~約700nm)だけでなく、紫外線(波長が約10nm~約400nm)および赤外線(波長が約700nm~約1mm)を含む電磁波を意味する。 In the present disclosure, “light” refers not only to visible light (having a wavelength of about 400 nm to about 700 nm), but also to electromagnetic waves including ultraviolet rays (having a wavelength of about 10 nm to about 400 nm) and infrared rays (having a wavelength of about 700 nm to about 1 mm). means.
 以下、図面を参照しながら、本開示のより具体的な実施形態を説明する。 Hereinafter, more specific embodiments of the present disclosure will be described with reference to the drawings.
 (実施形態)
 [光検出システム]
 まず、図2を参照して、本開示の実施形態による光検出システムの構成を説明する。図2は、本開示の例示的な実施形態による光検出システムの構成を模式的に示すブロック図である。図2には、生体10は人であるとして、人の頭部が示されている。生体10は、例えば仕事中または乗り物を運転中のように常に静止しているわけではなく、前後に移動することがある。生体10は、人ではなく動物であってもよい。図2に示す例において、生体10の被検部12は額部であるが、額部以外の部分であってもよい。
(embodiment)
[Photodetection system]
First, the configuration of the photodetection system according to the embodiment of the present disclosure will be described with reference to FIG. FIG. 2 is a block diagram that schematically illustrates the configuration of a photodetection system according to an exemplary embodiment of the present disclosure; FIG. 2 shows a human head assuming that the living body 10 is a human. The living body 10 is not always stationary, such as when working or driving a vehicle, but may move back and forth. The living body 10 may be an animal instead of a human. In the example shown in FIG. 2, the subject 12 of the living body 10 is the forehead, but it may be a portion other than the forehead.
 図2に示す光検出システム100は、第1光源20aと、第2光源20bと、光検出装置30と、処理装置40とを備える。処理装置40は、制御回路42と、信号処理装置44と、メモリ46とを備える。本明細書において、第1光源20aおよび第2光源20bを、区別せずに「光源20」とも称する。 The photodetection system 100 shown in FIG. 2 includes a first light source 20a, a second light source 20b, a photodetection device 30, and a processing device 40. The processor 40 comprises a control circuit 42 , a signal processor 44 and a memory 46 . In this specification, the first light source 20a and the second light source 20b are also referred to as "light source 20" without distinction.
 光源20は、生体10の被検部12を照射するための光パルスを出射する。光検出装置30は、光パルスが被検部12で反射されて生じる反射光パルスを所定の露光期間において検出して被検部12の生体情報を取得する。被検部12で反射されて生じる反射光パルスは、被検部12の表面で反射される成分と、被検部12の内部で拡散反射される成分を含み得る。生体10が前後に移動する場合、処理装置40は、脳血流の初期値および露光期間を適切に再設定することにより、被検部12の生体情報の変動を抑制する。その結果、被検部12の生体情報を非接触で安定的に取得することができる。当該生体情報は、例えば生体10の顔もしくは頭皮の血流情報、または脳血流情報であり得る。あるいは、当該生体情報はその両方の血流情報であり得る。 The light source 20 emits light pulses for irradiating the subject 12 of the living body 10 . The photodetector 30 acquires biometric information of the subject 12 by detecting a reflected light pulse generated by the light pulse being reflected by the subject 12 during a predetermined exposure period. A reflected light pulse generated by being reflected by the test part 12 may include a component reflected by the surface of the test part 12 and a component diffusely reflected inside the test part 12 . When the living body 10 moves back and forth, the processing device 40 suppresses fluctuations in biological information of the subject 12 by appropriately resetting the initial value of the cerebral blood flow and the exposure period. As a result, it is possible to stably acquire the biological information of the subject 12 in a non-contact manner. The biological information may be, for example, blood flow information on the face or scalp of the living body 10, or cerebral blood flow information. Alternatively, the biological information may be both blood flow information.
 以下に、本実施形態における光検出システム100の各構成要素を詳細に説明する。 Below, each component of the photodetection system 100 in this embodiment will be described in detail.
 <第1光源20aおよび第2光源20b>
 第1光源20aは、図2に示すように、被検部12を照射するための第1光パルスIp1を出射する。第1光パルスIp1は第1波長を有する。同様に、第2光源20bは、図2に示すように、被検部12を照射するための第2光パルスIp2を出射する。第2光パルスIp2は、第1波長よりも長い第2波長を有する。図2に示す例において、第1光源20aの個数は1つであるが、複数であってもよい。第2光源20bの個数についても同様である。用途によっては、第1光源20aおよび第2光源20bの両方を用いる必要はなく、一方を用いてもよい。
<First Light Source 20a and Second Light Source 20b>
The first light source 20a emits a first light pulse Ip1 for irradiating the part 12 to be inspected, as shown in FIG. The first light pulse Ip1 has a first wavelength. Similarly, the second light source 20b emits a second light pulse Ip2 for illuminating the subject 12, as shown in FIG. The second light pulse Ip2 has a second wavelength that is longer than the first wavelength. In the example shown in FIG. 2, the number of first light sources 20a is one, but it may be plural. The same applies to the number of second light sources 20b. Depending on the application, it is not necessary to use both the first light source 20a and the second light source 20b, and either one may be used.
 本明細書では、第1光パルスIp1および第2光パルスIp2を、区別せずに「光パルスI」とも称する。光パルスIは、立ち上がり部分および立ち下がり部分を含む。立ち上がり部分は、光パルスIのうち、その強度が増加を開始してから増加が終了するまでの部分である。立ち下がり部分は、光パルスIのうち、その強度が減少を開始してから減少が終了するまでの部分である。 The first optical pulse I p1 and the second optical pulse I p2 are also referred to herein as “optical pulses I p ” without distinction. The light pulse Ip includes a rising portion and a falling portion. The rising portion is the portion of the optical pulse Ip from when the intensity starts to increase until when the increase ends. The trailing portion is the portion of the optical pulse Ip from when the intensity starts to decrease until the decrease ends.
 被検部12に到達した光パルスIのうち、一部は、被検部12の表面で反射する表面反射成分Iになり、他の一部は、被検部12の内部で1回反射もしくは散乱、または多重散乱する内部散乱成分Iになる。表面反射成分Iは、直接反射成分、拡散反射成分、および散乱反射成分の3つを含む。直接反射成分は、入射角と反射角が等しい反射成分である。拡散反射成分は、表面の凹凸形状により拡散して反射する成分である。散乱反射成分は、表面近傍の内部組織によって散乱して反射する成分である。被検部12を人の額部とした場合、散乱反射成分は、表皮内部で散乱して反射する成分である。以下では、被検部12の表面で反射する表面反射成分Iはこれら3つの成分を含むとして説明する。また、内部散乱成分Iは、表面近傍の内部組織によって散乱して反射する成分を含まないとして説明する。表面反射成分Iおよび内部散乱成分Iは反射または散乱され、これらの成分の進行方向は変化し、その一部が反射光パルスとして光検出装置30に到達する。表面反射成分Iは、生体10の表面情報、例えば、顔および頭皮の血流情報を含む。顔および頭皮の血流情報から、例えば、生体10の顔の外観、皮膚血流量、心拍数、または発汗量を知ることができる。内部散乱成分Iは、生体10の内部情報、例えば、脳血流情報を含む。脳血流情報から、例えば、生体10の脳血流量、血圧、血中酸素飽和度、または心拍数を知ることができる。反射光パルスから内部散乱成分Iを検出する方法については詳細を後述する。 A portion of the light pulse Ip that has reached the test site 12 becomes a surface reflection component I1 that is reflected on the surface of the test site 12, and the other part is reflected once inside the test site 12. It becomes the internally scattered component I2 that is reflected, scattered, or multiply scattered. The surface reflection component I1 includes three components: a direct reflection component, a diffuse reflection component, and a diffuse reflection component. A direct reflection component is a reflection component for which the angle of incidence is equal to the angle of reflection. The diffuse reflection component is a component that diffuses and reflects due to the uneven shape of the surface. The scattered reflection component is the component that is scattered and reflected by the internal tissue near the surface. When the subject 12 is a human forehead, the scattered reflection component is a component that is scattered and reflected inside the epidermis. In the following description, it is assumed that the surface reflection component I1 reflected on the surface of the subject 12 includes these three components. Also, the internal scattering component I2 will be described as not including the component scattered and reflected by the internal tissue near the surface. The surface reflection component I1 and the internal scattering component I2 are reflected or scattered, the direction of travel of these components is changed, and some of them reach the photodetector 30 as reflected light pulses. The surface reflection component I1 includes surface information of the living body 10, such as blood flow information of the face and scalp. For example, facial appearance, skin blood flow, heart rate, or perspiration amount of the living body 10 can be known from the blood flow information of the face and scalp. The internal scattering component I2 contains internal information of the living body 10, such as cerebral blood flow information. For example, the cerebral blood flow, blood pressure, blood oxygen saturation, or heart rate of the living body 10 can be known from the cerebral blood flow information. A method for detecting the internal scattering component I2 from the reflected light pulse will be described later in detail.
 第1光パルスIp1の第1波長および第2光パルスIp2の第2波長は、例えば650nm以上950nm以下の波長範囲に含まれる任意の波長であり得る。この波長範囲は、赤色から近赤外線の波長範囲に含まれる。上記の波長範囲は、「生体の窓」と呼ばれており、生体内の水分および皮膚に比較的吸収されにくいという性質を有する。生体を検出対象にする場合、上記の波長範囲の光を使用することにより、検出感度を高くすることができる。生体10の脳の血流変化を検出する場合、使用される光は、主に酸素化ヘモグロビンおよび脱酸素化ヘモグロビンに吸収されると考えられる。一般に、血流に変化が生じると、酸素化ヘモグロビンの濃度および脱酸素化ヘモグロビンの濃度が変化する。この変化に伴い、光の吸収度合いも変化する。したがって、血流が変化すると、検出される光量も時間的に変化する。 The first wavelength of the first optical pulse I p1 and the second wavelength of the second optical pulse I p2 may be arbitrary wavelengths included in the wavelength range of 650 nm to 950 nm, for example. This wavelength range is included in the red to near-infrared wavelength range. The above wavelength range is called the "window of the body" and has the property of being relatively difficult to be absorbed by moisture and skin in the body. When a living body is to be detected, detection sensitivity can be increased by using light in the above wavelength range. When detecting changes in cerebral blood flow in living body 10, the light used is believed to be absorbed primarily by oxygenated hemoglobin and deoxygenated hemoglobin. In general, changes in blood flow result in changes in the concentration of oxygenated hemoglobin and deoxygenated hemoglobin. Along with this change, the degree of light absorption also changes. Therefore, when the blood flow changes, the amount of detected light also changes with time.
 酸素化ヘモグロビンと脱酸素化ヘモグロビンとでは、光吸収の波長依存性が異なる。波長が650nm以上であり、かつ805nmより短いとき、脱酸素化ヘモグロビンによる光吸収係数の方が、酸素化ヘモグロビンによる光吸収係数よりも大きい。波長805nmでは、脱酸素化ヘモグロビンによる光吸収係数と、酸素化ヘモグロビンによる光吸収係数とは等しい。波長が805nmより長く、かつ950nm以下であるとき、酸素化ヘモグロビンによる光吸収係数の方が、脱酸素化ヘモグロビンによる光吸収係数よりも大きい。 Oxygenated hemoglobin and deoxygenated hemoglobin differ in the wavelength dependence of light absorption. When the wavelength is 650 nm or more and shorter than 805 nm, the light absorption coefficient of deoxygenated hemoglobin is greater than that of oxygenated hemoglobin. At a wavelength of 805 nm, the light absorption coefficient of deoxygenated hemoglobin and the light absorption coefficient of oxygenated hemoglobin are equal. When the wavelength is longer than 805 nm and 950 nm or less, the light absorption coefficient of oxygenated hemoglobin is greater than that of deoxygenated hemoglobin.
 したがって、第1光パルスIp1の第1波長を650nm以上であり、かつ805nmよりも短く設定し、第2光パルスIp2の第2波長を805nmよりも長く、かつ950nm以下に設定してもよい。第1光パルスIp1および第2光パルスIp2で被検部12を照射すると、後述する処理装置40の処理により、被検部12の内部での血液に含まれる酸素化ヘモグロビンの濃度および脱酸素化ヘモグロビンの濃度を求めることができる。異なる波長を有する2つの光パルスの照射により、被検部12のより詳細な内部情報を取得することができる。 Therefore, even if the first wavelength of the first optical pulse Ip1 is set to 650 nm or more and shorter than 805 nm, and the second wavelength of the second optical pulse Ip2 is set to be longer than 805 nm and 950 nm or less, good. When the subject 12 is irradiated with the first light pulse Ip1 and the second light pulse Ip2 , the concentration and depletion of oxygenated hemoglobin contained in the blood inside the subject 12 are controlled by the processing of the processing device 40, which will be described later. The concentration of oxygenated hemoglobin can be determined. By irradiating two light pulses having different wavelengths, more detailed internal information of the subject 12 can be obtained.
 本実施形態において、光源20は、ユーザの網膜への影響を考慮して設計され得る。例えば、光源20は、レーザダイオードのようなレーザ光源であり、各国で策定されているレーザ安全基準のクラス1を満足し得る。クラス1が満足されている場合、被検部12が、被爆放出限界(AEL)が1mWを下回るほどの低照度の光で照射される。なお、光源20自体がクラス1を満足する必要はない。例えば、拡散板またはNDフィルタを光源20の前に設置して光を拡散または減衰することにより、レーザ安全基準のクラス1が満たされていてもよい。 In this embodiment, the light source 20 can be designed in consideration of the influence on the user's retina. For example, the light source 20 is a laser light source such as a laser diode, and can satisfy class 1 of the laser safety standards established by various countries. If Class 1 is satisfied, the test area 12 is illuminated with light of such low intensity that the accessible emission limit (AEL) is less than 1 mW. Note that the light source 20 itself does not need to satisfy Class 1. For example, a diffuser plate or neutral density filter may be placed in front of the light source 20 to diffuse or attenuate the light so that class 1 laser safety standards are met.
 <光検出装置30>
 光検出装置30は、光パルスIが被検部12で反射されて生じた反射光パルスの立ち上がり期間の成分の少なくとも一部を検出し、その強度に応じた信号を出力する。当該信号は、被検部12の表面情報を含む。あるいは、光検出装置30は、光パルスIが被検部12で反射されて生じた反射光パルスの立ち下がり期間の成分の少なくとも一部を検出し、その強度に応じた信号を出力する。当該信号は、被検部12の内部情報を含む。
<Photodetector 30>
The photodetector 30 detects at least a part of the rise period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof. The signal includes surface information of the test part 12 . Alternatively, the photodetector 30 detects at least a part of the falling period component of the reflected light pulse generated by the light pulse Ip being reflected by the part 12 to be inspected, and outputs a signal corresponding to the intensity thereof. The signal includes internal information of the test part 12 .
 光パルスの「立ち上がり期間」は、光検出装置30の光検出面において、当該光パルスの強度が増加を開始する時点から増加を終了する時点までの期間を指す。光パルスの「立ち下がり期間」は、光検出装置30の光検出面において、当該光パルスの強度が減少を開始する時点から減少を終了する時点までの期間を指す。より厳密には、「立ち上がり期間」は、当該光パルスの強度が予め設定された下限値を上回った時点から予め設定された上限値に達した時点までの期間を意味する。「立ち下がり期間」は、当該光パルスの強度が予め設定された上限値を下回った時点から予め設定された下限値に達した時点までの期間を意味する。上限値は当該光パルスの強度のピーク値の例えば90%の値に設定され、下限値は当該ピーク値の例えば10%の値に設定され得る。 The "rising period" of the light pulse refers to the period from when the intensity of the light pulse starts increasing to when it ends increasing on the photodetection surface of the photodetector device 30 . The “falling period” of the light pulse refers to the period from when the intensity of the light pulse starts decreasing to when it ends decreasing on the photodetection surface of the photodetector 30 . More precisely, the "rising period" means the period from when the intensity of the light pulse exceeds a preset lower limit to when it reaches a preset upper limit. The “falling period” means a period from when the intensity of the light pulse falls below a preset upper limit to when it reaches a preset lower limit. The upper limit value can be set to a value that is, for example, 90% of the peak value of the intensity of the light pulse, and the lower limit value can be set to a value that is, for example, 10% of the peak value.
 光検出装置30は、電子シャッタを備え得る。電子シャッタは、撮像のタイミングを制御する回路である。電子シャッタは、受光した光を有効な電気信号に変換して蓄積する1回の信号蓄積の期間と、信号蓄積を停止する期間とを制御する。信号蓄積期間は、「露光期間」とも称する。以下の説明では、露光期間の幅を、「シャッタ幅」とも称する。1回の露光期間が終了し次の露光期間が開始するまでの時間を、「非露光期間」とも称する。 The photodetector device 30 may be equipped with an electronic shutter. The electronic shutter is a circuit that controls imaging timing. The electronic shutter controls one signal accumulation period during which the received light is converted into an effective electrical signal and accumulated, and a period during which the signal accumulation is stopped. The signal accumulation period is also called an "exposure period". In the following description, the width of the exposure period is also called "shutter width". The time from the end of one exposure period to the start of the next exposure period is also called a "non-exposure period".
 光検出装置30は、電子シャッタにより、露光期間および非露光期間を、サブナノ秒、例えば、30psから1nsの範囲で調整することができる。距離の計測が目的である従来のTOFカメラは、光源20から出射され被写体で反射されて戻ってきた光のすべてを検出する。従来のTOFカメラでは、シャッタ幅が光のパルス幅よりも大きい必要があった。これに対し、本実施形態における光検出システム100では、被写体の光量を補正する必要がない。このため、シャッタ幅がパルス幅よりも大きい必要はない。シャッタ幅を、例えば、1ns以上30ns以下の値に設定することができる。本実施形態における光検出システム100によれば、シャッタ幅を縮小できるため、検出信号に含まれる暗電流の影響を低減することができる。 The photodetection device 30 can adjust the exposure period and the non-exposure period within a sub-nanosecond range, for example, 30 ps to 1 ns, using an electronic shutter. A conventional TOF camera whose purpose is to measure distance detects all of the light emitted from the light source 20 and returned after being reflected by the subject. Conventional TOF cameras require the shutter width to be greater than the light pulse width. On the other hand, in the photodetection system 100 of this embodiment, there is no need to correct the amount of light on the subject. Therefore, the shutter width need not be greater than the pulse width. The shutter width can be set to a value of 1 ns or more and 30 ns or less, for example. According to the photodetection system 100 of this embodiment, since the shutter width can be reduced, the influence of dark current contained in the detection signal can be reduced.
 光検出装置30は、1つ以上の光検出セルを有する。光検出装置30は、光検出面に沿って2次元的に配列された複数の光検出セルを有するイメージセンサであり得る。当該イメージセンサは、CCDイメージセンサまたはCMOSイメージセンサなどの任意のイメージセンサであり得る。各光検出セルは、例えばフォトダイオードなどの光電変換素子32と、1つまたは複数の電荷蓄積部34とを備え得る。本明細書において、光検出セルを「画素」とも称し、光検出セルによって検出される光の強度を「輝度値」とも称する。光検出装置30がイメージセンサである場合、光検出装置30が検出して出力する上記の信号は、2次元的に分布する複数の画素についての輝度値を示す画像信号である。当該画像信号は、画像化された情報を含んでいてもよいし、複数の画素についての輝度値の数値情報を含んでいてもよい。光検出装置30の構成については詳細を後述する。 The photodetection device 30 has one or more photodetection cells. The photodetection device 30 may be an image sensor having a plurality of photodetection cells two-dimensionally arranged along a photodetection surface. The image sensor can be any image sensor such as a CCD image sensor or a CMOS image sensor. Each photodetector cell may comprise a photoelectric conversion element 32 , such as a photodiode, and one or more charge storages 34 . In this specification, the photodetector cells are also referred to as "pixels", and the intensity of light detected by the photodetector cells is also referred to as a "luminance value". When the photodetector 30 is an image sensor, the above-described signal detected and output by the photodetector 30 is an image signal indicating luminance values of a plurality of pixels distributed two-dimensionally. The image signal may contain imaged information, or may contain numerical information of luminance values for a plurality of pixels. The details of the configuration of the photodetector 30 will be described later.
 <処理装置40>
 処理装置40に含まれる制御回路42は、光源20、光検出装置30、および信号処理装置44の動作を制御する。制御回路42は、光源20の光パルスIの出射タイミングと、光検出装置30のシャッタタイミングとの時間差を調整する。本明細書では、当該時間差を「位相差」とも称する。光源20の「出射タイミング」とは、光源20から出射される光パルスが立ち上がりを開始するタイミングである。「シャッタタイミング」とは、露光を開始するタイミングである。制御回路42は、出射タイミングを変化させて位相差を調整してもよいし、シャッタタイミングを変化させて位相差を調整してもよい。
<Processing device 40>
Control circuitry 42 included in processor 40 controls the operation of light source 20 , photodetector 30 , and signal processor 44 . The control circuit 42 adjusts the time difference between the emission timing of the light pulse Ip from the light source 20 and the shutter timing of the photodetector 30 . In this specification, the time difference is also called "phase difference". The “emission timing” of the light source 20 is the timing at which the light pulse emitted from the light source 20 starts rising. "Shutter timing" is the timing to start exposure. The control circuit 42 may adjust the phase difference by changing the emission timing, or may adjust the phase difference by changing the shutter timing.
 制御回路42は、光検出装置30の各画素によって検出された信号からオフセット成分を取り除くように構成されてもよい。オフセット成分は、太陽光もしくは照明光などの環境光、または外乱光による信号成分である。光源20の駆動をOFFにして光源20から光が出射されない状態で、光検出装置30によって信号を検出することにより、環境光または外乱光によるオフセット成分が見積もられる。 The control circuit 42 may be configured to remove the offset component from the signal detected by each pixel of the photodetector 30 . The offset component is a signal component due to environmental light such as sunlight or illumination light, or disturbance light. By detecting a signal with the photodetector 30 in a state in which the drive of the light source 20 is turned off and no light is emitted from the light source 20, the offset component due to ambient light or disturbance light can be estimated.
 処理装置40に含まれる信号処理装置44は、光検出装置30から出力された信号に基づいて、生体10の被検部12の生体情報を示すデータを生成して出力する。当該データは、被検部12の表面情報および/または内部情報を含む。 A signal processing device 44 included in the processing device 40 generates and outputs data indicating biological information of the subject 12 of the living body 10 based on the signal output from the photodetector 30 . The data includes surface information and/or internal information of the test part 12 .
 信号処理装置44は、被検部12の表面情報および/または内部情報に基づいて、生体10の心理状態および/または身体状態を推定することもできる。信号処理装置44は、生体10の心理状態および/または身体状態を示すデータを生成して出力してもよい。心理状態は、例えば、気分、感情、健康状態、または温度感覚であり得る。気分は、例えば、快、または不快といった気分を含み得る。感情は、例えば、安心、不安、悲しみ、または憤りといった感情を含み得る。健康状態は、例えば、元気、または倦怠といった状態を含み得る。温度感覚は、例えば、暑い、寒い、または蒸し暑いといった感覚を含み得る。これらに派生して、脳活動の程度を表す指標、例えば興味度、熟練度、習熟度、および集中度も、心理状態に含まれ得る。身体状態は、例えば、疲労度、眠気、または飲酒による酔いの程度であり得る。 The signal processing device 44 can also estimate the psychological state and/or physical state of the living body 10 based on the surface information and/or internal information of the subject 12 . The signal processor 44 may generate and output data indicating the psychological and/or physical state of the living body 10 . A psychological state can be, for example, a mood, an emotion, a state of health, or a temperature sensation. Moods can include, for example, moods such as pleasant or unpleasant. Emotions may include, for example, feelings of relief, anxiety, sadness, or resentment. A state of health may include, for example, a state of well-being or fatigue. Temperature sensations may include, for example, sensations of hot, cold, or muggy. Derived from these, indices representing the degree of brain activity, such as interest, proficiency, proficiency, and concentration, can also be included in psychological states. The physical condition can be, for example, the degree of fatigue, drowsiness, or drunkenness.
 制御回路42は、例えばプロセッサおよびメモリの組み合わせ、またはプロセッサおよびメモリを内蔵するマイクロコントローラなどの集積回路であり得る。制御回路42は、例えばプロセッサがメモリ46に記録されたコンピュータプログラムを実行することにより、例えば出射タイミングとシャッタタイミングとの調整を行ったり、信号処理装置44に信号処理を実行させたりする。 The control circuit 42 may be, for example, a combination processor and memory or an integrated circuit such as a microcontroller containing a processor and memory. The control circuit 42 executes a computer program recorded in the memory 46 by the processor, for example, to adjust the emission timing and the shutter timing, and cause the signal processing device 44 to perform signal processing.
 信号処理装置44は、例えばデジタルシグナルプロセッサ(DSP)、フィールドプログラマブルゲートアレイ(FPGA)などのプログラマブルロジックデバイス(PLD)、または中央演算処理装置(CPU)もしくは画像処理用演算プロセッサ(GPU)とコンピュータプログラムとの組み合わせによって実現され得る。信号処理装置44は、プロセッサがメモリ46に記録されたコンピュータプログラムを実行することにより、信号処理を実行する。 The signal processor 44 is, for example, a digital signal processor (DSP), a programmable logic device (PLD) such as a field programmable gate array (FPGA), or a central processing unit (CPU) or an arithmetic processor (GPU) for image processing and a computer program. It can be realized by a combination of The signal processing device 44 executes signal processing by the processor executing a computer program recorded in the memory 46 .
 信号処理装置44および制御回路42は、統合された1つの回路であってもよいし、分離された個別の回路であってもよい。本明細書では、信号処理装置44および制御回路42をまとめて「処理回路」とも称する。信号処理装置44、制御回路42、およびメモリ46の少なくとも1つは、例えば遠隔地に設けられたサーバなどの外部の装置の構成要素であってもよい。この場合、サーバなどの外部の装置は、無線通信または有線通信により、残りの構成要素と相互にデータの送受信を行う。 The signal processor 44 and the control circuit 42 may be one integrated circuit or separate individual circuits. The signal processor 44 and control circuitry 42 are also collectively referred to herein as "processing circuitry." At least one of signal processor 44, control circuitry 42, and memory 46 may be components of an external device, such as a remotely located server. In this case, an external device such as a server exchanges data with the rest of the components via wireless or wired communication.
 本明細書において、制御回路42の動作および信号処理装置44の動作をまとめて処理装置40の動作として説明する。 In this specification, the operation of the control circuit 42 and the operation of the signal processing device 44 are collectively described as the operation of the processing device 40 .
 <その他>
 光検出システム100は、被検部12の2次元像を光検出装置30の光検出面上に形成する結像光学系を備えてもよい。結像光学系の光軸は、光検出装置30の光検出面に略直交する。結像光学系は、ズームレンズを含んでいてもよい。ズームレンズの位置が変化すると生体10およびその被検部12の2次元像の拡大率が変更し、光検出装置30上の2次元像の解像度が変化する。したがって、生体10までの距離が遠くても、所望の被検部12を拡大して詳細に観察することが可能である。
<Others>
The photodetection system 100 may include an imaging optical system that forms a two-dimensional image of the subject 12 on the photodetection surface of the photodetection device 30 . The optical axis of the imaging optical system is substantially orthogonal to the photodetection surface of the photodetector 30 . The imaging optics may include a zoom lens. When the position of the zoom lens changes, the magnification of the two-dimensional image of the living body 10 and its subject 12 changes, and the resolution of the two-dimensional image on the photodetector 30 changes. Therefore, even if the living body 10 is far away, it is possible to magnify the desired subject 12 and observe it in detail.
 光検出システム100は、被検部12と光検出装置30との間に、光源20から出射される波長帯域の光、またはその近傍の光を通過させる帯域通過フィルタを備えていてもよい。これにより、環境光などの外乱成分の影響を低減することができる。帯域通過フィルタは、例えば多層膜フィルタまたは吸収フィルタによって構成され得る。光源20の温度変化およびフィルタへの斜入射に伴う帯域シフトを考慮して、帯域通過フィルタの帯域幅は、20nm以上100nm以下程度の幅を有してもよい。 The photodetection system 100 may include a band-pass filter that passes the light in the wavelength band emitted from the light source 20 or light in the vicinity thereof between the subject 12 and the photodetector 30 . As a result, the influence of disturbance components such as ambient light can be reduced. A band-pass filter can be constituted by a multilayer filter or an absorption filter, for example. Considering the temperature change of the light source 20 and the band shift due to oblique incidence on the filter, the bandwidth of the band-pass filter may have a width of about 20 nm or more and 100 nm or less.
 内部情報を取得する場合、光検出システム100は、被検部12と光源20との間、および被検部12と光検出装置30との間に、それぞれ偏光板を備えてもよい。この場合、光源20側に配置される偏光板と、光検出装置30側に配置される偏光板との偏光方向は、直交ニコルの関係であり得る。これら2つの偏光板の配置により、被検部12の表面反射成分Iのうち正反射成分、すなわち入射角と反射角が同じ成分が光検出装置30に到達することを防ぐことができる。つまり、表面反射成分Iが光検出装置30に到達する光量を低減させることができる。 When acquiring internal information, the photodetection system 100 may include polarizing plates between the test section 12 and the light source 20 and between the test section 12 and the photodetector 30, respectively. In this case, the polarization directions of the polarizing plate arranged on the light source 20 side and the polarizing plate arranged on the photodetector 30 side may have a crossed Nicols relationship. By arranging these two polarizing plates, it is possible to prevent the regular reflection component, that is, the component with the same incident angle and reflection angle, from reaching the photodetector 30 in the surface reflection component I1 of the test part 12 . That is, the amount of light that the surface reflection component I1 reaches the photodetector 30 can be reduced.
 [被検部12の内部情報の取得方法]
 以下に、図3Aおよび図3Bを参照して、被検部12の内部情報の取得方法を説明する。当該内部情報は内部散乱成分Iである。
[Method for Acquiring Internal Information of Subject 12]
A method of acquiring internal information of the subject 12 will be described below with reference to FIGS. 3A and 3B. The internal information is the internal scattering component I2 .
 図3Aは、光パルスIがインパルス波形を有する場合の、反射光パルスに含まれる表面反射成分Iおよび内部散乱成分Iの時間変化の例を模式的に示す図である。図3Bは、光パルスIが矩形形状の波形を有する場合の、反射光パルスに含まれる表面反射成分Iおよび内部散乱成分Iの時間変化の例を模式的に示す図である。各図の左側の図は光源20から出射された光パルスIの波形の例を表し、右側の図は反射光パルスに含まれる表面反射成分Iおよび内部散乱成分Iの波形の例を表す。 FIG. 3A is a diagram schematically showing an example of temporal changes of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has an impulse waveform. FIG. 3B is a diagram schematically showing an example of temporal changes in the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse when the light pulse Ip has a rectangular waveform. The diagram on the left side of each diagram shows an example of the waveform of the light pulse Ip emitted from the light source 20, and the diagram on the right side shows an example of the waveforms of the surface reflection component I1 and the internal scattering component I2 included in the reflected light pulse. show.
 光パルスIがインパルス波形を有する場合、図3Aの右側の図に示すように、表面反射成分Iは、光パルスIと同様の波形を有し、内部散乱成分Iは、表面反射成分Iよりも遅延するインパルス応答波形を有する。これは、内部散乱成分Iが被検部12内の様々な経路を通過した光線の組み合わせに相当するからである。 When the light pulse Ip has an impulse waveform, the surface reflection component I1 has a waveform similar to that of the light pulse Ip , and the internal scattering component I2 is the surface reflection component I2 , as shown in the right-hand diagram of FIG. 3A. It has an impulse response waveform that lags behind component I1 . This is because the internal scattering component I 2 corresponds to a combination of light rays that have passed through various paths within the subject 12 .
 光パルスIが矩形形状の波形を有する場合、図3Bの右側の図に示すように、表面反射成分Iは、光パルスIと同様の波形を有し、内部散乱成分Iは、複数のインパルス応答波形が重畳された波形を有する。本発明者らは、複数のインパルス応答波形の重畳により、光パルスIがインパルス波形を有する場合と比較して、光検出装置30が検出する内部散乱成分Iの光量を増幅できることを確認した。反射光パルスの立ち下がり部分で電子シャッタを開始することにより、内部散乱成分Iを効果的に検出することができる。図3Bの右側の図における破線によって囲まれた領域は、光検出装置30の電子シャッタが開放される露光期間の例を表す。矩形パルスのパルス幅が1nsから10nsのオーダであれば、光源20を低い電圧で駆動することができる。したがって、本実施形態における光検出システム100の小型化および低コスト化が可能になる。 When the light pulse I p has a rectangular waveform, the surface reflection component I 1 has a waveform similar to that of the light pulse I p , and the internal scattering component I 2 is It has a waveform in which a plurality of impulse response waveforms are superimposed. The present inventors confirmed that by superimposing a plurality of impulse response waveforms, the amount of light of the internal scattering component I2 detected by the photodetector 30 can be amplified, compared to the case where the light pulse Ip has an impulse waveform. . By initiating the electronic shutter on the trailing edge of the reflected light pulse, the internally scattered component I2 can be effectively detected. The area surrounded by the dashed line in the diagram on the right side of FIG. 3B represents an example of the exposure period during which the electronic shutter of the photodetector 30 is open. If the pulse width of the rectangular pulse is on the order of 1 ns to 10 ns, the light source 20 can be driven with a low voltage. Therefore, it is possible to reduce the size and cost of the photodetection system 100 in this embodiment.
 従来、生体内部の深さ方向において異なる箇所での光吸収係数または光散乱係数などの情報を区別して検出するために、ストリークカメラが使用されている。例えば、特開平4―189349号公報は、そのようなストリークカメラの一例を開示している。これらのストリークカメラでは、所望の空間分解能で計測するために、パルス幅がフェムト秒またはピコ秒の極超短光パルスが用いられる。これに対し、本実施形態では、表面反射成分Iと内部散乱成分Iとを区別して検出することができる。したがって、光源20から出射される光パルスは、極超短光パルスである必要はなく、パルス幅を任意に選択することができる。 Conventionally, streak cameras have been used to distinguish and detect information such as light absorption coefficients or light scattering coefficients at different locations in the depth direction inside a living body. For example, JP-A-4-189349 discloses an example of such a streak camera. These streak cameras use ultrashort light pulses with femtosecond or picosecond pulse widths to measure at the desired spatial resolution. On the other hand, in this embodiment, the surface reflection component I1 and the internal scattering component I2 can be detected separately. Therefore, the light pulse emitted from the light source 20 does not have to be an ultra-short light pulse, and the pulse width can be arbitrarily selected.
 生体10の頭部を光で照射して脳血流を計測する場合、内部散乱成分Iの光量は、表面反射成分Iの光量の数千分の1から数万分の1程度の非常に小さい値になり得る。さらに、レーザの安全基準を考慮すると、照射できる光の光量は極めて小さくなる。したがって、内部散乱成分Iの検出は非常に難しくなる。その場合でも、光源20が、相対的にパルス幅の大きい光パルスIを出射すれば、時間遅れを伴う内部散乱成分Iの積算量を増加させることができる。これにより、検出光量を増やし、SN比を向上させることができる。 When measuring the cerebral blood flow by irradiating the head of the living body 10 with light, the amount of light of the internal scattering component I2 is extremely small, which is approximately one to several ten thousandths of the amount of light of the surface reflection component I1 . can be small. Furthermore, considering the laser safety standards, the amount of light that can be emitted is extremely small. Therefore, detection of the internal scatter component I2 becomes very difficult. Even in such a case, if the light source 20 emits a light pulse Ip with a relatively large pulse width, it is possible to increase the integrated amount of the internal scattering component I2 with a time delay. As a result, the amount of detected light can be increased and the SN ratio can be improved.
 光源20は、例えば、パルス幅が3ns以上の光パルスIを出射し得る。あるいは、光源20は、パルス幅が5ns以上、さらに10ns以上の光パルスIを出射してもよい。一方、パルス幅が大きすぎても使用しない光が増えて無駄となるため、光源20は、例えば、パルス幅が50ns以下の光パルスIを出射し得る。あるいは、光源20は、パルス幅が30ns以下、さらに20ns以下の光パルスIを出射してもよい。矩形パルスのパルス幅が数nsから数十nsであれば、光源20を低電圧で駆動することができる。したがって、本実施形態における光検出システム100の低コスト化が可能になる。 The light source 20 can emit a light pulse Ip with a pulse width of 3 ns or more, for example. Alternatively, the light source 20 may emit a light pulse Ip with a pulse width of 5 ns or more, or 10 ns or more. On the other hand, if the pulse width is too large, the amount of light that is not used increases and is wasted. Therefore, the light source 20 can emit an optical pulse Ip with a pulse width of 50 ns or less, for example. Alternatively, the light source 20 may emit an optical pulse Ip with a pulse width of 30 ns or less, or even 20 ns or less. If the pulse width of the rectangular pulse is several ns to several tens of ns, the light source 20 can be driven at a low voltage. Therefore, it is possible to reduce the cost of the photodetection system 100 in this embodiment.
 光源20の照射パターンは、例えば、照射領域内において、均一な強度分布を有するパターンであってもよい。この点で、本実施形態における光検出システム100は、例えば特開平11―164826号公報に開示された従来の装置とは異なる。特開平11―164826号公報に開示された装置では、検出器と光源とを3cm程度離し、表面反射成分が、空間的に内部散乱成分から分離されるので、照射パターンは、離散的な強度分布を有するパターンとせざるを得ない。これに対し、本実施形態では、表面反射成分Iを時間的に内部散乱成分Iから分離して低減することができる。このため、均一な強度分布を有する照射パターンの光源20を用いることができる。均一な強度分布を有する照射パターンは、光源20から出射される光を拡散板で拡散することによって形成してもよい。 The irradiation pattern of the light source 20 may be, for example, a pattern having a uniform intensity distribution within the irradiation area. In this respect, the photodetection system 100 of this embodiment differs from the conventional device disclosed in, for example, Japanese Patent Application Laid-Open No. 11-164826. In the apparatus disclosed in Japanese Patent Application Laid-Open No. 11-164826, the detector and the light source are separated by about 3 cm, and the surface reflection component is spatially separated from the internal scattering component. It has to be a pattern with In contrast, in this embodiment, the surface reflection component I1 can be temporally separated from the internal scattering component I2 and reduced. Therefore, the light source 20 having an irradiation pattern having a uniform intensity distribution can be used. An irradiation pattern having a uniform intensity distribution may be formed by diffusing the light emitted from the light source 20 with a diffusion plate.
 本実施形態では、従来技術とは異なり、被検部12の照射点直下でも、内部散乱成分Iを検出することができる。被検部12を空間的に広い範囲にわたって光で照射することにより、計測解像度を高めることもできる。 In the present embodiment, unlike the prior art, the internal scattering component I2 can be detected even just below the irradiation point of the subject 12 . By irradiating the subject 12 with light over a spatially wide range, the measurement resolution can be increased.
 被検部12の内部情報の取得に関して、光検出装置30の構成、第1光パルスIp1および第2光パルスIp2の出射動作、ならびに血液中のHbOおよびHbの各濃度の初期値からの変化量の算出については、詳細を後述する。 Regarding the acquisition of the internal information of the subject 12, the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , and the initial values of the concentrations of HbO 2 and Hb in the blood Calculation of the amount of change in will be described later in detail.
 [光検出装置30と生体10との位置関係]
 次に、図4を参照して、本実施形態における光検出装置30と生体10との位置関係を説明する。図4は、本実施形態における光検出装置30と生体10との位置関係を示す図である。光検出装置30と生体10との計測距離Zは、光検出装置30の光検出面と、生体10の被検部12の表面との距離の平均値である。図4に示す例において、4つの計測範囲R1~R4が規定されている。計測範囲R1は、光検出装置30と生体10との距離がD11以上D12以下である範囲である。計測範囲R2は、光検出装置30と生体10との距離がD21以上D22以下である範囲である。計測範囲R3は、光検出装置30と生体10との距離がD31以上D32以下である範囲である。計測範囲R4は、光検出装置30と生体10との距離がD41以上D42以下である範囲である。計測範囲R1と計測範囲R2とは互いに部分的に重なるように設定されている。同様に、計測範囲R2と計測範囲R3とは互いに部分的に重なるように設定されている。計測範囲R3と計測範囲R4とは互いに部分的に重なるように設定されている。すなわち、D11<D21<D12<D31<D22<D41<D32<D42の関係が満たされる。第1から第4の計測範囲R1~R4を連続して互いに重ならないように設定してもよい。すなわち、D11<D12=D21<D22=D31<D32=D41<D42の関係が満たされてもよい。距離D11はゼロ以上の値であり、例えば5cm以上100cm以下であり得る。計測範囲R1~R4の各々の距離幅は、例えば3cm以上24cm以下であり得る。そのような距離幅であれば、計測範囲R1~R4を精度よく区別することができる。互いに隣接する2つの計測範囲が部分的に重なる範囲の距離幅は、例えば0cm以上8cm以下であり得る。計測範囲の数は4である必要はなく、任意の数である。
[Positional relationship between photodetector 30 and living body 10]
Next, the positional relationship between the photodetector 30 and the living body 10 in this embodiment will be described with reference to FIG. FIG. 4 is a diagram showing the positional relationship between the photodetector 30 and the living body 10 in this embodiment. The measured distance Z 0 between the photodetector 30 and the living body 10 is the average value of the distances between the photodetection surface of the photodetector 30 and the surface of the test site 12 of the living body 10 . In the example shown in FIG. 4, four measurement ranges R1-R4 are defined. The measurement range R1 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D11 and less than or equal to D12 . The measurement range R2 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D21 and less than or equal to D22 . The measurement range R3 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D31 and less than or equal to D32 . The measurement range R4 is a range in which the distance between the photodetector 30 and the living body 10 is greater than or equal to D41 and less than or equal to D42 . The measurement range R1 and the measurement range R2 are set so as to partially overlap each other. Similarly, the measurement range R2 and the measurement range R3 are set so as to partially overlap each other. The measurement range R3 and the measurement range R4 are set so as to partially overlap each other. That is, the relationship of D11 < D21 < D12 < D31 < D22 < D41 < D32 < D42 is satisfied. The first to fourth measurement ranges R1 to R4 may be set continuously so as not to overlap each other. That is, the relationship D 11 <D 12 =D 21 <D 22 =D 31 <D 32 =D 41 <D 42 may be satisfied. The distance D11 is a value greater than or equal to zero, and can be, for example, greater than or equal to 5 cm and less than or equal to 100 cm. The distance width of each of the measurement ranges R1 to R4 can be, for example, 3 cm or more and 24 cm or less. With such a distance width, the measurement ranges R1 to R4 can be accurately distinguished. The width of the range in which two adjacent measurement ranges partially overlap can be, for example, 0 cm or more and 8 cm or less. The number of measurement ranges does not have to be four, it can be any number.
 計測範囲R1~R4について、露光期間P1~P4がそれぞれ設定されている。露光期間は、例えば、人の脳血流変化を模した光学模型を用いて決定され得る。当該光学模型では、人の脳の状態を、人の安静時における皮膚または脳の吸収、および散乱特性を模した不活性状態と、脳活動が活性化して脳血流量が増加した際の吸収、および散乱特性を模した活性状態とに切り替えることができる。露光期間P1は、光学模型を計測範囲R1の中央、すなわち(D11+D12)/2の距離に配置した場合において、不活性状態と活性状態との輝度値の変化量が最大になる露光期間である。露光期間P2~P4についても同様である。複数の露光期間P1~P4において露光開始時間は互いに異なり、数字が小さいほど、露光開始時間は早くなる。複数の露光期間P1~P4において、露光開始時間と露光終了時間との時間差は一定である。 Exposure periods P1 to P4 are set for the measurement ranges R1 to R4, respectively. The exposure period can be determined, for example, using an optical model that simulates changes in human cerebral blood flow. In the optical model, the state of the human brain is divided into an inactive state simulating the absorption and scattering characteristics of the skin or brain at rest, absorption when brain activity is activated and cerebral blood flow increases, and the active state mimicking the scattering properties. The exposure period P1 is the exposure period during which the amount of change in luminance value between the inactive state and the active state is maximized when the optical model is placed in the center of the measurement range R1, that is, at a distance of (D 11 +D 12 )/2. is. The same applies to the exposure periods P2 to P4. The exposure start times are different in the plurality of exposure periods P1 to P4, and the smaller the number, the earlier the exposure start time. In a plurality of exposure periods P1 to P4, the time difference between the exposure start time and the exposure end time is constant.
 処理装置40は、不図示の記憶装置をさらに備え、当該記憶装置は、計測範囲R1~R4にそれぞれ対応付けられた露光期間P1~P4を示すデータを記憶していてもよい。処理装置40は、記憶装置から必要に応じて当該データを取得する。 The processing device 40 may further include a storage device (not shown), and the storage device may store data indicating the exposure periods P1 to P4 associated with the measurement ranges R1 to R4, respectively. The processing device 40 acquires the data from the storage device as needed.
 計測距離Zが計測範囲R1~R4のいずれかの計測範囲にある場合、処理装置40は、露光期間P1~P4から、当該計測範囲に対応する露光期間を選択して被検部12の生体情報を取得する。計測距離Zが変化して上記の計測範囲から他の計測範囲に移る場合、処理装置40は、露光期間P1~P4から、当該他の計測範囲に対応する他の露光期間を選択して被検部12の生体情報を取得する。 When the measurement distance Z0 is in one of the measurement ranges R1 to R4, the processing device 40 selects an exposure period corresponding to the measurement range from the exposure periods P1 to P4, and measures the living body of the subject 12. Get information. When the measurement distance Z0 changes and shifts from the above measurement range to another measurement range, the processing device 40 selects another exposure period corresponding to the other measurement range from the exposure periods P1 to P4, and Biological information of the detection unit 12 is acquired.
 [処理装置40が実行する補正動作の例]
 次に、図5を参照して、光検出装置30と生体10との計測距離Zが変化する場合の、本実施形態における処理装置40が実行する補正動作の例を説明する。図5は、光検出装置30と生体10との計測距離Zが変化する場合の、本実施形態における処理装置40が実行する補正動作の例を概略的に示すフローチャートである。処理装置40は、図5に示すステップS101からS108の動作を実行する。以下の説明において、光検出装置30は、2次元的に分布する複数の画素を有するイメージセンサである。処理装置40が取得する生体情報は脳血流情報である。
[Example of Correction Operation Executed by Processing Device 40]
Next, with reference to FIG. 5, an example of correction operation performed by the processing device 40 in this embodiment when the measured distance Z0 between the photodetector 30 and the living body 10 changes will be described. FIG. 5 is a flow chart schematically showing an example of correction operation performed by the processing device 40 in this embodiment when the measured distance Z0 between the photodetector 30 and the living body 10 changes. The processing device 40 executes the operations of steps S101 to S108 shown in FIG. In the following description, the photodetector 30 is an image sensor having a plurality of pixels distributed two-dimensionally. The biological information acquired by the processing device 40 is cerebral blood flow information.
 <ステップS101>
 処理装置40は、以下のようにして、輝度画像データを取得する。処理装置40は、光源20に、生体10の被検部12を照射するための光パルスIを出射させる。処理装置40は、光検出装置30に、光パルスIか被検部12で反射されて生じる反射光パルスを、現在設定されている露光期間において検出させて輝度画像データを生成させて出力させる。反射光パルスの検出について、処理装置40は、光検出装置30に、光パルスIを出射してから第1時間が経過した時点で、反射光パルスの検出を開始させ、反射光パルスの検出を開始してから所定の露光時間が経過した時点で、反射光パルスの検出を終了させる。所定の露光時間は、現在設定されている露光期間の時間幅である。処理装置40は、光検出装置30に、当該輝度画像データを処理装置40に送らせる。
<Step S101>
The processing device 40 acquires luminance image data as follows. The processing device 40 causes the light source 20 to emit a light pulse Ip for irradiating the subject 12 of the living body 10 . The processing device 40 causes the photodetector 30 to detect the light pulse Ip or the reflected light pulse generated by being reflected by the part 12 to be inspected during the currently set exposure period, and to generate and output luminance image data. . Regarding the detection of the reflected light pulse, the processing device 40 causes the photodetector 30 to start detecting the reflected light pulse when the first time elapses after the light pulse Ip is emitted. , the detection of the reflected light pulse is terminated when a predetermined exposure time has elapsed from the start of the operation. The predetermined exposure time is the duration of the currently set exposure period. The processor 40 causes the photodetector 30 to send the luminance image data to the processor 40 .
 本明細書において、反射光パルスの検出の開始とは、電荷蓄積部に、反射光パルスの成分に対応する電荷の蓄積を開始させることを意味する。反射光パルスの検出の終了とは、電荷蓄積部に、反射光パルスの成分に対応する電荷の蓄積を終了させることを意味する。 In this specification, starting detection of the reflected light pulse means causing the charge accumulation unit to start accumulating charges corresponding to components of the reflected light pulse. Termination of detection of the reflected light pulse means to cause the charge storage unit to terminate the storage of the charge corresponding to the component of the reflected light pulse.
 <ステップS102>
 処理装置40は、以下のようにして、輝度画像データに基づいて、光検出装置30と生体10との距離に関する距離データを取得する。光検出装置30と生体10との距離を計測する前に、キャリブレーションが実施される。当該キャリブレーションにおいて、計測範囲R1~R4に位置する対象物を光パルスで照射して生じる反射光パルスが、露光期間P1~P4においてそれぞれ検出される。対象物は例えば平板であり、当該平板の吸収係数および散乱係数は人の吸収係数および散乱係数にそれぞれ近くてもよい。当該キャリブレーションにより、光検出装置30の光検出面から当該光検出面に対して垂直な方向に離れる距離Z、輝度画像における画素の位置(X、Y)、および位置(X、Y)における輝度値Iについて、以下の式(1)が得られる。
<Step S102>
The processing device 40 acquires distance data regarding the distance between the photodetector 30 and the living body 10 based on the luminance image data as follows. Calibration is performed before measuring the distance between the photodetector 30 and the living body 10 . In the calibration, reflected light pulses generated by irradiating the object positioned in the measurement ranges R1 to R4 with light pulses are detected in the exposure periods P1 to P4, respectively. The object is, for example, a flat plate, and the absorption and scattering coefficients of the flat plate may be close to the absorption and scattering coefficients of humans, respectively. By the calibration, the distance Z from the photodetection surface of the photodetector 30 in the direction perpendicular to the photodetection surface, the position (X, Y) of the pixel in the luminance image, and the luminance at the position (X, Y) For the value I, the following equation (1) is obtained.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 距離Zを変化させ、距離Zと、輝度画像内の位置(X、Y)における輝度値Iとの関係を示すデータが取得される。取得されたデータに加えてデータ点間を補間したデータを用いることにより、式(1)によって表される関数が得られる。データ点間の補間には、内挿および外挿の方法が適用されてもよいし、種々の回帰方法が適用されてもよい。このデータの取得は、生体10の脳血流の計測のたびに行ってもよいし、最初に一度行ってもよい。式(1)によって表される関数は、テーブル形式で前述した記憶装置に記憶されていてもよい。処理装置40は、ステップS101において取得した輝度画像データを、式(1)によって表される関数に入力することにより、各画素での距離Zを算出することができる。 By changing the distance Z, data indicating the relationship between the distance Z and the luminance value I at the position (X, Y) in the luminance image is obtained. By using the data interpolated between the data points in addition to the acquired data, the function represented by Equation (1) is obtained. Interpolation and extrapolation methods may be applied to interpolate between data points, and various regression methods may be applied. This data acquisition may be performed each time the cerebral blood flow of the living body 10 is measured, or may be performed once at the beginning. The function represented by Equation (1) may be stored in the above-described storage device in table format. The processing device 40 can calculate the distance Z at each pixel by inputting the brightness image data acquired in step S101 into the function represented by Equation (1).
 処理装置40は、例えば画像における生体10の顔の位置を検出し、顔の位置との相対的な位置関係から額部を抽出する。処理装置40は、額部における距離Zの平均値を計測距離Zとして算出する。このように、処理装置40は、複数の画素の各々によって検出される反射光パルスの強度に基づいて、計測距離Zに関するデータを生成する。 The processing device 40 detects, for example, the position of the face of the living body 10 in the image, and extracts the forehead from the relative positional relationship with the position of the face. The processing device 40 calculates the average value of the distance Z at the forehead as the measured distance Z0 . Thus, the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
 <ステップS103>
 処理装置40は、計測距離Zが、現在設定されている露光期間に対応する計測範囲にあるか否かを判定する。本明細書において、当該計測範囲を「第1の範囲」と称する。判定がYesである場合、処理装置40はステップS104の動作を実行する。判定がNoである場合、処理装置40はステップS105の動作を実行する。
<Step S103>
The processing device 40 determines whether or not the measured distance Z0 is within the measurement range corresponding to the currently set exposure period. In this specification, the measurement range is called "first range". If the determination is Yes, the processing device 40 performs the operation of step S104. If the determination is No, the processing device 40 performs the operation of step S105.
 <ステップS104>
 ステップS103において判定がYesである場合、処理装置40は、ステップS101において取得した輝度画像データに基づいて、脳血流の変化量を示す脳血流データを生成して出力する。脳血流データは、脳の活動に関する脳活動データと言うこともできる。
<Step S104>
If the determination in step S103 is Yes, the processing device 40 generates and outputs cerebral blood flow data indicating the amount of change in cerebral blood flow based on the luminance image data acquired in step S101. The cerebral blood flow data can also be said to be brain activity data relating to brain activity.
 <ステップS105>
 ステップS103において判定がNoである場合、生体10が前後に移動して、計測距離Zが第1の範囲を逸脱する。計測距離Zが第1の範囲を逸脱する時刻を第1時刻とする。処理装置40は、第1時刻以降の予め設定された判定期間において、計測距離Zの変動量が所定の閾値以下であるか否かを判定する。言い換えれば、処理装置40は、判定期間において、計測距離Zの経時変化が所定の範囲に収まるか否かを判定する。距離の変動量は、例えば、判定期間内の距離の経時変化の変動幅、分散、および標準偏差の少なくとも1つに基づいて決定される基準値であり得る。第1時刻以降の判定が始まるまでの非判定期間の時間幅は、例えば1秒以上10秒以下であり得る。判定期間の時間幅は、例えば1秒以上10秒以下であり得る。所定の閾値は、例えば1mm以上30mm以下であり得る。判定がNoである場合、処理装置40は、ステップS106の動作を実行する。非判定期間も、判定がNoである場合に含まれる。判定がYesである場合、処理装置40は、ステップS107の動作を実行する。
<Step S105>
If the determination in step S103 is No, the living body 10 moves back and forth and the measured distance Z0 deviates from the first range. The time when the measured distance Z0 deviates from the first range is defined as the first time. The processing device 40 determines whether or not the amount of variation in the measured distance Z0 is equal to or less than a predetermined threshold during a predetermined determination period after the first time. In other words, the processing device 40 determines whether or not the change over time of the measured distance Z0 falls within a predetermined range during the determination period. The amount of variation in distance may be, for example, a reference value determined based on at least one of the variation width, variance, and standard deviation of the variation over time in distance within the determination period. The duration of the non-determination period until the determination starts after the first time can be, for example, 1 second or more and 10 seconds or less. The time width of the determination period can be, for example, 1 second or more and 10 seconds or less. The predetermined threshold may be, for example, 1 mm or more and 30 mm or less. If the determination is No, the processing device 40 performs the operation of step S106. A non-decision period is also included if the decision is no. If the determination is Yes, the processing device 40 performs the operation of step S107.
 <ステップS106>
 ステップS105において判定がNoである場合、生体10は移動し続けている。その場合、処理装置40は、仮のデータとして、第1時刻での脳血流データを出力する。処理装置40は、脳血流データを新たに生成して出力する必要はない。処理装置40は、第1時刻での脳血流データではなく欠損値を示すデータを出力してもよい。
<Step S106>
If the determination in step S105 is No, the living body 10 continues to move. In that case, the processing device 40 outputs the cerebral blood flow data at the first time as temporary data. The processing device 40 does not need to newly generate and output cerebral blood flow data. The processing device 40 may output data indicating a missing value instead of the cerebral blood flow data at the first time.
 <ステップS107>
 ステップS105において判定がYesである場合、生体10は移動を終了する。計測距離Zの変動が所定の閾値以下になる時刻を第2時刻とする。処理装置40は、脳血流値を初期化して、脳血流の初期値を第2時刻での脳血流値に再設定する。
<Step S107>
If the determination in step S105 is Yes, the living body 10 finishes moving. A second time is defined as a time at which the change in the measured distance Z0 becomes equal to or less than a predetermined threshold. The processing device 40 initializes the cerebral blood flow value and resets the initial value of the cerebral blood flow to the cerebral blood flow value at the second time.
 <ステップS108>
 第2時刻での計測距離Zが第1の範囲を逸脱している場合、第2時刻での計測距離Zは、第1の範囲とは異なる計測範囲にある。本明細書において、第1の範囲とは異なる計測範囲を「第2の範囲」と称する。第2時刻での計測距離Zが第2の範囲にある場合、処理装置40は、第1の範囲に対応する露光期間を第2の範囲に対応する露光期間に再設定する。言い換えれば、処理装置40は、ステップS101における、光パルスIを出射してから反射光パルスの検出を開始するまでの第1時間を、第1時間とは異なる第2時間に変更する。第2時刻での計測距離Zが存在する計測範囲の数が複数である場合、中央が計測距離Zに近い方の計測範囲を採用するといった条件に基づいて、処理装置40は、計測範囲を選択する。これに対して、第2時刻での計測距離Zが再び第1の範囲にある場合、処理装置40は、第1の範囲に対応する露光期間を維持する。以上のように、処理装置40は、第2時刻での計測距離Zに基づいて、第1時間を変更するか否かを決定する。
<Step S108>
If the measured distance Z0 at the second time is outside the first range, the measured distance Z0 at the second time is in a different measurement range than the first range. In this specification, a measurement range different from the first range is referred to as a "second range". When the measured distance Z0 at the second time is within the second range, the processing device 40 resets the exposure period corresponding to the first range to the exposure period corresponding to the second range. In other words, the processing device 40 changes the first time from the emission of the light pulse Ip to the start of detection of the reflected light pulse in step S101 to a second time different from the first time. If the number of measurement ranges in which the measurement distance Z0 exists at the second time is plural, the processing device 40 selects the measurement range based on the condition that the measurement range whose center is closer to the measurement distance Z0 is adopted. to select. On the other hand, when the measured distance Z0 at the second time is again within the first range, the processing device 40 maintains the exposure period corresponding to the first range. As described above, the processing device 40 determines whether or not to change the first time based on the measured distance Z0 at the second time.
 処理装置40は、脳血流の計測が終了するまで、ステップS101からS108の動作をフレームごとに繰り返し実行する。 The processing device 40 repeatedly executes the operations of steps S101 to S108 for each frame until the measurement of cerebral blood flow is completed.
 次に、図6Aおよび図6Bを参照して、処理装置40が上記の補正動作を実行した例を説明する。図6Aは、本実施形態における処理装置の補正動作を、光検出装置30と生体10の計測距離が短くなる場合に適用する例を説明するための図である。図6Aの(a)は、計測距離Zの時間変化を示す。図6Aの(b)は、露光期間と時間との関係を示す。図6Aの(c)は、生体10が安静にしている場合の、脳血流の変化量の時間変化を示す。当該脳血流の変化量は、酸素化ヘモグロビンの変化量を被検部12において平均化した値である。酸素化ヘモグロビンの変化量ではなく、被検部12における酸素化ヘモグロビンの変化量および脱酸素化ヘモグロビンの変化量の合計を用いてもよい。 Next, with reference to FIGS. 6A and 6B, an example in which the processing device 40 performs the above correction operation will be described. FIG. 6A is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 becomes short. (a) of FIG. 6A shows the time change of the measured distance Z0 . (b) of FIG. 6A shows the relationship between the exposure period and time. (c) of FIG. 6A shows temporal changes in the amount of change in cerebral blood flow when the living body 10 is at rest. The amount of change in cerebral blood flow is a value obtained by averaging the amount of change in oxygenated hemoglobin in the subject 12 . Instead of the amount of change in oxygenated hemoglobin, the sum of the amount of change in oxygenated hemoglobin and the amount of change in deoxygenated hemoglobin in the test area 12 may be used.
 図6Aの(a)に示す例において、初期の計測距離Zは計測範囲R2にある。計測距離ZがD21よりも短くなる場合、ステップS103における判定がNoになる。計測距離ZがD21よりも短くなる時刻が第1時刻である。計測距離Zが変化している最中は、図6Aの(c)に示すように、ステップS106における第1時刻での脳血流データの出力が続けられる。ハッチングが付された判定期間において、計測距離Zの変動量が閾値以下になる場合、ステップS105における判定がYesになる。計測距離Zの変動量が閾値以下になる時刻が第2時刻である。ステップS107において、第2時刻での脳血流値が脳血流の初期値になる。ステップS108において、図6Aの(b)に示すように、露光期間P2が露光期間P1に再設定される。第2時刻以降は、次にステップS103における判定がNoになるまで、ステップS104における脳血流データの出力が続けられる。 In the example shown in FIG. 6A (a), the initial measurement distance Z0 is in the measurement range R2. If the measured distance Z0 is shorter than D21 , the determination in step S103 is No. The time when the measured distance Z0 becomes shorter than D21 is the first time. While the measured distance Z0 is changing, the output of the cerebral blood flow data at the first time in step S106 continues, as shown in FIG. 6A(c). If the amount of variation in the measured distance Z0 is less than or equal to the threshold during the hatched determination period, the determination in step S105 is Yes. The second time is the time when the amount of variation in the measured distance Z0 becomes equal to or less than the threshold. In step S107, the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow. In step S108, as shown in (b) of FIG. 6A, the exposure period P2 is reset to the exposure period P1. After the second time, the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
 図6Bは、本実施形態における処理装置の補正動作を、光検出装置30と生体10の計測距離が短くなってから長くなる場合に適用する例を説明するための図である。図6Bの(a)から(c)は、図6Aの(a)から(c)にそれぞれ対応する。図6Bの(a)に示す例では、図6Aの(a)に示す例と同様に、初期の計測距離Zは計測範囲R2にあり、計測距離ZがD12よりも短くなる時刻が第1時刻である。図6Bの(a)に示す例では、計測距離Zの変動量が閾値以下になる第2時刻で、計測距離ZがD12よりも長くなり、第2時刻での計測距離Zは計測範囲R2にある。ステップS107において、第2時刻での脳血流値が脳血流の初期値になる。ステップS108において、図6Bの(b)に示すように、露光期間は露光期間P2のまま維持される。第2時刻以降は、次にステップS103における判定がNoになるまで、ステップS104における脳血流データの出力が続けられる。 FIG. 6B is a diagram for explaining an example in which the correcting operation of the processing device in this embodiment is applied when the measurement distance between the photodetector 30 and the living body 10 decreases and then increases. (a) to (c) of FIG. 6B correspond to (a) to (c) of FIG. 6A, respectively. In the example shown in (a) of FIG . 6B, similarly to the example shown in (a ) of FIG . It is the first time. In the example shown in (a) of FIG. 6B, at the second time when the amount of variation in the measured distance Z0 becomes equal to or less than the threshold, the measured distance Z0 becomes longer than D12 , and the measured distance Z0 at the second time is It is in the measurement range R2. In step S107, the cerebral blood flow value at the second time becomes the initial value of cerebral blood flow. In step S108, as shown in (b) of FIG. 6B, the exposure period is maintained as the exposure period P2. After the second time, the output of the cerebral blood flow data in step S104 is continued until the determination in step S103 becomes No.
 図6Aおよび図6Bに示すように、本実施形態における処理装置40が実行する補正動作により、生体10の体動が脳血流データに及ぼす影響を抑制することができ、生体10の脳血流データを非接触で安定的に取得することが可能になる。 As shown in FIGS. 6A and 6B, the correction operation performed by the processing device 40 in this embodiment can suppress the influence of the body movement of the living body 10 on the cerebral blood flow data, and the cerebral blood flow of the living body 10 can be suppressed. It becomes possible to acquire data stably without contact.
 [処理装置40が実行する補正動作の他の例]
 図5に示す補正動作の例では、輝度画像データに基づいて、計測距離Zが算出される。輝度画像データではなく、距離画像データに基づいて計測距離Zを算出すれば、計測距離Zを精度よく計測できる。その結果、より正確なタイミングで、脳血流の初期値および露光期間を再設定することができる。
[Another example of correction operation performed by processing device 40]
In the example of the correction operation shown in FIG. 5, the measured distance Z0 is calculated based on the brightness image data. If the measured distance Z0 is calculated based on the distance image data instead of the luminance image data, the measured distance Z0 can be measured with high accuracy. As a result, the initial value of cerebral blood flow and the exposure period can be reset with more accurate timing.
 まず、図7を参照して、本実施形態における距離画像データの生成方法を説明する。計測範囲R1~R4について、脳血流データを精度よく計測できる上記の露光期間P1~P4に加えて、反射光パルスの立ち上がり期間を含む露光期間P1A~P4Aと、反射光パルスの立ち下がり期間を含む露光期間P1B~P4Bとがそれぞれ設定される。すなわち、計測範囲R1~R4の各々には、3種類の露光期間が設定される。3種類の露光期間による反射光パルスの検出は、異なる画素ごとに行ってもよいし、同じ画素の異なる電荷蓄積部ごとに行ってもよいし、時間ごとに露光期間を切り替えて行ってもよいし、これらを組み合わせて行ってもよい。 First, a method for generating distance image data in this embodiment will be described with reference to FIG. For the measurement range R1 to R4, in addition to the above-mentioned exposure periods P1 to P4 for accurately measuring cerebral blood flow data, the exposure periods P1A to P4A including the rising period of the reflected light pulse and the falling period of the reflected light pulse are set. Exposure periods P1B to P4B are set, respectively. That is, three types of exposure periods are set for each of the measurement ranges R1 to R4. Detection of the reflected light pulse by the three types of exposure periods may be performed for each different pixel, may be performed for each different charge accumulation portion of the same pixel, or may be performed by switching the exposure period for each time. or a combination of these.
 図7は、本実施形態における距離画像データの生成方法を説明するための図である。図7の(a)は発光パルスの時間変化を示す。図7の(b)は、反射光パルスの時間変化を示す。図7の(c)は、露光期間P1~P4と時間との関係を示す。図7の(d)は、露光期間P1A~P4Aと時間との関係を示す。図7の(e)は、露光期間P1B~P4Bと時間との関係を示す。 FIG. 7 is a diagram for explaining a method of generating distance image data in this embodiment. FIG. 7(a) shows the temporal change of the light emission pulse. (b) of FIG. 7 shows the temporal change of the reflected light pulse. (c) of FIG. 7 shows the relationship between the exposure periods P1 to P4 and time. (d) of FIG. 7 shows the relationship between the exposure periods P1A to P4A and time. (e) of FIG. 7 shows the relationship between the exposure periods P1B to P4B and time.
 図7の(c)に示すように、露光期間P1~P4に蓄積される電荷量を示す信号をSとする。信号Sは脳血流情報を含む。処理装置40は、信号Sに基づいて、輝度画像データを生成して出力する。 As shown in (c) of FIG. 7, the signal indicating the amount of charge accumulated in the exposure periods P1 to P4 is assumed to be S0 . Signal S0 contains cerebral blood flow information. The processor 40 generates and outputs luminance image data based on the signal S0 .
 図7の(a)および(b)に示すように、発光パルスおよび反射光パルスの時間幅をTとする。図7の(d)に示すように、光源20が光パルスの出射を開始してから露光期間P1A~P4Aが終了するまでの時間をtとする。図7の(e)に示すように、光源20が光パルスの出射を開始してから露光期間P1B~P4Bが開始されるまでの時間をtとする。図7の(d)に示すように、露光期間P1A~P4Aに蓄積される電荷量を示す信号をSとする。図7の(e)に示すように、露光期間P1B~P4Bに蓄積される電荷量を示す信号をSとする。信号SおよびSのそれぞれの強度は、距離Zに応じて変動する。空気中の光速をc(≒3.0×10m/s)として、距離Zは、以下の式(2)を用いて算出できる。 As shown in (a) and (b) of FIG. 7, the time width of the light emission pulse and the reflected light pulse is set to T0 . As shown in (d) of FIG. 7, the time from when the light source 20 starts emitting light pulses to when the exposure periods P1A to P4A end is tA . As shown in (e) of FIG. 7, the time from when the light source 20 starts emitting light pulses to when the exposure periods P1B to P4B start is tB . As shown in (d) of FIG. 7, let S A be a signal indicating the amount of charge accumulated in the exposure periods P1A to P4A. As shown in (e) of FIG. 7, the signal indicating the amount of charge accumulated in the exposure periods P1B to P4B is SB . The intensity of each of signals S A and S B varies with distance Z. FIG. Assuming that the speed of light in air is c (≈3.0×10 8 m/s), the distance Z can be calculated using the following equation (2).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 処理装置40は、式(2)に基づいて、距離画像データを生成して出力する。式(2)に基づく距離画像データの生成方法の詳細は、特願2021-012027号(出願日2021年1月28日)に開示されている。参考のために、特願2021-012027号の開示内容のすべてを参照によって本明細書に援用する。 The processing device 40 generates and outputs distance image data based on Equation (2). Details of the method of generating distance image data based on Equation (2) are disclosed in Japanese Patent Application No. 2021-012027 (filed on January 28, 2021). For reference, the entire disclosure of Japanese Patent Application No. 2021-012027 is incorporated herein by reference.
 次に、図8を参照して、光検出装置30と生体10との計測距離Zが変化する場合の、本実施形態における処理装置40が実行する補正動作の他の例を説明する。当該他の例において、処理装置40は、脳血流データを取得するための輝度画像データに加えて距離画像データを取得し、距離画像データに基づいて計測距離Zを算出する。計測距離Zを精度よく計測できるので、より正確なタイミングで、脳血流の初期値および露光期間を再設定することができる。 Next, another example of the correction operation performed by the processing device 40 in this embodiment when the measured distance Z0 between the photodetector 30 and the living body 10 changes will be described with reference to FIG. In this other example, the processing device 40 acquires range image data in addition to luminance image data for acquiring cerebral blood flow data, and calculates the measured distance Z0 based on the range image data. Since the measurement distance Z0 can be accurately measured, the initial value of the cerebral blood flow and the exposure period can be reset with more accurate timing.
 図8は、光検出装置30と生体10との計測距離Zが変化する場合の、処理装置40が実行する補正動作の他の例を概略的に示すフローチャートである。処理装置40は、図8に示すステップS201からS208の動作を実行する。以下では、図5に示す補正動作とは異なる点を中心に説明する。 FIG. 8 is a flowchart schematically showing another example of the correction operation performed by the processing device 40 when the measured distance Z0 between the photodetector 30 and the living body 10 changes. The processing device 40 executes the operations of steps S201 to S208 shown in FIG. In the following, the points different from the correction operation shown in FIG. 5 will be mainly described.
 <ステップS201>
 処理装置40は、輝度画像データおよび距離画像データを取得する。
<Step S201>
The processing device 40 acquires luminance image data and distance image data.
 <ステップS202>
 処理装置40は、距離画像データが示す距離画像のうち、生体10の被検部12である額部を抽出し、額部に対応しない画素についての距離の値を欠損値に変換する。この処理により、額部に対応しない画素についての距離の値が、計測距離Zの算出に影響を及ぼさないようにすることができる。額部の抽出は、例えば以下のようにして行われ得る。画素ごとに距離を推定し、距離画像のうち、推定される距離がある一定の範囲内に収まる画素を含む部分が、額部として抽出される。あるいは、生体10の外観を示す画像を用いた顔検出の処理により、額部が抽出される。それらを組み合わせてもよい。
<Step S202>
The processing device 40 extracts the forehead, which is the subject 12 of the living body 10, from the distance image indicated by the distance image data, and converts the distance values of the pixels that do not correspond to the forehead into missing values. This processing can prevent the distance values of the pixels that do not correspond to the forehead from affecting the calculation of the measured distance Z0 . Extraction of the forehead can be performed, for example, as follows. A distance is estimated for each pixel, and a portion of the distance image that includes pixels whose estimated distance falls within a certain range is extracted as the forehead. Alternatively, the forehead is extracted by face detection processing using an image showing the appearance of the living body 10 . You can combine them.
 額部の画素であっても、その周縁領域に対応する画素については、計測中における生体10の動きが原因で、その画素についての距離の値が欠損値に変換され得る。したがって、額部の周縁領域に対応する画素についての距離の値を欠損値に変換してもよい。額部の周縁領域は、例えば、額部のエッジから0画素以上5画素以下の分だけ内側の領域であり得る。 Even for forehead pixels, for pixels corresponding to the peripheral region, the distance value for that pixel may be converted to a missing value due to movement of the living body 10 during measurement. Therefore, the distance values for the pixels corresponding to the peripheral region of the forehead may be converted to missing values. The peripheral region of the forehead may be, for example, a region 0 pixels or more and 5 pixels or less inward from the edge of the forehead.
 額部のエッジから0画素についての距離の値を欠損値に変換するとは、エッジに対応する画素についての距離の値を欠損値に変換する動作を意味する。額部のエッジからn画素(nは1以上5以下の整数)の分だけ内側の領域にある画素についての距離の値を欠損値に変換するとは、以下の動作をn回繰り返すことを意味する。当該動作は、エッジに対応する画素についての距離の値を欠損値に変換し、残った額部からエッジを再検出し、再検出したエッジに対応する画素についての距離の値を欠損値に変換することである。 Converting the distance value of the 0 pixel from the edge of the forehead into a missing value means the operation of converting the distance value of the pixel corresponding to the edge into a missing value. Converting the distance value of the pixel located in the area inside by n pixels (n is an integer of 1 to 5) from the edge of the forehead to a missing value means repeating the following operation n times. . The operation converts distance values for pixels corresponding to edges to missing values, redetects edges from the remaining forehead, and converts distance values for pixels corresponding to the redetected edges to missing values. It is to be.
 <ステップS203>
 処理装置40は、欠損値を示す画素の数が所定の閾値以下であるか否かを判定する。当該閾値は、例えば、画像中に占める生体10の像の大きさに応じて決定され得る。判定がNoである場合、処理装置40は、ステップS204の動作を実行する。判定がYesである場合、処理装置40は、ステップS206の動作を実行する。
<Step S203>
The processing device 40 determines whether or not the number of pixels showing missing values is equal to or less than a predetermined threshold. The threshold can be determined, for example, according to the size of the image of the living body 10 in the image. If the determination is No, the processing device 40 performs the operation of step S204. If the determination is Yes, the processing device 40 performs the operation of step S206.
 <ステップS204およびS205>
 ステップS203における判定がNoである場合、すなわち欠損値の画素数が閾値を超える場合、計測距離Zは、現在設定されている3種類の露光期間に対応する計測範囲にない。計測距離Zが当該計測範囲の上限よりも長い場合、反射光パルスが光検出装置30に到達する時間が遅くなるので、露光期間P1A~P4Aに検出される光量は相対的に減少し、露光期間P1B~P4Bに検出される光量は相対的に増加する。反対に、計測距離Zが当該計測範囲の下限よりも短い場合、反射光パルスが光検出装置30に到達する時間が早くなるので、露光期間P1A~P4Aに検出される光量は相対的に増加し、露光期間P1B~P4Bに検出される光量は相対的に減少する。そのような検出光量の増減により、距離データの精度は低下し得る。
<Steps S204 and S205>
If the determination in step S203 is No, that is, if the number of missing value pixels exceeds the threshold, the measurement distance Z0 is not within the measurement range corresponding to the currently set three types of exposure periods. If the measurement distance Z0 is longer than the upper limit of the measurement range, the time it takes for the reflected light pulse to reach the photodetector 30 is delayed. The amount of light detected in periods P1B to P4B relatively increases. Conversely, when the measurement distance Z0 is shorter than the lower limit of the measurement range, the reflected light pulse reaches the photodetector 30 earlier, so the amount of light detected during the exposure periods P1A to P4A relatively increases. However, the amount of light detected during the exposure periods P1B to P4B is relatively decreased. Accuracy of distance data can be reduced due to such an increase or decrease in the amount of detected light.
 したがって、処理装置40は、ステップS204において脳血流の初期値を再設定し、ステップS205において3種類の露光期間を再設定する。処理装置40は、例えば計測範囲に対応する3種類の露光期間を、計測範囲R1~R4の順に切り替えて探索してもよい。あるいは、直前のフレームにおいて計測距離Zを精度よく計測できていた場合は、計測範囲R1~R4のうち、その計測距離Zに対応する計測範囲に近い計測範囲から順に切り替えてもよい。そのような切り替えにより、適切な計測範囲に対応する3種類の露光期間を効率的に探索することができる。 Therefore, the processing device 40 resets the initial value of the cerebral blood flow in step S204, and resets the three exposure periods in step S205. For example, the processing device 40 may search the three types of exposure periods corresponding to the measurement ranges by switching in order of the measurement ranges R1 to R4. Alternatively, if the measurement distance Z0 was accurately measured in the previous frame, the measurement ranges R1 to R4 may be switched in order from the measurement range closest to the measurement range corresponding to the measurement distance Z0 . By such switching, it is possible to efficiently search for three types of exposure periods corresponding to appropriate measurement ranges.
 切り替えのたびに、処理装置40は、ステップS201からS203の動作を実行する。切り替えた3種類の露光期間が適切である場合、ステップS203における判定はYesになる。切り替えた3種類の露光期間が適切でない場合、ステップS203における判定はNoになる。 The processing device 40 executes the operations of steps S201 to S203 each time switching is performed. If the three switched exposure periods are appropriate, the determination in step S203 is Yes. If the three switched exposure periods are not appropriate, the determination in step S203 is No.
 <ステップS206>
 ステップS203における判定がYesである場合であっても、計測距離Zが、現在設定されている3種類の露光期間に対応する計測範囲にない可能性がある。例えば、計測距離Zが計測範囲R1にあっても、計測距離ZがD21以上D12以下である場合、3種類の露光期間は再設定される。
<Step S206>
Even if the determination in step S203 is Yes, there is a possibility that the measurement distance Z0 is not within the measurement range corresponding to the currently set three types of exposure periods. For example, even if the measurement distance Z0 is within the measurement range R1, when the measurement distance Z0 is D21 or more and D12 or less, the three types of exposure periods are reset.
 処理装置40は、距離画像データから距離データを取得する。処理装置40は、額部における距離の平均値を計測距離Zとして算出する。このように、処理装置40は、複数の画素の各々によって検出される反射光パルスの強度に基づいて、計測距離Zに関するデータを生成する。 The processing device 40 acquires distance data from the distance image data. The processing device 40 calculates the average value of the distances on the forehead as the measured distance Z0 . Thus, the processing unit 40 generates data regarding the measured distance Z0 based on the intensity of the reflected light pulse detected by each of the plurality of pixels.
 <ステップS207からS212>
 ステップS207からS212の動作は、図5に示すステップS103からS108の動作と同じである。
<Steps S207 to S212>
The operations from steps S207 to S212 are the same as the operations from steps S103 to S108 shown in FIG.
 処理装置40は、脳血流の計測が終了するまで、ステップS201からS212の動作をフレームごとに繰り返し実行する。その結果、計測距離Zの精度を向上させることができ、生体10の脳血流データを非接触でより安定的に取得することが可能になる。 The processing device 40 repeatedly executes the operations of steps S201 to S212 for each frame until the measurement of cerebral blood flow is completed. As a result, the accuracy of the measured distance Z0 can be improved, and the cerebral blood flow data of the living body 10 can be acquired more stably without contact.
 実際には、生体10は前後方向に移動し、水平方向にも移動したり、傾いたりし得る。以下に、生体10がそのような動きをする場合の、本実施形態による処理装置40が実行する補正動作の例を説明する。 In reality, the living body 10 can move forward and backward, move horizontally, and tilt. An example of the correction operation performed by the processing device 40 according to the present embodiment when the living body 10 makes such movements will be described below.
 ステップS206において、処理装置40は、以下のようにして、距離データが示す距離を、生体10の体動前の初期距離と、そこからずれた距離ΔZとの合計によって表す。処理装置40は、生体10の体動前および体動後の距離画像を公知のICPアルゴリズムによって処理することにより、回転行列および平行移動ベクトルによって体動前後の距離画像を位置合わせする。処理装置40は、生体10の体動前の額部における距離の平均値を初期距離として設定し、上記の平行移動ベクトルの前後方向の成分を距離ΔZとして設定する。ICPアルゴリズムにより、生体10の体動を規定する回転行列および平行移動ベクトルを得ることができる。 In step S206, the processing device 40 expresses the distance indicated by the distance data as the sum of the initial distance before body movement of the living body 10 and the distance ΔZ deviated therefrom as follows. The processing device 40 processes the range images before and after the body motion of the living body 10 using a known ICP algorithm, and aligns the range images before and after the body motion using the rotation matrix and the translation vector. The processing device 40 sets the average value of the distances at the forehead before body movement of the living body 10 as the initial distance, and sets the component of the translation vector in the front-rear direction as the distance ΔZ. A rotation matrix and a translation vector that define the body motion of the living body 10 can be obtained by the ICP algorithm.
 処理装置40は、ステップS207およびS209における判定を、距離だけでなく、水平方向の体動量および/または顔の回転量に基づいて行ってもよい。水平方向の体動量および顔の回転量は、上記の平行移動ベクトルおよび回転行列からそれぞれ知ることができる。以上により、生体10の前後以外の体動による脳血流データの変動を抑制することができ、脳血流データを安定的に取得することが可能になる。 The processing device 40 may make the determinations in steps S207 and S209 based not only on the distance but also on the amount of horizontal body movement and/or the amount of face rotation. The amount of body motion in the horizontal direction and the amount of rotation of the face can be known from the above translation vector and rotation matrix, respectively. As described above, fluctuations in the cerebral blood flow data due to body movements other than the back and forth movement of the living body 10 can be suppressed, and the cerebral blood flow data can be stably acquired.
 なお、本実施形態による光検出システム100は、光検出装置30に加えて、光検出装置30と生体10との距離を計測するための他の光検出装置をさらに備えてもよい。当該他の光検出装置は、例えばRGBカメラ、全方位カメラ、またはToFカメラであり得る。RGBカメラでは、取得した画像における生体10の像の拡大縮小率から距離を算出することができる。全方位カメラでは、例えば光検出装置30および生体10の各々の直上に当該カメラを取り付けることにより、距離を算出することができる。ToFカメラでは、測距結果から距離を算出することができる。当該他の光検出装置により、複数の視点から生体10の体動を計測することができ、光検出装置30と生体10との距離をより正確に計測することが可能になる。 Note that the photodetection system 100 according to the present embodiment may further include, in addition to the photodetection device 30, another photodetection device for measuring the distance between the photodetection device 30 and the living body 10. The other photodetection device can be, for example, an RGB camera, an omnidirectional camera, or a ToF camera. With the RGB camera, the distance can be calculated from the magnification/reduction ratio of the image of the living body 10 in the acquired image. With an omnidirectional camera, the distance can be calculated by attaching the camera directly above each of the photodetector 30 and the living body 10, for example. The ToF camera can calculate the distance from the distance measurement result. The other photodetector can measure the body motion of the living body 10 from a plurality of viewpoints, and the distance between the photodetector 30 and the living body 10 can be measured more accurately.
 ある光検出システムにおいて、本実施形態による処理装置40の補正動作が実行されている否かは、処理装置が出力する電圧の時間変化を調べることによって知ることができる。非露光期間において出力電圧はゼロであり、露光期間において出力電圧はノンゼロになる。すなわち、出力電圧の時間変化には、露光期間に対応する電圧パルスが現れる。生体10が図6Aの(a)に示すように移動する場合、露光期間が再設定されると、電圧パルスが時間軸に沿ってシフトする。生体10が図6Bの(a)に示すように移動する場合、または生体10が移動しない場合、露光期間は再設定されないので、電圧パルスのシフトは生じない。このように、電圧パルスがシフトするか否かは、生体10の体動に依存する。 In a certain photodetection system, whether or not the correction operation of the processing device 40 according to this embodiment is being executed can be known by checking the time change of the voltage output by the processing device. The output voltage is zero during the non-exposure period, and becomes non-zero during the exposure period. That is, a voltage pulse corresponding to the exposure period appears in the temporal change of the output voltage. When the living body 10 moves as shown in FIG. 6A (a), the voltage pulse shifts along the time axis when the exposure period is reset. If the living body 10 moves as shown in FIG. 6B (a), or if the living body 10 does not move, the exposure period is not reset, so no voltage pulse shift occurs. Thus, whether or not the voltage pulse shifts depends on the body motion of the living body 10 .
 以下に、被検部12の内部情報の取得に関する事項を説明する。当該事項は、光検出装置30の構成、第1光パルスIp1および第2光パルスIp2の出射動作、処理装置40の動作、ならびに血液中のHbOおよびHbの各濃度の初期値からの変化量の算出である。 Matters relating to acquisition of internal information of the subject 12 will be described below. The items are the configuration of the photodetector 30, the operation of emitting the first light pulse Ip1 and the second light pulse Ip2 , the operation of the processing unit 40, and the initial values of the concentrations of HbO 2 and Hb in blood. This is the calculation of the amount of change.
 [光検出装置30の構成]
 図9を参照して、光検出装置30の構成の例を説明する。図9は、光検出装置30の構成の一例を示す図である。図9において、二点鎖線の枠によって囲まれた領域が1つの画素201に相当する。画素201には、図示されていないが1つのフォトダイオードが含まれる。図9では2行4列に配列された8画素を示しているが、実際にはさらに多数の画素が配置され得る。各画素201は、第1浮遊拡散層204および第2浮遊拡散層206を含む。ここで、第1光パルスIp1の波長が650nm以上であり、かつ、805nmよりも短く、第2光パルスIp2の波長が805nmよりも長く、かつ、950nm以下であるとする。第1浮遊拡散層204は、第1光パルスIp1による第1反射光パルスを受光して生じた電荷を蓄積する。第2浮遊拡散層206は、第2光パルスIp2による第2反射光パルスを受光して生じた電荷を蓄積する。第1浮遊拡散層204および第2浮遊拡散層206に蓄積される信号は、あたかも一般的なCMOSイメージセンサの2画素の信号のように取り扱われ、光検出装置30から出力される。
[Configuration of photodetector 30]
An example of the configuration of the photodetector 30 will be described with reference to FIG. FIG. 9 is a diagram showing an example of the configuration of the photodetector 30. As shown in FIG. In FIG. 9 , a region surrounded by a two-dot chain line frame corresponds to one pixel 201 . Pixel 201 includes one photodiode, not shown. Although eight pixels arranged in two rows and four columns are shown in FIG. 9, more pixels may actually be arranged. Each pixel 201 includes a first floating diffusion layer 204 and a second floating diffusion layer 206 . Here, it is assumed that the wavelength of the first optical pulse Ip1 is 650 nm or more and shorter than 805 nm, and the wavelength of the second optical pulse Ip2 is longer than 805 nm and 950 nm or less. The first floating diffusion layer 204 accumulates charges generated by receiving the first reflected light pulse from the first light pulse Ip1 . The second floating diffusion layer 206 accumulates charges generated by receiving the second reflected light pulse from the second light pulse Ip2 . The signals accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are treated as if they were two pixel signals of a general CMOS image sensor, and output from the photodetector 30 .
 各画素201は、4つの信号検出回路を有する。各信号検出回路は、ソースフォロワトランジスタ309と、行選択トランジスタ308と、リセットトランジスタ310とを含む。各トランジスタは、例えば半導体基板に形成された電界効果トランジスタであるが、これに限定されない。図示されるように、ソースフォロワトランジスタ309の入力端子および出力端子の一方と、行選択トランジスタ308の入力端子および出力端子のうちの一方とが接続されている。ソースフォロワトランジスタ309の入力端子および出力端子の上記一方は、典型的にはソースである。行選択トランジスタ308の入力端子および出力端子の上記一方は、典型的にはドレインである。ソースフォロワトランジスタ309の制御端子であるゲートは、フォトダイオードに接続されている。フォトダイオードによって生成された正孔または電子の信号電荷は、フォトダイオードとソースフォロワトランジスタ309との間の電荷蓄積部である浮遊拡散層に蓄積される。 Each pixel 201 has four signal detection circuits. Each signal detection circuit includes a source follower transistor 309 , a row select transistor 308 and a reset transistor 310 . Each transistor is, for example, a field effect transistor formed on a semiconductor substrate, but is not limited to this. As shown, one of the input and output terminals of source follower transistor 309 is connected to one of the input and output terminals of row select transistor 308 . The one of the input and output terminals of source follower transistor 309 is typically the source. The one of the input and output terminals of row select transistor 308 is typically the drain. The gate, which is the control terminal of the source follower transistor 309, is connected to the photodiode. Signal charges of holes or electrons generated by the photodiode are accumulated in a floating diffusion layer, which is a charge accumulation part between the photodiode and the source follower transistor 309 .
 図9には示されていないが、第1浮遊拡散層204および第2浮遊拡散層206はフォトダイオードに接続される。フォトダイオードと、第1浮遊拡散層204および第2浮遊拡散層206の各々との間には、スイッチが設けられ得る。このスイッチは、処理装置40からの信号蓄積パルスに応じて、フォトダイオードと第1浮遊拡散層204および第2浮遊拡散層206の各々との間の導通状態を切り替える。これにより、第1浮遊拡散層204および第2浮遊拡散層206の各々への信号電荷の蓄積の開始と停止とが制御される。本実施形態における電子シャッタは、このような露光制御のための機構を有する。 Although not shown in FIG. 9, the first floating diffusion layer 204 and the second floating diffusion layer 206 are connected to photodiodes. A switch may be provided between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 . This switch switches the conduction state between the photodiode and each of the first floating diffusion layer 204 and the second floating diffusion layer 206 in response to the signal accumulation pulse from the processing device 40 . This controls the start and stop of signal charge accumulation in each of the first floating diffusion layer 204 and the second floating diffusion layer 206 . The electronic shutter in this embodiment has a mechanism for such exposure control.
 第1浮遊拡散層204および第2浮遊拡散層206に蓄積された信号電荷は、行選択回路302によって行選択トランジスタ308のゲートがONにされることにより、読み出される。この際、第1浮遊拡散層204および第2浮遊拡散層206の信号電位に応じて、ソースフォロワ電源305からソースフォロワトランジスタ309およびソースフォロワ負荷306へ流入する電流が増幅される。垂直信号線304から読み出されるこの電流によるアナログ信号は、列ごとに接続されたアナログ-デジタル(AD)変換回路307によってデジタル信号データに変換される。このデジタル信号データは、列選択回路303によって列ごとに読み出され、光検出装置30から出力される。行選択回路302および列選択回路303は、1つの行の読出しを行った後、次の行の読み出しを行い、以下同様に、すべての行の浮遊拡散層の信号電荷の情報を読み出す。処理装置40は、すべての信号電荷を読み出した後、リセットトランジスタ310のゲートをオンにすることにより、すべての浮遊拡散層をリセットする。これにより、1つのフレームの撮像が完了する。以下同様に、フレームの高速撮像を繰り返すことにより、光検出装置30による一連のフレームの撮像が完結する。 The signal charges accumulated in the first floating diffusion layer 204 and the second floating diffusion layer 206 are read out by turning on the gate of the row selection transistor 308 by the row selection circuit 302 . At this time, the current flowing from the source follower power supply 305 to the source follower transistor 309 and the source follower load 306 is amplified according to the signal potential of the first floating diffusion layer 204 and the second floating diffusion layer 206 . An analog signal based on this current read out from the vertical signal line 304 is converted into digital signal data by an analog-digital (AD) conversion circuit 307 connected for each column. This digital signal data is read column by column by the column selection circuit 303 and output from the photodetector 30 . After reading one row, the row selection circuit 302 and column selection circuit 303 read out the next row, and so on, to read the signal charge information of the floating diffusion layers of all the rows. After reading all the signal charges, the processing device 40 resets all the floating diffusion layers by turning on the gate of the reset transistor 310 . This completes imaging of one frame. Similarly, by repeating high-speed imaging of frames, a series of frame imaging by the photodetector 30 is completed.
 本実施形態では、CMOS型の光検出装置30の例を説明したが、光検出装置30は他の種類の撮像素子であってもよい。光検出装置30は、例えば、CCD型であっても、単一光子計数型素子であっても、EMCCDまたはICCDなどの増幅型イメージセンサであってもよい。 In this embodiment, an example of the CMOS-type photodetector 30 has been described, but the photodetector 30 may be another type of imaging device. The photodetector 30 may be, for example, a CCD type, a single photon counting device, or an intensifying image sensor such as an EMCCD or an ICCD.
 [第1光パルスIp1および第2光パルスIp2の出射動作]
 次に、図10Aおよび図10Bを参照して、第1光パルスIp1および第2光パルスIp2の出射動作を説明する。図10Aは、第1光パルスIp1および第2光パルスIp2を出射する動作の例を示す図である。図10Aに示すように、1フレーム内で、第1光パルスIp1の出射と第2光パルスIp2の出射とを交互に複数回切り替えてもよい。その結果、2種類の波長による検出画像の取得タイミングの時間差を低減でき、被検部12に動きがある場合であっても、ほぼ同時に第1光パルスIp1および第2光パルスIp2を用いた撮像が可能である。
[Emitting Operation of First Optical Pulse Ip1 and Second Optical Pulse Ip2 ]
Next, the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 will be described with reference to FIGS. 10A and 10B. FIG. 10A is a diagram showing an example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 . As shown in FIG. 10A, within one frame, the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be alternately switched multiple times. As a result, it is possible to reduce the time difference between acquisition timings of the detection images by the two kinds of wavelengths, and use the first optical pulse Ip1 and the second optical pulse Ip2 almost simultaneously even when the subject 12 is moving. imaging is possible.
 図10Bは、第1光パルスIp1および第2光パルスIp2を出射する動作の他の例を示す図である。図10Bに示すように、第1光パルスIp1の出射と第2光パルスIp2の出射とをフレームごとに切り替えてもよい。その結果、第1光パルスIp1による第1反射光パルスの検出と第2光パルスIp2による第2反射光パルスの検出とを、フレームごとに切り替えることができる。その場合、各画素201は単一の電荷蓄積部を備えていてもよい。そのような構成によれば、各画素201の電荷蓄積部の数を低減できるため、各画素201のサイズを大きくでき、感度を向上させることができる。 FIG. 10B is a diagram showing another example of the operation of emitting the first optical pulse Ip1 and the second optical pulse Ip2 . As shown in FIG. 10B, the emission of the first optical pulse Ip1 and the emission of the second optical pulse Ip2 may be switched for each frame. As a result, detection of the first reflected light pulse by the first light pulse Ip1 and detection of the second reflected light pulse by the second light pulse Ip2 can be switched for each frame. In that case, each pixel 201 may have a single charge reservoir. With such a configuration, the number of charge storage units in each pixel 201 can be reduced, so the size of each pixel 201 can be increased, and the sensitivity can be improved.
 [処理装置40の動作]
 次に、図11を参照して、光源20および光検出装置30に関する処理装置40の動作の例を説明する。図11は、光源20および光検出装置30に関する処理装置40の動作の概略を示すフローチャートである。処理装置40は、概略的には図11に示す動作を実行することにより、第1および第2反射光パルスの各々の立ち下がり期間の少なくとも一部の成分を光検出装置30に検出させる。
[Operation of processing device 40]
An example of the operation of the processing device 40 with respect to the light source 20 and the photodetector 30 will now be described with reference to FIG. FIG. 11 is a flow chart outlining the operation of the processor 40 with respect to the light source 20 and the photodetector 30. As shown in FIG. The processor 40 causes the photodetector 30 to detect at least part of the fall period components of each of the first and second reflected light pulses by performing the operation schematically shown in FIG.
 <ステップS301>
 ステップS301において、処理装置40は、第1光源20aに所定時間だけ第1光パルスIp1を出射させる。このとき、光検出装置30の電子シャッタは露光を停止した状態にある。処理装置40は、第1反射光パルスのうち、表面反射成分Iが光検出装置30に到達する期間が完了するまで、電子シャッタに露光を停止させる。
<Step S301>
In step S301, the processing device 40 causes the first light source 20a to emit the first light pulse Ip1 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposing until the surface reflection component I1 of the first reflected light pulse reaches the photodetector 30 for a period of time.
 <ステップS302>
 ステップS302において、処理装置40は、第1反射光パルスのうち、内部散乱成分Iが光検出装置30に到達するタイミングで、電子シャッタに露光を開始させる。
<Step S302>
In step S<b>302 , the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the first reflected light pulse reaches the photodetector 30 .
 <ステップS303>
 ステップS303において、処理装置40は、所定時間経過後、電子シャッタに露光を停止させる。ステップS102およびS103により、図9に示す第1浮遊拡散層204に、信号電荷が蓄積される。当該信号電荷を「第1信号電荷」と称する。
<Step S303>
In step S303, the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed. Signal charges are accumulated in the first floating diffusion layer 204 shown in FIG. 9 by steps S102 and S103. The signal charges are called "first signal charges".
 <ステップS304>
 ステップS304において、処理装置40は、第2光源20bに所定時間だけ第2光パルスIp2を出射させる。このとき、光検出装置30の電子シャッタは露光を停止した状態にある。処理装置40は、第2反射光パルスのうち、表面反射成分Iが光検出装置30に到達する期間が完了するまで、電子シャッタに露光を停止させる。
<Step S304>
In step S304, the processing device 40 causes the second light source 20b to emit the second light pulse Ip2 for a predetermined time. At this time, the electronic shutter of the photodetector 30 is in a state of stopping exposure. The processor 40 causes the electronic shutter to stop exposure until the surface reflection component I1 of the second reflected light pulse reaches the photodetector 30 for a period of time.
 <ステップS305>
 ステップS305において、処理装置40は、第2反射光パルスのうち、内部散乱成分Iが光検出装置30に到達するタイミングで、電子シャッタに露光を開始させる。
<Step S305>
In step S<b>305 , the processor 40 causes the electronic shutter to start exposure at the timing when the internal scattering component I 2 of the second reflected light pulse reaches the photodetector 30 .
 <ステップS306>
 ステップS306において、処理装置40は、所定時間経過後、電子シャッタに露光を停止させる。ステップS105およびS106により、図9に示す第2浮遊拡散層206に、信号電荷が蓄積される。当該信号電荷を「第2信号電荷」と称する。
<Step S306>
In step S306, the processing device 40 causes the electronic shutter to stop exposure after a predetermined time has elapsed. Through steps S105 and S106, signal charges are accumulated in the second floating diffusion layer 206 shown in FIG. The signal charges are called "second signal charges".
 <ステップS307>
 ステップS307において、処理装置40は、上記の信号蓄積を実行した回数が所定の回数に達したか否かを判定する。ステップS307における判定がNoの場合、処理装置40は、Yesと判定するまで、ステップS301からステップS306が繰り返される。ステップS307における判定がYesの場合、処理装置40は、ステップS308の動作を実行する。
<Step S307>
In step S307, the processing device 40 determines whether or not the number of times the above signal accumulation has been performed has reached a predetermined number. If the determination in step S307 is No, the processing device 40 repeats steps S301 to S306 until it determines Yes. If the determination in step S307 is Yes, the processing device 40 performs the operation of step S308.
 <ステップS308>
 ステップS308において、処理装置40は、光検出装置30に、第1信号電荷および第2信号電荷に基づいて、第1信号および第2信号をそれぞれ生成させて出力させる。第1信号および第2信号は、被検部12の内部情報を含む。
<Step S308>
In step S308, the processor 40 causes the photodetector 30 to generate and output a first signal and a second signal based on the first signal charge and the second signal charge, respectively. The first signal and the second signal contain internal information of the test part 12 .
 図11に示す動作をまとめる以下のようになる。処理装置40は、第1光源20aに第1光パルスIp1を出射させ、光検出装置30に第1反射光パルスの立ち下がり期間の少なくとも一部の成分を検出させる第1動作を実行する。処理装置40は、第2光源20bに第2光パルスIp2を出射させ、光検出装置30に第2反射光パルスの立ち下がり期間の少なくとも一部の成分を検出させる第2動作を実行する。処理装置40は、第1動作および第2動作を含む一連の動作を所定回数繰り返す。あるいは、処理装置40は、第1動作を所定回数繰り返し、その後、第2動作を所定回数繰り返してもよい。第1動作と第2動作とを入れ替えてもよい。 The operation shown in FIG. 11 is summarized as follows. The processing device 40 performs a first operation of causing the first light source 20a to emit the first light pulse Ip1 and causing the photodetector device 30 to detect at least part of the falling edge period of the first reflected light pulse. The processing device 40 causes the second light source 20b to emit the second light pulse Ip2 and performs a second operation of causing the photodetector 30 to detect at least part of the fall period component of the second reflected light pulse. The processing device 40 repeats a series of operations including the first operation and the second operation a predetermined number of times. Alternatively, the processing device 40 may repeat the first action a predetermined number of times, and then repeat the second action a predetermined number of times. The first action and the second action may be interchanged.
 図11に示す動作により、内部散乱成分Iを高い感度で検出することができる。生体10の頭部を光で照射して脳血流のような内部情報を取得する場合、内部での光の減衰率が非常に大きい。例えば、入射光に対して出射光が、100万分の1程度にまで減衰し得る。このため、内部散乱成分Iを検出するには、1パルスの照射では光量が不足する場合がある。レーザ安全性基準のクラス1での照射では、特に光量が微弱である。この場合、光源20が光パルスを複数回出射し、それに応じて光検出装置30も電子シャッタによって複数回露光することにより、検出信号を積算して感度を向上することができる。なお、複数回の光出射および露光は必須ではなく、必要に応じて行われる。 By the operation shown in FIG. 11, the internal scattering component I2 can be detected with high sensitivity. When acquiring internal information such as cerebral blood flow by irradiating the head of the living body 10 with light, the attenuation rate of light inside is very large. For example, the emitted light can be attenuated to about 1/1,000,000 of the incident light. Therefore, in order to detect the internal scattering component I2 , the amount of light may be insufficient with one pulse irradiation. In the case of irradiation in class 1 of laser safety standards, the amount of light is particularly weak. In this case, the light source 20 emits light pulses a plurality of times, and the photodetector 30 is also exposed a plurality of times by the electronic shutter accordingly, thereby integrating detection signals and improving sensitivity. It should be noted that the multiple times of light emission and exposure are not essential, and are performed as necessary.
 さらに、上記の例において、第1および第2反射光パルスの各々の立ち上がり期間の少なくとも一部の成分を光検出装置30に検出させることにより、第1および第2反射光パルスの各々の表面反射成分Iを検出することができ、顔および頭皮の血流のような表面情報を取得することが可能になる。図9に示す各画素201に含まれる第1浮遊拡散層204および第2浮遊拡散層206は、それぞれ、第1および第2反射光パルスのうち、立ち上がり期間の少なくとも一部の成分を受光して生じた電荷を蓄積し得る。 Furthermore, in the above example, the surface reflection of each of the first and second reflected light pulses is detected by causing the photodetector 30 to detect at least a portion of the rising period of each of the first and second reflected light pulses. Component I1 can be detected, making it possible to obtain surface information such as blood flow on the face and scalp. A first floating diffusion layer 204 and a second floating diffusion layer 206 included in each pixel 201 shown in FIG. The charge generated can be accumulated.
 あるいは、図9に示す行方向に互いに隣接する2つの画素201を1つの画素として扱ってもよい。例えば、一方の画素201に含まれる第1浮遊拡散層204および第2浮遊拡散層206は、それぞれ、第1および第2反射光パルスのうち、立ち下がり期間の少なくとも一部の成分を受光して生じた電荷を蓄積し得る。他方の画素201に含まれる第1浮遊拡散層204および第2浮遊拡散層206は、それぞれ、第1および第2反射光パルスのうち、立ち上がり期間の少なくとも一部の成分を受光して生じた電荷を蓄積し得る。そのような構成により、生体10の内部情報だけでなく表面情報も取得することができる。 Alternatively, two pixels 201 adjacent to each other in the row direction shown in FIG. 9 may be treated as one pixel. For example, the first floating diffusion layer 204 and the second floating diffusion layer 206 included in one pixel 201 respectively receive at least part of the fall period components of the first and second reflected light pulses. The charge generated can be accumulated. The first floating diffusion layer 204 and the second floating diffusion layer 206 included in the other pixel 201 receive the charge generated by receiving at least part of the rising period component of the first and second reflected light pulses, respectively. can accumulate. With such a configuration, not only the internal information of the living body 10 but also the surface information can be obtained.
 [血液中のHbOおよびHbの各濃度の初期値からの変化量の算出]
 第1光パルスIp1の第1波長が650nm以上であり、かつ805nmよりも短く、第2光パルスIp2の第2波長が850nmよりも長く、かつ950nm以下である場合、第1信号および第2信号を用いて予め定められた連立方程式を解くことにより、血液中のHbOおよびHbの各濃度の初期値からの変化量を求めることができる。以下の式(3)および式(4)は、連立方程式の例を表す。
[Calculation of the amount of change from the initial value of each concentration of HbO 2 and Hb in blood]
If the first wavelength of the first light pulse Ip1 is greater than or equal to 650 nm and less than 805 nm and the second wavelength of the second light pulse I p2 is greater than 850 nm and less than or equal to 950 nm, then the first signal and the second By solving predetermined simultaneous equations using two signals, the amount of change from the initial value of each concentration of HbO 2 and Hb in blood can be obtained. Equations (3) and (4) below represent examples of simultaneous equations.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 ΔHbOおよびΔHbは、それぞれ、血液中のHbOおよびHbの濃度の初期値からの変化量を表す。ε750 OXYおよびε750 deOXYは、それぞれ、波長750nmでのHbOおよびHbのモル吸光係数を表す。ε850 OXYおよびε850 deOXYは、それぞれ、波長850nmでのHbOおよびHbのモル吸光係数を表す。I750 iniおよびI750 nowは、それぞれ、波長750nmでの基準時間(初期時間)とある時間における検出強度を表す。これらの記号は、例えば、脳が賦活していない状態と賦活した状態とにおける検出強度を表す。I850 iniおよびI850 nowは、それぞれ、波長850nmでの基準時間(初期時間)とある時間における検出強度を表す。これらの記号は、例えば、脳が賦活していない状態と賦活した状態とにおける検出強度を表す。 ΔHbO 2 and ΔHb represent the amount of change from the initial values of the concentrations of HbO 2 and Hb in blood, respectively. ε 750 OXY and ε 750 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 750 nm, respectively. ε 850 OXY and ε 850 deOXY represent the molar extinction coefficients of HbO 2 and Hb at a wavelength of 850 nm, respectively. I 750 ini and I 750 now represent the detected intensity at a wavelength of 750 nm at a reference time (initial time) and a certain time, respectively. These symbols represent, for example, the detection strength in the non-activated state and the activated state of the brain. I 850 ini and I 850 now represent the detected intensity at a wavelength of 850 nm at a reference time (initial time) and a certain time, respectively. These symbols represent, for example, the detection strength in the non-activated state and the activated state of the brain.
 (その他1)
 本開示は、上述した実施形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施形態に施したもの、および、異なる実施形態における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
(Other 1)
The present disclosure is not limited to the embodiments described above. As long as it does not depart from the spirit of the present disclosure, various modifications that can be made by those skilled in the art to this embodiment, and forms constructed by combining the components of different embodiments are also included within the scope of the present disclosure.
 (その他2)
 本開示の実施形態の変形例を、図12、図13A、図13B、図14を用いて説明する。図12は、本変形例を説明するためのフローチャートである。図13Aは、本変形例を説明するためのタイミングチャートである。図13Bは、本変形例を説明するためのタイミングチャートである。図14は、本変形例を説明するためのテーブルである。
(Other 2)
A modification of the embodiment of the present disclosure will be described with reference to FIGS. 12, 13A, 13B, and 14. FIG. FIG. 12 is a flowchart for explaining this modification. FIG. 13A is a timing chart for explaining this modification. FIG. 13B is a timing chart for explaining this modification. FIG. 14 is a table for explaining this modified example.
 本変形例は下記に示すようなものであってもよい。 This modification may be as shown below.
  一または複数のメモリに記録された複数の命令を実行する一または複数のプロセッサによって実行される方法(a method being performed by one or more processors configured to execute instructions stored in one or more memories)であって、
 (a)時刻t(i)に被検部を含む生体を撮像した1または画像に基づいて、装置と前記生体の距離d(i)を決定し、これにより、距離d(k)、~、距離d(k+n0)が決定され、前記iは変数で、前記iは1以上の整数の値をとり、前記kは1以上の整数、前記n0は1以上の予め定められた整数であり、
 (b-1)前記距離d(k)、~、前記距離d(k+n0)が予め定められた条件を満たす場合は、第1処理を実施し、
 (b-2)前記距離d(k)、~、前記距離d(k+n0)が予め定められた条件を満たさない場合は、前記第1処理を実施せず、時刻t(k+n0+1)に前記生体を撮像した1または画像に基づいて距離d(k+n0+1)を決定し、
 前記予め定められた条件は、前記距離d(k)、~、前記距離d(k+n0)の最大値d(k+n0)maxと、前記距離d(k)、~、前記距離d(k+n0)の最小値d(k+n0)minの差Δd(k+n0)である{d(k+n0)max-d(k+n0)min}が予め定められた値以下であることであり、
 前記第1処理は、(c-1)前記距離d(k)、~、前記距離d(k+n0)に基づいてp(k+n0)を決定し、(c-2)前記装置に含まれる検出器に生体の被検部からの光r(k+n0)を時刻{t(k+n0)+t0+p(k+n0)}から検出させることを含み、時刻{t(k+n0)+t0}は前記装置に含まれる光源から前記被検部に向けて光o(k+n0)が出射される時刻である、方法。
A method being performed by one or more processors configured to execute instructions stored in one or more memories,
(a) Determine the distance d(i) between the apparatus and the living body based on 1 or the image taken of the living body including the test site at time t(i), thereby obtaining the distances d(k), . a distance d(k+n0) is determined, wherein i is a variable, said i takes the value of an integer greater than or equal to 1, said k is an integer greater than or equal to 1, said n0 is a predetermined integer greater than or equal to 1;
(b-1) performing a first process when the distances d(k), . . . , and the distances d(k+n0) satisfy a predetermined condition;
(b-2) if the distances d(k), . determining the distance d(k+n0+1) based on the imaged 1 or image;
The predetermined conditions are the maximum value d(k+n0)max of the distance d(k), . {d(k+n0)max−d(k+n0)min}, which is the difference Δd(k+n0) of the value d(k+n0)min, is less than or equal to a predetermined value;
The first processing includes: (c-1) determining p(k+n0) based on the distance d(k), . . . , the distance d(k+n0); detecting light r(k+n0) from a test site of a living body from time {t(k+n0)+t0+p(k+n0)}, wherein time {t(k+n0)+t0} is detected from a light source included in the device; is the time at which light o(k+n0) is emitted towards the part.
 前記第1処理は、前記光r(k+n0)に基づいて、前記生体の血液に関する情報Inf(k+n0)を生成することを含んでもよい。前記光r(k+n0)は、前記光o(k+n0)に基づく前記被検部からの光であってもよい。 The first processing may include generating information Inf(k+n0) on the blood of the living body based on the light r(k+n0). The light r(k+n0) may be light from the subject based on the light o(k+n0).
 上記した(a)は図12におけるS1002に対応する。 (a) above corresponds to S1002 in FIG.
 上記した(b-1)は図12においてS1003でYesに対応する。 (b-1) above corresponds to Yes in S1003 in FIG.
 上記した第1処理は図12におけるS1004~S1005に対応する。 The first processing described above corresponds to S1004 to S1005 in FIG.
 上記した(b-2)図12においてS1003でNoに対応する。 Corresponds to No in S1003 in FIG. 12 (b-2) above.
 上記した(c-1)は図12におけるS1004に対応する。 (c-1) above corresponds to S1004 in FIG.
 上記した(c-2)は図12におけるS1005に対応する。 (c-2) above corresponds to S1005 in FIG.
 図12を用いて、上記方法の詳細を以下に述べる。 The details of the above method will be described below using FIG.
 光検出システム100は、プロセッサ(図示せず)、メモリ46を含む。プロセッサは1または複数のプロセッサであってもよい。メモリ46は1または複数のメモリあってもよい。 The photodetection system 100 includes a processor (not shown) and memory 46 . A processor may be one or more processors. Memory 46 may be one or more memories.
 光検出システム100は第1カメラ(図示せず)、第2カメラ(図示せず)をさらに含んでもよい、
 メモリ46は複数の命令(instructions)を記憶する。当該複数の命令はプロセッサによって実行される。複数の命令は図12に示すS1001に示す処理、~、S1006に示す処理を含む。
The light detection system 100 may further include a first camera (not shown), a second camera (not shown),
Memory 46 stores a plurality of instructions. The instructions are executed by a processor. The plurality of instructions include the processing shown in S1001 to S1006 shown in FIG.
 光検出システム100は、処理を開始する指示を受け付ける。当該処理を開始する指示に応答して、プロセッサは、光検出システム100が有するタイマーの値を0にした後、タイマーの動作を開始させ、かつ、図12に示すS1001に示す処理~S1006に示す処理を実行する。本変形例において示す時刻は、当該タイマーが示す時刻であってもよい。 The photodetection system 100 accepts an instruction to start processing. In response to the instruction to start the process, the processor sets the value of the timer of the photodetection system 100 to 0, starts the operation of the timer, and performs the processes shown in S1001 to S1006 shown in FIG. Execute the process. The time indicated in this modified example may be the time indicated by the timer.
 (S1001)プロセッサは変数iに初期値1を設定する。 (S1001) The processor sets the initial value 1 to the variable i.
 (S1002)プロセッサは、時刻t(i)に、第1カメラに被検部12を含む生体10を撮像させる。これにより第1カメラは画像IL(i)を生成する。画像IL(i)は、メモリ46に格納される。 (S1002) The processor causes the first camera to image the living body 10 including the subject 12 at time t(i). The first camera thereby generates an image IL(i). Image IL(i) is stored in memory 46 .
 プロセッサは、時刻t(i)に、第2カメラに被検部12を含む生体10を撮像させる。これにより第2カメラは画像IR(i)を生成する。画像IR(i)は、メモリ46に格納される。 The processor causes the second camera to image the living body 10 including the subject 12 at time t(i). The second camera thereby produces an image IR(i). Image IR(i) is stored in memory 46 .
 第1カメラによる被検部12を含む生体10を撮像する時間の間隔は一定であってもよい。第2カメラによる被検部12を含む生体10を撮像する時間の間隔は一定であってもよい。 The time interval for imaging the living body 10 including the subject 12 by the first camera may be constant. A time interval for imaging the living body 10 including the test part 12 by the second camera may be constant.
 プロセッサは、メモリ46に格納された画像IL(i)と画像IR(i)に基づいて、生体10に含まる点と光検出システム100に含まれる点との距離d(i)を算出する。距離d(i)を検出システム100と生体10間の距離と呼んでもよい。 The processor calculates the distance d(i) between the points included in the living body 10 and the points included in the photodetection system 100 based on the images IL(i) and IR(i) stored in the memory 46 . Distance d(i) may be referred to as the distance between detection system 100 and living body 10 .
 プロセッサは、算出した距離d(i)をメモリ46に格納する。生体10に含まる点は被検部12に含まれる予め定めた点であってもよい。光検出システム100に含まれる点は光検出装置30に含まれる予め定めた点であってもよい。当該距離は、生体10に含まる点の3次元座標(xi,yi,zi)に基づいて決定される。当該3次元座標(xi,yi,zi)はステレオカメラ(stereo camera system)を設けて、プロセッサは、生体10に含まる点を測距(distance measurement)する技術を用いて、決定してもよい。当該ステレオカメラは上述した第1カメラ、第2カメラを含む。1台のカメラで測距する技術を用いて、当該距離を決定してもよい。S1002に示す処理とS1006に示す処理を繰り返すことで、距離d(1)、距離d(2)~、距離d(k)、~、距離d(k+n0)が決定される。図13A、図13Bには距離d(k)、~、距離d(k+n0)が記載されている。 The processor stores the calculated distance d(i) in the memory 46. The points included in the living body 10 may be predetermined points included in the subject 12 . The points included in the photodetection system 100 may be predetermined points included in the photodetection device 30 . The distance is determined based on the three-dimensional coordinates (xi, yi, zi) of points included in the living body 10 . The three-dimensional coordinates (xi, yi, zi) may be determined by providing a stereo camera system and the processor using a technique of distance measurement of points included in the living body 10. . The stereo camera includes the above-described first camera and second camera. A single camera ranging technique may be used to determine the distance. By repeating the processing shown in S1002 and the processing shown in S1006, the distance d(1), the distance d(2) ~, the distance d(k) ~, and the distance d(k+n0) are determined. Distances d(k), . . . , and distances d(k+n0) are shown in FIGS. 13A and 13B.
 iは変数で、iは1以上の整数の値をとり、kは1以上の整数、n0は1以上の予め定められた整数である。 i is a variable, i takes the value of an integer of 1 or more, k is an integer of 1 or more, and n0 is a predetermined integer of 1 or more.
 (S1003)プロセッサは、距離d(k)、~、距離d(k+n0)の最大値であるd(k+n0)maxを決定する。プロセッサは、距離d(k)、~、距離d(k+n0)の最小値であるd(k+n0)minを決定する。プロセッサはd(k+n0)maxとd(k+n0)minの差であるΔd(k+n0)=d(k+n0)max-d(k+n0)minを算出する。プロセッサは、Δd(k+n0)が予め定められた値d0以下であるかを決定する。Δd(k+n0)≦d0である場合、プロセッサは、S1004に示す処理、S1005に示す処理を実行する。Δd(k+n0)>d0である場合、プロセッサは、S1004に示す処理、S1005に示す処理を実行することなく、S1006に示す処理を実行し、その後S1002にて時刻t(k+n0+1)に生体を撮像した1または画像に基づいて距離d(k+n0+1)を決定する。 (S1003) The processor determines d(k+n0)max, which is the maximum value of the distances d(k) to d(k+n0). The processor determines d(k+n0)min, which is the minimum value of the distances d(k), .about.d(k+n0). The processor calculates Δd(k+n0)=d(k+n0)max−d(k+n0)min, which is the difference between d(k+n0)max and d(k+n0)min. The processor determines if Δd(k+n0) is less than or equal to a predetermined value d0. If Δd(k+n0)≦d0, the processor executes the processing shown in S1004 and the processing shown in S1005. If Δd(k+n0)>d0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005, and then images the living body at time t(k+n0+1) in S1002. 1 or based on the image determine the distance d(k+n0+1).
 (S1004)プロセッサは距離d(k)、~、距離d(k+n0)の平均、すなわちd(k+n0)avg=(d(k)+・・・+d(k+n0))/(n0+1)を計算する。 (S1004) The processor calculates the average of the distances d(k) to d(k+n0), that is, d(k+n0)avg=(d(k)+...+d(k+n0))/(n0+1).
 プロセッサは、メモリ46に記録された図14に示すテーブルを参照してd(k+n0)avgに対応するマスク時間p(k+n0)を決定する。例えば、L1≦d(k+n0)avg<L2である場合、マスク時間p(k+n0)はp2である。図14において、p1<p2<・・・<pαであってもよい。 The processor refers to the table shown in FIG. 14 recorded in the memory 46 to determine the mask time p(k+n0) corresponding to d(k+n0)avg. For example, if L1≤d(k+n0)avg<L2, the mask time p(k+n0) is p2. In FIG. 14, p1<p2< . . . <pα may be satisfied.
 (S1005)プロセッサは、光源20に時刻{t(k+n0)+t0}に生体10の被検部12に光パルスIpを出射させる。すなわち、プロセッサは、光源20に生体10の被検部12に向けて光o(k+n0)を時刻{t(k+n0)+t0}に出射させる。プロセッサは、光検出装置30に、時刻{t(k+n0)+t0+p(k+n0)}から、光o(k+n0)に基づく生体10の被検部12からの光r(k+n0)の検出を開始させる。プロセッサは、光検出装置30に、時刻{t(k+n0)+t0+p(k+n0)+(所定の露光期間)}に、当該光r(k+n0)の検出を終了させる。 (S1005) The processor causes the light source 20 to emit a light pulse Ip to the subject 12 of the living body 10 at time {t(k+n0)+t0}. That is, the processor causes the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 at time {t(k+n0)+t0}. The processor causes the photodetector 30 to start detecting light r(k+n0) from the test site 12 of the living body 10 based on the light o(k+n0) from time {t(k+n0)+t0+p(k+n0)}. The processor causes the photodetector 30 to finish detecting the light r(k+n0) at time {t(k+n0)+t0+p(k+n0)+(predetermined exposure period)}.
 (S1006)プロセッサは変数iの値を1増加させ、S1002に示す処理を実行する。 (S1006) The processor increments the value of variable i by 1 and executes the process shown in S1002.
 次に、図13Aについて説明する。 Next, FIG. 13A will be described.
 図13AはS1003でYesの場合に関連する処理の概要を示している。 FIG. 13A shows an overview of processing related to the case of Yes in S1003.
 プロセッサは、時刻t(k)に撮像された画像IL(k)、画像IR(k)に基づいて距離d(k)を算出し、~、時刻t(k+n0)に撮像された画像IL(k+n0)、画像IR(k+n0)に基づいて距離d(k+n0)を算出する。 The processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
 プロセッサは、距離d(k)、~、距離d(k+n0)の最大値であるd(k+n0)maxと、距離d(k)、~、距離d(k+n0)の最小値であるd(k+n0)minの差であるΔd(k+n0)=d(k+n0)max-d(k+n0)minを算出する。 d(k+n0)max, which is the maximum value of the distance d(k), . . . , the distance d(k+n0); Δd(k+n0)=d(k+n0)max−d(k+n0)min, which is the difference of min, is calculated.
 Δd(k+n0)≦d0である場合(図12において、S1003でYes)、プロセッサは、光源20に、生体10の被検部12に向けて光o(k+n0)を時刻{t(k+n0)+t0}から出射させる。 If Δd(k+n0)≦d0 (Yes in S1003 in FIG. 12), the processor causes the light source 20 to emit light o(k+n0) toward the subject 12 of the living body 10 at time {t(k+n0)+t0}. emitted from
 Δd(k+n0)≦d0である場合(図12において、S1003でYes)、プロセッサは、光検出装置30に、生体10の被検部12からの光r(k+n0)を時刻{t(k+n0)+t0+p(k+n0)}から検出させる。 If Δd(k+n0)≦d0 (Yes in S1003 in FIG. 12), the processor causes the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 at time {t(k+n0)+t0+p (k+n0)}.
 以上で図13Aの説明を終える。 This completes the explanation of FIG. 13A.
 次に、図13Bについて説明する。 Next, FIG. 13B will be described.
 図13BはS1003でNoの場合に関連する処理の概要を示している。 FIG. 13B shows an overview of the processing related to No in S1003.
 プロセッサは、時刻t(k)に撮像された画像IL(k)、画像IR(k)に基づいて距離d(k)を算出し、~、時刻t(k+n0)に撮像された画像IL(k+n0)、画像IR(k+n0)に基づいて距離d(k+n0)を算出する。 The processor calculates the distance d(k) based on the image IL(k) and the image IR(k) captured at time t(k), and the image IL(k+n0 ), and the distance d(k+n0) is calculated based on the image IR(k+n0).
 プロセッサは、距離d(k)、~、距離d(k+n0)の最大値であるd(k+n0)maxと、距離d(k)、~、距離d(k+n0)の最小値であるd(k+n0)minの差であるΔd(k+n0)=d(k+n0)max-d(k+n0)minを算出する。 The processor calculates d(k+n0)max, which is the maximum value of the distance d(k), . Δd(k+n0)=d(k+n0)max−d(k+n0)min, which is the difference of min, is calculated.
 Δd(k+n0)>d0である場合(図12において、S1003でNo)、プロセッサは、光源20に、生体10の被検部12に向けて光o(k+n0)を出射させない。プロセッサは、光検出装置30に、生体10の被検部12からの光r(k+n0)を検出させない。 If Δd(k+n0)>d0 (No in S1003 in FIG. 12), the processor does not cause the light source 20 to emit the light o(k+n0) toward the test site 12 of the living body 10 . The processor does not cause the photodetector 30 to detect the light r(k+n0) from the subject 12 of the living body 10 .
 以上で図13Bの説明を終える。 This completes the explanation of FIG. 13B.
 プロセッサは光検出装置30が、時刻{t(k+n0)+t0+p(k+n0)}~時刻{t(k+n0)+t0+p(k+n0)+(所定の露光期間)}に検出した光r(k+n0)に基づいて、生体の血液に関する情報Inf(k+n0)を生成してもよい。 Based on the light r(k+n0) detected by the photodetector 30 from time {t(k+n0)+t0+p(k+n0)} to time {t(k+n0)+t0+p(k+n0)+(predetermined exposure period)}, Information Inf(k+n0) regarding the blood of the living body may be generated.
 生体の血液に関する情報は式(3)、式(4)を用いて生成される血液中のHbOの濃度および/またはHbの濃度であってもよい。 The information about the blood of the living body may be the concentration of HbO 2 and/or the concentration of Hb in the blood generated using equations (3) and (4).
 光検出システム100が処理終了の指示を受け付けた場合、プロセッサは図12に示す処理を終了させてもよい。 When the photodetection system 100 receives an instruction to end processing, the processor may end the processing shown in FIG.
 (S1003)は下記の様な処理であってもよい。プロセッサは、距離d(k)、~、距離d(k+n0)の偏差であるσ(k+n0)を算出し、当該偏差が所定値σ0以下であるかを決定する。σ(k+n0)≦σ0である場合、プロセッサは、S1004に示す処理、S1005に示す処理を実行する。σ(k+n0)>σ0である場合、プロセッサは、S1004に示す処理、S1005に示す処理を実行せず、S1006に示す処理を実行する。 (S1003) may be the following processing. The processor calculates σ(k+n0), which is the deviation of the distances d(k) to d(k+n0), and determines whether the deviation is less than or equal to a predetermined value σ0. If σ(k+n0)≦σ0, the processor executes the processing shown in S1004 and the processing shown in S1005. If σ(k+n0)>σ0, the processor executes the processing shown in S1006 without executing the processing shown in S1004 and S1005.
 t0=0であってもよい。 t0 may be 0.
 t0=0である場合、図13Aにおいて、(i)プロセッサは、時刻t(k+n0)に、第1カメラに被検部12を含む生体10を撮像させ、(ii)プロセッサは、時刻t(k+n0)に、第2カメラに被検部12を含む生体10を撮像させ、(iii)プロセッサは、光源20に、生体10の被検部12に向けて光o(k+n0)を時刻t(k+n0)から出射させ、(iv)プロセッサは、光検出装置30に、生体10の被検部12からの光r(k+n0)を時刻{t(k+n0)+p(k+n0)}から検出させる。 When t0=0, in FIG. 13A, (i) the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0), and (ii) the processor causes the time t(k+n0 ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0). (iv) The processor causes the photodetector device 30 to detect the light r(k+n0) from the test site 12 of the living body 10 from time {t(k+n0)+p(k+n0)}.
 t0=0である場合、図13Bにおいて、(i)プロセッサは、時刻t(k+n0)に、第1カメラに被検部12を含む生体10を撮像させ、(ii)プロセッサは、時刻t(k+n0)に、第2カメラに被検部12を含む生体10を撮像させ、(iii)プロセッサは、光源20に、生体10の被検部12に向けて光o(k+n0)を時刻t(k+n0)から出射させず、(iv)プロセッサは、光検出装置30に、生体10の被検部12からの光r(k+n0)を時刻{t(k+n0)+p(k+n0)}から検出させない。 When t0=0, in FIG. 13B, (i) the processor causes the first camera to image the living body 10 including the test part 12 at time t(k+n0); ), causes the second camera to image the living body 10 including the test site 12, and (iii) the processor causes the light source 20 to direct light o(k+n0) toward the test site 12 of the living body 10 at time t(k+n0). (iv) The processor prevents the photodetector 30 from detecting the light r(k+n0) from the subject 12 of the living body 10 from time {t(k+n0)+p(k+n0)}.
 本変形例では、第1カメラ、第2カメラを用いて距離d(i)を求めた。これに代えて、光源20と光検出装置30を用いて距離d(i)を求めてもよい。光源20と光検出装置30を用いて距離d(i)を求める場合、t0=0となる。 In this modified example, the distance d(i) is obtained using the first camera and the second camera. Alternatively, light source 20 and photodetector 30 may be used to determine distance d(i). When the distance d(i) is obtained using the light source 20 and the photodetector 30, t0=0.
 本開示における光検出システムは、生体の被検部の生体情報を取得することが可能である。本開示における光検出システムは、例えば生体センシングに有用である。 The light detection system according to the present disclosure is capable of acquiring biometric information on a subject of a living body. Optical detection systems in the present disclosure are useful, for example, for biosensing.
  10   生体
  12   被検部
  20   光源
  20a  第1光源
  20b  第2光源
  30   光検出装置
  32   光電変換素子
  34   電荷蓄積部
  40   処理装置
  42   制御回路
  44   信号処理装置
  46   メモリ
  100  光検出システム
REFERENCE SIGNS LIST 10 living body 12 subject 20 light source 20a first light source 20b second light source 30 photodetector 32 photoelectric conversion element 34 charge storage unit 40 processor 42 control circuit 44 signal processor 46 memory 100 photodetection system

Claims (11)

  1.  生体の被検部に光パルスを出射する光源と、
     前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置と、
     前記光源と前記光検出装置を制御し、かつ前記光検出装置と前記生体との距離に関する距離データを取得する処理回路と、
    を備え、
     前記処理回路は、
      前記光源に前記光パルスを出射させ、
      前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させ、
      前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定する、
     光検出システム。
    a light source that emits a light pulse to a subject of a living body;
    a photodetector that detects a reflected light pulse generated by the light pulse being reflected by the test site;
    a processing circuit that controls the light source and the photodetector and acquires distance data regarding the distance between the photodetector and the living body;
    with
    The processing circuit is
    causing the light source to emit the light pulse;
    The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse. terminating the detection of the reflected light pulse at the time of
    When the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time, the above-mentioned at the second time determining whether to change the first time based on the distance;
    Optical detection system.
  2.  前記処理回路は、前記第2時刻での前記距離が前記第1の範囲を逸脱している場合、前記第1時間を変更する、
     請求項1に記載の光検出システム。
    The processing circuit changes the first time if the distance at the second time deviates from the first range.
    The optical detection system of claim 1.
  3.  前記処理回路は、前記第2時刻での前記距離が前記第1の範囲とは異なる第2の範囲内にある場合、前記第1時間を前記第2の範囲に対応する第2時間に変更する、
     請求項2に記載の光検出システム。
    The processing circuit changes the first time to a second time corresponding to the second range when the distance at the second time is within a second range different from the first range. ,
    3. The optical detection system of claim 2.
  4.  前記処理回路は、前記第2時刻での前記距離が前記第1の範囲にある場合、前記第1時間を維持する、
     請求項1に記載の光検出システム。
    the processing circuit maintains the first time if the distance at the second time is within the first range;
    The optical detection system of claim 1.
  5.  前記処理回路は、前記距離の前記経時変化の変動幅、分散、および標準偏差の少なくとも1つに基づいて決定される基準値が閾値以下である場合、前記距離の前記経時変化が前記所定の範囲に収まっていると判定する、
     請求項1から4のいずれかに記載の光検出システム。
    If a reference value determined based on at least one of variation width, variance, and standard deviation of the change over time of the distance is equal to or less than a threshold, the processing circuit determines that the change over time of the distance is within the predetermined range. is determined to be within
    5. A light detection system according to any one of claims 1 to 4.
  6.  前記光検出装置は、2次元的に配列された複数の画素を有するイメージセンサであり、
     前記処理回路は、前記複数の画素の各々によって検出される前記反射光パルスの強度に基づいて前記距離データを生成する、
     請求項1から5のいずれかに記載の光検出システム。
    The photodetector is an image sensor having a plurality of pixels arranged two-dimensionally,
    the processing circuit generates the distance data based on the intensity of the reflected light pulse detected by each of the plurality of pixels;
    6. A light detection system according to any one of claims 1-5.
  7.  前記処理回路は、前記複数の画素のうち、前記被検部に対応しない画素の数、または前記被検部に対応しない画素および前記被検部の周縁領域に対応する画素の数に基づいて、前記第1時間を変更するか否かを決定する、
     請求項6に記載の光検出システム。
    The processing circuit, among the plurality of pixels, based on the number of pixels that do not correspond to the test site, or the number of pixels that do not correspond to the test site and the number of pixels that correspond to the peripheral region of the test site, determining whether to change the first time;
    7. A photodetection system according to claim 6.
  8.  前記生体は人であり、
     前記被検部は人の額部であり、
     前記処理回路は、前記光検出装置によって検出される前記反射光パルスの強度に応じた信号に基づいて、前記生体の脳活動に関する脳活動データを生成する、請求項1から7のいずれかに記載の光検出システム。
    the living body is a human,
    The part to be examined is a human forehead,
    8. The processing circuit according to any one of claims 1 to 7, wherein said processing circuit generates brain activity data relating to brain activity of said living body based on a signal corresponding to the intensity of said reflected light pulse detected by said photodetector. light detection system.
  9.  生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムに用いられる処理装置であって、
     前記処理装置は、
      プロセッサと、
      前記プロセッサによって実行されるコンピュータプログラムを格納したメモリと、
    を備え、
     前記コンピュータプログラムは、前記プロセッサに、
      前記光源に前記光パルスを出射させることと、
      前記被検部と前記光検出装置の間の距離に関する距離データを取得させることと、
      前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、
      前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、
    を実行させる、
     処理装置。
    A processing device used in a photodetection system including a light source for emitting a light pulse to a subject part of a living body and a photodetector for detecting a reflected light pulse generated by the light pulse being reflected by the subject part,
    The processing device is
    a processor;
    a memory storing a computer program executed by the processor;
    with
    The computer program causes the processor to:
    causing the light source to emit the light pulse;
    Acquiring distance data relating to the distance between the test site and the photodetector;
    The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse. terminating the detection of the reflected light pulse when
    When the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time, the above-mentioned at the second time determining whether to change the first time based on the distance;
    to run
    processing equipment.
  10.  生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムを制御する方法であって、
     前記方法は、
      前記光源に前記光パルスを出射させることと、
      前記被検部と前記光検出装置の間の距離に関する距離データを取得することと、
      前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、
      前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、
    を含む、方法。
    A method for controlling a light detection system including a light source that emits a light pulse to a subject part of a living body and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part,
    The method includes:
    causing the light source to emit the light pulse;
    Acquiring distance data regarding the distance between the test site and the photodetector;
    The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse. terminating the detection of the reflected light pulse when
    When the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time, the above-mentioned at the second time determining whether to change the first time based on the distance;
    A method, including
  11.  生体の被検部に光パルスを出射する光源、および前記光パルスが前記被検部で反射されて生じる反射光パルスを検出する光検出装置を含む光検出システムを制御するコンピュータによって実行されるプログラムであって、
     前記プログラムは、
      前記光源に前記光パルスを出射させることと、
      前記被検部と前記光検出装置の間の距離に関する距離データを取得することと、
      前記光検出装置に、前記光パルスを出射してから第1時間が経過した時点で、前記反射光パルスの検出を開始させ、前記反射光パルスの検出を開始してから所定の露光時間が経過した時点で、前記反射光パルスの検出を終了させることと、
      前記距離が第1の範囲を逸脱する時刻を第1時刻とし、前記第1時刻以降に前記距離の経時変化が所定の範囲に収まる時刻を第2時刻とするとき、前記第2時刻での前記距離に基づいて、前記第1時間を変更するか否かを決定することと、
    を実行させる、プログラム。
    A program executed by a computer that controls a light detection system that includes a light source that emits a light pulse to a subject part of a living body and a light detection device that detects a reflected light pulse generated by the light pulse being reflected by the subject part. and
    Said program
    causing the light source to emit the light pulse;
    Acquiring distance data regarding the distance between the test site and the photodetector;
    The photodetector is caused to start detecting the reflected light pulse when a first time elapses after the light pulse is emitted, and a predetermined exposure time elapses after starting the detection of the reflected light pulse. terminating the detection of the reflected light pulse when
    When the time at which the distance deviates from the first range is defined as a first time, and the time at which the change over time of the distance falls within a predetermined range after the first time is defined as a second time, the above-mentioned at the second time determining whether to change the first time based on the distance;
    The program that causes the to run.
PCT/JP2022/041391 2021-11-22 2022-11-07 Light detecting system, processing device, method for controlling light detecting system, and program WO2023090188A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021189218 2021-11-22
JP2021-189218 2021-11-22

Publications (1)

Publication Number Publication Date
WO2023090188A1 true WO2023090188A1 (en) 2023-05-25

Family

ID=86396947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/041391 WO2023090188A1 (en) 2021-11-22 2022-11-07 Light detecting system, processing device, method for controlling light detecting system, and program

Country Status (1)

Country Link
WO (1) WO2023090188A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017124153A (en) * 2016-01-07 2017-07-20 パナソニックIpマネジメント株式会社 Biological information measuring device
JP2017185200A (en) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 Imaging device having light source, photo-detector, and control circuit

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017124153A (en) * 2016-01-07 2017-07-20 パナソニックIpマネジメント株式会社 Biological information measuring device
JP2017185200A (en) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 Imaging device having light source, photo-detector, and control circuit

Similar Documents

Publication Publication Date Title
US11303828B2 (en) Imaging apparatus including light source that emits pulsed light, image sensor, and control circuit
JP6887097B2 (en) Imaging device
US10270998B2 (en) Imaging apparatus including light source that emits pulsed light beam onto object and light detector that detects light returning from object
US10194094B2 (en) Imaging apparatus including light source that emits pulsed light, image sensor, and control circuit
JP6998529B2 (en) Imaging device
WO2020044854A1 (en) Biological measurement device and biological measurement method
JPWO2019230306A1 (en) Identification device and identification method
JP2020103879A (en) Biological measurement method, map data generation method, program, computer readable recording medium, and biological measurement device
WO2023090188A1 (en) Light detecting system, processing device, method for controlling light detecting system, and program
WO2021182018A1 (en) Measuring apparatus and method for controlling measuring apparatus
JP7417867B2 (en) Optical measurement device
WO2020137276A1 (en) Imaging device
JP7142246B2 (en) Bioinstrumentation device, head-mounted display device, and bioinstrumentation method
JP2021141949A (en) Measuring device and program
WO2022138063A1 (en) Biological measurement device, biological measurement method, and computer program
WO2023079862A1 (en) Imaging system, processing device, and method executed by computer in imaging system
WO2020137352A1 (en) Biosensing method, map data generation method, program, computer-readable medium, and biosensing device
JP2020032105A (en) Biological measurement device, biological measurement system, control method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22895470

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023561536

Country of ref document: JP

Kind code of ref document: A