WO2015075894A1 - Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille - Google Patents

Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille Download PDF

Info

Publication number
WO2015075894A1
WO2015075894A1 PCT/JP2014/005642 JP2014005642W WO2015075894A1 WO 2015075894 A1 WO2015075894 A1 WO 2015075894A1 JP 2014005642 W JP2014005642 W JP 2014005642W WO 2015075894 A1 WO2015075894 A1 WO 2015075894A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil
light
image
imaging
wavelength
Prior art date
Application number
PCT/JP2014/005642
Other languages
English (en)
Japanese (ja)
Inventor
浩 今井
雅雄 今井
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2015548974A priority Critical patent/JPWO2015075894A1/ja
Publication of WO2015075894A1 publication Critical patent/WO2015075894A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to an imaging device, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, and a pupil imaging method, and in particular, an imaging device having a single imaging element, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, And a pupil imaging method.
  • Patent Documents 1 to 3 describe a method of preparing two sets of imaging devices and measuring a position and the like based on triangulation in order to measure accurate three-dimensional coordinates when an object moves back and forth. .
  • Patent Documents 1 to 3 require at least two cameras, which increases the size and cost of the apparatus.
  • An object of the present invention is to provide an imaging apparatus that solves the above-described problems.
  • the image pickup apparatus of the present invention includes one image pickup device and an image forming unit that forms a plurality of images at different viewpoints at a common portion of the subject irradiated with light from the light source on the one image pickup device, Have
  • an imaging apparatus capable of suppressing an increase in size and cost can be provided.
  • the top view of the imaging device in a first embodiment is shown.
  • the top view in case the imaging device in 1st embodiment has arbitrary light sources is shown.
  • the top view in case the imaging device in 1st embodiment has arbitrary division
  • the top view in case the imaging device in 1st embodiment has an imaging means is shown.
  • the top view of the pupil imaging device in a second embodiment is shown.
  • the top view of the pupil diameter measuring apparatus in 3rd embodiment is shown.
  • the top view in case the pupil diameter measuring apparatus in 3rd embodiment is provided with a pupil extraction means and a pupil diameter calculation means as a pupil diameter estimation means is shown.
  • the flowchart showing the flow until it calculates the three-dimensional coordinate and dimension of one or more pupils from one imaged image is shown.
  • the pupil diameter measuring apparatus according to the third embodiment, (a) an example of an image captured by the imaging device 8, (b) an intensity distribution of an image in a cross section of the line segment PQ in (a), (c) an extracted pupil An example of obtaining the circumference of the portion 21 and the coordinates of the circumference is shown. It is a figure explaining the calculation method of the three-dimensional coordinate of the pupil in 3rd embodiment.
  • the top view of the pupil state detection apparatus in 3rd embodiment is shown. It is a flowchart which shows a process in case it has a pupil state determination means in 3rd embodiment.
  • the top view of the pupil imaging device in a third embodiment is shown. It is the figure which showed an example of the specific structure of the pupil imaging device in 4th embodiment. It is a figure which shows the optical path of the light of the 1st wavelength of the pupil imaging device in 4th embodiment. It is a figure which shows the optical path of the light of the 2nd wavelength of the pupil imaging device in 4th embodiment. It is a figure which shows the structure of the prism body provided with the total reflection surface and the wavelength selection film
  • the top view of the pupil diameter measuring apparatus in 5th embodiment is shown. It is a flowchart in case a pupil diameter estimation means is provided with a pupil extraction means and a pupil diameter calculation means in 5th embodiment.
  • FIG. 5th embodiment It is a flowchart which shows operation
  • A) of the pupil diameter measuring apparatus in 5th embodiment is a figure which shows intensity distribution of the image imaged with the 1st image pick-up element 44, and (b) intensity distribution of the image imaged with the 2nd image pick-up element 45. It is. It is a figure explaining the effect
  • FIG. 1 is a top view of the imaging apparatus 100 according to the present embodiment.
  • the imaging apparatus 100 according to the present embodiment includes one light source 2a, half mirror 2ba, mirrors 2bb to 2bd, one imaging element 5, and a lens 6b.
  • the light source 2a emits incident light to the half mirror 2ba.
  • the half mirror 2ba divides incident light from the light source 2a into light that passes through the half mirror 2ba and travels in the direction of the mirror 2bb, and light that reflects off the half mirror 2ba and travels in the direction of the mirror 2bc.
  • the mirror 2bc reflects incident light from the half mirror 2ba in the direction of the mirror 2bd.
  • the mirrors 2bb and 2bd allow incident light to enter the subject 1 from different positions.
  • FIG. 1 shows an optical axis A and an optical axis B as two optical axes when incident light is incident on the subject 1 from these mirrors 2bb and 2bd.
  • the optical axis A and the optical axis B are parallel, but if the optical axis A and the optical axis B do not intersect and incident light can be incident on different positions of the subject 1, It does not necessarily have to be parallel.
  • the light from the subject 1 on the optical axis A is reflected by the mirror 2bb and the half mirror 2ba and is incident on the lens 6b, and an image on the optical axis A (for example, the first viewpoint 3) is formed on the image sensor 5.
  • the mirror 2bd and the mirror 2bc reflect the light from the subject 1 on the optical axis B and make the light incident on the half mirror 2ba.
  • the half mirror 2ba makes the light from the mirror 2bc incident on the lens 6b.
  • the half mirror 2ba forms an image on the image sensor 5 on the optical axis B (for example, the second viewpoint 4).
  • the image sensor 5 On the image sensor 5, at least light from the subject at the first viewpoint 3 and the second viewpoint 4 which are two different viewpoints is simultaneously imaged by the lens 6b.
  • the light reflected by the half mirror 2ba and traveling in the direction of the mirror 2bc is reflected by the mirror 2bc and incident on the mirror 2bd, is parallel to the light transmitted through the half mirror 2ba and traveled in the direction of the mirror 2bb, and is at a different position on the subject 1 Is incident on.
  • the images at the first viewpoint 3 and the second viewpoint 4 include at least the image of the feature point 1a that is a common part of the subject 1 at each viewpoint. That is, at least an image of the feature point 1 a that is a common part of the subject 1 is formed on the image sensor 5.
  • the configuration of the imaging apparatus 100 according to the present embodiment is adopted, the light of the subject 1 from different viewpoints can be imaged and imaged on one image element 5 by the optical system, so that the configuration of the apparatus is simple.
  • the imaging device 100 capable of reducing the size can be obtained.
  • the exact position of the feature point 1a can be obtained from the captured image by triangulation.
  • the example using the single light source 2a, the half mirror 2ba, the mirror 2bb, the mirror 2bc, and the mirror 2bd has been described, but the present embodiment is not necessarily limited to each of these configurations.
  • the light source 2a shown in FIG. 1 may be replaced by an arbitrary light source 2 having any position / angle such as natural light or indoor lighting.
  • an arbitrary dividing means 2b may be used instead of the half mirror 2ba of FIG.
  • the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5, but the light from the subject 1 at least at two different viewpoints is combined. You can image it. If light from two or more viewpoints is imaged, it is possible to calculate three-dimensional coordinates by triangulation.
  • the imaging device 5 may be a known device such as a CMOS (complementary metal oxide semiconductor) or a CCD (charge-coupled device).
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the optical path length between the half mirror 2ba and the mirror 2bb is L
  • the optical path length between the mirror 2bc and the mirror 2bd are respectively L / 2. That is, the sum of the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd is L.
  • the optical path lengths of the light traveling along the path reaching the optical axis B emitted in the direction of the subject 1 are arranged to be equal.
  • the light from the subject 1 from the direction of the optical axis A and the light from the subject 1 from the direction of the optical axis B are imaged on the image sensor 5 with the same optical path length.
  • each light is imaged with the same optical path length, so that accurate three-dimensional coordinates can be calculated from the imaged image by triangulation.
  • processing for obtaining three-dimensional coordinates from an image formed is not performed.
  • the triangulation will be described in detail in the description of the configuration and operation in the third embodiment to be described later. If the sum is L, the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd may be set to arbitrary values.
  • the configuration in which the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5 uses the above-described number of mirrors, half mirrors, and lenses. Not necessarily.
  • the above-described configuration has an arbitrary mirror, half mirror, lens, and the like that can image light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 on the image sensor 5.
  • An imaging means 6 comprising an optical system consisting of can be substituted.
  • the arbitrary light source 2 shown in FIG. 2 is configured by an optical system such as a light source 2a, a half mirror 2ba, and a mirror 2aa.
  • the imaging apparatus 100 can be made smaller if the configuration of the optical path shared with the optical system constituting the imaging means 6 is configured using these optical system configurations.
  • the internal configuration of the apparatus can be simplified, the imaging apparatus 100 can be manufactured at a lower cost.
  • FIG. 5 is a top view showing a configuration of the pupil imaging device 200 according to the present embodiment.
  • the pupil imaging device 200 of the present embodiment includes a light source 11, a wavelength selection unit 12, and an imaging unit 20, as shown in FIG.
  • the light source 11 emits light having a predetermined wavelength.
  • the wavelength selection unit 12 selectively transmits light having a predetermined wavelength emitted from the light source 11.
  • the image forming means 20 has images at different viewpoints for at least a part of the face including at least one pupil 14a of the detected person 14 irradiated with light of a predetermined wavelength from the light source 11 via the wavelength selecting means 12. Is imaged on one image pickup device 13 via the wavelength selecting means 12.
  • the different viewpoints are on different optical axes substantially parallel to the direction in which the light source 11 emits light, and the different optical axes are parallel to each other.
  • the pupil imaging device 200 of the second embodiment includes a light source 11, a wavelength selection unit 12, a single imaging device 13, a first viewpoint 15, a second viewpoint 16, and an imaging unit 20.
  • the imaging unit 20 includes a half mirror 20a, mirrors 20b to 20d, and a lens 20e will be described.
  • the operation of the pupil imaging apparatus 200 of the present embodiment will be described.
  • the process until the incident light from the light source 11 irradiates the detection subject 14 will be described.
  • the light emitted from the light source 11 enters the half mirror 20a.
  • Incident light at the half mirror 20a is split into light that passes through the half mirror 20a and travels in the direction of the mirror 20b, and light that reflects off the half mirror 20a and travels in the direction of the mirror 20c.
  • the light reflected by the half mirror 20a and traveling in the direction of the mirror 20c is reflected by the mirror 20c and incident on the mirror 20d, and is parallel to the light transmitted through the half mirror 20a and proceeding in the direction of the mirror 20b.
  • the light enters the different positions of the person 14.
  • the light from the detected person 14 at the first viewpoint 15 and the second viewpoint 16 includes light from at least a part of the face including at least one pupil 14a of the detected person 14 at each viewpoint. It is. That is, an image of at least a part of the face including one or more pupils 14 a of the person to be detected 14 is formed on the image sensor 13. Since the optical axes A and B are substantially parallel to the light incident from the person 14 to be detected, fundus reflection light is captured at the pupil portion of the image to be formed.
  • the pupil imaging apparatus 200 In the pupil imaging apparatus 200 according to the present embodiment, at least a part of the face including one or more pupils 14a of the person 14 to be detected at two different viewpoints is applied to one imaging element 13 by the optical system. Since an image can be formed and imaged, the configuration of the apparatus is simple, and the imaging apparatus 200 that can be reduced in size can be obtained. Further, with such an apparatus, an image of the pupil 14a that can accurately obtain the three-dimensional coordinates around the pupil and the pupil diameter using triangulation can be captured.
  • the pupil imaging apparatus 200 may be used to calculate and measure a human feature amount such as the intensity of the pupil image of the detected person and the reflected light distribution of the face in addition to the three-dimensional coordinates around the pupil.
  • the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b.
  • the pupil extracting means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on the image sensor 13.
  • the pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a from the at least two regions described above. Since there are usually two pupils 14a, a total of four regions are extracted by the pupil extraction means 21a.
  • the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b.
  • the pupil extraction means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on one image sensor 13. From the two regions extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a. As described above, since there are usually two pupils 14a, a total of four regions are extracted by the pupil extracting means 21a.
  • the pupil extraction means 21a extracts two regions from one image imaged on one image sensor 13, and the pupil diameter calculation means 21b determines the three-dimensional coordinates and dimensions of one or more pupils 14a. The flow until the calculation is performed will be described with reference to the flowchart of FIG.
  • the pupil extracting means 21a captures one image formed on the image sensor 13 (S2; image capture). Then, the pupil extracting unit 21a performs binarization processing on the captured image, extracts four regions corresponding to the pupil 14a in the image, and outputs them to the pupil diameter calculating unit 21b (S3; pupil part extraction). From the four regions corresponding to the pupil 14a in the image extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b has coordinates (X and Y coordinates) in the surface of the image sensor 13 around the pupil 14a in the image. Is acquired (S4; coordinate acquisition).
  • the pupil diameter calculating means 21b generates a real image including the pupil 14a imaged on the image sensor 13 by triangulation using the calculation methods shown in Equations 1 to 3 described later in addition to the obtained X and Y coordinates.
  • a vertical distance (referred to as Z coordinate) from the image sensor 13 is obtained (S5; acquisition of three-dimensional coordinates of the pupil).
  • the pupil diameter calculating means 21b calculates the pupil diameter from the obtained three-dimensional coordinates (S6; pupil diameter calculation), and records the calculated pupil diameter (S7; pupil diameter recording).
  • the process returns to step S1 after the process of S7 is completed in the flowchart of FIG.
  • the image captured in S2 in FIG. 8, that is, the image captured by the image sensor 8 is a facial image of the detected person 14 as shown in FIG. 9A as an example.
  • This image is an image in which the facial images of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B in FIG.
  • FIG. 9B shows the intensity distribution of the image in the cross section of the line segment PQ in FIG. 9A, and the pupil portions 22 of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B are high. Images are taken with intensity values.
  • the pupil extracting unit 21a acquires and records the coordinates of the circumference of the extracted pupil portion 21 in FIG. 9C.
  • the coordinates to be recorded are the light of the X component of the coordinates recorded by the pupil extracting means 21a (the direction of the line segment PQ in FIG. 9A).
  • the minimum coordinate XARmin and the maximum coordinate XARmax of the right eye image in the axis A direction are recorded.
  • the recorded coordinates are recorded as the minimum coordinate XALmin and the maximum coordinate XALmax of the left eye image in the optical axis A direction.
  • the recorded coordinates are recorded as the minimum coordinate XBRmin and the maximum coordinate XBRmax of the right eye image in the optical axis B direction.
  • the recorded coordinates are recorded as the minimum coordinate XBLmin and the maximum coordinate XBLmax of the left eye image in the optical axis B direction.
  • the Y coordinate can be recorded.
  • FIG. 10 is a diagram for explaining the operation of the pupil diameter calculating means 21 of the pupil diameter measuring apparatus 300 of the present embodiment.
  • the position of the principal point of the lens 20e is the origin, and the optical axis A direction of the lens 20e is the Z direction. Further, it is perpendicular to the optical axis A, and the horizontal direction on the surface of FIG. Further, the depth direction in FIG.
  • the optical axes A and B in FIG. 10 and the optical axes A and B in FIG. 5 are the same.
  • FIG. 10 shows the optical axes A and B in FIG.
  • FIG. 7 re-expressed linearly for each optical axis, the lens 20e, and the image sensor 13.
  • the images viewed from the respective optical axes in FIG. 5 are the viewpoints A and B (corresponding to the first viewpoint 15 and the second viewpoint 16) on the imaging element 13 and the optically equivalent imaging element 13a in FIG.
  • the minimum X coordinate XARmin of the right end of the right eye pupil of the viewpoint A and the minimum X coordinate XBRmin of the right end of the right eye pupil of the viewpoint B recorded by the pupil extracting means 21a in FIG.
  • the positional relationship is as shown in FIG. 10 with respect to the distance.
  • a method of measuring the three-dimensional coordinates E (XRmin, YRmin, ZRmin) will be described by taking the right end of the right eye pupil 14a shown in FIG. 10 as an example.
  • the distance d between the optical axes A and B is L + L / 2 from FIG. F is the focal length of the lens 20e.
  • the lens 20e and the image sensor 13 with respect to the optical axis B in FIG. 5 are expressed as an optically equivalent lens 20ea and an optically equivalent image sensor 13a in FIG.
  • description will be given using this optical equivalent diagram.
  • the detected person 14 and the optical axis A and optical axis B, and the viewpoints A and B have the same relationship as that of triangulation (stereo measurement), and therefore, the three-dimensional coordinates E (XRmin) of the right end of the right eye pupil 14a.
  • YRmin, ZRmin are expressed by Equations 1 to 3.
  • the pupil diameter measuring device 300 is shown, but by further including the pupil state determining means 24, the pupil state detecting device 400 can be obtained.
  • the difference with the pupil diameter measuring apparatus 300 among the structures of the pupil state detection apparatus 400 which has the pupil state determination means 24 is demonstrated using FIG.
  • processing when the pupil state determination unit 24 is provided will be described with reference to the flowchart of FIG.
  • the pupil state determination means 24 determines the psychological state and physiological state of the detected person 14 from the three-dimensional coordinates and dimensions of one or more pupils 14a output from the pupil diameter calculation means 21b.
  • FIG. 12 shows a flowchart when the pupil state determination means 24 performs frequency analysis on the three-dimensional coordinates and dimensions of one or more pupils 14a and determines the psychological state and physiological state of the detected person 14 from the result.
  • the pupil diameter calculation means 21b calculates the pupil diameter in a predetermined cycle in S6.
  • the pupil diameter calculated in S7 is recorded, the pupil diameter data collected in time series in S8 is analyzed by the pupil state determination means 24, and the psychological state and physiological state of the detected person 14 are determined from the analysis result.
  • the process returns to the step of S1. Steps S8 and after may be performed in parallel with the processing of S1 to S7, or may be performed separately.
  • the pupil diameter calculation means 21b calculates the pupil diameter data as time-series data every 1/60 seconds.
  • the calculated pupil diameter data can be recorded by the pupil state determination means 24.
  • the time-series data recorded in the pupil state determination unit 24 can be subjected to wavelet transform as frequency analysis by the pupil state determination unit 24.
  • the frequency band of the wavelet transform is a low frequency band 0.04 to 0.15 Hz with 0.15 Hz as a boundary, and a high frequency band. It is divided into 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude.
  • the parasympathetic nerve is dominant in the low frequency band, and the sympathetic nerve is dominant in the high frequency band. For example, if the value obtained by dividing the power in the low frequency band by the power in the high frequency band is 4 or more, the parasympathetic nerve is dominant and the detected person is relaxed. It is determined that the detector is excited.
  • the pupil diameter measuring apparatus 300 has a simple configuration and can be reduced in size.
  • the pupil diameter can be accurately obtained using triangulation. It is.
  • the pupil state detection device 400 it is possible to provide a small and low-cost device that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected.
  • the frequency band of the wavelet transform may be other than 0.04 to 0.5 Hz as long as the psychological state can be distinguished, and the frequency dividing the low frequency band and the high frequency band may be other than 0.15 Hz.
  • Fourier transform or other methods may be used.
  • the image pickup device 13 was a CMOS sensor having horizontal 1920 pixels and vertical 1080 pixels, and the pixel pitch in the X and Y directions in this case was 3 ⁇ m. Other pixel numbers may be applied to the image definition.
  • the image update frequency of the image sensor 13 was 60 times per second. That is, the pupil diameter data is calculated by the pupil diameter calculation means 21b as time series data every 1/60 seconds, but may be other time intervals.
  • the measurement resolution of the pupil 14a in the X and Y directions when the imaging distance (Z coordinate) is 500 mm is 0.1 mm. It becomes. Further, in the Z direction, the imaging distance could be corrected with a resolution of 1 mm.
  • LED having a wavelength of 850 nm
  • a light having a different wavelength may be used as long as the same effect can be obtained.
  • a sensor for measuring ambient light may be provided to correct the pupillary light reflection.
  • LED is an abbreviation for “Light Emitting Diode”.
  • FIG. 13 is a top view showing the configuration of the pupil imaging device 500 according to this embodiment.
  • the pupil imaging apparatus 500 of the present embodiment is different from the second embodiment in that it includes a second light source 42 and a second imaging means 49.
  • the pupil imaging device 500 of the fourth embodiment includes a first light source 41, a second light source 42, a wavelength selection unit 43, a first imaging element 44, a second imaging element 45, The first imaging means 48 and the second imaging means 49 are provided.
  • the first light source 41 emits light having a first wavelength.
  • the second light source 42 emits light of the second wavelength from a position different from that of the first light source 41 at a different angle.
  • the wavelength selection unit 43 selectively transmits the light having the first wavelength and the light having the second wavelength.
  • the first imaging device 44 selects the light of the first wavelength at the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including the one or more pupils 14a of the person 14 to be detected. Light is received through the means 43.
  • the second imaging element 45 receives, via the wavelength selection means 43, the light having the second wavelength at the first viewpoint 46 of at least a part of the face including the one or more pupils 14a of the person 14 to be detected.
  • the first image forming unit 48 selects the light from the detected person 14 in wavelength selection means substantially parallel to the optical axis A and the optical axis B when the light from the first light source 41 is emitted to the detected person 14. Light is received through 43. Further, the first imaging means 48 is configured so that each of the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including one or more pupils 14a of the subject 14 to be irradiated is first. An image of light having a wavelength of 1 is formed on the first image sensor 44 via the wavelength selection means 43.
  • the second imaging unit 49 generates an image of the second wavelength light at the first viewpoint of at least a part of the face including at least one or more pupils of the subject via the wavelength selection unit 43.
  • An image is formed on the second image sensor 45.
  • the first viewpoint 46 and the second viewpoint 47 are on the optical axis A and the optical axis B, which are two optical axes that are substantially parallel to the direction in which the first light source 41 emits light. , Parallel to each other.
  • the first light source 41 emits light having a first wavelength.
  • the light of the first wavelength is irradiated to the detected person 14 via the wavelength selection means 43.
  • the second light source 42 emits light having the second wavelength at a different angle from a position different from that of the first light source 41.
  • the light of the second wavelength is irradiated to the person to be detected 14.
  • the light of the first wavelength from at least a part of the face including one or more pupils 14a of the detected person 14 is emitted from the first light source 41 to the detected person 14 via the wavelength selecting means 43.
  • the incident light having the first wavelength forms an image on the first image sensor 44 by the first imaging means 48.
  • the light having the second wavelength from at least a part of the face including one pupil 14 a of the person to be detected 14 enters the second imaging unit 49 via the wavelength selection unit 43.
  • the incident light of the second wavelength is imaged on the second image sensor 45 by the second imaging means 49.
  • FIG. 14 is a diagram illustrating an example of a specific configuration of the pupil imaging device 500.
  • FIG. 15A is a diagram showing an optical system of the optical path of the first wavelength from the first light source 41.
  • FIG. 15B is a diagram showing an optical system of the optical path of the second wavelength from the second light source 42.
  • the configuration of FIG. 15A is an optical system that optically has the same action as that of the third embodiment, and each of the optical axes A and B is included in the first image sensor 44 as in FIG. 9A.
  • the face images of the person to be detected 14 corresponding to the direction are captured in an overlapping manner.
  • the first imaging means 48 includes a first wavelength selection mirror 48a, a second wavelength selection mirror 48b, a third wavelength selection mirror 48c, a first mirror 48d, and a second mirror. 48e, a half mirror 48f, and a lens 48g.
  • the second imaging means 49 includes a first wavelength selection mirror 48a, a mirror 49a, a second wavelength selection mirror 48b, a lens 48g, and a third wavelength selection mirror 48c.
  • the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c selectively reflect the light of the first wavelength emitted from the first light source 41, and the second wavelength selection mirror 48b. The light is selectively transmitted to the second wavelength emitted by the light source 42.
  • the light emitted from the first light source 41 enters the half mirror 48f.
  • Incident light that has entered the half mirror 48f is split into light that passes through the half mirror 48f and travels in the direction of the first wavelength selection mirror 48a, and light that reflects by the half mirror 48f and travels in the direction of the second mirror 48e.
  • the light reflected by the half mirror 48f is reflected by the second mirror 48e, reflected by the first mirror 48d, and travels in the direction of the person to be detected 14 via the wavelength selection means.
  • the light transmitted through the half mirror 48f is reflected by the first wavelength selection mirror 48a and travels in the direction of the detected person 14 through the wavelength selection means. That is, the light transmitted through the half mirror 48f is parallel to the light reflected by the half mirror 48f and is incident on different positions of the detection subject 14.
  • the second light source 42 emits light of the second wavelength from the position different from that of the first light source 41 in the direction of the person to be detected 14.
  • the first image sensor 44 and the second image sensor 45 will be described.
  • light from at least a part of the face including one or more pupils 14a of the person to be detected 14 is parallel to the optical axis A and is selected via the wavelength selection unit 43.
  • the light enters the mirror 48a.
  • the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st wavelength selection mirror 48a.
  • the first wavelength light is partially reflected by the half mirror 48f, reflected by the second wavelength selection mirror 48b, incident on the lens 48g, and reflected by the third wavelength selection mirror 48c.
  • the image is formed on the image sensor 44.
  • the first mirror 48d Similarly, light from at least a part of the face including one or more pupils 14a of the person 14 to be detected is incident on the first mirror 48d via the wavelength selection means 43 in parallel with the optical axis B. And the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st mirror 48d. Further, the light is reflected by the second mirror 48e, partially transmitted through the half mirror 48f, and thereafter imaged on the first image sensor 44 through the same path as the light incident parallel to the optical axis A. That is, the image of the person 14 to be detected from the viewpoints in both the optical axis A direction and the optical axis B direction is formed on the first image sensor 44.
  • the pupil 14a has an optical axis (optical axis A and optical axis B) on which the light emitted from the light source 41 is irradiated to the detected person 14 and light reflected from the detected person 14. Since the optical axes (optical axis A and optical axis B) incident on the image sensor 44 are coaxial, fundus reflection light is imaged. Therefore, a so-called bright pupil image is captured.
  • light from at least a part of the face including one or more pupils 14 a of the person to be detected 14 enters the wavelength selection unit 43.
  • the second wavelength light is transmitted through the first wavelength selection mirror 48a.
  • the transmitted light is reflected by the mirror 49a, transmitted by the second wavelength selection mirror 48b, enters the lens 48g, passes through the third wavelength selection mirror 48c, and is coupled onto the second image sensor 45.
  • Imaged that is, unlike the first image sensor 44, only an image of at least a part of the face including one or more pupils 14 a of the person 14 to be detected from the direction of the optical axis A is formed on the image sensor 45.
  • the pupil 14 a has an optical axis on which the light emitted from the light source 42 is applied to the detected person 14, and an optical axis on which the light reflected from the detected person 14 enters the image sensor 44 ( Since the optical axis A and the optical axis B) are non-coaxial, fundus reflection light is not imaged. For this reason, a so-called dark pupil image is captured.
  • the light of the second wavelength from the direction of the optical axis B is transmitted through the second wavelength selection mirror 48b in FIG.
  • the pupil imaging device 500 In the pupil imaging device 500 according to the present embodiment, light sources having different wavelengths are arranged coaxially with the imaging element, and the other is arranged at a non-coaxial position / angle, and on different imaging elements for each wavelength. Since the image of the pupil 14a is formed on the light, the images of the bright pupil and the dark pupil can be captured.
  • the half mirror 48f Since the light incident from both the optical axis A and the optical axis B is imaged on the image sensor 44, the half mirror 48f has a first wavelength transmittance of 50% (a reflectance of 50%). It is preferable to use one.
  • the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c are:
  • An optically equivalent configuration such as the prism body 52 shown in FIG. 15C may be used.
  • the prism body 52 includes a total reflection surface 51 and a wavelength selection film 52. In this case, it arrange
  • other mirrors may have other optically equivalent configurations.
  • the total reflection surface 50 of the prism body 52 is used as the surface of the above-described mirror. Use.
  • FIG. 16 is a top view showing a pupil diameter measuring apparatus 600 of this embodiment.
  • a region obtained by imaging the pupil 14a in one image formed on the first image sensor 44 and a second image sensor 45 are connected.
  • a pupil diameter estimating means 53 is provided for estimating the pupil diameter by triangulation using the same imaged area of the pupil 14a in one image.
  • the pupil diameter estimating means 53 includes a pupil extracting means 53a and a pupil diameter calculating means 53b as shown in FIG.
  • the pupil extracting means 53a images the pupil 14a in the image imaged on the first image sensor 44 and the same pupil 14a in the image imaged on the second image sensor 45. Extract regions. Usually, since there are two pupils, a total of four regions are extracted.
  • the pupil diameter calculating means 53b calculates the three-dimensional coordinates and dimensions of the pupil 14a from the extracted four regions.
  • the pupil extracting means 53a uses a single image formed on the first image sensor 44 and a single image formed on the second image sensor 45. The flow from extraction of two regions until the pupil diameter calculation means 53b calculates the three-dimensional coordinates and dimensions of one or more pupil extraction means 53a will be described with reference to the flowchart of FIG.
  • the pupil extracting means 53a captures the respective images formed on the first image sensor 44 and the second image sensor 45 (S102; image capture).
  • the pupil extracting means 53a calculates the difference between the image formed on the first image sensor 44 and the image formed on the second image sensor 45 (S103; image difference).
  • the pupil extracting means 53a performs binarization processing on the image after the difference calculation, and extracts two regions in which the pupil 14a is imaged (S104; optical axis A image pupil partial extraction).
  • the pupil diameter calculating unit 53b acquires coordinates (X and Y coordinates) in the imaging element plane around the pupil 14a from the two extracted regions, and calculates at least one pupil diameter (S105: viewpoint) A image pupil partial pupil diameter calculation).
  • the pupil diameter calculating means 53b sets a differential filter from the obtained pupil diameter (S106; differential filter setting). Further, the pupil diameter calculating means 53b applies the above-described differential filter to the image formed on the first image sensor 44 (S107; image differential processing).
  • the pupil extracting unit 53a performs binarization processing on the image to which the above-described image differentiation processing is applied, extracts four regions corresponding to the pupil portion 22 in the image, and outputs them to the pupil diameter calculating unit 53b (S108; Pupil part extraction).
  • the pupil diameter calculating means 53b calculates the X and Y coordinates in the first imaging element 44 plane of the four regions corresponding to the pupil portion 22 in the image (S109; coordinate acquisition).
  • the pupil diameter calculation unit 53b converts the pupil image 22 into a real image including the pupil portion 22 imaged on the first image sensor 44 by triangulation using the calculation methods shown in Equations 1 to 3.
  • a distance in the vertical direction (Z coordinate) is obtained from the first image sensor 44 (S110; acquisition of three-dimensional coordinates of the pupil).
  • the pupil diameter calculating means 53b calculates the pupil diameters of the four pupil portions 22 from the obtained three-dimensional coordinates (S111; pupil diameter calculation), and records the calculated pupil diameter (S112; pupil diameter recording).
  • the process returns to step S101 after the process of S112 is completed in the flowchart of FIG.
  • FIG. 19A is an intensity distribution of an image captured by the first image sensor 44
  • FIG. 19B is a diagram illustrating an intensity distribution of an image captured by the second image sensor 45.
  • FIG. 19A which is an image of a bright pupil
  • FIG. 19B which is an image of a dark pupil
  • FIG. 19C only the pupil portion 22 when viewed in the direction of the optical axis A can be extracted.
  • the right and left pupil diameters DR and DL of the pupil portion 22 in the optical axis A direction can be obtained from the extracted pupil portion 22 as S105.
  • the minimum coordinate of the right eye image in the optical axis A direction is recorded as XARmin and the maximum coordinate is recorded as XARmax, as in FIG. 9C.
  • the minimum coordinate of the left eye image in the direction of the optical axis A is recorded as XALmin, and the maximum coordinate is recorded as XALmax.
  • the minimum coordinate of the right eye image in the optical axis B direction is recorded as XBRmin, and the maximum coordinate is recorded as XBRmax.
  • the minimum coordinate of the left eye image in the optical axis B direction is recorded as XBLmin, and the maximum coordinate as XBLmax.
  • the Y component can be recorded.
  • FIG. 20 is a diagram for explaining the operation of the feature extraction filter.
  • a second-order differential filter represented by Expression 4 is exemplified as the feature extraction filter.
  • the differential interval DR is the right pupil diameter of the pupil portion in the optical axis A direction described above.
  • any one of the right and left pupil diameters DR and DL of the pupil portion 22 in the direction of the optical axis A is applied to Equation 4.
  • the image of FIG. 20A is subjected to the second-order differential filter of FIG. 20B as S107, and the binarization process is performed as S108, so that the one having the diameter DR can be selectively extracted, as shown in FIG. 20C.
  • the pupil portion 22 when viewed in the direction of the optical axis B can also be extracted.
  • the extracted pupil portion 22 is output to the pupil diameter calculating means 53b.
  • I (x, y) 2 ⁇ I (x, y) ⁇ I (x ⁇ DR, y) ⁇ I (x + DR, y)
  • the DR value can always be obtained from the extracted image of the pupil 14a in the direction of the optical axis A, even when the imaging distance varies and the DR value increases as shown in FIG. 20E, for example.
  • the time value DR is the differential interval
  • the secondary differential filter can be optimized as shown in FIG. 20F, and the pupil 14a can be selectively extracted as shown in FIG. 20G.
  • the pupil diameter measuring apparatus 600 may further include pupil state determining means 54 as shown in FIG. FIG. 21 shows a flowchart when the pupil state determination means 54 is provided. From the value of the pupil diameter (S111; pupil diameter calculation) calculated in FIG. 18 (S112; pupil diameter recording) and the analysis result of the frequency analysis (S8 in FIG. 12) shown in the third embodiment, the detected person 14 The psychological state and physiological state are determined. Note that, as described in the third embodiment, as a method for determining the psychological state and physiological state of the person 14 to be detected, first, as described in the third embodiment, the frequency band of wavelet transform is set to a low frequency band 0.
  • the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude. That is, when the pupil state determination means 54 is provided, the flowchart of FIG. 21 returns to the step of S201 after the processing of S211 is completed.
  • the steps after S212 may be performed in parallel with the processing of S201 to S211 or may be performed separately.
  • the pupil imaging apparatus 600 even when the difference in intensity level between the face portion 23 and the pupil portion 22 is small, an accurate pupil diameter can be calculated by binarization processing. Moreover, the pupil state detection apparatus 700 in this embodiment can provide a small and low-cost apparatus that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected. Is possible.
  • a feature extraction filter may be a two-dimensional distribution represented by Equation 5 instead of a second-order differential filter.
  • the filtering process is as shown in FIGS. 20D and 20H.
  • I (x, y) 4 ⁇ I (x, y) ⁇ I (x ⁇ D, y) ⁇ I (x + D, y) ⁇ I (x, y ⁇ D) ⁇ I (x, y + D)
  • the left pupil diameter DL may be used, or the right and left pupil diameters DR and DL may be used alternately.
  • An LED having a wavelength of 850 nm was used as the light source 1.
  • An LED with a wavelength of 940 nm was used as the light source 67.
  • the light sources 1 and 2 may have different wavelengths as long as the same effect can be obtained.
  • wavelength selection filter a dichroic mirror using a dielectric multilayer film can be used, but other known wavelength selection filters may be used.
  • the optical axis A and the optical axis B are parallel, but it may be a condition for convergence imaging in which the angle between the optical axis A and the optical axis B is changed by changing the angle of each mirror.
  • FIG. 22 is a top view showing a configuration of a pupil state detection / display apparatus 800 according to the present embodiment.
  • the pupil state detection display device 800 of the present embodiment includes a display device 61 between the detected person 14 and the wavelength selection means 43 as compared with the third or fifth embodiment. Is different. Other configurations and configuration variations are the same as those in the third and fifth embodiments.
  • 22A shows a case where the display device 61 is an optical filter 61a
  • FIG. 22B shows an optical filter 61a and a display unit 61b, an information processing device 61c
  • FIG. 22C shows an optical filter 61a, a display unit 61b, and a half mirror 61d. .
  • the pupil state detection device 101 is the pupil state detection device 400 or the pupil state detection device 700 of the third and fifth embodiments of the present invention.
  • the optical filter 61a in FIG. 22A is visible light total reflection and infrared light total transmission, when it sees from the to-be-detected person 14, it looks like a mirror. Further, if the optical filter 61a is made to absorb visible light and transmit infrared light, it will appear as a black plane when viewed from the person 14 to be detected.
  • FIG. 22B shows a configuration in which the result determined by the information processing device 61c is displayed on the display unit 61b based on the determination result of the pupil state detection device 101.
  • the subject 14 is notified of the fatigue state and the mental state, or a visual stimulus for measuring the reaction of the subject 14 such as light reflection is given. Also good.
  • the frequency band of wavelet transform is set to a low frequency band 0. It is divided into 04 to 0.15 Hz and high frequency band 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude.
  • FIG. 22C is a configuration provided with a half mirror 61c in the configuration shown in FIG. 22B.
  • a half mirror 61c When there is no display on the display unit 61b, it acts as a mirror, and when there is a display on the display unit 61b, the mirror image of the detected person 14 and the display image can be optically fused.
  • the detected person 14 can evaluate a more accurate psychological state without feeling that it is being measured. Furthermore, in the case of the configuration of FIG. 22B, a visual stimulus is given to the detected person 14 according to the evaluation result by the psychological state of the detected person 14, the pupil state detecting device 400 of the physiological state, and the pupil state detecting device 700, It has become possible to have further effects on stress management and treatment of mental illness. In the case of the configuration of FIG. 22C, the effect of the configuration of FIG. 22A and FIG. 22B described above can be obtained according to the display content.
  • a liquid crystal display can be used as the display unit 61b.
  • an organic EL display or other display may be used.
  • EL is an abbreviation for “Electroluminescence”.
  • the display unit 61b may be a stereoscopic display using a lenticular lens or a parallax barrier, in addition to a normal two-dimensional display.
  • the left and right pupil positions obtained by the pupil state detection device 101, the convergence angle obtained from the pupil position, and the focus adjustment state of the eyeball obtained from the pupil diameter are used to observe the stereoscopic display.
  • An appropriate parallax image may be displayed.
  • An imaging apparatus comprising: an imaging unit configured to form a plurality of images at different viewpoints of a common portion of a subject irradiated with light from a light source on the one imaging element.
  • the at least two different viewpoints are a first viewpoint and a second viewpoint,
  • the imaging means reflects the light from the subject at the first viewpoint by a mirror and enters the lens to form an image on the one image sensor, and the light from the subject at the second viewpoint.
  • the imaging apparatus according to appendix 1, wherein the image is reflected by a mirror and incident on the lens so as to form an image of the subject at the first viewpoint and the second viewpoint at the same time on the one imaging element.
  • the arbitrary light source includes one light source, a dividing unit that divides incident light emitted from the one light source, and a subject direction from two directions that are different depending on an angle at which the incident light divided by the dividing unit is parallel or converged.
  • the imaging apparatus according to appendix 1 or 2 wherein the imaging apparatus is a mirror that is made incident on the lens.
  • the dividing unit is a half mirror.
  • a light source that emits light of a predetermined wavelength; Wavelength selection means for selectively transmitting light of the predetermined wavelength; One image sensor, Respective images at different viewpoints of at least a part of the face including one or more pupils of the detection subject irradiated with light of the predetermined wavelength from the light source via the wavelength selection unit, the wavelength selection unit An image forming means for forming an image on the one image sensor via The different viewpoints are on different optical axes substantially parallel to the direction in which the light source emits light; The different optical axes are arranged parallel or converging to each other, Pupil imaging device.
  • the pupil diameter is determined by triangulation using the pupil imaging device of Appendix 5 and at least two regions corresponding to the one or more pupils in one image formed on the one imaging element.
  • a pupil diameter measuring device comprising: a pupil diameter estimating means for estimating.
  • Appendix 8 The pupil diameter measuring device according to appendix 6 or 7, and pupil state determining means for determining the psychological state and physiological state of the detected person from the three-dimensional coordinates and dimensions of the one or more pupils output from the pupil diameter calculating means And a pupillary state detection device.
  • a first light source that emits light of a first wavelength
  • a second light source that emits light of a second wavelength at an angle different from that of the first light source
  • Wavelength selection means for selectively transmitting the light of the first wavelength and the light of the second wavelength
  • a first image sensor A second imaging device;
  • the optical axis is coaxial, and one of the detected persons irradiated with the light of the first wavelength through the wavelength selecting means
  • Images of the light of the first wavelength at the first viewpoint and the second viewpoint of at least a part of the face including the pupil are connected onto the first imaging element via the wavelength selection unit.
  • First imaging means for imaging The image by the light of the second wavelength at the first viewpoint of at least a part of the face including at least one pupil of the subject irradiated with the light of the second wavelength is passed through the wavelength selection means.
  • second image forming means for forming an image on the second image sensor The first viewpoint and the second viewpoint are on two optical axes substantially parallel to a direction in which the first light source emits light, The two optical axes are arranged parallel to each other or converging. Pupil imaging device.
  • Appendix 10 One or more of the pupil imaging device according to appendix 9, one image imaged on the first image sensor, and one image imaged on the second image sensor And a pupil diameter estimating means for estimating a pupil diameter by triangulation using at least two regions corresponding to the pupil of the pupil diameter.
  • the pupil diameter estimating means determines the region of the one or more pupils from one image imaged on the first image sensor and one image imaged on the second image sensor.
  • Pupil extracting means for extracting;
  • the pupil diameter measuring device according to appendix 10 further comprising pupil diameter calculating means for calculating and outputting three-dimensional coordinates and dimensions of the one or more pupils.
  • the pupil diameter measuring device is a first wavelength selection mirror that selectively reflects the first wavelength of light from at least a part of the face including the one or more pupils in the first viewpoint.
  • the second imaging means reflects light from at least a part of the face including the one or more pupils at the first viewpoint with a mirror and makes it incident on a lens to be coupled onto the second imaging element.
  • the pupil image pickup device according to appendix 9, wherein an image is formed.
  • a pupil state detection display device comprising: the pupil imaging device according to attachment 15; and a display device further provided between the detected person and the wavelength selection unit.
  • Appendix 17 Emitting light of a predetermined wavelength from the light source to at least a part of the face including one or more pupils of the subject; Light from at least a part of the face including the one or more pupils on at least two different viewpoints on each of the two optical axes substantially parallel to the direction of emitting the light on one image sensor Image, Pupil imaging method.
  • the first light source emits light of the first wavelength to at least a part of the face including one pupil of the detected person
  • the second light source emits light of the second wavelength to at least a part of the face including one pupil of the detected person at a different angle from a position different from the first light source
  • the light of the first wavelength and the light of the second wavelength that enter from at least a part of the face including one pupil of the detected person selects the light of the first wavelength and the light of the second wavelength Is incident on the first imaging means and the second imaging means via the wavelength selection means that transmits the light
  • the first imaging means whose optical axis is coaxial is one pupil of the detected person at the first viewpoint and the second viewpoint.
  • a second imaging means images the light of the second wavelength of at least a part of the face including at least one pupil of the subject at the first viewpoint on the second image sensor; Pupil imaging method.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Le problème décrit par la présente invention est de pourvoir à un dispositif d'imagerie qui résout les problèmes liés à l'augmentation de la taille et des coûts de dispositif. La solution selon l'invention porte sur un dispositif d'imagerie comportant un élément d'imagerie et un moyen de formation d'image permettant de former une image, sur ledit élément d'imagerie, d'une pluralité d'images à partir de différentes perspectives d'un site partagé sur un objet d'imagerie exposé à de la lumière en provenance d'une source de lumière.
PCT/JP2014/005642 2013-11-19 2014-11-10 Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille WO2015075894A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015548974A JPWO2015075894A1 (ja) 2013-11-19 2014-11-10 撮像装置、瞳孔撮像装置、瞳孔径測定装置、瞳孔状態検出装置、及び瞳孔撮像方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013238493 2013-11-19
JP2013-238493 2013-11-19

Publications (1)

Publication Number Publication Date
WO2015075894A1 true WO2015075894A1 (fr) 2015-05-28

Family

ID=53179185

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/005642 WO2015075894A1 (fr) 2013-11-19 2014-11-10 Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille

Country Status (2)

Country Link
JP (1) JPWO2015075894A1 (fr)
WO (1) WO2015075894A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179938A1 (fr) * 2016-04-15 2017-10-19 이문기 Dispositif de photographie de l'œil

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0870474A (ja) * 1994-08-29 1996-03-12 Sanyo Electric Co Ltd 立体撮像装置及び立体画像記録再生装置
JP2004261598A (ja) * 2003-02-28 2004-09-24 Agilent Technol Inc 瞳孔検知装置及び方法
WO2007125794A1 (fr) * 2006-04-27 2007-11-08 Konica Minolta Holdings, Inc. Dispositif de mesure de donnees et procede de mesure de donnees
JP2008246013A (ja) * 2007-03-30 2008-10-16 National Univ Corp Shizuoka Univ 眠気検知装置
JP2012068937A (ja) * 2010-09-24 2012-04-05 Panasonic Corp 瞳孔検出装置及び瞳孔検出方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0870474A (ja) * 1994-08-29 1996-03-12 Sanyo Electric Co Ltd 立体撮像装置及び立体画像記録再生装置
JP2004261598A (ja) * 2003-02-28 2004-09-24 Agilent Technol Inc 瞳孔検知装置及び方法
WO2007125794A1 (fr) * 2006-04-27 2007-11-08 Konica Minolta Holdings, Inc. Dispositif de mesure de donnees et procede de mesure de donnees
JP2008246013A (ja) * 2007-03-30 2008-10-16 National Univ Corp Shizuoka Univ 眠気検知装置
JP2012068937A (ja) * 2010-09-24 2012-04-05 Panasonic Corp 瞳孔検出装置及び瞳孔検出方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179938A1 (fr) * 2016-04-15 2017-10-19 이문기 Dispositif de photographie de l'œil
KR20170118618A (ko) * 2016-04-15 2017-10-25 이문기 눈 촬영 장치

Also Published As

Publication number Publication date
JPWO2015075894A1 (ja) 2017-03-16

Similar Documents

Publication Publication Date Title
EP2731490B1 (fr) Système et procédé de mesure à distance de la focalisation optique
CN106062665B (zh) 基于用户的眼睛运动和位置的光学感测和跟踪的用户界面
US10204262B2 (en) Infrared imaging recognition enhanced by 3D verification
US8371693B2 (en) Autism diagnosis support apparatus
JP4884777B2 (ja) 眼底観察装置
JP2021045600A (ja) 生体情報検出装置
JP6346525B2 (ja) 視線検出装置
WO2018031249A1 (fr) Capteur d'imagerie hybride permettant la capture d'objets à lumière structurée
JP6631951B2 (ja) 視線検出装置及び視線検出方法
JP4971532B1 (ja) 立体画像撮影装置および内視鏡
CN109964230B (zh) 用于眼睛度量采集的方法和设备
KR20130027671A (ko) 깊이 정보 획득장치 및 이를 포함하는 3차원 정보 획득 시스템
CN103654699A (zh) 激发荧光双目内窥系统
CN106767526A (zh) 一种基于激光mems振镜投影的彩色多线激光三维测量方法
JP2012143363A (ja) 画像処理装置
JP2020515346A (ja) 眼科撮像装置およびシステム
JP2015231498A (ja) 内視鏡装置
JP2015197358A (ja) 路面状態検知システム
WO2015075894A1 (fr) Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille
JP5587756B2 (ja) 光学式距離計測装置、光学式距離計測装置の距離計測方法および距離計測用プログラム
KR20130068851A (ko) 입체영상 깊이감 측정 장치 및 방법
JP2019052857A (ja) 撮像装置
WO2014051274A2 (fr) Sonde de balayage portable intégrée à un moniteur et appareil de tomographie en cohérence optique l'utilisant
JP2018082424A (ja) 画像形成装置
US9289120B2 (en) Temporal structured-illumination motion-detection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14863686

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015548974

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14863686

Country of ref document: EP

Kind code of ref document: A1