WO2021210235A1 - Dispositif électronique - Google Patents

Dispositif électronique Download PDF

Info

Publication number
WO2021210235A1
WO2021210235A1 PCT/JP2021/002814 JP2021002814W WO2021210235A1 WO 2021210235 A1 WO2021210235 A1 WO 2021210235A1 JP 2021002814 W JP2021002814 W JP 2021002814W WO 2021210235 A1 WO2021210235 A1 WO 2021210235A1
Authority
WO
WIPO (PCT)
Prior art keywords
eyepiece
line
detection
light
sight
Prior art date
Application number
PCT/JP2021/002814
Other languages
English (en)
Japanese (ja)
Inventor
山本 英明
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to CN202180028226.0A priority Critical patent/CN115427883A/zh
Publication of WO2021210235A1 publication Critical patent/WO2021210235A1/fr
Priority to US17/954,470 priority patent/US20230013134A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • G03B13/06Viewfinders with lenses with or without reflectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2213/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B2213/02Viewfinders
    • G03B2213/025Sightline detection

Definitions

  • the present invention relates to an electronic device having a line-of-sight detection function.
  • Cameras including video cameras that can detect the user's line of sight (line-of-sight direction) with the line-of-sight detection function and select a focus-finding point based on the line-of-sight detection result have been put into practical use. Further, a camera having an eyepiece detection function has been put into practical use so that the line-of-sight detection function is effective only when the user touches the finder (eyepiece).
  • Patent Document 1 discloses a technique for realizing a line-of-sight detection function and an eyepiece detection function by providing a light-emitting diode for eyepiece detection and an eyepiece detection sensor in addition to a light-emitting diode for line-of-sight detection and a line-of-sight detection sensor.
  • Patent Document 2 discloses a technique for performing line-of-sight detection and eyepiece detection with the same sensor.
  • An object of the present invention is to provide an electronic device that realizes an eyepiece detection function and a line-of-sight detection function with low power consumption at low cost.
  • the electronic device of the present invention is an electronic device capable of performing eyepiece detection for detecting an eyepiece on an eyepiece and eyepiece detection for detecting a user's line of sight, and includes an eyepiece detection sensor that receives light for eyepiece detection.
  • the eyepiece detection sensor is different from the eyepiece detection sensor, and has one or more light sources including a line-of-sight detection sensor that receives light for the line-of-sight detection and a light source that is used for both the eyepiece detection and the line-of-sight detection. ..
  • the present invention it is possible to provide an electronic device that realizes an eyepiece detection function and a line-of-sight detection function with low power consumption at low cost.
  • ⁇ Explanation of configuration> 1 (a) and 1 (b) show the appearance of the camera 1 (digital still camera; interchangeable lens camera) according to the present embodiment.
  • the present invention is also applicable to devices that display information such as images and characters, and arbitrary electronic devices that can detect the line of sight of a user who visually recognizes an optical image through an eyepiece optical system.
  • These electronic devices may include, for example, mobile phones, game consoles, tablet terminals, personal computers, clock-type and eyeglass-type information terminals, head-mounted displays, binoculars, and the like.
  • FIG. 1 (a) is a front perspective view
  • FIG. 1 (b) is a rear perspective view
  • the camera 1 has a photographing lens unit 1A and a camera housing 1B.
  • a release button 34 which is an operating member that receives an imaging operation from a user (photographer), is arranged on the camera housing 1B.
  • an eyepiece window frame 121 is arranged on the back surface of the camera housing 1B for the user to look into the display panel 6 described later included in the camera housing 1B. ..
  • the eyepiece window frame 121 forms a viewing port 12 and projects outward (back side) with respect to the camera housing 1B.
  • the operation member 41 is a touch panel that accepts touch operations
  • the operation member 42 is an operation lever that can be pushed down in each direction
  • the operation member 43 is a four-direction key that can be pushed down in each of the four directions.
  • the operation member 41 includes a display panel such as a liquid crystal panel, and has a function of displaying an image on the display panel.
  • FIG. 2 is a block diagram showing the configuration inside the camera 1.
  • the image pickup element 2 is, for example, an image pickup element such as a CCD or a CMOS sensor.
  • the optical image formed on the image pickup surface of the image pickup element 2 is photoelectrically converted by the optical system of the photographing lens unit 1A, and the obtained analog image signal is obtained.
  • Output to the A / D conversion unit (not shown).
  • the photographing lens unit 1A is composed of an optical system including a zoom lens, a focus lens, an aperture, and the like, and while mounted on the camera housing 1B, guides the light from the subject to the image pickup element 2 and captures the subject image in the image pickup element 2. An image is formed on the imaging surface of the lens.
  • the aperture control unit 118, the focus adjustment unit 119, and the zoom control unit 120 each receive an instruction signal from the CPU 3 via the mount contact 117, and drive and control the aperture, focus lens, and zoom lens according to the instruction signal.
  • the CPU 3 included in the camera housing 1B reads a control program for each block included in the camera housing 1B from the ROM of the memory unit 4, expands it into the RAM of the memory unit 4, and executes it. As a result, the CPU 3 controls the operation of each block included in the camera housing 1B.
  • the line-of-sight detection unit 201, the light measuring unit 202, the automatic focus detection unit 203, the signal input unit 204, the eyepiece detection unit 208, the display device drive unit 210, the light source drive unit 205, and the like are connected to the CPU 3. Further, the CPU 3 transmits a signal to the aperture control unit 118, the focus adjustment unit 119, and the zoom control unit 120 arranged in the photographing lens unit 1A via the mount contact 117.
  • the memory unit 4 has a function of storing an image pickup signal from the image pickup element 2 and the line-of-sight detection sensor 30.
  • the CPU 3 extracts feature points required for line-of-sight detection from the eye image according to a predetermined algorithm described later, and calculates the user's line of sight (viewpoint in the visual recognition image) from the positions of the feature points.
  • the eyepiece detection unit 208 transmits the output of the eyepiece detection sensor 50 to the CPU 3.
  • the CPU 3 calculates whether or not the user has made an eye contact with the eyepiece portion (finder; portion of the viewing port 12) according to a predetermined algorithm described later.
  • the photometric unit 202 performs amplification, logarithmic compression, A / D conversion, and the like of a signal obtained from the image sensor 2 that also serves as a photometric sensor, specifically, a luminance signal corresponding to the brightness of the field of view. The result is sent to the CPU 3 as the photometric brightness information.
  • the CPU 3 calculates the distance from the signals of the plurality of detection elements to the subject corresponding to each focus detection point.
  • This is a known technique known as imaging surface phase-difference AF.
  • the light source driving unit 205 drives infrared LEDs 18, 19, 22 to 27, which will be described later, based on a signal (instruction) from the CPU 3. Specifically, the light source driving unit 205 individually or uniformly controls the emission intensity (emission amount; emission brightness) of the infrared LEDs 18, 19, 22 to 27 based on the signal from the CPU 3.
  • the control of the emission intensity shall include switching between lighting and extinguishing.
  • the image processing unit 206 performs various image processing on the image data stored in the RAM. For example, various image processes for developing, displaying, and recording digital image data, such as pixel defect correction processing due to optical systems and imaging elements, demosaiking processing, white balance correction processing, color interpolation processing, and gamma processing, are performed. Will be done.
  • Switch SW1 and switch SW2 are connected to the signal input unit 204.
  • the switch SW1 is a switch for starting the light measurement, distance measurement, line-of-sight detection operation, etc. of the camera 1, and is turned on by the first stroke of the release button 34.
  • the switch SW2 is a switch for starting a shooting operation, and is turned on by the second stroke of the release button 34.
  • the ON signals from the switches SW1 and SW2 are input to the signal input unit 204 and transmitted to the CPU3.
  • the signal input unit 204 also receives operation inputs from the operation member 41 (touch panel), operation member 42 (operation lever), and operation member 43 (four-direction keys) shown in FIG. 1 (b).
  • the recording / output unit 207 records data including image data on a recording medium such as a removable memory card, or outputs these data to an external device via an external interface.
  • the display device drive unit 210 drives the display device 209 based on the signal from the CPU 3.
  • the display devices 209 are display panels 5 and 6, which will be described later.
  • FIG. 3 is a cross-sectional view of the camera 1 cut along the YZ plane formed by the Y-axis and the Z-axis shown in FIG. 1A, and is a diagram conceptually showing the configuration of the camera 1.
  • the shutter 32 and the image sensor 2 are arranged in order in the optical axis direction of the photographing lens unit 1A.
  • a display panel 5 is provided on the back surface of the camera housing 1B, and the display panel 5 displays menus and images for operating the camera 1 and viewing / editing the image obtained by the camera 1.
  • the display panel 5 is composed of a backlit liquid crystal panel, an organic EL panel, or the like.
  • the EVF provided in the camera housing 1B can display menus and images as a normal EVF like the display panel 5, detects the line of sight of the user looking into the EVF, and controls the detection result of the camera 1. It has a structure that can be reflected in.
  • the display panel 6 displays the same display as the display panel 5 (menu display and image display for operating the camera 1 and viewing / editing the image obtained by the camera 1) when the user is looking through the viewfinder. ..
  • the display panel 6 is composed of a backlit liquid crystal panel, an organic EL panel, or the like.
  • the display panel 6 is a rectangle whose size in the X-axis direction (horizontal direction) such as 3: 2, 4: 3, and 16: 9 is longer than the size in the Y-axis direction (vertical direction), similar to an image captured by a general camera. Consists of.
  • the panel holder 7 is a panel holder that holds the display panel 6, and the display panel 6 and the panel holder 7 are adhesively fixed to form the display panel unit 8.
  • the first optical path dividing prism 9 and the second optical path dividing prism 10 are attached and adhered to form an optical path dividing prism unit 11 (optical path dividing member).
  • the optical path dividing prism unit 11 guides the light from the display panel 6 to the eyepiece window 17 provided in the viewing port 12, and conversely transmits the reflected light from the eyes (pupils) guided from the eyepiece window 17 to the line-of-sight detection sensor 30. Guide.
  • the display panel unit 8 and the optical path dividing prism unit 11 are fixed and integrally formed with the mask 33 interposed therebetween.
  • the eyepiece optical system 16 is composed of a G1 lens 13, a G2 lens 14, and a G3 lens 15.
  • the eyepiece window 17 is a transparent member that transmits visible light.
  • the image displayed on the display panel unit 8 is observed through the optical path dividing prism unit 11, the eyepiece optical system 16, and the eyepiece window 17.
  • the illumination windows 20 and 21 are windows for hiding the infrared LEDs 18, 19, 22 to 27 so that they cannot be seen from the outside, and are made of a resin that absorbs visible light and transmits infrared light.
  • FIG. 4A is a perspective view showing the configuration of the EVF portion of the camera 1
  • FIG. 4B is a cross-sectional view of the optical axis of the EVF portion.
  • the infrared LEDs 18, 19, 22, 23, 24, 25, 26, 27 are arranged at different positions and postures so that each of them irradiates infrared light toward the viewing port 12.
  • Infrared LEDs 18, 19, 23, 25 are infrared LEDs (light sources) for short-range illumination.
  • Infrared LEDs 22, 24, 26, 27 are infrared LEDs (light sources) for long-distance lighting. A light source other than the infrared LED may be used.
  • the line-of-sight detection optical system including the diaphragm 28 and the line-of-sight imaging lens 29 guides the infrared reflected light guided from the eyepiece window 17 by the optical path dividing prism unit 11 to the line-of-sight detection sensor 30.
  • the line-of-sight detection sensor 30 is composed of a solid-state image sensor such as a CCD or CMOS.
  • the infrared LEDs 18, 19, 22 to 27 irradiate the user's eyeball with light, and the line-of-sight detection sensor 30 reflects the reflected light from the user's eyeball (reflection emitted from the infrared LEDs 18, 19, 22 to 27 and reflected by the eyeball). Light) is received.
  • the eyepiece detection sensor 50 is composed of a photodiode or the like that can be driven with lower power than the line-of-sight detection sensor 30.
  • the infrared LED 22 also serves as an infrared LED for eyepiece detection. That is, the infrared LED 22 is used for both the line-of-sight detection and the eyepiece detection.
  • the infrared LED 22 irradiates the user with light, and the eyepiece detection sensor 50 receives the diffusely reflected light from the user (diffuse reflected light emitted from the infrared LED 22 and diffusely reflected by the user).
  • the optical image (eyeball image) of the light-irradiated eyeball passes through the eyepiece window 17, the G3 lens 15, the G2 lens 14, and the G1 lens 13. It enters the second optical path dividing prism 10 from the second surface 10a of the second optical path dividing prism 10.
  • a dichroic film that reflects infrared light is formed on the first surface 10b of the second optical path dividing prism, and as shown by the reflected optical path 31b, the eyeball image that enters the second optical path dividing prism 10 is the first. It is reflected on the side of the second surface 10a on the first surface 10b. Then, as shown by the imaging optical path 31c, the reflected eyeball image is totally reflected by the second surface 10a, and exits from the third surface 10c of the second optical path dividing prism 10 to the outside of the second optical path dividing prism 10. It passes through the aperture 28 and is imaged on the line-of-sight detection sensor 30 by the line-of-sight imaging lens 29. For the line-of-sight detection, a corneal reflection image formed by specularly reflecting the light emitted from the infrared LED on the cornea is used together with such an eyeball image.
  • FIG. 5 shows an example of an optical path in which light emitted from infrared LEDs 18, 19, 23, and 25 for short-range illumination is specularly reflected by the cornea 37 of the eyeball and received by the line-of-sight detection sensor 30.
  • FIG. 6 is a diagram for explaining the principle of the line-of-sight detection method, and is a schematic view of an optical system for performing line-of-sight detection. As shown in FIG. 6, the infrared LEDs 51a and 51b irradiate the user's eyeball 140 with infrared light.
  • a part of the infrared light emitted from the infrared LEDs 51a and 51b and reflected by the eyeball 140 is imaged in the vicinity of the line-of-sight detection sensor 30 by the line-of-sight imaging lens 29.
  • the positions of the infrared LEDs 51a and 51b, the line-of-sight imaging lens 29, and the line-of-sight detection sensor 30 are adjusted so that the principle of the line-of-sight detection method can be easily understood.
  • FIG. 7A is a schematic view of an eye image (eyeball image projected on the line-of-sight detection sensor 30) captured by the line-of-sight detection sensor 30, and
  • FIG. 7B is a diagram of the line-of-sight detection sensor 30 (for example, CCD). It is a figure which shows the output intensity.
  • FIG. 8 shows a schematic flowchart of the line-of-sight detection operation.
  • the infrared LEDs 51a and 51b emit infrared light toward the user's eyeball 140 at the emission intensity E2 for line-of-sight detection according to the instruction from the light source driving unit 205. ..
  • the user's eyeball image illuminated by infrared light is imaged on the line-of-sight detection sensor 30 through the line-of-sight imaging lens 29 (light receiving lens), and is photoelectrically converted by the line-of-sight detection sensor 30. As a result, a processable electrical signal of the eye image is obtained.
  • step S802 the line-of-sight detection unit 201 (line-of-sight detection circuit) sends an eye image (eye image signal; electrical signal of the eye image) obtained from the line-of-sight detection sensor 30 to the CPU 3.
  • eye image eye image signal; electrical signal of the eye image
  • step S803 the CPU 3 obtains the coordinates of the points corresponding to the corneal reflex images Pd and Pe of the infrared LEDs 51a and 51b and the pupil center c from the eye image obtained in step S802.
  • the infrared light emitted from the infrared LEDs 51a and 51b illuminates the cornea 142 of the user's eyeball 140.
  • the corneal reflex images Pd and Pe formed by a part of the infrared light reflected on the surface of the cornea 142 are focused by the line-of-sight imaging lens 29 and imaged on the line-of-sight detection sensor 30 to form an eye.
  • the light from the ends a and b of the pupil 141 is also imaged on the line-of-sight detection sensor 30 to become the pupil end images a'and b'in the eye image.
  • FIG. 7 (b) shows the luminance information (luminance distribution) of the region ⁇ 'in the eye image of FIG. 7 (a).
  • the horizontal direction of the eye image is the X-axis direction
  • the vertical direction is the Y-axis direction
  • the brightness distribution in the X-axis direction is shown.
  • the coordinates of the corneal reflex images Pd'and Pe'in the X-axis direction (horizontal direction) are Xd and Xe
  • the coordinates of the pupil end images a'and b'in the X-axis direction are Xa and Xb.
  • a brightness intermediate between the above two types of brightness can be obtained.
  • the X-coordinates Xd and Xe of the corneal reflection images Pd'and Pe'and the X-coordinates Xa and Xb of the pupil end images a'and b' can be obtained.
  • the coordinates with extremely high brightness can be obtained as the coordinates of the corneal reflection images Pd'and Pe'
  • the coordinates with extremely low brightness can be obtained as the coordinates of the pupil end images a'and b'. ..
  • the coordinate Xc (center of the pupil image) can be expressed as Xc ⁇ (Xa + Xb) / 2. That is, the coordinates Xc of the pupil center image c'can be calculated from the X coordinates Xa and Xb of the pupil edge images a'and b'. In this way, the coordinates of the corneal reflex images Pd'and Pe'and the coordinates of the pupil center image c'can be estimated.
  • step S804 the CPU 3 calculates the imaging magnification ⁇ of the eyeball image.
  • the imaging magnification ⁇ is a magnification determined by the position of the eyeball 140 with respect to the line-of-sight imaging lens 29, and can be obtained by using a function of the interval (Xd-Xe) between the corneal reflection images Pd'and Pe'.
  • step S805 the CPU 3 calculates the rotation angle of the optical axis of the eyeball 140 with respect to the optical axis of the line-of-sight imaging lens 29.
  • the X coordinate of the midpoint of the corneal reflex image Pd and the corneal reflex image Pe and the X coordinate of the center of curvature O of the cornea 142 substantially coincide with each other. Therefore, assuming that the standard distance from the center of curvature O of the cornea 142 to the center c of the pupil 141 is Occ, the rotation angle ⁇ x of the eyeball 140 in the ZX plane (plane perpendicular to the Y axis) is as follows. It can be calculated by the formula 1 of.
  • the rotation angle ⁇ y of the eyeball 140 in the ZZ plane can also be calculated by the same method as the calculation method of the rotation angle ⁇ x.
  • step S806 the CPU 3 uses the rotation angles ⁇ x and ⁇ y calculated in step S805 to display the user's viewpoint (the position where the line of sight is poured; the position where the user is looking) in the visual image displayed on the display panel 6. ) Is obtained (estimated).
  • the coordinates of the viewpoint (Hx, Hy) are the coordinates corresponding to the center of the pupil c
  • the coordinates of the viewpoint (Hx, Hy) can be calculated by the following equations 2 and 3.
  • Hx m ⁇ (Ax ⁇ ⁇ x + Bx) ⁇ ⁇ ⁇ (Equation 2)
  • Hy m ⁇ (Ay ⁇ ⁇ y + By) ⁇ ⁇ ⁇ (Equation 3)
  • the parameters m of equations 2 and 3 are constants determined by the configuration of the finder optical system (line-of-sight imaging lens 29, etc.) of the camera 1, and the rotation angles ⁇ x and ⁇ y are converted into the coordinates corresponding to the pupil center c in the visual recognition image. It is assumed that the conversion coefficient is determined in advance and stored in the memory unit 4.
  • the parameters Ax, Bx, Ay, and By are line-of-sight correction parameters that correct individual differences in the line of sight, are acquired by performing a known calibration operation, and are stored in the memory unit 4 before the line-of-sight detection operation starts. And.
  • step S807 the CPU 3 stores the coordinates (Hx, Hy) of the viewpoint in the memory unit 4, and finishes the line-of-sight detection operation.
  • the line-of-sight detection method is not limited to the above method.
  • the number of infrared LEDs used for line-of-sight detection may be more than or less than two, and line-of-sight detection is performed using one or more infrared LEDs including an infrared LED 22 that is used for both line-of-sight detection and eyepiece detection. Is done.
  • the number of infrared LEDs used for both eye-gaze detection and eyepiece detection may be more than or less than two. All infrared LEDs used for line-of-sight detection may also be used for eyepiece detection.
  • FIG. 9 shows a schematic flowchart of the operation of the camera 1 including eyepiece detection.
  • the infrared LED 22 lights up at the light emission intensity E1 for eyepiece detection according to the instruction from the light source driving unit 205.
  • the infrared LEDs 18, 19, 23 to 27 are preferably turned off from the viewpoint of reducing power consumption, but may be turned on.
  • the light emission intensity E1 for eyepiece detection and the light emission intensity E2 for line-of-sight detection may be the same, but may be different. In the present embodiment, it is assumed that the emission intensity E1 is set stronger than the emission intensity E2.
  • the infrared light from the infrared LED 22 is applied to the user, and the diffuse reflected light from the user is received by the eyepiece detection sensor 50.
  • step S902 the CPU 3 determines whether or not the amount of reflected light received by the eyepiece detection sensor 50, that is, the amount of light received by the eyepiece detection sensor 50 (light intensity; light receiving brightness) exceeds the eyepiece determination threshold Th1.
  • the eyepiece determination threshold Th1 is stored in the memory unit 4 in advance.
  • the amount of received light exceeds the eyepiece determination threshold Th1 it is determined that the user has eyepieced the eyepiece portion (finder; portion of the viewing opening 12), and the process proceeds to step S903.
  • the received light amount does not exceed the eyepiece determination threshold Th1
  • step S902 continues until the received amount exceeds the eyepiece determination threshold Th1. Repeat the process.
  • step S903 the line-of-sight detection operation as described in FIG. 8 is performed.
  • the emission intensity of the infrared LED 22 is controlled from the emission intensity E1 for eyepiece detection to the emission intensity E2 for line-of-sight detection.
  • At least one of the infrared LEDs 18, 19, 23 to 27 may be lit.
  • the emission intensity E1 is stronger than the emission intensity E2. That is, the emission intensity of the infrared LED 22 is weakened after the eyepiece is detected.
  • step S904 the CPU 3 determines whether or not the light receiving amount (light receiving intensity; light receiving brightness) of the eyepiece detection sensor 50 exceeds the eyepiece determination threshold Th2.
  • the eyepiece determination threshold Th2 is stored in the memory unit 4 in advance.
  • the eyepiece determination thresholds Th1 and Th2 are determined based on the emission intensity of the infrared LED 22, and since the emission intensity E2 is weaker than the emission intensity E1 in the present embodiment, the eyepiece determination threshold Th2 is set smaller than the eyepiece determination threshold Th1. Will be done. That is, the eyepiece determination threshold is reduced after the eyepiece is detected.
  • step S903 If the amount of received light exceeds the eyepiece determination threshold Th2, it is determined that the user has eyepieced the eyepiece portion (finder; portion of the viewing opening 12), and the process proceeds to step S903. On the other hand, when the amount of received light does not exceed the eyepiece determination threshold Th2, it is determined that the user has taken his / her eyes away from the eyepiece (eyes have been taken off), and the operation of FIG. 9 ends. Alternatively, the process returns to step S901.
  • the infrared LED 22 also serves as an infrared LED for eyepiece detection.
  • the eyepiece detection function and the eyepiece detection function can be realized with a smaller number of light sources as compared with the configuration in which the light source for eyepiece detection and the light source for eyepiece detection are separately provided.
  • the line-of-sight detection sensor 30 Since a two-dimensional eyeball image is used for line-of-sight detection, it is necessary to use a solid-state image sensor such as a CCD or CMOS as the line-of-sight detection sensor 30. On the other hand, in eyepiece detection, it is sufficient to determine whether or not the amount of reflected light from the user has reached a predetermined amount. Therefore, as the eyepiece detection sensor 50, a sensor that can be driven with low power such as a photodiode can be used. .. In the present embodiment, the power consumption of the electronic device is further reduced by using a sensor that can be driven with lower power than the line-of-sight detection sensor 30 as the eyepiece detection sensor 50.
  • the number, arrangement, and type of infrared LEDs are not particularly limited, but in the present embodiment, the eyeballs are in contact with each other by providing an infrared LED that illuminates a short distance and an infrared LED that illuminates a long distance. Highly accurate line-of-sight detection is possible regardless of whether it is near or far from the eye. Further, by providing a plurality of infrared LEDs, the line of sight can be detected more reliably. Specifically, even if the light from one of the infrared LEDs is blocked by an eyelid or the like and the cornea is not irradiated, the line of sight can be detected by illuminating the cornea with the other infrared LED.
  • the diffused light from the user is used for eyepiece detection, while the specularly reflected light from the cornea is used for eyeline detection. Therefore, the light source for eyepiece detection can be arranged with a high degree of freedom, but there are many restrictions on the arrangement of the light source for line-of-sight detection. As an example, consider a case where a light source for detecting the line of sight is arranged on the right side of the eyepiece optical system 16 (the positive direction side of the X axis perpendicular to the optical axis of the eyepiece optical system 16) or on the left side (the negative direction side of the X axis). ..
  • the eyepiece optical system 16 since the eyepiece optical system 16 has a shape longer in the X-axis direction than in the Y-axis direction, the light source is arranged at a position far away from the optical axis of the eyepiece optical system 16. Then, the specularly reflected light outside the eyeball is focused on the line-of-sight detection sensor 30, but the light from the light source is easily vignetted (easily blocked) by the eyelids. In particular, the above-mentioned vignetting is likely to occur when the user rotates the camera 1 by 90 ° around the Z axis to look into the eyepiece optical system 16 and takes a picture in a vertical position.
  • the infrared LED 22 used for both eyepiece detection and line-of-sight detection is arranged on the upper side of the eyepiece optical system 16 (on the positive direction side of the Y axis perpendicular to the optical axis of the eyepiece optical system 16).
  • the occurrence of vignetting is suppressed without impairing the comfort of the eyepiece.
  • the same effect can be obtained by arranging the infrared LED 22 on the lower side (negative direction side of the Y axis) of the eyepiece optical system 16. The same applies to the infrared LEDs 18, 19, 23 to 27.
  • the period during which the line-of-sight detection sensor 30 receives light is not particularly limited, but it is preferable that the line-of-sight detection sensor 30 starts receiving light (operation) after eyepiece detection. By doing so, the line-of-sight detection sensor 30 is not driven until the eyepiece is detected, so that the power consumption of the electronic device can be further reduced.
  • the emission intensity of the infrared LEDs 18, 19, 22 to 27 is not particularly limited, but in the present embodiment, the emission intensity of the infrared LEDs 18, 19, 22 to 27 can be controlled. Therefore, it is possible to light the infrared LEDs 18, 19, 22 to 27 with light emission intensities suitable for each of the line-of-sight detection and the eyepiece detection. Specifically, the emission intensity of the infrared LED 22, which is used for both eyepiece detection and eyepiece detection, is controlled to be weakened after eyepiece detection. As a result, when the eyepiece is detected, the eyepiece can be detected even when the eye is farther away, and when the line of sight is detected, the line of sight can be detected with less power consumption.
  • the emission intensity of the infrared LED 22 may be controlled to increase after the eyepiece detection. By doing so, the eyepiece detection can be performed with less power consumption during the eyepiece detection, and the intensity of the corneal reflex image is strengthened during the eyepiece detection, so that the eyepiece detection that is more resistant to disturbance becomes possible. It is preferable to set the light emission intensity for eyepiece detection and the light emission intensity for line-of-sight detection according to the performance required for the electronic device.
  • the emission intensity of the infrared LEDs 18, 19, 23 to 27 may also be appropriately controlled.
  • the infrared LEDs 18, 19, 23 to 27 are turned off during eyepiece detection (before eyepiece detection), and the emission intensity of infrared LEDs 18, 19, 23 to 27 is set to infrared LED 22 during eyepiece detection (after eyepiece detection). It may be controlled to the same emission intensity as.
  • the eyepiece determination threshold is not particularly limited, but in the present embodiment, the eyepiece determination threshold is controlled according to the control of the emission intensity of the light source (infrared LED). Specifically, since the emission intensity of the infrared LED 22 is weakened after the eyepiece detection, the eyepiece determination threshold value is also reduced after the eyepiece detection. As a result, the eyepiece detection can be suitably (highly accurate) before or after the eyepiece detection.
  • the emission intensity of the infrared LED 22 may be increased after the eyepiece detection, and in that case, it is preferable that the eyepiece determination threshold value is also increased after the eyepiece detection.
  • the above-described embodiment (including modified examples) is merely an example, and the present invention also includes a configuration obtained by appropriately modifying or changing the above-described configuration within the scope of the gist of the present invention. ..
  • the present invention also includes a configuration obtained by appropriately combining the above-described configurations.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiment to a system or device via a network or storage medium, and one or more processors in the computer of the system or device reads and executes the program. It can also be realized by the processing to be performed. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • a circuit for example, ASIC
  • Camera 30 Line-of-sight detection sensor 50: Eyepiece detection sensor 18, 19, 22-27: Infrared LED

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)
  • Automatic Focus Adjustment (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Exposure Control For Cameras (AREA)
  • Viewfinders (AREA)
  • Studio Devices (AREA)

Abstract

Un dispositif électronique selon la présente invention est capable d'exécuter une détection de contact oculaire pour détecter un contact oculaire avec un oculaire et une détection de ligne de visée pour détecter une ligne de visée d'un utilisateur. Le dispositif électronique est pourvu des éléments suivants : un capteur de détection de contact oculaire qui reçoit de la lumière pour la détection de contact oculaire ; un capteur de détection de ligne de visée qui est disposé séparément par rapport au capteur de détection de contact oculaire pour recevoir de la lumière pour la détection de ligne de visée ; et une ou plusieurs sources de lumière comprenant une source de lumière utilisée tant pour la détection de contact oculaire que pour la détection de ligne de visée.
PCT/JP2021/002814 2020-04-14 2021-01-27 Dispositif électronique WO2021210235A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180028226.0A CN115427883A (zh) 2020-04-14 2021-01-27 电子装置
US17/954,470 US20230013134A1 (en) 2020-04-14 2022-09-28 Electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020072186A JP2021170699A (ja) 2020-04-14 2020-04-14 電子機器
JP2020-072186 2020-04-14

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/954,470 Continuation US20230013134A1 (en) 2020-04-14 2022-09-28 Electronic device

Publications (1)

Publication Number Publication Date
WO2021210235A1 true WO2021210235A1 (fr) 2021-10-21

Family

ID=78084105

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002814 WO2021210235A1 (fr) 2020-04-14 2021-01-27 Dispositif électronique

Country Status (4)

Country Link
US (1) US20230013134A1 (fr)
JP (1) JP2021170699A (fr)
CN (1) CN115427883A (fr)
WO (1) WO2021210235A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023062979A1 (fr) * 2021-10-14 2023-04-20 キヤノン株式会社 Appareil électronique et dispositif d'imagerie

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05257059A (ja) * 1992-03-13 1993-10-08 Olympus Optical Co Ltd 視線検出装置
JPH07199047A (ja) * 1993-12-30 1995-08-04 Canon Inc 視線検出機能付カメラ
JPH09253050A (ja) * 1996-03-26 1997-09-30 Canon Inc 視線検出装置及び視線検出機能付撮像装置
JPH09262209A (ja) * 1996-03-28 1997-10-07 Canon Inc 視線検出手段,接眼検出手段を備えた撮像装置
JP2004012503A (ja) * 2002-06-03 2004-01-15 Canon Inc カメラ
JP2004215062A (ja) * 2003-01-07 2004-07-29 Minolta Co Ltd 撮像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05257059A (ja) * 1992-03-13 1993-10-08 Olympus Optical Co Ltd 視線検出装置
JPH07199047A (ja) * 1993-12-30 1995-08-04 Canon Inc 視線検出機能付カメラ
JPH09253050A (ja) * 1996-03-26 1997-09-30 Canon Inc 視線検出装置及び視線検出機能付撮像装置
JPH09262209A (ja) * 1996-03-28 1997-10-07 Canon Inc 視線検出手段,接眼検出手段を備えた撮像装置
JP2004012503A (ja) * 2002-06-03 2004-01-15 Canon Inc カメラ
JP2004215062A (ja) * 2003-01-07 2004-07-29 Minolta Co Ltd 撮像装置

Also Published As

Publication number Publication date
CN115427883A (zh) 2022-12-02
JP2021170699A (ja) 2021-10-28
US20230013134A1 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
US11650660B2 (en) Electronic device, control method, and non-transitory computer readable medium
JP2004326118A (ja) アイスタート能力を組み込んだ機器
WO2021210235A1 (fr) Dispositif électronique
US11822714B2 (en) Electronic device and control method for capturing an image of an eye
US20220329740A1 (en) Electronic apparatus, method for controlling electronic apparatus, and non-transitory computer readable storage medium
WO2021210225A1 (fr) Dispositif électronique
US11971552B2 (en) Electronic device, method of controlling the same, and storage medium
JP2021182736A (ja) 電子機器
JP2022124778A (ja) 電子機器
WO2021085541A1 (fr) Dispositif d'imagerie, appareil électronique et unité de recherche
US20230224561A1 (en) Display apparatus, finder apparatus, and imaging apparatus
US20230092593A1 (en) Detection device detecting gaze point of user, control method therefor, and storage medium storing control program therefor
US20240114228A1 (en) Line-of-sight detecting apparatus, image pickup apparatus, line-of-sight detecting method, and storage medium
JP2021076832A (ja) 撮像装置、電子機器、ファインダーユニット
JP2023083695A (ja) 電子機器
JP2008083078A (ja) カメラ
JP2024003432A (ja) 電子機器
JP2023087377A (ja) 視線検出装置
JP2023063023A (ja) 電子機器及び電子機器の制御方法
JP2022096819A (ja) 視線検出装置
JP2022059450A (ja) 表示装置、光電変換装置、電子機器、照明装置、移動体およびウェアラブルデバイス
JP2022187913A (ja) 表示装置、撮像装置、表示装置の制御方法、プログラム、および記録媒体
JPH0990200A (ja) 視線検出手段を有するカメラ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21788071

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21788071

Country of ref document: EP

Kind code of ref document: A1