EP4054169A1 - Imaging device, electronic apparatus, and finder unit - Google Patents
Imaging device, electronic apparatus, and finder unit Download PDFInfo
- Publication number
- EP4054169A1 EP4054169A1 EP20881105.9A EP20881105A EP4054169A1 EP 4054169 A1 EP4054169 A1 EP 4054169A1 EP 20881105 A EP20881105 A EP 20881105A EP 4054169 A1 EP4054169 A1 EP 4054169A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- unit
- sensor
- line
- display panel
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 55
- 230000003287 optical effect Effects 0.000 claims abstract description 148
- 238000001514 detection method Methods 0.000 claims description 163
- 238000005286 illumination Methods 0.000 claims description 28
- 210000001747 pupil Anatomy 0.000 claims description 18
- 239000000463 material Substances 0.000 claims description 7
- 238000002834 transmittance Methods 0.000 claims description 6
- 239000000758 substrate Substances 0.000 claims description 5
- 238000013459 approach Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 27
- 210000005252 bulbus oculi Anatomy 0.000 description 25
- 210000004087 cornea Anatomy 0.000 description 25
- 238000010586 diagram Methods 0.000 description 24
- 210000001508 eye Anatomy 0.000 description 20
- 235000019557 luminance Nutrition 0.000 description 14
- 239000004065 semiconductor Substances 0.000 description 13
- 239000011521 glass Substances 0.000 description 12
- 239000000428 dust Substances 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 238000000034 method Methods 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000005357 flat glass Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 239000000853 adhesive Substances 0.000 description 3
- 230000001070 adhesive effect Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 238000012905 input function Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000004544 sputter deposition Methods 0.000 description 2
- 108091026901 Evf-1 Proteins 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/53—Constructional details of electronic viewfinders, e.g. rotatable or detachable
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/02—Viewfinders
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/02—Viewfinders
- G03B13/06—Viewfinders with lenses with or without reflectors
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/12—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
- G03B17/20—Signals indicating condition of a camera member or suitability of light visible in viewfinder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
- H04N23/811—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
Definitions
- the present invention relates to an imaging apparatus with a line-of-sight detection function, an electronic device, and a finder unit.
- EVFs Electronic viewfinders
- a user moves an eye of the user close to an EVF to view a display image.
- J PTL1 discusses a technique that uses a common optical path as both an optical path of an electronic viewfinders (EVF) display and an optical path of a line-of-sight input sensor and provides an optical path split mirror on the common optical path to detect a line-of-sight direction (position) of an eye that views the EVF display.
- EVF electronic viewfinders
- PTL1 does not discuss an optical arrangement of an electronic viewfinder (EVF) in an imaging apparatus with a line-of-sight input function.
- EMF electronic viewfinder
- a suitable structure for preventing an effect of dust in a finder and a relative arrangement of an operation member and a built-in member of an imaging apparatus and a finder and a line-of-sight detection mechanism are not considered.
- the present invention is directed to a suitable EVF arrangement in an imaging apparatus including an EVF and a line-of-sight input function that prevents a decrease in operability of the imaging apparatus.
- a display unit (19) including a display unit (18) is fixed to an optical path split unit (16) configured to split an optical path to a line-of-sight detection sensor in an electronic viewfinder (EVF) having a line-of-sight detection function.
- EMF electronic viewfinder
- the present invention prevents a decrease in operability of an imaging apparatus including an electronic viewfinder (EVF) and a line-of-sight input function.
- EVF electronic viewfinder
- Fig. 1 is a diagram illustratively illustrating external views of a camera body 101 as an electronic device according to a first exemplary embodiment of the present invention.
- the camera body 101 and a lens unit 102 described below will collectively be referred to as an imaging apparatus 100.
- an electronic device according to an aspect of the present invention is applicable to any electronic devices capable of detecting a device displaying information, such as images and characters, and a line of sight of a user viewing the information displayed on the device. Examples of the electronic devices may include mobile phones, game machines, tablet terminals, personal computers, watch- or glasses-type information terminals, and head-mount displays.
- Fig. 1A is a front perspective view
- Fig. 1B is a rear perspective view
- the camera body 101 includes a release button 103 disposed thereon.
- the release button 103 is an operation member for receiving an imaging operation from a user (image capturing person).
- an eyepiece portion 107 is disposed on a rear side of the camera body 101, and the user can look into an electronic viewfinder (EVF) unit 1 included in the camera body 101 through the eyepiece portion 107.
- EMF electronic viewfinder
- the eyepiece portion 107 forms an eyehole and is externally protruded (directed behind) from the camera body 101.
- An operation member 106 is also disposed on the rear side of the camera body 101. The operation member 106 receives various operations from the user.
- a first operation member 106A is an operation lever that can be pressed in each direction.
- a second operation member 106B is a four-direction key that can be pressed in four directions.
- a display monitor 104 touch panel includes a display panel, such as a liquid crystal panel, and has a function of displaying images on the display panel.
- FIG. 2 is a block diagram illustratively illustrating an internal configuration of the camera body 101 according to an aspect of the present invention.
- An image sensor 105 is an image sensor, such as a charge-coupled device (CCD) sensor or complementary metal oxide semiconductor (CMOS) sensor.
- CCD charge-coupled device
- CMOS complementary metal oxide semiconductor
- An optical image formed on an image plane of the image sensor 105 by an optical system of the lens unit 102 is photoelectrically converted by the image sensor 105 to obtain an analog image signal, the obtained analog image signal is output to an analog/digital (A/D) conversion unit, and image data is output from the AD conversion unit.
- A/D analog/digital
- the lens unit 102 consists of the optical system including a zoom lens (not illustrated), a focus lens (not illustrated), and a diaphragm (not illustrated). Mounted on the camera body 101, the lens unit 102 guides light beams from a subject to the image sensor 105 and forms a subject image on the image plane of the image sensor 105.
- a diaphragm control unit 118, a focal point adjustment unit 119, and a zoom control unit 120 each receive instruction signals from a central processing unit (CPU) 111 via a mount contact portion 108B provided on a mount portion 108 and controls driving of the diaphragm, the focus lens, and the zoom lens based on the instruction signals.
- CPU central processing unit
- the CPU 111 as a control unit of the camera body 101 reads a control program for blocks included in the camera body 101 from a read-only memory (ROM) of a memory unit 112, develops the read program to a random access memory (RAM) of the memory unit 112, and executes the developed program. By doing so, the CPU 111 controls operations of the blocks of the camera body 101 and comprehensively controls the camera body 101 and the lens unit 102.
- a line-of-sight detection unit 4, a light measurement unit 113, an automatic focal point detection unit 114, a signal input unit 115, an EVF driving unit 2, and a light source driving unit 3 are connected to the CPU 111.
- the CPU 111 transmits signals to the focal point adjustment unit 119 and the diaphragm control unit 118 disposed in the lens unit 102 via the mount contact portion 108B.
- the memory unit 112 has a function of storing imaged signals from the image sensor 105 and a line-of-sight detection sensor 53.
- the line-of-sight detection unit 4 performs A/D conversion on the output (eye image obtained by imaging an eye) from the line-of-sight detection sensor 53 and acquires a result of the A/D conversion.
- the CPU 111 extracts feature points necessary for line-of-sight detection from the eye image according to a predetermined algorithm described below and calculates a line of sight (gaze point on an image for viewing) of the user from positions of the feature points.
- the light measurement unit 113 performs amplification, logarithmic compression, and A/D conversion on signals, which are acquired from the image sensor 105 serving also as a light measurement sensor, specifically luminance signals corresponding to the brightness of a field, and acquires a result as field luminance information.
- the automatic focal point detection unit 114 performs A/D conversion on signal voltages from a plurality of detection elements (plurality of pixels) that is included in pixels of the image sensor 105 and is used for phase difference detection.
- the CPU 111 then calculates a distance to a subject corresponding to a focal point detection point based on the converted signals of the plurality of detection elements.
- This is a publicly-known technique known as image plane phase difference autofocusing (image plane phase difference AF).
- image plane phase difference AF image plane phase difference AF
- a field image (image for viewing) in the finder is split, and 180 split portions on the image plane each include a focal point detection point.
- An image processing unit 116 performs various types of image processing on image data stored in the RAM Specifically, the image processing unit 116 applies various types of image processing for developing digital image data and displaying and recording the developed digital image data. Examples of the various types of image processing include correction processing for correcting pixel defects originating from an optical system or an image sensor, demosaicing processing, white balance correction processing, color interpolation processing, and gamma processing.
- Switches SW1 and SW2 are connected to the signal input unit 115.
- the switch SW1 is turned on by a first stroke of the release button 103, and the switch SW2 is turned on by a second stroke of the release button 103.
- An instruction to start a light measurement, distance measurement, or line-of-sight detection operation of the camera body 101 is issued with the switch SW1, and an instruction to start an imaging operation is issued with the switch SW2.
- ON signals from the switches SW1 and SW2 are input to the signal input unit 115 and transmitted to the CPU 111.
- the signal input unit 115 also receives operation input from the operation members 106A and 106B and the display monitor 104 illustrated in Fig. 1B .
- a recording/outputting unit 117 records data including image data in a recording medium such as a removable memory card, or such data is output to external devices via an external interface.
- a basic structure of the imaging apparatus 100 including the camera body 101 and the lens unit 102 has been described above.
- Fig. 3 is a cross-sectional view illustrating the camera body 101 according to the first exemplary embodiment of the present invention and is a cross section cut along a YZ-plane formed by Y- and Z-axes illustrated in Fig. 1A .
- a shutter 109 is a focal-plane mechanical shutter, and the shutter 109 and the image sensor 105 are arranged in this order from the side where the lens unit 102 is disposed.
- a main substrate (not illustrated) is provided between the image sensor 105 and the display monitor 104.
- the lens unit 102 is attachable to and detachable from the camera body 101 via a mount 108A. In a state illustrated in Fig. 3 , the lens unit 102 is mounted on the camera body 101.
- the EVF 1 is a display unit provided at an upper portion of the camera body 101 and has a line-of-sight detection function described below.
- the upper portion of the camera body 101 is a side where the EVF unit 1 is provided inside an exterior cover 110, whereas a lower portion of the camera body 101 is a side where not the EVF unit 1 but the display monitor 104 and the image sensor 105 are provided.
- the EVF unit (finder unit) 1 can display menus and images as a typical EVF similarly to the display monitor 104 and, furthermore, has the line-of-sight detection function of detecting a line of sight of the user looking into the EVF. Further, the EVF unit 1 is configured to reflect line-of-sight detection results in the control of the camera body 101. Details of the line-of-sight detection function will be described below.
- the user can look into the EVF unit 1 through the eyepiece portion 107 including an eyecup 9, which is removable, attached to an eyecup attachment frame 8.
- Fig. 4 is a cross-sectional view illustrating the EVF unit 1 according to the first exemplary embodiment of the present invention and illustrates a longitudinal cross section with respect to an optical axis of the EVF unit 1 (i.e., an imaging optical axis of an EVF lens unit 29).
- the position of the EVF lens unit 29 illustrated in Fig. 4 indicates a case where a diopter is -1.
- An optical path split prism unit 16 is bonded and fixed to a fixed lens-barrel 12 with a prism mask 13 therebetween.
- the optical path split prism unit 16 is an optical path split unit including a first optical path split prism 14 and a second optical path split prism 15 bonded together.
- a panel holder 17 is a holding mechanism for holding a display panel.
- a display panel 18 and the panel holder 17 are bonded together and fixed to form a display panel unit 19 of the EVF unit 1.
- the display panel unit 19 and the optical path split prism unit 16 are fixed with a panel mask 20 therebetween.
- the panel mask 20 is a mask unit.
- the EVF lens unit (eyepiece optical system) 29 of the EVF unit 1 includes a G1 lens 21, a G2 lens 25, and a finder lens (G3 lens) 26.
- the G1 lens 21 is held by a G1 lens holder 22.
- a G1-G2 mask 23 is also held by the G1 lens holder 22.
- a G2-G3 holder 24 holds the G2 lens 25 and the G3 lens as the finder lens 26.
- the G1 lens holder 22 is fixed to the G2-G3 holder 24 with screws (not illustrated).
- a G3 mask 27 is fixed to the G2-G3 holder 24 with a double-sided masking tape 28.
- the foregoing members 21 to 28 will collectively be referred to as the EVF lens unit 29.
- a light emitting diode (LED) holder 30, an eyepiece window holder 31, and an eyepiece window 32 are provided in the eyepiece portion 107 of the EVF unit 1.
- LED light emitting diode
- a structure of the eyepiece window 32 will be described with reference to Fig. 5.
- Fig. 5 is a diagram illustratively illustrating the eyepiece window 32 according to the first exemplary embodiment of the present invention and illustrates the eyepiece window 32 viewed from the EVF lens unit 29 side (inside of the camera).
- the eyepiece window 32 is an optical member having R1 and R2 surfaces.
- the R1 surface is on a side facing the inside of the camera, and the R2 surface is on the opposite side (outside the camera).
- Abase material of the eyepiece window 32 is a transparent plate glass.
- a transparent region 32a is a blank region, and a visible light cut mask 32b illustrated as a cross-hatched region surrounds the transparent region. Specifically, the transparent region 32a is located at a portion corresponding to an imaging optical path of the EVF unit 1, and the visible light cut mask 32b is disposed at a region to avoid the imaging optical path.
- the visible light cut mask 32b is a printed material printed with a material (ink) that absorbs visible light and transmits infrared light. A color of an appearance of the visible light cut mask 32b is black under visible wavelength light.
- the visible light cut mask 32b may be configured to have a lower visible wavelength light transmittance than an infrared wavelength light transmittance.
- Fig. 6 is a cross-sectional view illustrating the eyepiece window 32 according to an aspect of the present invention along a cross section A-A illustrated in Fig. 5 .
- a plate glass 32c is the base material of the eyepiece window 32.
- An antireflection coat 32d is a coating disposed on a side of the eyepiece window 32 that faces the inside of the camera. The antireflection coat 32d prevents reflection of unnecessary light.
- an antireflection (AR) film 32e is provided on the opposite side to the antireflection coat 32d across the plate glass 32c.
- the AR film 32e has a function of a hard coat, an antireflection coat, and anti-scattering. With the AR film 32e having the anti-scattering function on the side of the eyepiece window 32 that faces the outside of the camera, safety is improved in a situation where the eyepiece window 32 is damaged.
- the user can observe display images displayed on the display panel unit 19 through the optical path split prism unit 16, the EVF lens unit 29, and the transparent region 32a of the eyepiece window 32.
- Fig. 7 is a diagram illustrating a structure of the EVF unit 1 viewed from the eyepiece portion 107 side with the eyepiece window 32 removed. Specifically, Fig. 7 is a diagram illustrating details of the EVF unit 1 in a case where the camera body 101 is viewed from the rear side.
- a display opening mask 31a is provided at a center of the eyepiece window holder 31.
- the display opening mask 31a is horizontally long correspondingly to an aspect ratio of the EVF and has substantially the same shape as the transparent region 32a of the eyepiece window 32.
- a diopter adjustment dial 33 is rotatably held on a side surface (right side surface) of the EVF unit 1 with a shaft screw 34.
- the diopter adjustment dial is a diopter adjustment unit that moves a lens group of the EVF lens unit 29 in a direction parallel to the optical axis to adjust a diopter when the user looks into the EVF unit 1.
- a right side of the camera body 101 is the side where the release button 103, the first operation member 106A, and the second operation member 106B are located in a direction parallel to an X-axis direction illustrated in Fig. 1A .
- Fixing screws 35 fix the eyepiece window holder 31 and the LED holder 30 together to the fixed lens-barrel 12.
- Infrared LEDs 36, 37, 38, and 39 for line of sight are illumination units for line of sight mainly for short-range illumination and are respectively disposed in opening masks 31b, 31c, 31d, and 31e for LEDs of the eyepiece window holder 31.
- Infrared LEDs 40, 41, 42, and 43 for line of sight are illumination units for line of sight mainly for long-range illumination and are respectively disposed in opening masks 31f, 31g, 31h, and 31i for LEDs of the eyepiece window holder 31.
- the foregoing eight infrared LEDs for line of sight are disposed such that illumination light beams are narrowed by the opening masks for LEDs to emit infrared light toward different positions (emission directions).
- An infrared LED 44 for proximity detection is an illumination unit of a proximity detection unit that detects an approach of an object (mainly the user) to the EVF unit 1 together with a sensor 45 for proximity detection, which will be described below.
- the infrared LED 44 for proximity detection is disposed in an opening mask 31j for LEDs.
- the sensor 45 for proximity detection is a detection unit that receives light beams emitted from the infrared LED for proximity detection in the predetermined emission direction and reflected via an object.
- the sensor 45 for proximity detection is disposed in an opening mask 31k for sensors.
- the four infrared LEDs for line of sight are disposed at positions along the upper long side of the display opening mask 31a, and the other four infrared LEDs for line of sight (eight infrared LEDs for line of sight in total) are disposed at positions along the lower long side of the display opening mask 31a.
- the infrared LED for proximity detection, the sensor for proximity detection, and the diopter adjustment mechanism are disposed at positions along the shorter sides of the display opening mask 31a.
- the eight infrared LEDs 36 to 39 for line of sight, the infrared LED 44 for proximity detection, and the sensor 45 for proximity detection are disposed outside the display opening mask 31a with respect to the optical axis of the EVF unit 1.
- the illumination units and the detection units are disposed at (overlap on the two-dimensional projection plane) a region where the visible cut mask 32b of the eyepiece window 32 is located in a case where the EVF unit 1 is viewed from the rear side of the camera body 101, so that the illumination units and the detection units are not easily visible from the user.
- the illumination units and the detection units are invisible from the outside.
- FIG. 8 is a cross-sectional view illustrating a main portion of the EVF unit 1 and illustrates a longitudinal cross section at a position including the infrared LED 39 for line of sight mainly for short-range illumination in the direction parallel to the optical axis of the EVF unit 1.
- the state illustrated in Fig. 8 indicates a case where the diopter of the EVF unit 1 is +2.0 as the position of the EVF lens unit 29.
- the infrared LED 39 for line of sight for short-range illumination is pressed against the eyepiece window holder 31 by a cushion member 46 attached to the LED holder 30 and is fixed to the opening mask 31e for LEDs.
- a light-shielding wall 31m is a light-shielding member that prevents infrared light emitted from the infrared LED 39 for line of sight for short-range illumination from directly entering the finder lens 26.
- the finder lens 26 includes an R2 surface 26a.
- the R2 surface 26a is a convex lens, and a space between the finder lens 26 and the eyepiece window 32 increases at great distances from a lens center toward peripheries. Since the light-shielding wall 31m is disposed in the increased region, illumination light from the infrared LED 39 for line of sight for short-range illumination can illuminate even a region at a short distance from the eyepiece window 32, for example.
- the eyepiece window 32 includes the visible cut mask 32b and the transparent region 32a as a single continuous member, there is no need for a separate connection portion for connecting the visible cut mask 32b and the transparent region 32a together. Thus, vignetting of light beams in a connection portion does not occur, and light distribution characteristics of the EVF unit 1 to nearby regions improve.
- the infrared LED 39 for line of sight is mainly described above, the infrared LEDs 36, 37, and 38 for line of sight and the infrared LEDs 40, 41, 42, and 43 for line of sight have a similar structure and produce the above-described advantages.
- the infrared LED 44 for proximity detection and the sensor 45 for proximity detection are packaged into a single unit as a proximity detection unit 47.
- Fig. 9 is a cross-sectional (horizontal sectional) view illustrating a main portion of the EVF unit 1 at a position including the infrared LED 44 for proximity detection according to the first exemplary embodiment of the present invention.
- Fig. 10 is a cross-sectional (horizontal sectional) view illustrating a main portion of the EVF unit 1 at a position including the sensor 45 for proximity detection according to the first exemplary embodiment of the present invention.
- the positions of the infrared LED 44 for proximity detection and the sensor 45 for proximity detection as the proximity detection unit 47 in the EVF unit 1 are determined. Specifically, the proximity detection unit 47 is pressed against the eyepiece window holder 31 via a second cushion member 48 to determine the position of the proximity detection unit 47 in the EVF unit 1. As illustrated in Fig. 9 , the proximity detection unit 47 is disposed such that an illumination surface of the infrared LED 44 for proximity detection is inclined at an angle of 10 degrees with respect to a line parallel to the optical axis of the EVF unit 1. Similarly,
- a sensor surface of the sensor 45 for proximity detection faces a direction inclined at an angle of 10 degrees with respect to the line parallel to the optical axis of the EVF unit 1.
- Fig. 11 is a cross-sectional view of a position including the proximity detection unit 47 and is a cross-sectional view that includes the infrared LED 44 for proximity detection and the sensor 45 for proximity detection and is in the direction parallel to the optical axis of the EVF unit 1.
- Fig. 12 is a graph illustratively illustrating light transmission/reception characteristics of the proximity detection unit 47.
- a horizontal axis represents angles from a normal direction of the proximity detection unit 47
- a vertical axis represents intensities of light transmission/reception characteristics using the normal direction as a reference.
- FIG. 9 a region illustrated as a cross-hatched portion 49 illustrated in Fig. 9 in an inner wall portion of the eyecup attachment frame 8 and the eyecup 9, which is removable, is illuminated with infrared light emitted from the infrared LED 44 for proximity detection.
- the sensor 45 for proximity detection captures the light in a region illustrated as a cross-hatched portion 50 illustrated in Fig. 10 in the inner wall portion of the eyecup attachment frame 8 and the eyecup 9, which is removable.
- a signal level for detection with respect to a front direction is not adversely affected because a detection direction is directed in the direction parallel to the optical axis of the EVF unit 1.
- a crosstalk optical path is an optical path via which infrared light emitted from the infrared LED 44 for proximity detection travels through an optical path 51 with one time of reflection by traveling through the R1 surface of the eyepiece window 32, being reflected by the R2 surface, traveling through the R1 again, and then reaching the sensor 45 for proximity detection. While there are cases where a plurality of times of reflection is involved, only the case where one time of reflection is involved will be described below, because an intensity decreases significantly correspondingly to the number of times of reflection. Specifically, since the reflectance of the R2 surface is approximately 2%, the intensity is 2% in a case with one time of reflection, 0.04% in a case with two times of reflection, and 0.0008% in a case with three times of reflection.
- the intensity of infrared light in the direction of 29 degrees illustrated in Fig. 11 is approximately 25% with respect to light emitted from a center of the infrared LED 44 for proximity detection.
- FIG. 13 is a cross-sectional (normal cross-sectional) view illustrating a main portion of the proximity detection unit 47 according to the first exemplary embodiment of the present invention and illustrates a normal cross section of the proximity detection unit 47 through the center of the infrared LED 44 for proximity detection and a point P
- the proximity detection unit 47 is inclined at a predetermined angle with respect to the eyepiece window 32, and this effectively reduces crosstalk through a wall surface of the eyecup attachment frame 8 and crosstalk through in-plane reflection by the eyepiece window 32.
- the line-of-sight detection unit 4 includes at least a line-of-sight imaging lens 52, the line-of-sight detection sensor 53, and a diaphragm 56 for line-of-sight detection and may include the infrared LEDs 36 to 43 for line of sight and the optical path split prism unit 16.
- Fig. 14 is a perspective view illustratively illustrating mainly a portion relating to the line-of-sight detection function in the EVF unit 1 according to the first exemplary embodiment of the present invention.
- Fig. 15 is a cross-sectional view illustrating mainly the portion relating to the line-of-sight detection function in the EVF unit 1 according to the first exemplary embodiment of the present invention and is a transverse cross-sectional view with respect to the direction parallel to the optical axis of the EVF unit 1.
- Fig. 16 is a diagram illustratively illustrating an optical path for line-of-sight detection in the EVF unit 1 according to an aspect of the present invention with respect to the perspective view illustrated in Fig. 14 .
- the infrared LEDs 36 to 43 for line of sight are provided at different positions in different orientations to emit infrared light in different directions to the eyeball of the user.
- the infrared LEDs 36 to 43 for line of sight are disposed in different orientations with respect to a glass surface of the plate glass 32c of the eyepiece window 32.
- the infrared LEDs 36, 37, 38, and 39 for line of sight are illumination devices that emit infrared wavelength light mainly for short-range illumination and use a LED as a light source.
- the infrared LEDs 40, 41, 42, and 43 for line of sight are illumination devices that emit infrared wavelength light mainly for long-range illumination and use a LED as a light source.
- the line-of-sight imaging lens 52 is an optical system for line-of-sight detection and forms an image of light emitted from the infrared LEDs for line of sight and reflected by the eye of the user on the line-of-sight detection sensor 53.
- the line-of-sight detection sensor 53 is a sensor for line-of-sight detection using a solid-state image sensor, such as a CCD image sensor, and is a detection unit capable of detecting a line of sight of the eye of the user near the eyepiece portion 107. Since the line-of-sight detection sensor 53 images infrared wavelength reflection light for line-of-sight detection, the line-of-sight detection sensor 53 can be configured to acquire either one of color images or monochrome images. An opening size of the diaphragm 56 for line-of-sight detection is adjusted to adjust the amount of light entering the line-of-sight detection sensor 53 and to adjust a depth of field such that images of the eyeball of the user are not blurred.
- infrared light emitted from a predetermined infrared LED for line of sight travels as an eyeball image reflected by the eyeball of the user through the eyepiece window 32, the finder lens 26, the G2 lens 25, and the G1 lens 21, and enters a second surface 14a of the first optical path split prism 14.
- An incidence optical path of the incident light is illustrated as an optical path 54 in Fig. 15 .
- a dichroic layer that reflects infrared light is formed on a first surface 14b of the first optical path split prism 14.
- An optical path of the reflection light is illustrated as a reflection optical path 55a in Fig. 15 .
- the reflection light traveling through the reflection optical path 55a is totally reflected by the second surface 14a, travels through an imaging optical path 55b, and forms an image on the line-of-sight detection sensor 53 through the line-of-sight imaging lens 52 via a diaphragm 56.
- the camera body 101 uses a cornea reflection image together with a pupil image of the eyeball for line-of-sight detection.
- the cornea reflection image is formed by specular reflection of illumination from the infrared LEDs for line of sight by a cornea 142 of the eyeball of the user.
- Fig. 17 is a diagram illustratively illustrating an arrangement of the eyeball of the user and reflection images.
- Fig. 17 illustrates a case where an eyeball image and cornea reflection images have a short eyeball distance.
- the eyeball of the user includes a pupil 141 and an iris 143, and cornea reflection images 144 to 147 are cornea reflection images formed by light emitted from the infrared LEDs for line of sight.
- a reflection image corresponding to the infrared LED 36 for line of sight is the cornea reflection image 144.
- a reflection image corresponding to the infrared LED 37 for line of sight is the cornea reflection image 145.
- a reflection image corresponding to the infrared LED 38 for line of sight is the cornea reflection image 146.
- a reflection image corresponding to the infrared LED 39 for line of sight is the cornea reflection image 147.
- the line-of-sight detection detects a line of sight based on a relative relationship between a pupil center and cornea reflection images.
- the line-of-sight detection can use, for example, a method discussed in Japanese Patent No. 3186072 , and detailed descriptions of the line-of-sight detection method are omitted.
- Fig. 18 is a diagram illustratively illustrating a principle of the line-of-sight detection method and is a schematic diagram illustrating an optical system for performing line-of-sight detection.
- light sources 131a and 131b as a virtual light source 131 are disposed substantially symmetrically with respect to an optical axis of a light receiving lens 130 and illuminate an eyeball 140 of the user. Part of the light emitted from the light sources 131a and 131b and reflected by the eyeball 140 is focused on the line-of-sight detection sensor 53 by the light receiving lens 130.
- Fig. 18 is a diagram illustratively illustrating a principle of the line-of-sight detection method and is a schematic diagram illustrating an optical system for performing line-of-sight detection.
- light sources 131a and 131b as a virtual light source 131 are disposed substantially symmetrically with respect to an optical axis of a light receiving lens 130 and illuminate an eyeball 140 of the user. Part of the light e
- FIG. 19 is a diagram illustratively illustrating an eye image on the line-of-sight detection sensor.
- Fig. 19A is a schematic diagram illustrating an eye image (an eyeball image projected to the line-of-sight detection sensor 53) imaged by the line-of-sight detection sensor 53
- Fig. 19B is a diagram illustrating CCD output intensities of the line-of-sight detection sensor 53.
- Fig. 20 is a schematic flowchart illustrating a line-of-sight detection operation according to the first exemplary embodiment of the present invention.
- step S801 in Fig. 20 the light sources 131a and 131b emit infrared light toward the eyeball 140 of the user.
- An eyeball image of the user illuminated with the infrared light travels through the light receiving lens 130 and is formed on the line-of-sight detection sensor 53, and the line-of-sight detection sensor 53 photoelectrically converts the formed image. A processible electric signal of the eye image is thereby acquired.
- step S802 the line-of-sight detection unit 4 transmits the eye image (eye image signal; electric signal of the eye image) acquired from the line-of-sight detection sensor 53 to the CPU 111.
- step S803 the CPU 111 calculates, from the eye image acquired in step S802, coordinates of points corresponding to cornea reflection images Pd and Pe of the light sources 131a and 131b and a pupil center c.
- the infrared light emitted from the light sources 131a and 131b illuminates the cornea 142 of the eyeball 140 of the user.
- the cornea reflection images Pd and Pe formed by part of the infrared light reflected by a surface of the cornea 142 are focused by the light receiving lens 130, and form images on the line-of-sight detection sensor 53 to obtain cornea reflection images Pd' and Pe' on the eye image.
- light beams from end portions a and b of the pupil 141 form images on the line-of-sight detection sensor 53 to obtain pupil end images a' and b' on the eye image.
- Fig. 19B illustrates luminance information (luminance distribution) about a region ⁇ ' of an eye image in Fig. 19A.
- Fig. 19B illustrates a luminance distribution in an X-axis direction, where the X-axis direction is a horizontal direction of the eye image and a Y-axis direction is a vertical direction of the eye image.
- coordinates Xd and Xe are coordinates of the cornea reflection images Pd' and Pe' in the X-axis direction (horizontal direction)
- coordinates Xa and Xb are coordinates of the pupil end images a' and b' in the X-axis direction.
- Fig. 19B illustrates luminance information (luminance distribution) about a region ⁇ ' of an eye image in Fig. 19A.
- Fig. 19B illustrates a luminance distribution in an X-axis direction, where the X-axis direction is a horizontal direction of the eye image and a Y-axis direction is a vertical direction
- intermediate luminances between the above-described two types of luminances are obtained in a region of X-coordinates (coordinates in the X-axis direction) smaller than the coordinate Xa and a region of X-coordinates greater than the coordinate Xb.
- the X-coordinates Xd and Xe of the cornea reflection images Pd' and Pe' and the X-coordinates Xa and Xb of the pupil end images a' and b' are obtained from the luminance distribution illustrated in Fig. 19B . Specifically, coordinates with extremely high luminances are obtained as coordinates of the cornea reflection images Pd' and Pe', and coordinates with extremely low luminances are obtained as coordinates of the pupil end images a' and b'.
- a coordinate Xc of a pupil center image c' (pupil image center) formed on the line-of-sight detection sensor 53 by light beams from the pupil center c is expressed as Xc ⁇ (Xa + Xb)/2.
- the coordinate Xc of the pupil center image c' is calculated from the X-coordinates Xa and Xb of the pupil end images a' and b'.
- the coordinates of the cornea reflection images Pd' and Pe' and the pupil center image c' are estimated as described above.
- step S804 the CPU 111 calculates an imaging magnification ⁇ of the eyeball image.
- the imaging magnification ⁇ is a magnification that is determined based on the position of the eyeball 140 with respect to the light receiving lens 130 and is obtained using a function of an interval (Xd - Xe) between the cornea reflection images Pd' and Pe'.
- step S805 the CPU 111 calculates rotation angles of the optical axis of the eyeball 140 with respect to the optical axis of the light receiving lens 130.
- An X-coordinate of a midpoint between the cornea reflection images Pd and Pe substantially matches an X-coordinate of a center of curvature O of the cornea 142.
- a rotation angle ⁇ ° x of the eyeball 140 in a Z-X plane is calculated using formula (1) below, where Oc is a standard distance from the center of curvature O of the cornea 142 to the center c of the pupil 141.
- a rotation angle ⁇ y of the eyeball 140 in a Z-Y plane is calculated using a method similar to the method for calculating the rotation angle ⁇ x.
- step S806 the CPU 111 calculates (estimates) a gaze point (a position to which a line of sight is directed; a position at which the user is looking) of the user on an image for viewing that is displayed on the EVF unit 1 using the rotation angles ⁇ x and ⁇ y calculated in step S805.
- a gaze point a position to which a line of sight is directed; a position at which the user is looking
- the coordinates (Hx, Hy) of the gaze point are calculated using formulas (2) and (3) below.
- Hx m ⁇ (Ax ⁇ ⁇ x + Bx) (2).
- Hy m ⁇ (Ay ⁇ ⁇ y + By) (3).
- the parameter m in formulas 2 and 3 is a constant number determined based on a structure of a finder optical system (e.g., light receiving lens 130) of the camera body 101.
- the parameter m is also a conversion coefficient for converting the rotation angles ⁇ x and ⁇ y into coordinates corresponding to the pupil center c on the image for viewing. These are predetermined and stored in the memory unit 112.
- the parameters Ax, Bx, Ay, and By are line-of-sight correction parameters for correcting individual differences in line of sight.
- the parameters Ax, Bx, Ay, and By are acquired by performing a calibration operation described below and stored in the memory unit 112 before the line-of-sight detection operation is initiated.
- step S807 the CPU 111 stores the coordinates (Hx, Hy) of the gaze point into the memory unit 112, and the line-of-sight detection operation ends.
- Fig. 21 is a diagram illustratively illustrating a structure of the optical path split prism unit 16 according to the first exemplary embodiment of the present invention.
- Fig. 21A is a cross-sectional view illustrating details of a portion C specified in Fig. 15 .
- the display panel 18 includes a semiconductor chip 18a and a glass plate 18b.
- the semiconductor chip 18a includes an organic electroluminescent (organic EL) element and a circuit for operating the organic EL element (display surface).
- the display panel 18 is abutted against a display panel abutment surface 17a of the panel holder 17 and fixed by bonding with, for example, an adhesive.
- the panel holder 17 is made of resin.
- the panel holder 17 includes a prism attachment surface 17b on the opposite side to the display panel abutment surface 17a.
- a first double-sided tape 57, a panel mask 20, and a second double-sided tape 58 are arranged in this order between the prism attachment surface 17b and the optical path split prism unit 16 and are each fixed by bonding with a double-sided tape.
- the bonding can use a method using, for example, an adhesive.
- a feature of the present exemplary embodiment is that the optical path split prism unit 16 is substantially sealed and attached without using a protection member for preventing entrance of dust into the display panel 18 (display panel unit 19).
- a conventional EVF includes a protection glass for preventing entrance of dust on a front surface of the display panel unit 19 so that no dust is visible in viewing a screen of a display panel.
- dust may adhere to the outside of the protection glass, and in a case where the protection glass is excessively close to a display surface of the display panel, the dust adhering on the outside of the protection glass may be imaged and included in formed images, so that the protection glass is disposed with a predetermined distance or longer along an optical axis of the display panel.
- the thickness of the optical path split prism unit 16 in the optical axis direction provides a distance for preventing dust on the outside of the optical path split prism unit 16 from forming an image on the display surface of the display panel.
- the space of the entire finder is made more compact than a case where a conventional display panel unit and an optical system for line-of-sight detection are simply arranged.
- a second opening portion 20a of the panel mask 20 passes light from the display panel 18 and light beams from the eyehole (reflection light from the pupils of the user) while a portion other than the second opening portion 20a of the panel mask 20 cuts stray light entering through end portions of the optical path split prism unit 16.
- the panel mask 20 contains a material with high heat resistance to be resistant to heat in a case where light beams from the eyehole are imaged nearby. According to the present exemplary embodiment, the panel mask 20 contains a metal plated with black.
- the panel holder 17 includes an opening portion 7c so that the opening portion 7c passes light from the display panel 18 and light beams (including reflection light from the pupil of the user) from the eyehole while a portion other than a first opening portion 17c provides a mask to make the circuit portion of the semiconductor chip 18a invisible from the eyehole side. While the first opening portion 17c is opened to be outside the second opening portion 20a when viewed from the EVF lens unit 29 side according to the present exemplary embodiment, the present invention is not limited to that described above.
- Fig. 21B is an exploded perspective view illustrating the display panel unit 19 and the optical path split prism unit 16, and each portion corresponding to Fig. 21A is given the same reference number as in Fig. 21A .
- the display panel unit is attached to the optical path split prism unit 16 by positioning the panel mask 20 and the optical path split prism unit 16 with respect to the display panel 18 using jigs.
- the panel holder 17 includes Y-direction references 17d and 17e to determine a level and a Y-dimension and includes an X-direction reference 17f to determine an X-dimension.
- the panel mask 20 and the optical path split prism unit 16 are attached using the references 17d, 17e, and 17f.
- the optical path split prism unit 16 is substantially sealed and attached without a protection member for preventing entrance of dust into the display panel unit 19 in the EVF unit 1 having the line-of-sight detection function.
- the display panel unit 19 and the optical path split prism unit 16 of the EVF unit 1 according to the present exemplary embodiment are attached together and integrally formed. This makes it unnecessary to provide a separate protection glass for preventing entrance of dust into the display panel unit 8.
- an optical path length from the eyehole of the finder to the display surface of the display panel is reduced, and while the structure (space) of the entire EVF is made compact, a wide viewing angle with respect to the display panel 18 is obtained.
- use of the above-described structure facilitates positioning of three optical axes that are optical axes of the display surface of the display panel unit 19, the optical path split prism unit 16, and the EVF lens unit 29 in assembling the EVF unit 1.
- the optical path split prism unit 16 has a greater interval (space, optical path length) between the display pane 18 and the optical path split prism unit 16 than a predetermined value. With this structure, foreign matter on the optical path split prism unit 16 is prevented from being viewed by the user.
- the EVF unit according to the present exemplary embodiment is applicable to the camera body 101, which is a similar electronic device to that described in the first exemplary embodiment, and a difference from the EVF unit 1 according to the first exemplary embodiment is a structure of an optical path split prism unit.
- Fig. 22 is an exploded perspective view illustrating an optical path split prism unit 216 and a display panel 218 according to the second exemplary embodiment of the present invention.
- the optical path split prism unit 216 according to the present exemplary embodiment includes a first optical path split prism 214, a second optical path split prism 215, and a black mask 201, and a dichroic layer is formed on an attachment surface of the two prisms as in the first exemplary embodiment.
- the black mask 201 is formed by sputtering on an incidence surface of the second optical path split prism 215 on the display unit side.
- the black mask 201 is a shaded portion illustrated in Fig. 22 .
- the black mask 201 includes an opening portion 201a.
- the display panel 218 includes a semiconductor chip 218a and a glass plate 218b as in the first exemplary embodiment.
- the semiconductor chip 218a includes an organic EL element (display surface) and a circuit for operating the organic EL element.
- the display panel 218 is attached directly to the second optical path split prism 15 with a double-sided tape 202.
- a feature is that the black mask 201 is provided to prevent unnecessary matter from being visible on a field when the user looks into the eyepiece portion 107 of the camera body 101.
- the display panel 218 is attached directly to the second optical path split prism 215, and a separate protection glass for preventing entrance of dust into the display panel 218 is unnecessary, so that the optical path length from the eyehole of the finder to the display surface of the display panel is reduced.
- the EVF unit according to the present exemplary embodiment is applicable to the camera body 101, which is a similar electronic device to that described in the first exemplary embodiment, and a difference from the EVF unit 1 according to the first exemplary embodiment is a structure of an optical path split prism unit.
- Fig. 23 is an exploded perspective view illustrating an optical path split prism 315 and a semiconductor chip 318a for display according to the third exemplary embodiment of the present invention.
- the optical path split prism 315 according to the present exemplary embodiment is a prism similar to the second optical path split prism 15 according to the first exemplary embodiment and is attached directly to the semiconductor chip 318a including an organic EL.
- a black mask 301 is formed by sputtering on the optical path split prism 315 as in the second exemplary embodiment.
- the black mask 301 is a shaded portion illustrated in Fig. 23 .
- the black mask 301 includes an opening portion 301a.
- the semiconductor chip 318a is attached directly to the second optical path split prism 215 with an adhesive for tight sealing.
- the optical path split prism 315 is attached directly to the semiconductor chip 318a including an organic EL element (display surface) and a circuit for operating the organic EL element.
- FIG. 24 is an external perspective view illustrating a camera body 400 as an electronic device according to the fourth exemplary embodiment of the present invention.
- An EVF unit 401 of the camera body 400 according to the present exemplary embodiment is substantially the same as the EVF unit 1 according to the first exemplary embodiment, so that an arrangement of components of the electronic device and an arrangement and a structure of the EVF unit 401 according to the present exemplary embodiment will be described.
- the camera body 400 according to the present exemplary embodiment and the camera body 101 according to the first exemplary embodiment have basically the same arrangement of the components.
- a positional relationship between the EVF unit 401 and a diopter adjustment portion 416 and a release button 405 according to the present exemplary embodiment is substantially the same as that of the camera body 101 according to the first exemplary embodiment.
- an arrangement of components of the camera body 400 will be described in more detail.
- a mount 402 is disposed on a front surface of the camera body 400, and a camera accessory such as an interchangeable lens is attachable and detachable.
- an accessory shoe 403 is disposed at an upper portion of the camera body 400 and is a connection portion to and from which external devices, such as a flash and a microphone, are attachable and detachable.
- An eyepiece portion 404 is disposed on a rear surface of the camera body 400 and at an upper portion of the camera body 400.
- a grip portion 419 is disposed at a right portion of the camera body 400 viewed from the rear side from which the user looks into the eyepiece portion 404 of the EVF unit 401.
- the user can hold the grip portion 419 with a hand.
- operation units that are manually operable by the user holding the camera body 400 are concentrated at the right side of the camera body 400 according to the present exemplary embodiment.
- the release button 405 and a first operation dial 407, a second operation dial 408, a first setting button 409, and a second setting button 410 for adjusting various parameters relating to imaging conditions and modes are located at an upper right portion of the camera body 400.
- an information display portion 406 is disposed on the right side of the EVF unit 401 (eyepiece portion 404) at an upper portion of the camera body 400 according to the present exemplary embodiment.
- the information display portion 406 is a display unit that can display various types of information, such as a shutter speed and an aperture value relating to exposure conditions, a current imaging mode, and information about whether continuous imaging is on.
- Fig. 25 is a perspective view illustratively illustrating an internal structure of the camera body 400 according to the fourth exemplary embodiment of the present invention.
- Fig. 26 is a top external view illustratively illustrating an internal structure of the camera body 400 according to the fourth exemplary embodiment of the present invention.
- the diopter adjustment portion 416 is disposed to the right of the EVF unit 401.
- the grip portion 419, the various operation units (406 to 410), and the diopter adjustment portion 416 are concentrated mainly at the right side of the camera body 400. This structure improves the operability of the camera body 400 for the user holding the camera body 400 since the most common case is a case where the user holds the camera body 400 with the right hand.
- the line-of-sight detection sensor unit 415 of the EVF unit 401 is disposed at the side different from the side that is to be held by the user with respect to an optical path (or optical axis) of the EVF unit 401 in the camera body 400.
- the line-of-sight detection sensor unit 415 of the EVF unit 401 is disposed at an opposite side to the side where the grip portion 419 or various operation units or the diopter adjustment portion 416 is disposed with respect to a lens unit of the EVF unit 401.
- a shutter 411, an image sensor unit 412 including an image sensor, and a display monitor 414 are provided at a lower portion of the EVF unit 401 in the camera body 400.
- the EVF unit 401 and the display monitor 414 are disposed to overlap on a plane (two-dimensional plane) perpendicular to an imaging optical axis of the camera body 400 in order to reduce the size and thickness of the camera body 400.
- the EVF unit 401 is disposed to overlap the central axis (imaging optical axis) of the mount 402 on the plane perpendicular to the central axis.
- the EVF unit 401 is disposed to overlap the imaging optical axis of the camera body 400 and a lens unit attachable to and detachable from the camera body 400 on the two-dimensional plane perpendicular to the imaging optical axis of the lens unit of the EVF unit 401.
- a positioning unit such as a global positioning system (GPS) unit 417 of the camera body 400 and a measurement unit such as a measurement unit 418 for detecting an orientation and a movement of the camera body 400 are disposed at a front side of the EVF unit 401.
- the accessory shoe 403 described above is disposed at an upper portion of the EVF unit 401.
- the foregoing units are also connected to a main substrate 413 similarly to the various operation units described above.
- disposing the line-of-sight detection sensor unit 415 between the units and the EVF unit 401 may lead to an increase in size of the camera body 400 and complicated wiring from the main substrate 413 to the units.
- a layout of the EVF unit 401 for accommodating a line-of-sight detection mechanism while preventing an increase in size of the camera body 400 desirably avoids a grip region of the camera body 400 and a neighborhood of the grip region where other members are less likely to be disposed.
- the line-of-sight detection sensor unit 415 is desirably disposed at the opposite side to the grip region disposed at the right side of the camera body 400 where the grip portion 419 and various operation units are disposed with respect to the optical axis of the EVF unit 401.
- the present invention is not limited to the exemplary embodiments, and various modifications and changes can be made without departing from the scope of the invention.
- the lens unit and the camera body can be integrally provided.
- imaging apparatuses are illustratively described as an electronic device to which the present invention is applied are described above in the exemplary embodiments, the present invention is not limited to those described above.
- the exemplary embodiments are applicable to a device, such as a head-mount display that includes the line-of-sight detection function and performs control based on feedback from the line-of-sight detection function.
- a (computer) program according to the flow illustrated in Fig. 20 is stored in advance in a memory unit of the camera body. Then, the components of the camera body 101 having the configuration illustrated in Fig. 2 cooperate together and execute the program to control operations of the entire imaging system.
- the program can be in any forms that have a program function, such as object codes, a program to be executed by an interpreter, or script data to be fed to an operating system (OS).
- OS operating system
- a recording medium for feeding the program can be, for example, a hard disk, a magnetic recording medium such as a magnetic tape, or an optical or magneto-optical recording medium.
- the present invention is also realizable by the following process. Specifically, a program for realizing one or more functions of the above-described exemplary embodiments is fed to a system or an apparatus via a network or a storage medium, and one or more processors of a computer of the system or the apparatus read the program and execute the read program. Further, the present invention is also realizable by a circuit (e.g., application-specific integrated circuit (ASIC)) for realizing one or more functions.
- ASIC application-specific integrated circuit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Studio Devices (AREA)
- Viewfinders (AREA)
Abstract
Description
- The present invention relates to an imaging apparatus with a line-of-sight detection function, an electronic device, and a finder unit.
- Electronic viewfinders (EVFs) have been known as display devices of digital cameras. A user moves an eye of the user close to an EVF to view a display image.
- Further, cameras and video cameras with a line-of-sight detection function that realize a function of selecting autofocus points by detecting a line-of-sight direction of a user have been put to practical use.
- For example, J PTL1 discusses a technique that uses a common optical path as both an optical path of an electronic viewfinders (EVF) display and an optical path of a line-of-sight input sensor and provides an optical path split mirror on the common optical path to detect a line-of-sight direction (position) of an eye that views the EVF display.
- PTL1:
Japanese Patent Application Laid-Open No. 5-130463 - However, PTL1 does not discuss an optical arrangement of an electronic viewfinder (EVF) in an imaging apparatus with a line-of-sight input function. For example, a suitable structure for preventing an effect of dust in a finder and a relative arrangement of an operation member and a built-in member of an imaging apparatus and a finder and a line-of-sight detection mechanism are not considered.
- The present invention is directed to a suitable EVF arrangement in an imaging apparatus including an EVF and a line-of-sight input function that prevents a decrease in operability of the imaging apparatus.
- According to an aspect of the present invention, in order to achieve the above-described object, a display unit (19) including a display unit (18) is fixed to an optical path split unit (16) configured to split an optical path to a line-of-sight detection sensor in an electronic viewfinder (EVF) having a line-of-sight detection function.
- The present invention prevents a decrease in operability of an imaging apparatus including an electronic viewfinder (EVF) and a line-of-sight input function.
-
-
Fig. 1A is a diagram illustratively illustrating an exterior view of acamera body 101 as an electronic device according to a first exemplary embodiment of the present invention. -
Fig. 1B is a diagram illustratively illustrating an exterior view of thecamera body 101 as an electronic device according to the first exemplary embodiment of the present invention. -
Fig. 2 is a block diagram illustratively illustrating an internal configuration of thecamera body 101 according to an aspect of the present invention. -
Fig. 3 is a cross-sectional view illustrating thecamera body 101 according to the first exemplary embodiment of the present invention. -
Fig. 4 is a cross-sectional view illustrating an electronic viewfinder (EVF)unit 1 according to the first exemplary embodiment of the present invention. -
Fig. 5 is a diagram illustratively illustrating aneyepiece window 32 according to the first exemplary embodiment of the present invention. -
Fig. 6 is a cross-sectional view illustrating theeyepiece window 32 according to an aspect of the present invention. -
Fig. 7 is a diagram illustrating a structure of theEVF unit 1 according to an aspect of the present invention. -
Fig. 8 is a cross-sectional view illustrating a main portion of theEVF unit 1 according to an aspect of the present invention. -
Fig. 9 is a cross-sectional (horizontal sectional) view illustrating a main portion of theEVF unit 1 at a position including an infrared light emitting diode (LED) 44 for proximity detection according to the first exemplary embodiment of the present invention. -
Fig. 10 is a cross-sectional (horizontal sectional) view illustrating a main portion of theEVF unit 1 at a position including asensor 45 for proximity detection according to the first exemplary embodiment of the present invention. -
Fig. 11 is a cross-sectional view illustrating theEVF unit 1 at a position including aproximity detection unit 47 according to an aspect of the present invention. -
Fig. 12 is a graph illustratively illustrating light transmission/reception characteristics of theproximity detection unit 47. -
Fig. 13 is a cross-sectional (normal cross-sectional) view illustrating a main portion of theproximity detection unit 47 according to the first exemplary embodiment of the present invention. -
Fig. 14 is a perspective view illustratively illustrating mainly a portion relating to a line-of-sight detection function in theEVF unit 1 according to the first exemplary embodiment of the present invention. -
Fig. 15 is a cross-sectional view illustrating mainly the portion relating to the line-of-sight detection function in theEVF unit 1 according to the first exemplary embodiment of the present invention. -
Fig. 16 is a diagram illustratively illustrating an optical path for line-of-sight detection in theEVF unit 1 according to an aspect of the present invention. -
Fig. 17 is a diagram illustratively illustrating an arrangement of an eyeball of a user and reflection images. -
Fig. 18 is a diagram illustrating a principle of a line-of-sight detection method. -
Figs. 19 are diagrams illustratively illustrating an eye image on a line-of-sight detection sensor. -
Fig. 20 is a schematic flowchart illustrating a line-of-sight detection operation according to the first exemplary embodiment of the present invention. -
Fig. 21A is a diagram illustratively illustrating a structure of an optical pathsplit prism unit 16 according to the first exemplary embodiment of the present invention. -
Fig. 21B is a diagram illustratively illustrating a structure of an optical pathsplit prism unit 16 according to the first exemplary embodiment of the present invention. -
Fig. 22 is an exploded perspective view illustrating an optical pathsplit prism unit 216 and adisplay panel 218 according to a second exemplary embodiment of the present invention. -
Fig. 23 is an exploded perspective view illustrating an opticalpath split prism 315 and asemiconductor chip 318a for display according to a third exemplary embodiment of the present invention. -
Fig. 24 is an external perspective view illustrating acamera body 400 as an electronic device according to a fourth exemplary embodiment of the present invention. -
Fig. 25 is a perspective view illustratively illustrating an internal structure of thecamera body 400 according to the fourth exemplary embodiment of the present invention. -
Fig. 26 is a top external view illustratively illustrating an internal structure of thecamera body 400 according to the fourth exemplary embodiment of the present invention. Description of Embodiments - Various exemplary embodiments of the present invention will be described below with reference to the attached drawings.
-
Fig. 1 is a diagram illustratively illustrating external views of acamera body 101 as an electronic device according to a first exemplary embodiment of the present invention. According to the present exemplary embodiment, thecamera body 101 and alens unit 102 described below will collectively be referred to as animaging apparatus 100. Further, an electronic device according to an aspect of the present invention is applicable to any electronic devices capable of detecting a device displaying information, such as images and characters, and a line of sight of a user viewing the information displayed on the device. Examples of the electronic devices may include mobile phones, game machines, tablet terminals, personal computers, watch- or glasses-type information terminals, and head-mount displays. -
Fig. 1A is a front perspective view, andFig. 1B is a rear perspective view. Thecamera body 101 includes arelease button 103 disposed thereon. Therelease button 103 is an operation member for receiving an imaging operation from a user (image capturing person). As illustrated inFig. 1B , aneyepiece portion 107 is disposed on a rear side of thecamera body 101, and the user can look into an electronic viewfinder (EVF)unit 1 included in thecamera body 101 through theeyepiece portion 107. Theeyepiece portion 107 forms an eyehole and is externally protruded (directed behind) from thecamera body 101. An operation member 106 is also disposed on the rear side of thecamera body 101. The operation member 106 receives various operations from the user. For example, afirst operation member 106A is an operation lever that can be pressed in each direction. Asecond operation member 106B is a four-direction key that can be pressed in four directions. A display monitor 104 (touch panel) includes a display panel, such as a liquid crystal panel, and has a function of displaying images on the display panel. -
Fig. 2 is a block diagram illustratively illustrating an internal configuration of thecamera body 101 according to an aspect of the present invention. Animage sensor 105 is an image sensor, such as a charge-coupled device (CCD) sensor or complementary metal oxide semiconductor (CMOS) sensor. An optical image formed on an image plane of theimage sensor 105 by an optical system of thelens unit 102 is photoelectrically converted by theimage sensor 105 to obtain an analog image signal, the obtained analog image signal is output to an analog/digital (A/D) conversion unit, and image data is output from the AD conversion unit. - The
lens unit 102 consists of the optical system including a zoom lens (not illustrated), a focus lens (not illustrated), and a diaphragm (not illustrated). Mounted on thecamera body 101, thelens unit 102 guides light beams from a subject to theimage sensor 105 and forms a subject image on the image plane of theimage sensor 105. Adiaphragm control unit 118, a focalpoint adjustment unit 119, and azoom control unit 120 each receive instruction signals from a central processing unit (CPU) 111 via amount contact portion 108B provided on a mount portion 108 and controls driving of the diaphragm, the focus lens, and the zoom lens based on the instruction signals. - The
CPU 111 as a control unit of thecamera body 101 reads a control program for blocks included in thecamera body 101 from a read-only memory (ROM) of amemory unit 112, develops the read program to a random access memory (RAM) of thememory unit 112, and executes the developed program. By doing so, theCPU 111 controls operations of the blocks of thecamera body 101 and comprehensively controls thecamera body 101 and thelens unit 102. A line-of-sight detection unit 4, alight measurement unit 113, an automatic focalpoint detection unit 114, asignal input unit 115, anEVF driving unit 2, and a lightsource driving unit 3 are connected to theCPU 111. Further, theCPU 111 transmits signals to the focalpoint adjustment unit 119 and thediaphragm control unit 118 disposed in thelens unit 102 via themount contact portion 108B. According to the present exemplary embodiment, thememory unit 112 has a function of storing imaged signals from theimage sensor 105 and a line-of-sight detection sensor 53. - With an eyeball image formed on the line-of-sight detection sensor 53 (CCD-EYE) described below, the line-of-
sight detection unit 4 performs A/D conversion on the output (eye image obtained by imaging an eye) from the line-of-sight detection sensor 53 and acquires a result of the A/D conversion. TheCPU 111 extracts feature points necessary for line-of-sight detection from the eye image according to a predetermined algorithm described below and calculates a line of sight (gaze point on an image for viewing) of the user from positions of the feature points. - The
light measurement unit 113 performs amplification, logarithmic compression, and A/D conversion on signals, which are acquired from theimage sensor 105 serving also as a light measurement sensor, specifically luminance signals corresponding to the brightness of a field, and acquires a result as field luminance information. - The automatic focal
point detection unit 114 performs A/D conversion on signal voltages from a plurality of detection elements (plurality of pixels) that is included in pixels of theimage sensor 105 and is used for phase difference detection. TheCPU 111 then calculates a distance to a subject corresponding to a focal point detection point based on the converted signals of the plurality of detection elements. This is a publicly-known technique known as image plane phase difference autofocusing (image plane phase difference AF). According to the present exemplary embodiment, for example, a field image (image for viewing) in the finder is split, and 180 split portions on the image plane each include a focal point detection point. - An
image processing unit 116 performs various types of image processing on image data stored in the RAM Specifically, theimage processing unit 116 applies various types of image processing for developing digital image data and displaying and recording the developed digital image data. Examples of the various types of image processing include correction processing for correcting pixel defects originating from an optical system or an image sensor, demosaicing processing, white balance correction processing, color interpolation processing, and gamma processing. - Switches SW1 and SW2 are connected to the
signal input unit 115. The switch SW1 is turned on by a first stroke of therelease button 103, and the switch SW2 is turned on by a second stroke of therelease button 103. An instruction to start a light measurement, distance measurement, or line-of-sight detection operation of thecamera body 101 is issued with the switch SW1, and an instruction to start an imaging operation is issued with the switch SW2. ON signals from the switches SW1 and SW2 are input to thesignal input unit 115 and transmitted to theCPU 111. Thesignal input unit 115 also receives operation input from theoperation members Fig. 1B . - A recording/
outputting unit 117 records data including image data in a recording medium such as a removable memory card, or such data is output to external devices via an external interface. A basic structure of theimaging apparatus 100 including thecamera body 101 and thelens unit 102 has been described above. -
Fig. 3 is a cross-sectional view illustrating thecamera body 101 according to the first exemplary embodiment of the present invention and is a cross section cut along a YZ-plane formed by Y- and Z-axes illustrated inFig. 1A . - As illustrated in
Fig. 3 , ashutter 109 is a focal-plane mechanical shutter, and theshutter 109 and theimage sensor 105 are arranged in this order from the side where thelens unit 102 is disposed. In the cross-sectional view illustrated inFig. 3 , a main substrate (not illustrated) is provided between theimage sensor 105 and thedisplay monitor 104. Thelens unit 102 is attachable to and detachable from thecamera body 101 via amount 108A. In a state illustrated inFig. 3 , thelens unit 102 is mounted on thecamera body 101. - The
EVF 1 is a display unit provided at an upper portion of thecamera body 101 and has a line-of-sight detection function described below. The upper portion of thecamera body 101 is a side where theEVF unit 1 is provided inside anexterior cover 110, whereas a lower portion of thecamera body 101 is a side where not theEVF unit 1 but thedisplay monitor 104 and theimage sensor 105 are provided. - The EVF unit (finder unit) 1 can display menus and images as a typical EVF similarly to the
display monitor 104 and, furthermore, has the line-of-sight detection function of detecting a line of sight of the user looking into the EVF. Further, theEVF unit 1 is configured to reflect line-of-sight detection results in the control of thecamera body 101. Details of the line-of-sight detection function will be described below. - The user can look into the
EVF unit 1 through theeyepiece portion 107 including aneyecup 9, which is removable, attached to aneyecup attachment frame 8. - Details of the structure of the
EVF unit 1 will now be described with reference toFig. 4. Fig. 4 is a cross-sectional view illustrating theEVF unit 1 according to the first exemplary embodiment of the present invention and illustrates a longitudinal cross section with respect to an optical axis of the EVF unit 1 (i.e., an imaging optical axis of an EVF lens unit 29). The position of theEVF lens unit 29 illustrated inFig. 4 indicates a case where a diopter is -1. - An optical path split
prism unit 16 is bonded and fixed to a fixed lens-barrel 12 with aprism mask 13 therebetween. - The optical path split
prism unit 16 is an optical path split unit including a first optical path splitprism 14 and a second optical path splitprism 15 bonded together. - A
panel holder 17 is a holding mechanism for holding a display panel. Adisplay panel 18 and thepanel holder 17 are bonded together and fixed to form adisplay panel unit 19 of theEVF unit 1. - The
display panel unit 19 and the optical path splitprism unit 16 are fixed with apanel mask 20 therebetween. Thepanel mask 20 is a mask unit. - The EVF lens unit (eyepiece optical system) 29 of the
EVF unit 1 includes aG1 lens 21, aG2 lens 25, and a finder lens (G3 lens) 26. TheG1 lens 21 is held by aG1 lens holder 22. Further, a G1-G2 mask 23 is also held by theG1 lens holder 22. A G2-G3 holder 24 holds theG2 lens 25 and the G3 lens as thefinder lens 26. TheG1 lens holder 22 is fixed to the G2-G3 holder 24 with screws (not illustrated). AG3 mask 27 is fixed to the G2-G3 holder 24 with a double-sided masking tape 28. The foregoingmembers 21 to 28 will collectively be referred to as theEVF lens unit 29. - In the
eyepiece portion 107 of theEVF unit 1, a light emitting diode (LED)holder 30, aneyepiece window holder 31, and aneyepiece window 32 are provided. Next, details of a structure of theeyepiece window 32 will be described with reference toFig. 5. Fig. 5 is a diagram illustratively illustrating theeyepiece window 32 according to the first exemplary embodiment of the present invention and illustrates theeyepiece window 32 viewed from theEVF lens unit 29 side (inside of the camera). - The
eyepiece window 32 is an optical member having R1 and R2 surfaces. The R1 surface is on a side facing the inside of the camera, and the R2 surface is on the opposite side (outside the camera). Abase material of theeyepiece window 32 is a transparent plate glass. Atransparent region 32a is a blank region, and a visible light cutmask 32b illustrated as a cross-hatched region surrounds the transparent region. Specifically, thetransparent region 32a is located at a portion corresponding to an imaging optical path of theEVF unit 1, and the visible light cutmask 32b is disposed at a region to avoid the imaging optical path. - The visible light cut
mask 32b is a printed material printed with a material (ink) that absorbs visible light and transmits infrared light. A color of an appearance of the visible light cutmask 32b is black under visible wavelength light. The visible light cutmask 32b may be configured to have a lower visible wavelength light transmittance than an infrared wavelength light transmittance. -
Fig. 6 is a cross-sectional view illustrating theeyepiece window 32 according to an aspect of the present invention along a cross section A-A illustrated inFig. 5 . As described above, aplate glass 32c is the base material of theeyepiece window 32. Anantireflection coat 32d is a coating disposed on a side of theeyepiece window 32 that faces the inside of the camera. Theantireflection coat 32d prevents reflection of unnecessary light. Furthermore, an antireflection (AR)film 32e is provided on the opposite side to theantireflection coat 32d across theplate glass 32c. TheAR film 32e has a function of a hard coat, an antireflection coat, and anti-scattering. With theAR film 32e having the anti-scattering function on the side of theeyepiece window 32 that faces the outside of the camera, safety is improved in a situation where theeyepiece window 32 is damaged. - As described above, the user can observe display images displayed on the
display panel unit 19 through the optical path splitprism unit 16, theEVF lens unit 29, and thetransparent region 32a of theeyepiece window 32. - Next,
Fig. 7 is a diagram illustrating a structure of theEVF unit 1 viewed from theeyepiece portion 107 side with theeyepiece window 32 removed. Specifically,Fig. 7 is a diagram illustrating details of theEVF unit 1 in a case where thecamera body 101 is viewed from the rear side. - As illustrated in
Fig. 7 , adisplay opening mask 31a is provided at a center of theeyepiece window holder 31. Thedisplay opening mask 31a is horizontally long correspondingly to an aspect ratio of the EVF and has substantially the same shape as thetransparent region 32a of theeyepiece window 32. - A
diopter adjustment dial 33 is rotatably held on a side surface (right side surface) of theEVF unit 1 with ashaft screw 34. The diopter adjustment dial is a diopter adjustment unit that moves a lens group of theEVF lens unit 29 in a direction parallel to the optical axis to adjust a diopter when the user looks into theEVF unit 1. A right side of thecamera body 101 is the side where therelease button 103, thefirst operation member 106A, and thesecond operation member 106B are located in a direction parallel to an X-axis direction illustrated inFig. 1A . - Fixing screws 35 fix the
eyepiece window holder 31 and theLED holder 30 together to the fixed lens-barrel 12. -
Infrared LEDs masks eyepiece window holder 31. -
Infrared LEDs masks eyepiece window holder 31. The foregoing eight infrared LEDs for line of sight are disposed such that illumination light beams are narrowed by the opening masks for LEDs to emit infrared light toward different positions (emission directions). - An
infrared LED 44 for proximity detection is an illumination unit of a proximity detection unit that detects an approach of an object (mainly the user) to theEVF unit 1 together with asensor 45 for proximity detection, which will be described below. Theinfrared LED 44 for proximity detection is disposed in anopening mask 31j for LEDs. - The
sensor 45 for proximity detection is a detection unit that receives light beams emitted from the infrared LED for proximity detection in the predetermined emission direction and reflected via an object. Thesensor 45 for proximity detection is disposed in anopening mask 31k for sensors. - As illustrated in
Fig. 7 , the four infrared LEDs for line of sight are disposed at positions along the upper long side of thedisplay opening mask 31a, and the other four infrared LEDs for line of sight (eight infrared LEDs for line of sight in total) are disposed at positions along the lower long side of thedisplay opening mask 31a. The infrared LED for proximity detection, the sensor for proximity detection, and the diopter adjustment mechanism are disposed at positions along the shorter sides of thedisplay opening mask 31a. With such a structure, the size of theEVF unit 1 on a two-dimensional projection plane can be reduced while the line-of-sight detection function and the diopter adjustment function are included. - Further, as illustrated in
Fig. 7 , the eightinfrared LEDs 36 to 39 for line of sight, theinfrared LED 44 for proximity detection, and thesensor 45 for proximity detection are disposed outside thedisplay opening mask 31a with respect to the optical axis of theEVF unit 1. Thus, the illumination units and the detection units are disposed at (overlap on the two-dimensional projection plane) a region where thevisible cut mask 32b of theeyepiece window 32 is located in a case where theEVF unit 1 is viewed from the rear side of thecamera body 101, so that the illumination units and the detection units are not easily visible from the user. With such a structure, the illumination units and the detection units are invisible from the outside.Fig. 8 is a cross-sectional view illustrating a main portion of theEVF unit 1 and illustrates a longitudinal cross section at a position including theinfrared LED 39 for line of sight mainly for short-range illumination in the direction parallel to the optical axis of theEVF unit 1. The state illustrated inFig. 8 indicates a case where the diopter of theEVF unit 1 is +2.0 as the position of theEVF lens unit 29. - As illustrated in
Fig. 8 , theinfrared LED 39 for line of sight for short-range illumination is pressed against theeyepiece window holder 31 by acushion member 46 attached to theLED holder 30 and is fixed to theopening mask 31e for LEDs. - A light-shielding
wall 31m is a light-shielding member that prevents infrared light emitted from theinfrared LED 39 for line of sight for short-range illumination from directly entering thefinder lens 26. - The
finder lens 26 includes anR2 surface 26a. TheR2 surface 26a is a convex lens, and a space between thefinder lens 26 and theeyepiece window 32 increases at great distances from a lens center toward peripheries. Since the light-shieldingwall 31m is disposed in the increased region, illumination light from theinfrared LED 39 for line of sight for short-range illumination can illuminate even a region at a short distance from theeyepiece window 32, for example. - Furthermore, since the
eyepiece window 32 includes thevisible cut mask 32b and thetransparent region 32a as a single continuous member, there is no need for a separate connection portion for connecting thevisible cut mask 32b and thetransparent region 32a together. Thus, vignetting of light beams in a connection portion does not occur, and light distribution characteristics of theEVF unit 1 to nearby regions improve. - While the
infrared LED 39 for line of sight is mainly described above, theinfrared LEDs infrared LEDs - Next, details of the relative arrangement of the
infrared LED 44 for proximity detection and thesensor 45 for proximity detection will be described. According to the present exemplary embodiment, theinfrared LED 44 for proximity detection and thesensor 45 for proximity detection are packaged into a single unit as aproximity detection unit 47. -
Fig. 9 is a cross-sectional (horizontal sectional) view illustrating a main portion of theEVF unit 1 at a position including theinfrared LED 44 for proximity detection according to the first exemplary embodiment of the present invention.Fig. 10 is a cross-sectional (horizontal sectional) view illustrating a main portion of theEVF unit 1 at a position including thesensor 45 for proximity detection according to the first exemplary embodiment of the present invention. - The positions of the
infrared LED 44 for proximity detection and thesensor 45 for proximity detection as theproximity detection unit 47 in theEVF unit 1 are determined. Specifically, theproximity detection unit 47 is pressed against theeyepiece window holder 31 via asecond cushion member 48 to determine the position of theproximity detection unit 47 in theEVF unit 1. As illustrated inFig. 9 , theproximity detection unit 47 is disposed such that an illumination surface of theinfrared LED 44 for proximity detection is inclined at an angle of 10 degrees with respect to a line parallel to the optical axis of theEVF unit 1. Similarly, - A sensor surface of the
sensor 45 for proximity detection faces a direction inclined at an angle of 10 degrees with respect to the line parallel to the optical axis of theEVF unit 1. -
Fig. 11 is a cross-sectional view of a position including theproximity detection unit 47 and is a cross-sectional view that includes theinfrared LED 44 for proximity detection and thesensor 45 for proximity detection and is in the direction parallel to the optical axis of theEVF unit 1.Fig. 12 is a graph illustratively illustrating light transmission/reception characteristics of theproximity detection unit 47. InFig. 12 , a horizontal axis represents angles from a normal direction of theproximity detection unit 47, and a vertical axis represents intensities of light transmission/reception characteristics using the normal direction as a reference. - Crosstalk as a cause of detection errors in the
proximity detection unit 47 will be now described with reference toFigs. 9 to 12 . InFig. 9 , a region illustrated as across-hatched portion 49 illustrated inFig. 9 in an inner wall portion of theeyecup attachment frame 8 and theeyecup 9, which is removable, is illuminated with infrared light emitted from theinfrared LED 44 for proximity detection. At this time, the ratio between an integral value of infrared light intensities in thecross-hatched portion 49 in a case where theinfrared LED 44 for proximity detection is inclined at an angle of 10 degrees (region from 16 degrees to 30 degrees) and an integral value of infrared light intensities in thecross-hatched portion 49 in a case where theinfrared LED 44 is not inclined and faces the vertical direction (corresponding to about 6 degrees to 20 degrees because theinfrared LED 44 is not inclined at an angle of 10 degrees), i.e., the ratio of 10-degree incline to vertical, is approximately equal to 0.6:1. - As illustrated in
Fig. 10 , thesensor 45 for proximity detection captures the light in a region illustrated as across-hatched portion 50 illustrated inFig. 10 in the inner wall portion of theeyecup attachment frame 8 and theeyecup 9, which is removable. At this time, the ratio between an integral value of detection sensitivities in thecross-hatched portion 50 in the case where thesensor 45 for proximity detection is inclined at an angle of 10 degrees (region from 16 degrees to 30 degrees) and an integral value of detection sensitivities in thecross-hatched portion 50 in the case where thesensor 45 for proximity detection is not inclined and faces the vertical direction (corresponding to about 6 degrees to 20 degrees because thesensor 45 is not inclined at an angle of 10 degrees), i.e., the ratio of 10-degree incline to vertical, is approximately equal to 0.6:1. - Specifically, the combined ratio of the light transmission side illustrated in
Fig. 9 and the light reception side illustrated inFig. 10 , i.e., the ratio of 10-degree incline to vertical, is approximately equal to 0.62:1 = 0.36:1, and crosstalk originating from an effect of the inner walls of theeyecup attachment frame 8 and theeyecup 9, which is removable, is reduced by approximately 64%. At this time, a signal level for detection with respect to a front direction is not adversely affected because a detection direction is directed in the direction parallel to the optical axis of theEVF unit 1. - Next, crosstalk caused by in-plane reflection in the
eyepiece window 32 will be described with reference toFig. 11 . A crosstalk optical path is an optical path via which infrared light emitted from theinfrared LED 44 for proximity detection travels through anoptical path 51 with one time of reflection by traveling through the R1 surface of theeyepiece window 32, being reflected by the R2 surface, traveling through the R1 again, and then reaching thesensor 45 for proximity detection. While there are cases where a plurality of times of reflection is involved, only the case where one time of reflection is involved will be described below, because an intensity decreases significantly correspondingly to the number of times of reflection. Specifically, since the reflectance of the R2 surface is approximately 2%, the intensity is 2% in a case with one time of reflection, 0.04% in a case with two times of reflection, and 0.0008% in a case with three times of reflection. - In a case where the
proximity detection unit 47 is opposite theeyepiece window 32, the intensity of infrared light in the direction of 29 degrees illustrated inFig. 11 is approximately 25% with respect to light emitted from a center of theinfrared LED 44 for proximity detection. - A method of calculating an angle with respect to a normal line in a case where the
proximity detection unit 47 is inclined at an angle of approximately 10 degrees as illustrated inFig. 9 will be described below.Fig. 13 is a cross-sectional (normal cross-sectional) view illustrating a main portion of theproximity detection unit 47 according to the first exemplary embodiment of the present invention and illustrates a normal cross section of theproximity detection unit 47 through the center of theinfrared LED 44 for proximity detection and a point P - As illustrated in
Fig. 13 , an angle formed by the normal line of theinfrared LED 44 for proximity detection and the line passing the point P, which is 29 degrees before reflection, changes to 31 degrees as a result of one time of reflection. The intensity of infrared light in the above-described case is about 22% (the ratio of 10-degree incline to vertical is approximately equal to 0.22:0.25 = 0.88:1) with respect to the center of theinfrared LED 44 for proximity detection based on the graph of light transmission/reception characteristics illustrated inFig. 12 . - The ratio of 10-degree incline to vertical is approximately equal to 0.882:1 = 0.77:1 in the
proximity detection unit 47 in total, and crosstalk originating from an effect of in-plane reflection in the eyepiece window is reduced by approximately 23%. - As described above, the
proximity detection unit 47 is inclined at a predetermined angle with respect to theeyepiece window 32, and this effectively reduces crosstalk through a wall surface of theeyecup attachment frame 8 and crosstalk through in-plane reflection by theeyepiece window 32. - Next, a structure of the optical path split
prism unit 16 and the line-of-sight detection unit 4 in theEVF unit 1 will be described. The line-of-sight detection unit 4 according to the present exemplary embodiment includes at least a line-of-sight imaging lens 52, the line-of-sight detection sensor 53, and adiaphragm 56 for line-of-sight detection and may include theinfrared LEDs 36 to 43 for line of sight and the optical path splitprism unit 16. -
Fig. 14 is a perspective view illustratively illustrating mainly a portion relating to the line-of-sight detection function in theEVF unit 1 according to the first exemplary embodiment of the present invention.Fig. 15 is a cross-sectional view illustrating mainly the portion relating to the line-of-sight detection function in theEVF unit 1 according to the first exemplary embodiment of the present invention and is a transverse cross-sectional view with respect to the direction parallel to the optical axis of theEVF unit 1.Fig. 16 is a diagram illustratively illustrating an optical path for line-of-sight detection in theEVF unit 1 according to an aspect of the present invention with respect to the perspective view illustrated inFig. 14 . - The
infrared LEDs 36 to 43 for line of sight are provided at different positions in different orientations to emit infrared light in different directions to the eyeball of the user. For example, theinfrared LEDs 36 to 43 for line of sight are disposed in different orientations with respect to a glass surface of theplate glass 32c of theeyepiece window 32. - The
infrared LEDs infrared LEDs - The line-of-
sight imaging lens 52 is an optical system for line-of-sight detection and forms an image of light emitted from the infrared LEDs for line of sight and reflected by the eye of the user on the line-of-sight detection sensor 53. The line-of-sight detection sensor 53 is a sensor for line-of-sight detection using a solid-state image sensor, such as a CCD image sensor, and is a detection unit capable of detecting a line of sight of the eye of the user near theeyepiece portion 107. Since the line-of-sight detection sensor 53 images infrared wavelength reflection light for line-of-sight detection, the line-of-sight detection sensor 53 can be configured to acquire either one of color images or monochrome images. An opening size of thediaphragm 56 for line-of-sight detection is adjusted to adjust the amount of light entering the line-of-sight detection sensor 53 and to adjust a depth of field such that images of the eyeball of the user are not blurred. - As illustrated in
Fig. 15 , infrared light emitted from a predetermined infrared LED for line of sight travels as an eyeball image reflected by the eyeball of the user through theeyepiece window 32, thefinder lens 26, theG2 lens 25, and theG1 lens 21, and enters asecond surface 14a of the first optical path splitprism 14. An incidence optical path of the incident light is illustrated as anoptical path 54 inFig. 15 . - On a
first surface 14b of the first optical path splitprism 14, a dichroic layer that reflects infrared light is formed. Thus, the eyeball image of the user after entering theEVF unit 1 is reflected by thefirst surface 14b and travels toward thesecond surface 14a. An optical path of the reflection light is illustrated as a reflectionoptical path 55a inFig. 15 . - The reflection light traveling through the reflection
optical path 55a is totally reflected by thesecond surface 14a, travels through an imagingoptical path 55b, and forms an image on the line-of-sight detection sensor 53 through the line-of-sight imaging lens 52 via adiaphragm 56. - As illustrated in
Fig. 16 , thecamera body 101 according to the present exemplary embodiment uses a cornea reflection image together with a pupil image of the eyeball for line-of-sight detection. The cornea reflection image is formed by specular reflection of illumination from the infrared LEDs for line of sight by acornea 142 of the eyeball of the user. -
Fig. 17 is a diagram illustratively illustrating an arrangement of the eyeball of the user and reflection images.Fig. 17 illustrates a case where an eyeball image and cornea reflection images have a short eyeball distance. InFig. 17 , the eyeball of the user includes apupil 141 and aniris 143, andcornea reflection images 144 to 147 are cornea reflection images formed by light emitted from the infrared LEDs for line of sight. - A reflection image corresponding to the
infrared LED 36 for line of sight is thecornea reflection image 144. A reflection image corresponding to theinfrared LED 37 for line of sight is thecornea reflection image 145. A reflection image corresponding to theinfrared LED 38 for line of sight is thecornea reflection image 146. A reflection image corresponding to theinfrared LED 39 for line of sight is thecornea reflection image 147. - The line-of-sight detection (detection of line-of-sight direction) according to the present exemplary embodiment detects a line of sight based on a relative relationship between a pupil center and cornea reflection images. The line-of-sight detection can use, for example, a method discussed in
Japanese Patent No. 3186072 - A line-of-sight detection method will be described with reference to
Figs. 18 ,19A ,19B , and20 .Fig. 18 is a diagram illustratively illustrating a principle of the line-of-sight detection method and is a schematic diagram illustrating an optical system for performing line-of-sight detection. As illustrated inFig. 18 ,light sources light receiving lens 130 and illuminate aneyeball 140 of the user. Part of the light emitted from thelight sources eyeball 140 is focused on the line-of-sight detection sensor 53 by thelight receiving lens 130.Fig. 19 is a diagram illustratively illustrating an eye image on the line-of-sight detection sensor.Fig. 19A is a schematic diagram illustrating an eye image (an eyeball image projected to the line-of-sight detection sensor 53) imaged by the line-of-sight detection sensor 53, andFig. 19B is a diagram illustrating CCD output intensities of the line-of-sight detection sensor 53.Fig. 20 is a schematic flowchart illustrating a line-of-sight detection operation according to the first exemplary embodiment of the present invention. - If the line-of-sight detection operation is started, in step S801 in
Fig. 20 , thelight sources eyeball 140 of the user. An eyeball image of the user illuminated with the infrared light travels through thelight receiving lens 130 and is formed on the line-of-sight detection sensor 53, and the line-of-sight detection sensor 53 photoelectrically converts the formed image. A processible electric signal of the eye image is thereby acquired. - In step S802, the line-of-
sight detection unit 4 transmits the eye image (eye image signal; electric signal of the eye image) acquired from the line-of-sight detection sensor 53 to theCPU 111. - In step S803, the
CPU 111 calculates, from the eye image acquired in step S802, coordinates of points corresponding to cornea reflection images Pd and Pe of thelight sources - The infrared light emitted from the
light sources cornea 142 of theeyeball 140 of the user. At this time, the cornea reflection images Pd and Pe formed by part of the infrared light reflected by a surface of thecornea 142 are focused by thelight receiving lens 130, and form images on the line-of-sight detection sensor 53 to obtain cornea reflection images Pd' and Pe' on the eye image. Similarly, light beams from end portions a and b of thepupil 141 form images on the line-of-sight detection sensor 53 to obtain pupil end images a' and b' on the eye image. -
Fig. 19B illustrates luminance information (luminance distribution) about a region α' of an eye image inFig. 19A. Fig. 19B illustrates a luminance distribution in an X-axis direction, where the X-axis direction is a horizontal direction of the eye image and a Y-axis direction is a vertical direction of the eye image. According to the present exemplary embodiment, coordinates Xd and Xe are coordinates of the cornea reflection images Pd' and Pe' in the X-axis direction (horizontal direction), and coordinates Xa and Xb are coordinates of the pupil end images a' and b' in the X-axis direction. As illustrated inFig. 19B , extremely high level luminances are obtained at the coordinates Xd and Xe of the cornea reflection images Pd' and Pe'. Extremely low level luminances are obtained in a region from the coordinate Xa to the coordinate Xb, except at the coordinates Xd and Xe. The region from the coordinate Xa to the coordinate Xb corresponds to the region of the pupil 141 (a region of pupil images formed on the line-of-sight detection sensor 53 by light beams from the pupil 141). Intermediate luminances between the above-described two types of luminances are obtained in a region (a region of iris images outside the pupil images that are formed by light beams from the iris 143) of theiris 143 outside thepupil 141. Specifically, intermediate luminances between the above-described two types of luminances are obtained in a region of X-coordinates (coordinates in the X-axis direction) smaller than the coordinate Xa and a region of X-coordinates greater than the coordinate Xb. - The X-coordinates Xd and Xe of the cornea reflection images Pd' and Pe' and the X-coordinates Xa and Xb of the pupil end images a' and b' are obtained from the luminance distribution illustrated in
Fig. 19B . Specifically, coordinates with extremely high luminances are obtained as coordinates of the cornea reflection images Pd' and Pe', and coordinates with extremely low luminances are obtained as coordinates of the pupil end images a' and b'. Further, in a case where a rotation angle θx of an optical axis of theeyeball 140 with respect to an optical axis of the virtuallight receiving lens 130 is small, a coordinate Xc of a pupil center image c' (pupil image center) formed on the line-of-sight detection sensor 53 by light beams from the pupil center c is expressed as Xc ≈ (Xa + Xb)/2. Specifically, the coordinate Xc of the pupil center image c' is calculated from the X-coordinates Xa and Xb of the pupil end images a' and b'. The coordinates of the cornea reflection images Pd' and Pe' and the pupil center image c' are estimated as described above. - In step S804, the
CPU 111 calculates an imaging magnification β of the eyeball image. The imaging magnification β is a magnification that is determined based on the position of theeyeball 140 with respect to thelight receiving lens 130 and is obtained using a function of an interval (Xd - Xe) between the cornea reflection images Pd' and Pe'. - In step S805, the
CPU 111 calculates rotation angles of the optical axis of theeyeball 140 with respect to the optical axis of thelight receiving lens 130. An X-coordinate of a midpoint between the cornea reflection images Pd and Pe substantially matches an X-coordinate of a center of curvature O of thecornea 142. Thus, a rotation angle θ° x of theeyeball 140 in a Z-X plane (a plane perpendicular to the Y-axis) is calculated using formula (1) below, where Oc is a standard distance from the center of curvature O of thecornea 142 to the center c of thepupil 141. A rotation angle θy of theeyeball 140 in a Z-Y plane (a plane perpendicular to the X-axis) is calculated using a method similar to the method for calculating the rotation angle θx.
β × Oc × SINθX ≈ {(Xd + Xe)/2} - Xc (1).
- In step S806, the
CPU 111 calculates (estimates) a gaze point (a position to which a line of sight is directed; a position at which the user is looking) of the user on an image for viewing that is displayed on theEVF unit 1 using the rotation angles θx and θy calculated in step S805. In a case where coordinates (Hx, Hy) of the gaze point are coordinates corresponding to the pupil center c, the coordinates (Hx, Hy) of the gaze point are calculated using formulas (2) and (3) below.
Hx = m × (Ax × θx + Bx) (2).
Hy = m × (Ay × θy + By) (3).
- The parameter m in
formulas camera body 101. The parameter m is also a conversion coefficient for converting the rotation angles θx and θy into coordinates corresponding to the pupil center c on the image for viewing. These are predetermined and stored in thememory unit 112. The parameters Ax, Bx, Ay, and By are line-of-sight correction parameters for correcting individual differences in line of sight. The parameters Ax, Bx, Ay, and By are acquired by performing a calibration operation described below and stored in thememory unit 112 before the line-of-sight detection operation is initiated. - In step S807, the
CPU 111 stores the coordinates (Hx, Hy) of the gaze point into thememory unit 112, and the line-of-sight detection operation ends. - Next, a detailed structure of the optical path split
prism unit 16 will be described with reference toFig. 21. Fig. 21 is a diagram illustratively illustrating a structure of the optical path splitprism unit 16 according to the first exemplary embodiment of the present invention.Fig. 21A is a cross-sectional view illustrating details of a portion C specified inFig. 15 . Thedisplay panel 18 includes asemiconductor chip 18a and aglass plate 18b. Thesemiconductor chip 18a includes an organic electroluminescent (organic EL) element and a circuit for operating the organic EL element (display surface). Thedisplay panel 18 is abutted against a displaypanel abutment surface 17a of thepanel holder 17 and fixed by bonding with, for example, an adhesive. Thepanel holder 17 is made of resin. Thepanel holder 17 includes aprism attachment surface 17b on the opposite side to the displaypanel abutment surface 17a. A first double-sided tape 57, apanel mask 20, and a second double-sided tape 58 are arranged in this order between theprism attachment surface 17b and the optical path splitprism unit 16 and are each fixed by bonding with a double-sided tape. The bonding can use a method using, for example, an adhesive. - As described above, a feature of the present exemplary embodiment is that the optical path split
prism unit 16 is substantially sealed and attached without using a protection member for preventing entrance of dust into the display panel 18 (display panel unit 19). A conventional EVF includes a protection glass for preventing entrance of dust on a front surface of thedisplay panel unit 19 so that no dust is visible in viewing a screen of a display panel. However, dust may adhere to the outside of the protection glass, and in a case where the protection glass is excessively close to a display surface of the display panel, the dust adhering on the outside of the protection glass may be imaged and included in formed images, so that the protection glass is disposed with a predetermined distance or longer along an optical axis of the display panel. On the contrary, according to the present exemplary embodiment, the thickness of the optical path splitprism unit 16 in the optical axis direction provides a distance for preventing dust on the outside of the optical path splitprism unit 16 from forming an image on the display surface of the display panel. Thus, while accommodating the line-of-sight detection function, the space of the entire finder is made more compact than a case where a conventional display panel unit and an optical system for line-of-sight detection are simply arranged. - A
second opening portion 20a of thepanel mask 20 passes light from thedisplay panel 18 and light beams from the eyehole (reflection light from the pupils of the user) while a portion other than thesecond opening portion 20a of thepanel mask 20 cuts stray light entering through end portions of the optical path splitprism unit 16. Thepanel mask 20 contains a material with high heat resistance to be resistant to heat in a case where light beams from the eyehole are imaged nearby. According to the present exemplary embodiment, thepanel mask 20 contains a metal plated with black. Thepanel holder 17 includes an opening portion 7c so that the opening portion 7c passes light from thedisplay panel 18 and light beams (including reflection light from the pupil of the user) from the eyehole while a portion other than afirst opening portion 17c provides a mask to make the circuit portion of thesemiconductor chip 18a invisible from the eyehole side. While thefirst opening portion 17c is opened to be outside thesecond opening portion 20a when viewed from theEVF lens unit 29 side according to the present exemplary embodiment, the present invention is not limited to that described above. -
Fig. 21B is an exploded perspective view illustrating thedisplay panel unit 19 and the optical path splitprism unit 16, and each portion corresponding toFig. 21A is given the same reference number as inFig. 21A . - According to the present exemplary embodiment, the display panel unit is attached to the optical path split
prism unit 16 by positioning thepanel mask 20 and the optical path splitprism unit 16 with respect to thedisplay panel 18 using jigs. For the positioning, thepanel holder 17 includes Y-direction references X-direction reference 17f to determine an X-dimension. Thepanel mask 20 and the optical path splitprism unit 16 are attached using thereferences - As described above, according to the present exemplary embodiment, the optical path split
prism unit 16 is substantially sealed and attached without a protection member for preventing entrance of dust into thedisplay panel unit 19 in theEVF unit 1 having the line-of-sight detection function. Specifically, thedisplay panel unit 19 and the optical path splitprism unit 16 of theEVF unit 1 according to the present exemplary embodiment are attached together and integrally formed. This makes it unnecessary to provide a separate protection glass for preventing entrance of dust into thedisplay panel unit 8. Thus, an optical path length from the eyehole of the finder to the display surface of the display panel is reduced, and while the structure (space) of the entire EVF is made compact, a wide viewing angle with respect to thedisplay panel 18 is obtained. Further, use of the above-described structure facilitates positioning of three optical axes that are optical axes of the display surface of thedisplay panel unit 19, the optical path splitprism unit 16, and theEVF lens unit 29 in assembling theEVF unit 1. - Further, the optical path split
prism unit 16 according to the present exemplary embodiment has a greater interval (space, optical path length) between thedisplay pane 18 and the optical path splitprism unit 16 than a predetermined value. With this structure, foreign matter on the optical path splitprism unit 16 is prevented from being viewed by the user. - An EVF unit having the line-of-sight detection function according to a second exemplary embodiment of the present invention will be described below with reference to
Fig. 22 . The EVF unit according to the present exemplary embodiment is applicable to thecamera body 101, which is a similar electronic device to that described in the first exemplary embodiment, and a difference from theEVF unit 1 according to the first exemplary embodiment is a structure of an optical path split prism unit. - Thus, redundant descriptions of components similar to those of the
EVF unit 1 according to the first exemplary embodiment are omitted, and the optical path split prism unit according to the present exemplary embodiment will be described in detail. -
Fig. 22 is an exploded perspective view illustrating an optical path splitprism unit 216 and adisplay panel 218 according to the second exemplary embodiment of the present invention. As illustrated inFig. 22 , the optical path splitprism unit 216 according to the present exemplary embodiment includes a first optical path splitprism 214, a second optical path splitprism 215, and ablack mask 201, and a dichroic layer is formed on an attachment surface of the two prisms as in the first exemplary embodiment. - The
black mask 201 is formed by sputtering on an incidence surface of the second optical path splitprism 215 on the display unit side. Theblack mask 201 is a shaded portion illustrated inFig. 22 . Theblack mask 201 includes anopening portion 201a. - The
display panel 218 includes asemiconductor chip 218a and aglass plate 218b as in the first exemplary embodiment. Thesemiconductor chip 218a includes an organic EL element (display surface) and a circuit for operating the organic EL element. - As described above, according to the present exemplary embodiment, the
display panel 218 is attached directly to the second optical path splitprism 15 with a double-sided tape 202. Specifically, a feature is that theblack mask 201 is provided to prevent unnecessary matter from being visible on a field when the user looks into theeyepiece portion 107 of thecamera body 101. - With the above-described structure, the
display panel 218 is attached directly to the second optical path splitprism 215, and a separate protection glass for preventing entrance of dust into thedisplay panel 218 is unnecessary, so that the optical path length from the eyehole of the finder to the display surface of the display panel is reduced. - An EVF unit having the line-of-sight detection function according to a third exemplary embodiment of the present invention will be described below with reference to
Fig. 23 . The EVF unit according to the present exemplary embodiment is applicable to thecamera body 101, which is a similar electronic device to that described in the first exemplary embodiment, and a difference from theEVF unit 1 according to the first exemplary embodiment is a structure of an optical path split prism unit. - Thus, redundant descriptions of components similar to those of the
EVF unit 1 according to the first exemplary embodiment are omitted, and the optical path split prism unit according to the present exemplary embodiment will be described in detail.Fig. 23 is an exploded perspective view illustrating an optical path splitprism 315 and asemiconductor chip 318a for display according to the third exemplary embodiment of the present invention. As illustrated inFig. 23 , the optical path splitprism 315 according to the present exemplary embodiment is a prism similar to the second optical path splitprism 15 according to the first exemplary embodiment and is attached directly to thesemiconductor chip 318a including an organic EL. - A
black mask 301 is formed by sputtering on the optical path splitprism 315 as in the second exemplary embodiment. Theblack mask 301 is a shaded portion illustrated inFig. 23 . Theblack mask 301 includes anopening portion 301a. - The
semiconductor chip 318a is attached directly to the second optical path splitprism 215 with an adhesive for tight sealing. - As described above, according to the present exemplary embodiment, the optical path split
prism 315 is attached directly to thesemiconductor chip 318a including an organic EL element (display surface) and a circuit for operating the organic EL element. - With the above-described structure, it is unnecessary to provide a separate glass for element protection or a separate protection glass for preventing entrance of dust to the
semiconductor chip 318a, and the optical path length from theeyepiece portion 107 to the display surface of the display panel (semiconductor chip 318a) is more reduced than the above-described exemplary embodiments. - A structure of a camera body that is an electronic device including an EVF unit according to a fourth exemplary embodiment of the present invention will be described below with reference to
Figs. 24 to 26 .Fig. 24 is an external perspective view illustrating acamera body 400 as an electronic device according to the fourth exemplary embodiment of the present invention. AnEVF unit 401 of thecamera body 400 according to the present exemplary embodiment is substantially the same as theEVF unit 1 according to the first exemplary embodiment, so that an arrangement of components of the electronic device and an arrangement and a structure of theEVF unit 401 according to the present exemplary embodiment will be described. Further, thecamera body 400 according to the present exemplary embodiment and thecamera body 101 according to the first exemplary embodiment have basically the same arrangement of the components. For example, a positional relationship between theEVF unit 401 and adiopter adjustment portion 416 and arelease button 405 according to the present exemplary embodiment is substantially the same as that of thecamera body 101 according to the first exemplary embodiment. Thus, an arrangement of components of thecamera body 400 will be described in more detail. - As illustrated in
Fig. 24 , amount 402 is disposed on a front surface of thecamera body 400, and a camera accessory such as an interchangeable lens is attachable and detachable. Further, anaccessory shoe 403 is disposed at an upper portion of thecamera body 400 and is a connection portion to and from which external devices, such as a flash and a microphone, are attachable and detachable. Aneyepiece portion 404 is disposed on a rear surface of thecamera body 400 and at an upper portion of thecamera body 400. - A
grip portion 419 is disposed at a right portion of thecamera body 400 viewed from the rear side from which the user looks into theeyepiece portion 404 of theEVF unit 401. The user can hold thegrip portion 419 with a hand. Thus, operation units that are manually operable by the user holding thecamera body 400 are concentrated at the right side of thecamera body 400 according to the present exemplary embodiment. For example, therelease button 405 and afirst operation dial 407, asecond operation dial 408, afirst setting button 409, and asecond setting button 410 for adjusting various parameters relating to imaging conditions and modes are located at an upper right portion of thecamera body 400. Further, aninformation display portion 406 is disposed on the right side of the EVF unit 401 (eyepiece portion 404) at an upper portion of thecamera body 400 according to the present exemplary embodiment. Theinformation display portion 406 is a display unit that can display various types of information, such as a shutter speed and an aperture value relating to exposure conditions, a current imaging mode, and information about whether continuous imaging is on. -
Fig. 25 is a perspective view illustratively illustrating an internal structure of thecamera body 400 according to the fourth exemplary embodiment of the present invention.Fig. 26 is a top external view illustratively illustrating an internal structure of thecamera body 400 according to the fourth exemplary embodiment of the present invention. As illustrated inFigs. 25 and26 , thediopter adjustment portion 416 is disposed to the right of theEVF unit 401. - As described above, the
grip portion 419, the various operation units (406 to 410), and thediopter adjustment portion 416 are concentrated mainly at the right side of thecamera body 400. This structure improves the operability of thecamera body 400 for the user holding thecamera body 400 since the most common case is a case where the user holds thecamera body 400 with the right hand. - As described above, various operation units and the diopter adjustment portion are concentrated at the right side of the
camera body 400, so that disposing, for example, a line-of-sightdetection sensor unit 415 for line-of-sight detection, at the right side of thecamera body 400 may increase the size of thecamera body 400. Thus, the line-of-sightdetection sensor unit 415 of theEVF unit 401 according to the present exemplary embodiment is disposed at the side different from the side that is to be held by the user with respect to an optical path (or optical axis) of theEVF unit 401 in thecamera body 400. Specifically, the line-of-sightdetection sensor unit 415 of theEVF unit 401 is disposed at an opposite side to the side where thegrip portion 419 or various operation units or thediopter adjustment portion 416 is disposed with respect to a lens unit of theEVF unit 401. - As illustrated in
Figs. 25 and26 , ashutter 411, animage sensor unit 412 including an image sensor, and adisplay monitor 414 are provided at a lower portion of theEVF unit 401 in thecamera body 400. According to the present exemplary embodiment, theEVF unit 401 and the display monitor 414 are disposed to overlap on a plane (two-dimensional plane) perpendicular to an imaging optical axis of thecamera body 400 in order to reduce the size and thickness of thecamera body 400. - Further, it is known that disposing the finder immediately above a central axis that passes through a center of a diameter of the
mount 402 and is the imaging optical axis of thecamera body 400 typically reduces the strangeness the user may feel in framing a subject. Thus, according to the present exemplary embodiment, theEVF unit 401 is disposed to overlap the central axis (imaging optical axis) of themount 402 on the plane perpendicular to the central axis. In other words, theEVF unit 401 is disposed to overlap the imaging optical axis of thecamera body 400 and a lens unit attachable to and detachable from thecamera body 400 on the two-dimensional plane perpendicular to the imaging optical axis of the lens unit of theEVF unit 401. - Further, a positioning unit such as a global positioning system (GPS)
unit 417 of thecamera body 400 and a measurement unit such as ameasurement unit 418 for detecting an orientation and a movement of thecamera body 400 are disposed at a front side of theEVF unit 401. Further, theaccessory shoe 403 described above is disposed at an upper portion of theEVF unit 401. The foregoing units are also connected to amain substrate 413 similarly to the various operation units described above. Thus, disposing the line-of-sightdetection sensor unit 415 between the units and theEVF unit 401 may lead to an increase in size of thecamera body 400 and complicated wiring from themain substrate 413 to the units. - Thus, a layout of the
EVF unit 401 for accommodating a line-of-sight detection mechanism while preventing an increase in size of thecamera body 400 desirably avoids a grip region of thecamera body 400 and a neighborhood of the grip region where other members are less likely to be disposed. For example, as described above, the line-of-sightdetection sensor unit 415 is desirably disposed at the opposite side to the grip region disposed at the right side of thecamera body 400 where thegrip portion 419 and various operation units are disposed with respect to the optical axis of theEVF unit 401. - While various exemplary embodiments of the present invention have been described above, the present invention is not limited to the exemplary embodiments, and various modifications and changes can be made without departing from the scope of the invention. For example, while so-called interchangeable lens imaging apparatuses with a camera body to and from which a lens unit is attachable and detachable are described above in the exemplary embodiments, the present invention is not limited to those described above. For example, the lens unit and the camera body can be integrally provided.
- Further, while imaging apparatuses are illustratively described as an electronic device to which the present invention is applied are described above in the exemplary embodiments, the present invention is not limited to those described above. For example, the exemplary embodiments are applicable to a device, such as a head-mount display that includes the line-of-sight detection function and performs control based on feedback from the line-of-sight detection function.
- Further, while the configuration in which the
CPU 111 comprehensively controls the camera body and the lens unit is described above in the exemplary embodiments, the present invention is not limited to those described above. In another configuration, for example, a (computer) program according to the flow illustrated inFig. 20 is stored in advance in a memory unit of the camera body. Then, the components of thecamera body 101 having the configuration illustrated inFig. 2 cooperate together and execute the program to control operations of the entire imaging system. Further, the program can be in any forms that have a program function, such as object codes, a program to be executed by an interpreter, or script data to be fed to an operating system (OS). Further, a recording medium for feeding the program can be, for example, a hard disk, a magnetic recording medium such as a magnetic tape, or an optical or magneto-optical recording medium. - Further, the present invention is also realizable by the following process. Specifically, a program for realizing one or more functions of the above-described exemplary embodiments is fed to a system or an apparatus via a network or a storage medium, and one or more processors of a computer of the system or the apparatus read the program and execute the read program. Further, the present invention is also realizable by a circuit (e.g., application-specific integrated circuit (ASIC)) for realizing one or more functions.
- The present invention is not limited to the above-described exemplary embodiments, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, the claims below are attached to state the scope of the present invention.
- This application claims the benefit of
Japanese Patent Applications No. 2019-199104, filed October 31, 2019 No. 2020-176363, filed October 20, 2020
Claims (18)
- An imaging apparatus including a first sensor configured to image an optical subject image incident via a first lens unit, the imaging apparatus comprising:a display panel;an eyepiece portion;a second sensor configured to image an eye of a user viewing the display panel by looking into the eyepiece portion, the second sensor being different from the first sensor;an optical path split unit configured to guide light from the display panel to outside the imaging apparatus and to guide external light to the second sensor; anda second lens unit disposed between the optical path split unit and the eyepiece portion,wherein the second sensor is disposed at a different side from a side where a grip region of the imaging apparatus is disposed with respect to an optical axis of the second lens unit.
- The imaging apparatus according to Claim 1, further comprising a grip portion to be gripped by the user in gripping the imaging apparatus,
wherein the grip region of the imaging apparatus includes the grip portion. - The imaging apparatus according to Claim 1, further comprising an operation unit to be operated manually by the user in a state of gripping the imaging apparatus,wherein the grip region of the imaging apparatus is located at a side where the operation unit is disposed with respect to the optical axis of the second lens unit, andwherein the second sensor is disposed at an opposite side to the side where the operation unit is disposed with respect to the optical axis of the second lens unit.
- The imaging apparatus according to Claim 3, wherein the operation unit includes a release button for issuing an instruction to start imaging.
- The imaging apparatus according to Claim 3, wherein the operation unit includes an operation dial or a setting button for adjusting an imaging condition.
- The imaging apparatus according to Claim 1, further comprising a diopter adjustment portion for adjusting a diopter by moving the second lens unit,wherein the grip region of the imaging apparatus is located at a side where the diopter adjustment portion is disposed with respect to the optical axis of the second lens unit, andwherein the second sensor is disposed at an opposite side to the side where the diopter adjustment portion is disposed with respect to the optical axis of the second lens unit.
- The imaging apparatus according to Claim 1, further comprising:an image sensor unit including the first sensor;a display monitor configured to display image data acquired using the first sensor, the display monitor being different from the display panel; anda substrate connected to the image sensor unit and the display monitor,wherein a finder unit including at least the display panel, the second sensor, the optical path split unit, and the second lens unit overlaps at least one of the image sensor unit, the display monitor, and the substrate on a two-dimensional plane perpendicular to the optical axis of the second lens unit.
- The imaging apparatus according to Claim 7, wherein the finder unit overlaps an imaging optical axis of the first lens unit on the two-dimensional plane perpendicular to the optical axis of the second lens unit.
- An electronic device including a display panel and having a line-of-sight detection function to detect a line of sight of a user viewing a display on the display panel, the electronic device comprising:a display unit including the display panel and a frame holding the display panel;a line-of-sight detection sensor configured to receive light from a pupil of the user; andan optical path split unit configured to guide light from the display panel to outside and to guide external light to the line-of-sight detection sensor,wherein the optical path split unit is fixed to the display unit to be integrally formed with the display unit.
- The electronic device according to Claim 9, further comprising a lens unit configured to guide light from the outside to the optical path split unit.
- The electronic device according to Claim 9, further comprising a light source configured to emit light for detecting the line of sight of the user to the outside.
- The electronic device according to Claim 11,wherein the light from the light source includes infrared light, andwherein the line-of-sight detection sensor receives reflected infrared light from the outside.
- The electronic device according to Claim 9, further comprising a detection unit configured to detect the line of sight of the user based on a signal from the line-of-sight detection sensor.
- A finder unit comprising:a display panel;a sensor configured to image an eye of a user viewing the display panel;an optical path split unit configured to guide light from the display panel to outside and to guide external light to the sensor;a lens unit disposed at a position facing the optical path split unit;an optical member located between the lens unit and the eye of the user viewing the display panel;an illumination unit configured to emit infrared wavelength light toward an opposite side across the optical member, the illumination unit being located between the optical member and the lens unit in a direction parallel to an optical axis of the lens unit; anda mask having a transmittance in visible wavelength light and a transmittance in infrared wavelength light, the transmittance in visible wavelength light being lower than the transmittance in infrared wavelength light in the direction parallel to the optical axis of the lens unit,wherein the mask is disposed at a position that is different from an optical path from the display panel to the optical member and closer to the user than the illumination unit when the user views the display panel.
- The finder unit according to Claim 14, wherein the mask and the optical member are integrally provided, and the mask is a printed material printed with a material that transmits infrared wavelength light.
- The finder unit according to Claim 14, further comprising a line-of-sight detection unit,wherein the sensor images an optical image in a case where the infrared wavelength light emitted from the illumination unit is reflected by the eye of the user, andwherein the line-of-sight detection unit detects a line of sight of the user based on image data corresponding to the optical image imaged by the sensor.
- The finder unit according to Claim 14, further comprising:a proximity detection unit configured to detect an approach of an object; anda diopter adjustment unit configured to adjust a diopter by moving the lens unit,wherein the illumination unit is disposed at a position along a long side and the proximity detection unit and the diopter adjustment unit are disposed at a position along a short side with respect to the optical path from the display panel to the optical member, andwherein the proximity detection unit is located at an opposite side to a side where the diopter adjustment unit is located with respect to the optical axis of the lens unit.
- The finder unit according to Claim 17,wherein the proximity detection unit includes a second illumination unit different from the illumination unit, andwherein a light emission direction of the second illumination unit is inclined with respect to the optical member toward the optical axis of the lens unit.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019199104 | 2019-10-31 | ||
JP2020176363A JP2021076832A (en) | 2019-10-31 | 2020-10-20 | Imaging apparatus, electronic apparatus, and finder unit |
PCT/JP2020/040662 WO2021085541A1 (en) | 2019-10-31 | 2020-10-29 | Imaging device, electronic apparatus, and finder unit |
Publications (2)
Publication Number | Publication Date |
---|---|
EP4054169A1 true EP4054169A1 (en) | 2022-09-07 |
EP4054169A4 EP4054169A4 (en) | 2024-03-13 |
Family
ID=75716360
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20881105.9A Pending EP4054169A4 (en) | 2019-10-31 | 2020-10-29 | Imaging device, electronic apparatus, and finder unit |
Country Status (4)
Country | Link |
---|---|
US (1) | US11831975B2 (en) |
EP (1) | EP4054169A4 (en) |
CN (1) | CN114631304A (en) |
WO (1) | WO2021085541A1 (en) |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3186072B2 (en) | 1991-01-08 | 2001-07-11 | キヤノン株式会社 | Equipment with gaze detection device |
JPH05130463A (en) | 1991-11-06 | 1993-05-25 | Canon Inc | Magnetic recording image pickup device |
JPH06138369A (en) * | 1992-10-29 | 1994-05-20 | Canon Inc | Sight line detecting device |
US5761543A (en) * | 1992-10-29 | 1998-06-02 | Canon Kabushiki Kaisha | Apparatus for measuring anterior eye portion |
JPH06148505A (en) * | 1992-10-30 | 1994-05-27 | Nikon Corp | Camera with line-of-sight detector |
JPH07191382A (en) * | 1993-12-27 | 1995-07-28 | Asahi Optical Co Ltd | Finder device for camera |
JPH07299038A (en) * | 1994-05-06 | 1995-11-14 | Nikon Corp | Camera having visual axis detecting function |
JPH11237562A (en) * | 1998-02-24 | 1999-08-31 | Olympus Optical Co Ltd | Finder for single-lens reflex type digital camera |
JP4910989B2 (en) * | 2007-10-26 | 2012-04-04 | ソニー株式会社 | Imaging device |
JP6175945B2 (en) * | 2013-07-05 | 2017-08-09 | ソニー株式会社 | Gaze detection apparatus and gaze detection method |
KR102471370B1 (en) * | 2015-02-23 | 2022-11-28 | 소니그룹주식회사 | Information processing apparatus, information processing method, and program |
JP7122862B2 (en) | 2018-05-14 | 2022-08-22 | ヤマハ発動機株式会社 | Outboard motor |
CN110169613A (en) | 2019-04-22 | 2019-08-27 | 武汉金皖苏医疗器械有限公司 | A kind of mask of anti-lens atomization |
-
2020
- 2020-10-29 CN CN202080076281.2A patent/CN114631304A/en active Pending
- 2020-10-29 WO PCT/JP2020/040662 patent/WO2021085541A1/en unknown
- 2020-10-29 EP EP20881105.9A patent/EP4054169A4/en active Pending
-
2022
- 2022-04-20 US US17/725,343 patent/US11831975B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US11831975B2 (en) | 2023-11-28 |
CN114631304A (en) | 2022-06-14 |
WO2021085541A1 (en) | 2021-05-06 |
US20220247933A1 (en) | 2022-08-04 |
EP4054169A4 (en) | 2024-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TW200815891A (en) | Imaging device and light shielding member | |
US20230013134A1 (en) | Electronic device | |
JP5144006B2 (en) | Camera with display screen | |
EP4054169A1 (en) | Imaging device, electronic apparatus, and finder unit | |
JP2007163724A (en) | Image coincidence type range finder using ccd, camera with range finder, and photographing system | |
US11822714B2 (en) | Electronic device and control method for capturing an image of an eye | |
JP2021076832A (en) | Imaging apparatus, electronic apparatus, and finder unit | |
JP2021125867A (en) | Image processing device, imaging device, control method of image processing device, and program | |
US11971552B2 (en) | Electronic device, method of controlling the same, and storage medium | |
WO2021210225A1 (en) | Electronic device | |
US20230403450A1 (en) | Viewfinder unit with line-of-sight detection function, image capturing apparatus, and attachment accessory | |
US20230092593A1 (en) | Detection device detecting gaze point of user, control method therefor, and storage medium storing control program therefor | |
CN116437180A (en) | Display apparatus, viewfinder apparatus, and image pickup apparatus | |
JP2021182736A (en) | Electronic apparatus | |
JP2022124778A (en) | Electronic apparatus | |
JP2024003432A (en) | Electronic device | |
JP6504970B2 (en) | Prism dustproof structure of optical equipment | |
JP2023083695A (en) | Electronic apparatus | |
JP2002131805A (en) | Electronic viewfinder device of camera | |
JP2022119697A (en) | Finder unit having line-of-sight detection function, imaging apparatus, and attachment unit | |
JP2019193175A (en) | Imaging apparatus | |
JP2003107560A (en) | Optical system for display light projection for camera | |
JPH07333689A (en) | Observing device and camera provided with the same | |
JP2004252264A (en) | Camera | |
JP2001197336A (en) | Electronic finder system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220531 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: H04N0005225000 Ipc: G02B0007280000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 23/67 20230101ALI20230928BHEP Ipc: H04N 23/63 20230101ALI20230928BHEP Ipc: H04N 23/62 20230101ALI20230928BHEP Ipc: H04N 23/56 20230101ALI20230928BHEP Ipc: H04N 23/55 20230101ALI20230928BHEP Ipc: H04N 23/54 20230101ALI20230928BHEP Ipc: G03B 17/02 20210101ALI20230928BHEP Ipc: G03B 13/02 20210101ALI20230928BHEP Ipc: G02B 7/28 20210101AFI20230928BHEP |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20240209 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 23/67 20230101ALI20240205BHEP Ipc: H04N 23/63 20230101ALI20240205BHEP Ipc: H04N 23/62 20230101ALI20240205BHEP Ipc: H04N 23/56 20230101ALI20240205BHEP Ipc: H04N 23/55 20230101ALI20240205BHEP Ipc: H04N 23/54 20230101ALI20240205BHEP Ipc: G03B 17/02 20210101ALI20240205BHEP Ipc: G03B 13/02 20210101ALI20240205BHEP Ipc: G02B 7/28 20210101AFI20240205BHEP |