WO2022163080A1 - Dispositif d'observation - Google Patents

Dispositif d'observation Download PDF

Info

Publication number
WO2022163080A1
WO2022163080A1 PCT/JP2021/042380 JP2021042380W WO2022163080A1 WO 2022163080 A1 WO2022163080 A1 WO 2022163080A1 JP 2021042380 W JP2021042380 W JP 2021042380W WO 2022163080 A1 WO2022163080 A1 WO 2022163080A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light shielding
optical
display
area
Prior art date
Application number
PCT/JP2021/042380
Other languages
English (en)
Japanese (ja)
Inventor
宏輔 ▲高▼橋
峻介 宮城島
広樹 斉藤
智貴 大槻
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2022578068A priority Critical patent/JPWO2022163080A1/ja
Publication of WO2022163080A1 publication Critical patent/WO2022163080A1/fr
Priority to US18/360,663 priority patent/US20230370712A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • G03B17/20Signals indicating condition of a camera member or suitability of light visible in viewfinder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators

Definitions

  • One embodiment of the present invention relates to an observation device for an imaging device, and more particularly to an observation device capable of observing both an optical image of a subject and a captured image.
  • a user of an imaging device uses a viewfinder for the camera, that is, an observation device for the imaging device, for purposes such as setting the imaging range (angle of view).
  • a viewfinder for the camera that is, an observation device for the imaging device, for purposes such as setting the imaging range (angle of view).
  • HVF hybrid observation device
  • an optical image of a subject is formed by an optical system within the HVF while displaying a captured image on a display within the HVF (see, for example, Patent Document 1).
  • the captured image is superimposed on the optical image of the subject, and the user of the HVF can observe both the captured image and the optical image of the subject through a single eyepiece.
  • the visibility of the image displayed over the optical image of the subject may change depending on the intensity of the light (environmental light) in the shooting environment. For example, outdoors in the daytime, the incident light into the HVF makes the optical image appear brighter, making it difficult to see the image.
  • One embodiment of the present invention has been made in view of the above circumstances, and provides an observation device capable of adjusting the visibility of an image and an optical image when displaying the image superimposed on an optical image of a subject. for the purpose.
  • one embodiment of the present invention is an observation device for an imaging device having an imaging device and controlled by a control unit, wherein a user observes an optical image of a subject.
  • the image is an image based on the signal generated by the image pickup device, and the display mechanism superimposes the image on the first optical region and displays both the image and the optical image so that the user can observe them.
  • the light shielding mechanism is arranged in the optical path between the subject and the display mechanism, and has a first light shielding area overlapping the first optical area and a second light shielding area overlapping the second optical area. It is an observation apparatus in which control processing for controlling the light shielding ratio of the first light shielding region and the second light shielding region is executed by the control unit for the mechanism based on the intensity of the ambient light.
  • control processing may be executed for the light shielding mechanism based on the output signal of a sensor that outputs a signal corresponding to the intensity.
  • the imaging device may generate a signal when a part of the subject observed as an optical image is captured.
  • the light shielding rate of the first light shielding area may be controlled based on the signal generated by the imaging element and the output signal of the sensor.
  • the edge of the first light shielding area may be located outside the edge of the image display area.
  • the display mechanism may have a light-transmissive display.
  • the image is displayed in the portion corresponding to the first optical region in the display, with the optical image transmitted through the display.
  • the image may include an image of the in-focus area and an image of the out-of-focus area other than the in-focus area.
  • the light shielding rate of the area overlapping the image of the in-focus area in the first light shielding area may be controlled to be higher than the light shielding rate of the area overlapping the image of the out-of-focus area.
  • the image may include an image of the in-focus area and an image of the out-of-focus area other than the in-focus area.
  • the display mode of the image in the in-focus area may be different from the display mode of the image in the out-of-focus area.
  • a correction process is performed to correct the gradation value according to the signal generated by the image sensor, and the display mechanism displays an image based on the corrected gradation value, and the correction amount for the gradation value in the correction process. may be set according to the intensity.
  • the display mechanism may display a marker indicating the shooting range of the image sensor together with the image.
  • a marker indicating the shooting range of the image sensor.
  • one of the sign and the image may be preferentially displayed with respect to the other.
  • one of the sign and the image selected by the user may be preferentially displayed over the other.
  • observation device may be for an imaging device including a control unit.
  • another embodiment of the present invention is an observation device for an imaging device having an imaging device and controlled by a control unit, wherein the optical system forms an optical image of a subject so that a user can observe it. and a display mechanism for displaying an image so that a user can observe it, and a light shielding mechanism for varying the light shielding rate by an electric signal, wherein the optical image has a first optical area and a second optical area, and is displayed.
  • the mechanism superimposes an image on the first optical region to display both the image and the optical image so that a user can observe them
  • the light shielding mechanism is disposed in the optical path between the subject and the display mechanism, and the first optical region and a plurality of light shielding regions superimposed on the second optical region
  • the control unit executes control processing for the display mechanism and the light shielding mechanism, and in the control processing, the display size of the image in the display mechanism and the plurality of light shielding regions and a light shielding rate corresponding to a display size corresponding to a user's input operation and a light shielding rate corresponding to a plurality of light shielding areas are read out from a storage device in which light shielding rates corresponding to .
  • the display size and the light shielding ratios corresponding to the plurality of light shielding regions may be stored in the storage device while being associated with one of the plurality of modes.
  • the input operation is an operation for the user to specify one of a plurality of modes
  • the display size corresponding to the mode specified by the user and the light shielding rate corresponding to the plurality of light shielding areas are set. , may be read from the storage device.
  • another embodiment of the present invention is an observation device for an imaging device having an imaging device and controlled by a control unit, wherein the optical system forms an optical image of a subject so that a user can observe it. and a display mechanism for displaying an image to be observed by a user, the optical system having a lens, the optical image having a first optical region and a second optical region, the display mechanism comprising: An image is superimposed on the first optical region, and both the image and the optical image are displayed so that the user can observe them. The mark is displayed after correction according to the distortion caused by the It is a viewing device that is displayed in a weakly corrected state.
  • FIG. 3 is an explanatory diagram of an optical image observed by an observation device
  • FIG. 4 is a diagram showing the correspondence between input gradation values and output values in ⁇ correction
  • FIG. 4 is a diagram showing an example of 1st gradation correction.
  • FIG. 11 is a diagram showing another example of the second gradation correction.
  • FIG. 10 is a diagram showing an image and an optical image when the display size is changed;
  • FIG. 10 is a diagram showing an image and an optical image when the display position is changed;
  • FIG. 10 is a diagram showing an image and an optical image when the display area of the image and the display area of the sign are overlapped;
  • FIG. 10 is an explanatory diagram of a case in which the observation device is used in a situation where the intensity of ambient light is high.
  • the upper diagram in the figure shows each light shielding area of the light shielding mechanism, and the lower diagram is an optical image that can be observed within the observation device. and images.
  • FIG. 10 is an explanatory diagram of the effect of controlling the light shielding rate of each light shielding region of the light shielding mechanism. Figures and images are shown.
  • FIG. 10 is an explanatory diagram of a light shielding rate of a second light shielding region;
  • FIG. 4 is an explanatory diagram of the effect of increasing the light shielding rate of the light shielding area overlapping the image of the in-focus area.
  • Observable optical images and images are shown.
  • FIG. 10 is a diagram showing an image that can be observed within the observation device when the imaging environment is a dark environment; It is a figure which shows the operation
  • FIG. 10 is an explanatory diagram of the display size of an image and the light shielding rate of each light shielding region of the light shielding mechanism, which are set for each mode; FIG.
  • FIG 3 shows an optical image observed with distortion, a corrected image, and an uncorrected mark. It is a figure showing the modification of the imaging device concerning one embodiment of the present invention, and an observation device. It is a figure showing a modification of composition of an imaging device concerning one embodiment of the present invention, and an observation device. It is a figure which shows the modification of an internal structure of an observation apparatus.
  • a first embodiment of the present invention relates to an observation device for an imaging device.
  • An imaging device 10 according to the first embodiment constitutes, for example, a digital camera shown in FIG.
  • the observation device 30 according to the first embodiment is configured by a camera viewfinder. In the configuration shown in FIG. 1 , the observation device 30 is built in the imaging device 10 .
  • the imaging device 10 means a portion of the digital camera excluding the viewing device 30 .
  • the user corresponds to the user of the imaging device 10 as well as the user of the observation device 30 .
  • the imaging apparatus 10 includes an imaging lens 12, an aperture 16, a shutter 18, an imaging element 20, a rear display 22, an operation unit 24, a lens drive mechanism 28, a control section 40, an internal memory 50, and the like. Prepare.
  • the imaging device 10 is a lens-integrated model or a lens-interchangeable model, and images at an angle of view corresponding to the imaging lens 12 .
  • Light transmitted through the imaging lens 12 is incident on the imaging device 20 during imaging.
  • the amount of light incident on the imaging device 20 is controlled by adjusting the aperture value of the aperture 16 .
  • the exposure time during imaging is controlled by adjusting the shutter speed of the shutter 18 . Exposure conditions such as aperture value, shutter speed, and ISO sensitivity are controlled by the controller 40 .
  • the imaging lens 12 may be a telephoto lens. Also, the focus lens 12a included in the imaging lens 12 can be moved in its optical axis direction by a lens driving mechanism 28. As shown in FIG. That is, the imaging device 10 has a variable focus (in-focus position).
  • the imaging element 20 is configured by a known image sensor, such as a CCD (Charged Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor Image Sensor), an organic imaging element, or the like.
  • the imaging device 20 receives light (not limited to visible light) from an object within the angle of view, converts the received light image into an electrical signal, and generates and outputs the converted signal.
  • the back display 22 is provided on the back of the imaging device 10 and displays images and various types of information. For example, a live view image is displayed during imaging.
  • a live view image is an image (captured image) based on a signal generated by the imaging device 20, and is a real-time image of a subject being captured.
  • the operation unit 24 is provided on the outer surface of the imaging device 10 and receives user operations.
  • the operation unit 24 includes, as shown in FIG. 2, a release button 25, a cross key type or control wheel type selection button 26, a touch panel 27 provided on the rear display 22, and the like.
  • the release button 25 is pressed when the user instructs storage of the captured image.
  • the selection button 26 and the touch panel 27 are operated, for example, when the user selects a mode or sets conditions.
  • the observation device 30 is an observation device for the imaging device 10, and is a peek-type finder used by the user to set the angle of view and confirm the subject during imaging.
  • the observation device 30 is a hybrid finder (HVF) having both the function of an optical view finder (OVF) and the function of an electronic view finder (EVF).
  • HVF hybrid finder
  • the optical image of the subject is formed by the function of the OVF so that the user can observe it, and the photographed image is displayed by the function of the EVF so that the user can observe it.
  • forming an optical image so that the user can observe it and displaying the image means forming an optical image so as to fit within the field of view of the user when the user looks into the observation device 30, and displaying the image. means to display an image.
  • an image P (more specifically, a live view image) displayed in the observation device 30 is obtained by imaging a distant subject at a narrow angle of view. It becomes the actual image.
  • the observation device 30 can observe an optical image of a subject over a relatively wide range, and the subject shown in the image P is included in the subject observed as the optical image. That is, the imaging device 20 captures a part of the subject observed as an optical image by the observation device 30 and generates a signal.
  • multiple images P may be displayed.
  • a plurality of images P in addition to the image of the entire angle of view of the imaging lens 12, an image obtained by enlarging the focus position of the imaging lens 12 and/or an enlarged human face detected by the imaging device 10 is displayed. It is preferable to display images.
  • the operability is improved when the user takes an image of a moving object such as a person, an animal, or a vehicle located far away as an object. Specifically, when the moving object is out of the angle of view, the image P does not show the moving object. Even in this case, by confirming the optical image superimposed on the image P in the observation device 30, that is, the subject in a wider range, it is possible to easily track the moving object outside the angle of view. Note that the observation device 30 will be described in detail later.
  • the control unit 40 is configured to control each unit of the imaging device 10 and execute various processes including imaging, image recording, and image display.
  • the control unit 40 is configured by a processor.
  • a processor is one or more hardware devices, for example, CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), GPU (Graphics Processing Unit) , or other IC (Integrated Circuit). Alternatively, a processor may be configured by combining these.
  • the processor forming the control unit 40 may be configured with a single IC (Integrated Circuit) chip for the entire function of the control unit 40, as typified by SoC (System on Chip).
  • SoC System on Chip
  • the hardware configuration of the processor described above may be realized by an electric circuit (Circuitry) in which circuit elements such as semiconductor elements are combined.
  • the internal memory 50 is an example of a storage device, and stores programs executed by the control unit 40 .
  • the processor functions as the control unit 40 by executing this program by the processor.
  • the programs executed by the processor are not necessarily stored in the internal memory 50, and may be stored in the memory card 52, which is an example of a storage device.
  • the memory card 52 is used by being inserted into a card slot (not shown) provided in the imaging device 10 .
  • the control unit 40 stores data necessary for the control unit 40 to execute various processes, such as a control pattern for correction, which will be described later.
  • the above data is stored in a storage device consisting of the internal memory 50 and the memory card 52. may be stored in
  • the processor constituting the control unit 40 can communicate with an external server (for example, a cloud service server) via the Internet or a mobile communication line, the above data is stored in the external server. good too.
  • the control unit 40 has a control processing unit 42 and an image generation unit 44 as shown in FIG.
  • the control processing section 42 is configured to control each section of the imaging device 10 according to a user's operation received via the operation unit 24 or according to a predetermined control rule.
  • the control processing unit 42 automatically changes exposure conditions by controlling the diaphragm 16, shutter 18, and image sensor 20 according to the intensity of light in the shooting environment (hereinafter referred to as ambient light).
  • ambient light the intensity of light in the shooting environment
  • the control processing unit 42 causes the internal memory 50 or the like to record the data of the image captured at that time.
  • control processing unit 42 causes the rear display 22 or the display mechanism 34 of the observation device 30 to display an image based on the image data created by the image creating unit 44 during imaging. Which of the rear display 22 and the display mechanism 34 should display an image may be determined by the user, or may be automatically determined by the control section 40 . For example, when the distance between the user and the imaging device 10 is equal to or less than a predetermined distance, the control processing unit 42 automatically sets the display mechanism 34 as the image display destination.
  • image data recorded in the internal memory 50 or the like will be referred to as “recorded image data”, and image data displayed on the rear display 22 or the display mechanism 34 will be referred to as “display image data”. .
  • control processing unit 42 automatically adjusts the focus (in-focus position) by driving the lens drive mechanism 28 to move the focus lens 12a.
  • Autofocus processing includes, for example, contrast autofocus, image plane phase difference autofocus, laser autofocus, directional light autofocus such as the Time of Flight method, and depth-from-defocus method (DFD method) autofocus. is available.
  • control processing unit 42 calculates a focus position (in-focus position) in the image based on autofocus technology, and in the imaged image, an image in the in-focus area and an image in the out-of-focus area other than the in-focus area. and can be identified.
  • the image of the in-focus area is a partial image existing in the captured image, and is an image of the focused area.
  • control processing unit 42 can detect the intensity of ambient light.
  • Ambient light includes light emitted from a subject and light of the entire environment in which the imaging device 10 exists, such as external light that illuminates the surroundings of the imaging device 10 .
  • the intensity of the ambient light is detected based on the output signal of the photometric sensor 48, which will be described later, and the signal generated by the imaging element 20, which is an image sensor.
  • the intensity of ambient light is detected based on the signal generated by the image pickup device 20, for example, the exposure amount from the signal, specifically, the integration of the exposure amount calculated for automatic exposure control or automatic white balance control Calculate the value.
  • the image creation unit 44 is configured to create recorded image data and display image data of the captured image.
  • the display image data may also be used as recording image data.
  • the image generator 44 has an A/D (Analog/Digital) converter 45, an image data generator 46, and a corrector 47, as shown in FIG.
  • the A/D converter 45 converts the signal generated by the imaging device 20 from an analog signal to a digital signal.
  • the image data creation unit 46 performs image processing such as white balance correction on the converted digital signal, and creates image data by compressing the processed signal according to a predetermined standard.
  • the image data is data indicating the gradation value of each part of the angle of view at the time of imaging, more specifically, the gradation value of three colors of RGB (hereinafter referred to as input gradation value) for each pixel.
  • the input gradation value is defined within a numerical range that includes the lower and upper limits and intermediate values between them. Specified within a range.
  • the correction unit 47 executes correction processing to create display image data from the image data created by the image data creation unit 46 .
  • the correcting unit 47 performs ⁇ correction to obtain an output value corresponding to the input gradation value indicated by the image data.
  • the output value corresponding to the input gradation value is the gradation value corresponding to the signal generated by the imaging device 20, and is defined within the same numerical range as the input gradation value.
  • the correction processing may be other gradation correction processing such as knee correction.
  • the correction unit 47 can perform additional correction on the gradation value (output value) after ⁇ correction.
  • the additional correction is a correction that changes the output value from the value when the ⁇ correction is performed, and is performed when the output value satisfies a predetermined condition, for example.
  • the additional correction includes a first additional correction and a second additional correction, and the amount of correction in each additional correction is set according to the intensity of the ambient light detected by the control processing section 42 .
  • the first additional correction is a so-called shadow correction.
  • the output value is changed to a normal value ( value when only ⁇ correction is performed).
  • the first reference value (Va in FIG. 6) is set to a numerical value corresponding to 1/3 to 1/4 of the intermediate value in the numerical range (0 to 255, for example) that defines the input tone value.
  • the output value of the low gradation portion is corrected from the value on the broken line to the value on the solid line in FIG. 7, resulting in a gradation value higher than the normal value. That is, the brightness of the low gradation portion after the first additional correction is performed becomes brighter than the brightness when the first additional correction is not performed. As a result, for example, the visibility of a dark portion that is too dark to be visually recognized with a normal output value is improved.
  • the amount of correction is increased when performing the first additional correction for a region where the input gradation value is near the lower limit in the low gradation portion, the region will appear darker than it actually is. It looks bright on the image. As a result, the brightness difference in the displayed image deviates from the actual appearance. Therefore, considering the balance with the actual appearance, in the region where the input gradation value is near the lower limit, as shown in FIG. It is better to reduce it gradually. However, it is not limited to this, and as shown in FIG. The correction amount may be increased as the is increased.
  • the second additional correction is a so-called highlight correction.
  • the output value is set to the normal value for a high gradation portion (bright portion) where the input gradation value indicated by the image data is equal to or higher than the second reference value. (value when only ⁇ correction is performed).
  • the second reference value (Vb in FIG. 8) is set to a numerical value corresponding to 2/3 to 3/4 of the intermediate value in the numerical range (0 to 255, for example) that defines the input tone value.
  • the output value of the high gradation portion is corrected from the value on the broken line to the value on the solid line in FIG. 8, resulting in a gradation value lower than the normal value. That is, the brightness of the high gradation portion after the second additional correction is performed becomes darker than the brightness when the second additional correction is not performed. As a result, for example, the visibility of a bright portion that is too bright to be visually recognized with a normal output value is improved.
  • the input gradation value is set as shown in FIG. It is preferable to gradually decrease the correction amount of the second additional correction as the value increases.
  • the present invention is not limited to this, and as shown in FIG. 9, the difference between the input gradation value and the second reference value Vb is the same over the entire high gradation portion including the region where the input gradation value is near the upper limit value.
  • the correction amount may be increased as the value increases.
  • a plurality of control patterns for additional correction are prepared and stored in a storage device such as the internal memory 50, the memory card 52, or an external server.
  • the correction unit 47 reads a control pattern corresponding to the intensity of the ambient light among the plurality of control patterns, and performs additional correction based on the read control pattern.
  • a captured image when a captured image includes a low gradation portion (dark portion) or a high gradation portion (bright portion), additional correction is performed in addition to the ⁇ correction.
  • Display image data can be created so as to improve visibility.
  • the correction pattern for ⁇ correction that is, the conversion pattern from the input gradation value to the output value
  • the intensity of the ambient light may be changed according to the intensity of the ambient light.
  • the correction regarding the output value that is, the ⁇ correction and the additional correction are not limited to the cases where they are performed by the control unit 40 consisting of a processor.
  • the above correction may be performed in the control circuit in the module.
  • the display image data is not recorded, and only the image data before correction (image data created by performing only ⁇ correction) is recorded as the recording image data. You may Alternatively, the display image data created by performing the additional correction may be recorded as recording image data together with the image data before correction.
  • control processing unit 42 and the image creating unit 44 will be described as the operations and processing of the control unit 40 (processor).
  • the observation device 30 is an observation device (finder) for the imaging device 10 including the control unit 40, and is housed in the housing of the imaging device 10, for example.
  • the observation device 30 has an introduction window 31, an optical system 32, an eyepiece window 33, a display mechanism 34, a light blocking mechanism 36, and a sensor 48 for photometry, as shown in FIG.
  • the display mechanism 34 and the light shielding mechanism 36 are controlled by the controller 40 , and the output signal of the sensor 48 is transmitted to the controller 40 .
  • the introduction window 31 is provided on the front surface of the imaging device 10 to introduce the light (luminous flux) from the subject into the observation device 30 .
  • An eyepiece window 33 is provided on the rear surface of the imaging device 10 so that the user can look into the observation device 30 .
  • the optical system 32 forms an optical image of a subject so that the user can observe it.
  • the optical system 32 has a plurality of lenses 32a and 32b, as shown in FIG.
  • One lens 32 a (hereinafter referred to as the upstream lens 32 a ) is arranged at a position closer to the subject and is provided to form an optical image of the subject on the display mechanism 34 .
  • the other lens 32b (hereinafter referred to as the downstream lens 32b) is arranged closer to the subject and magnifies the optical image formed by the display mechanism 34 and the image displayed on the display mechanism 34.
  • the optical system 32 includes a reflecting mirror and a prism for changing the optical path, a half mirror for transmitting light traveling in a predetermined direction, and a focusing plate for forming an optical image. (reticle), etc. may be included.
  • the display mechanism 34 includes the display 35 shown in FIG. 2, and displays an image and various information on the display 35 during imaging so that the user can observe it.
  • the display 35 is arranged between the upstream lens 32 a and the downstream lens 32 b in the optical path of the light (luminous flux) from the subject introduced into the observation device 30 .
  • the display 35 is optically transmissive and is configured by, for example, a transmissive display.
  • a transmissive display a thin self-luminous organic EL (Electro Luminescence) panel or the like can be used.
  • the brightness (luminance) of each area of the display 35 is variable and can be controlled for each area.
  • the area of the display 35 corresponds to a range that occupies part of the display screen formed by the display 35 .
  • the light shielding mechanism 36 is a dimming member provided to shield the light (luminous flux) from the subject introduced into the observation device 30 .
  • the light shielding mechanism 36 is composed of, for example, a polymer-dispersed liquid crystal panel, an electrochromic sheet, or an ND (Neutral Density) filter, which is a neutral density filter. As shown in FIG. 2, the light shielding mechanism 36 is arranged in the optical path between the subject and the display mechanism 34, more specifically, between the upstream lens 32a and the display 35.
  • the light shielding rate of the light shielding mechanism 36 is variable, and changes according to the electric signal input to the light shielding mechanism 36. Change. Furthermore, the light shielding rate of the light shielding mechanism 36 can be controlled for each area.
  • Each region (light-shielding region) in the light-shielding mechanism 36 is each portion of the liquid crystal panel, electrochromic sheet, or ND filter that constitutes the light-shielding mechanism 36 , and corresponds to the region of the display device 35 .
  • the planar size of the light blocking mechanism 36 is larger than the planar size of the display 35 .
  • the light introduced into the observation device 30 and directed to the eyepiece window 33 is substantially completely blocked by the light shielding mechanism 36 at the position in front of the display 35 .
  • the photometric sensor 48 is installed in the observation device 30 and outputs a signal corresponding to the intensity of the ambient light. Output. Note that the photometric sensor 48 is not limited to being provided inside the observation device 30 , and may be provided inside the imaging device 10 .
  • the observation device 30 configured as described above operates under the control of the control unit 40 during imaging. Specifically, the control unit 40 controls the display mechanism 34 based on the display image data to display the image P (specifically, the live view image) indicated by the display image data on the display 35 . At this time, when the above-described additional correction is performed on the image P, the control unit 40 causes the display 35 to display the image P after correction, that is, the image in which the output value (luminance) of the lightness or dark portion is corrected.
  • the observation device 30 By the way, light (luminous flux) from the subject is introduced into the observation device 30 through the introduction window 31 .
  • the luminous flux is condensed toward the display 35 by the lens 32a on the upstream side. form an image.
  • the light of each of the optical image and the image P is guided to the eyepiece window 33 through the lens 32b on the downstream side.
  • the display mechanism 34 displays the image P on the display 35 so that the user can observe the image P together with the optical image.
  • An image (inside viewfinder field) observable by the observation device 30 will be specifically described with reference to FIGS.
  • An optical image observable by the observation device 30 has a first optical area OP1 and a second optical area OP2, as shown in FIG.
  • the display mechanism 34 displays the image P superimposed on the first optical area OP1, as shown in FIG.
  • the first optical area OP1 corresponds to an area that overlaps the image P in the optical image.
  • the second optical area OP2 is an area that does not overlap the image P in the optical image, that is, an area other than the first optical area OP1.
  • the user can observe both the optical image of the first optical area OP1 and the image P by transmitting the light of each area of the optical image through the display device 35 .
  • the area of the display 35 through which the light of the first optical area OP1 passes corresponds to the first optical area OP1, and is hereinafter referred to as the first display area 35a. That is, the control unit 40 displays the image P in the first display area 35a.
  • the area of the display 35 through which the light of the second optical area OP2 passes corresponds to the second optical area OP2, and is hereinafter referred to as the second display area 35b.
  • the control unit 40 causes the display 35 to display setting information EI including exposure conditions such as the F-number, shutter speed, and ISO sensitivity together with the image P (see FIG. 3).
  • the setting information EI is superimposed on the optical image at a position away from the image P and displayed.
  • control unit 40 causes the display 35 to display a mark F indicating the imaging range (angle of view) of the imaging element 20 (see FIG. 3).
  • the mark F is superimposed and displayed on the optical image, and is, for example, a frame surrounding the area corresponding to the angle of view in the optical image, or an L-shaped mark indicating the boundary position of the area.
  • the control unit 40 specifies the positional relationship between the optical image of the subject and the angle of view based on the specifications of the imaging lens 12 and information about each part of the observation device 30, and displays the indicator F on the display 35 according to the specified positional relationship. display.
  • the display size and display position when displaying the image P on the display 35 are variable as shown in FIGS.
  • the control unit 40 sets the display size and display position of the image based on the operation content, and when receiving a user's change instruction, according to the instruction content. Change the display size or display position.
  • control unit 40 sets or changes the display size and display position of the image P so that the display area of the image P and the display area of the setting information EI do not overlap on the display mechanism 34 (more specifically, the display device 35).
  • the display position of the setting information EI is determined as follows. As a result, the image P is displayed without interfering with the setting information EI, so that the user can appropriately observe the image P, that is, the subject within the angle of view.
  • the control unit 40 causes the display 35 to display one of the label F and the image preferentially over the other in the overlapping area.
  • "Preferentially display” means that, for example, when the display areas of the indicator F and the image P overlap on the display 35, the indicator F and the image P are preferentially displayed on the indicator 35. It is to display in pixels. In the case shown in FIG. 12, the image P is preferentially displayed with respect to the marker F in the overlapping area.
  • the display is not limited to this, and the one selected by the control unit 40 according to a predetermined rule may be preferentially displayed from among the indicator F and the image P.
  • the observation device 30 when used in a situation where the intensity of ambient light is high, the optical image looks bright. Visibility of P is reduced.
  • the light shielding mechanism 36 when using the observation device 30 in a photographing environment with a high intensity of ambient light, the light shielding mechanism 36 as a whole is light-shielded as shown in the upper diagram of FIG.
  • the rate we control the rate to a low value.
  • the subject of the image P superimposed and displayed on the first optical area OP1 is difficult to see due to the brightness of the first optical area OP1, as shown in the lower diagram of FIG.
  • the first optical region OP1 can be seen through the image P, the visibility of the image is further reduced.
  • illustration of the setting information EI and the indicator F is omitted for convenience of explanation.
  • the control unit 40 controls the light shielding rate of each of the plurality of regions in the light shielding mechanism 36 based on the intensity of the ambient light. Specifically, the control unit 40 executes detection processing for detecting the intensity of ambient light, and executes control processing for the light shielding mechanism 36 according to the detection result.
  • control unit 40 detects the intensity of ambient light based on the output signal of the sensor 48 and the signal (image signal) generated by the imaging device 20 . In the control process, the control unit 40 controls the light shielding rate of each of the first light shielding region 36a and the second light shielding region 36b in the light shielding mechanism 36 based on the intensity of the detected ambient light.
  • the first light shielding region 36a is a region of the light shielding mechanism 36 that overlaps the first optical region OP1, and is present at a position through which the light of the first optical region OP1 passes in the optical path of the incident light into the observation device 30. do.
  • the first light shielding area 36a is positioned upstream of the display 35 so as to overlap the first display area 35a, that is, at a position that shields light passing through the first display area 35a.
  • the second light shielding region 36b is a region of the light shielding mechanism 36 that overlaps the second optical region OP2, and is present at a position through which the light of the second optical region OP2 passes in the optical path of the incident light into the observation device 30. do.
  • the second light blocking region 36b is located upstream of the display 35 and overlaps the second display region 35b, that is, at a position that blocks light passing through the second display region 35b.
  • the visibility of the optical image and the image observed by the observation device 30 can be improved in consideration of the intensity of the ambient light. can be improved.
  • the present embodiment can more appropriately adjust the visibility of the optical image and the image compared to the invention described in Japanese Patent Application Laid-Open No. 2002-200005, which adjusts the light amount by driving the lens aperture.
  • the light shielding rate of each of the first light shielding region 36a and the second light shielding region 36b can be individually controlled. That is, the brightness of each image and optical image can be individually adjusted.
  • the light shielding mechanism 36 makes the light shielding rate of the first light shielding area 36a sufficiently higher than that of the second light shielding area 36b (for example, the light shielding rate is 100%). This makes it possible to improve the visibility of the image P while maintaining the brightness of the optical image even when the intensity of the ambient light is extremely high, as shown in the lower diagram of FIG. 14 .
  • the light shielding rate should be set to, for example, about 50 to 90%.
  • the light shielding rate control in the control process will be described in detail.
  • the light shielding rate of the second light shielding region 36b is adjusted based on the output signal of the sensor 48. This is because the second light shielding area 36b overlaps the second optical area OP2, and the second optical area OP2 can be displayed with brightness according to the amount of light incident on the viewing device 30.
  • FIG. That is, by detecting the intensity of the ambient light from the output signal of the sensor 48 and controlling the light shielding rate of the second light shielding area 36b based on the detection result, the visibility of the second optical area OP2 is improved. can control the light shielding rate.
  • the light shielding rate of the first light shielding region 36a is controlled based on the output signal of the sensor 48 and the signal (image signal) generated by the imaging element 20. This is to consider the brightness balance between the optical image and the image P.
  • the brightness of the image P is determined according to the intensity of the subject (the inside of the mark F in FIG. 3) within the shooting angle of view.
  • the intensity is detected based on the signal generated by the imaging device 20, or more precisely, the integrated value of the exposure amount obtained from the signal.
  • the brightness of the optical image is determined according to the amount of light incident on the observation device 30, that is, the intensity of the light in the entire imaging environment including the angle of view of the observation device 30, and is based on the output signal of the sensor 48.
  • the ambient light at the angle of view of the photographing and the intensity of light over the entire angle of view of the viewing device 30 may differ from each other.
  • the light shielding rate of the first light shielding region 36a is controlled based on the output signal of the sensor 48 and the signal generated by the imaging device 20. Thereby, it is possible to balance the brightness between the optical image and the image P and, for example, to clearly display the image P with respect to the optical image in the first display area 35a.
  • the first light shielding region 36a of the light shielding mechanism 36 covers the entire first display region 35a on the upstream side (object side) of the display device 35 so as to sufficiently block the light passing through the first display region 35a. preferably configured. Therefore, the area of the first light shielding region 36a (the area when the light shielding mechanism 36 is viewed from the front) is the same as the area of the first display region 35a (the area when the display 35 is viewed from the front), or or more. In addition, since the light shielding mechanism 36 is arranged upstream of the display 35, the amount of light incident on the display 35 is reduced by increasing the light shielding rate of the light shielding mechanism 36, thereby suppressing deterioration of the display 35. can do.
  • the area of the first light shielding region 36a is somewhat larger than the area of the first display region 35a.
  • the end of the first light-shielding region 36a be positioned outside the end of the first display region 35a.
  • “outside” means the outside when the center position of the image P displayed in the first display area 35a (that is, the center position of the first display area 35a) is used as a reference, that is, the side far from the center position.
  • the optical image is observed through the second display area 35b of the display 35, and its brightness is determined according to the light shielding rate of the second light shielding area 36b.
  • the brightness of the edge regions of the optical image tends to be lower than the brightness of the central region.
  • the user may be able to manually change the light shielding rate of each light shielding area in the light shielding mechanism 36 . That is, after controlling the light shielding rate of each of the first light shielding area 36a and the second light shielding area 36b based on the intensity of the detected ambient light, the control unit 40 controls the light shielding rate of each light shielding area according to the user's input operation. The shading rate may be re-controlled.
  • control unit 40 may perform control processing so that the image in the focus area is displayed more clearly in the image displayed in the first display area 35a.
  • the light shielding rate of the area overlapping the image of the focus area in the first light shielding area 36a (the area surrounded by the dashed line in the upper diagram of FIG. 16) is , may be controlled to be higher than the light shielding rate of the area overlapping the image of the out-of-focus area.
  • the first light shielding area 36a the light shielding rate of the area overlapping the image of the in-focus area may be increased, or the light shielding rate of the area overlapping the image of the out-of-focus area may be decreased.
  • the image in the in-focus area becomes conspicuous.
  • the method of displaying the image of the in-focus area in a conspicuous manner is not limited to the control of the light shielding rate described above, and other methods are also conceivable.
  • the display mode of the image in the in-focus area may be changed to a different mode from the display mode of the image in the out-of-focus area.
  • the display mode of an image includes the presence/absence of luminance and hue change, the presence/absence of blinking display or highlight display, and the presence/absence of display of pointing objects such as pointers and cursors.
  • the ambient light intensity detected by the detection process may fall below the reference value, such as when the shooting environment is dark such as an outdoor space at night.
  • the reference value is, for example, a value corresponding to the intensity of ambient light when the shooting environment is dark.
  • the control unit 40 increases the light shielding rate of each of the first light shielding region 36a and the second light shielding region 36b, for example, even if it is controlled to a value near the upper limit. good. This is because the need to check the optical image of the subject is reduced when the shooting environment is dark.
  • the light shielding rate of each light shielding region of the light shielding mechanism 36 is increased to shield the light from the subject, and the angle of view of the image P displayed on the display mechanism 34 as shown in FIG. should be expanded. Thereby, the user can concentrate on confirming the image P without looking at the optical image.
  • the control unit 40 increases the light shielding rate of each light shielding area at the time when the response operation is received.
  • step S002 is an optional step and can be omitted.
  • control The unit 40 executes the processes after step S003. Specifically, the control unit 40 executes detection processing to detect the intensity of the ambient light based on the signal generated by the imaging element 20 and the output signal of the sensor 48 (S003).
  • control unit 40 determines whether or not additional correction is necessary for the output value of the image based on the image data created during imaging (S004). Then, if additional correction is necessary, the control unit 40 corrects the output value with a correction amount corresponding to the intensity detected in step S003 (S005). If no additional correction is required, step S005 is omitted.
  • the display image data is created (S006), and the control unit 40 causes the display device 35 provided in the display mechanism 34 to display the image P indicated by the display image data (S007).
  • the display area of the image P that is, the first display area 35a, is at the initial setting position or the position preset by the user.
  • step S005 when the additional correction is performed in step S005, the image P is displayed in the first display area 35a based on the corrected gradation value (more specifically, the output value).
  • the control unit 40 causes the display 35 to display setting information EI regarding exposure conditions and the like, and a mark F indicating the angle of view (see FIG. 3).
  • optical image is formed by the optical system 32 and passes through the display 35 .
  • the user looks into the observation device 30 to observe both the image P displayed in the first display area 35 a and the optical image transmitted through the display 35 .
  • control unit 40 executes control processing and controls the light shielding rate of each of the plurality of light shielding regions in the light shielding mechanism 36 based on the intensity of the ambient light detected in the detection processing (S008).
  • the light shielding rate of the first light shielding area 36a is controlled based on the output signal of the sensor 48 and the signal generated by the imaging device 20, and the light shielding rate of the second light shielding area 36b is controlled based on the output signal of the sensor 48. to adjust.
  • control process is executed after the image is displayed, but the control process may be executed at the same time as the image is displayed on the display 35, or may be executed before the image is displayed. good.
  • the control unit 40 accepts the change operation and changes the display size or the display position of the first display area 35a in which the image P is displayed according to the content of the change operation (S010).
  • the control unit 40 executes the control process again with the change of the first display area 35a (S011).
  • an area overlapping with the changed first display area 35a to become a new first light shielding area 36a, and an area overlapping with the changed second display area 35b to newly become a second light shielding area 36b The light shielding rate of each of the regions is controlled.
  • the display size or display position of the first display area 35a may be changed so that the first display area 35a and the display area of the sign F overlap on the display 35. be.
  • the control unit 40 may preferentially display one of the image P and the marker F over the other in the overlap region where the first display region 35a and the display region of the marker F overlap (FIG. 12). reference).
  • the user can use the observation device 30 through the above series of steps. Then, the use of the observation device 30 ends when a predetermined termination condition is met (S012), such as when the imaging device 10 is powered off.
  • the control unit 40 controls the light shielding rate of each of the plurality of light shielding regions in the light shielding mechanism 36 based on the intensity of the ambient light.
  • the control unit 40 controls the light shielding rate of each of the plurality of light shielding regions in the light shielding mechanism 36 based on the intensity of the ambient light.
  • other forms of controlling the light shielding rate of each light shielding area are conceivable, and one example thereof will be described below as a second embodiment.
  • about 2nd Embodiment it mainly demonstrates a different point from 1st Embodiment, and supposes that description is abbreviate
  • the light shielding rate of each light shielding area is controlled by reflecting the user's will. That is, in the second embodiment, the user directly or indirectly designates the light shielding rate of each light shielding area, and in the control process, the light shielding rate of each light shielding area is set to the light shielding rate designated by the user. Control.
  • control unit 40 causes the rear display 22 or the display mechanism 34 to display the mode designation screen shown in FIG.
  • the user performs an input operation to designate a mode through the mode designation screen.
  • the input operation at this time is performed via, for example, a command dial, selection button 26, or touch panel 27 provided on the imaging device 10 .
  • these input operations may be performed through a command dial, a selection button, or a touch panel provided on the observation device 30, for example.
  • mode designation screen shown in FIG. 19 "mode A”, "mode B", and “mode C” can be designated as an example of a plurality of modes.
  • the contents and the like are not particularly limited.
  • Each of the plurality of modes is associated with the display size of the image P on the display mechanism 34 (the display size of the image P on the display device 35) and the light shielding rate corresponding to the plurality of light shielding regions on the light shielding mechanism 36. . More specifically, the display size of an image and the light shielding rate of each light shielding area are set for each mode. As shown in FIG. 20, these set values are stored in a storage device such as the internal memory 50, memory card 52, or external server while being associated with one of a plurality of modes.
  • control unit 40 When the user designates one of the modes in the input operation, the control unit 40 accepts the input operation and executes control processing according to the contents of the input operation. In the control process according to the second embodiment, the control unit 40 controls the light shielding rate of each light shielding area and also controls the display size of the image P on the display mechanism 34 .
  • the control unit 40 reads the display size and the light shielding rate of each light shielding area according to the user's input operation from the storage device. Specifically, the display size corresponding to the mode designated by the user and the light shielding rate of each light shielding area are read out. Then, the control unit 40 controls the display size of the image P and the light shielding rate of each light shielding area according to each read value.
  • the light shielding rate of each light shielding area can be controlled by reflecting the user's intention.
  • the user only has to specify one of a plurality of modes, which improves usability for the user (saves the trouble of directly inputting the light shielding rate). ).
  • the light shielding rate of each light shielding area is read out together with the display size of the image P, and these values are controlled as a set. Therefore, efficient control is realized.
  • the control of the display size of the image P and the control of the light shielding rate of each light shielding area may be performed separately.
  • the light shielding rate of each light shielding area may be adjusted based on the intensity of the ambient light.
  • the control unit 40 reads the light shielding rate of each light shielding area according to the user's input operation from the storage device, corrects the read light shielding rate based on the intensity of the ambient light, and obtains the corrected light shielding rate. It is preferable to control the light shielding rate of each light shielding area as follows.
  • An optical system 32 is arranged in the observation device 30 as shown in FIG. 2, and the optical system 32 includes a lens 32a. If the lens 32a is a lens that produces distortion, such as a fisheye lens, the optical image and the image observed through the lens, which has a wide angle of view, will appear distorted. Optical images are allowed to appear distorted, but images need to be corrected for distortion for reasons of checking the recorded image.
  • An embodiment in which an image is displayed in consideration of lens distortion in the observation device 30 will be described below as a third embodiment.
  • about 3rd Embodiment it mainly demonstrates a different point from 1st Embodiment, and supposes that description is abbreviate
  • the control unit 40 (strictly speaking, the correction unit 47) corrects the image P according to the distortion aberration.
  • control unit 40 reads out information about the distortion aberration of the lens 32a from the internal memory 50 or the like, and based on the information, performs known aberration reduction correction on the image data to create display image data.
  • the corrected image P displayed by the generated display image data can be seen through the lens 32a, but is observed with distortion due to aberration reduced as shown in FIG.
  • the correction for reducing aberration is not limited to the configuration performed by the control unit 40, and may be performed, for example, in a module provided in the display 35 by a control circuit within the module.
  • a mark F consisting of a frame surrounding the imaging range (angle of view) of the image sensor 20 is superimposed on the optical image together with the image P, and displayed on the display 35 of the display mechanism 34. Is displayed.
  • This mark F is observed through the lens 32a, and essentially has a rectangular shape when there is no distortion caused by the lens 32a.
  • the mark F indicates the position of the angle of view in the optical image. Therefore, when the optical image is observed with distortion, the mark F also has the distortion. Must be observed (distorted).
  • the control unit 40 reads information about the optical system 32 including the lens 32a and information about the imaging range from the internal memory 50 or the like, and based on these information, specifies the positional relationship between the optical image and the imaging range. . After that, the control unit 40 causes the indicator F to be displayed on the area corresponding to the imaging range (strictly speaking, the area corresponding to the boundary of the imaging range) on the display 35 based on the specified positional relationship. At this time, the mark F is displayed without correction for reducing aberrations, and is distorted in the same manner as the optical image as shown in FIG. 21 for easy understanding.
  • the image P can be observed while the distortion is suppressed.
  • the mark F is displayed in a distorted state according to the optical image, it is possible to correctly grasp the shooting range in the optical image.
  • the mark F is displayed without being corrected for aberration reduction, but it is not limited to this.
  • the label F may be displayed with weaker correction than the image. Weak correction means that the degree to which distortion is reduced by correction, that is, the amount of correction for canceling distortion is small.
  • an external observation device 130 may be detachably connected to the upper portion of the main body of the imaging device 10 via the connection portion 14 (see FIG. 1).
  • the control unit 40 that controls each unit of the observation device 30 is configured by the processor provided in the imaging device 10 . That is, in the above-described embodiment, the observation device 30 for the imaging device 10 including the control unit 40 has been described, but the present invention is not limited to this.
  • the external observation device 130 has a control unit 30Y that controls each part of the observation device main body 30X separately from the control unit of the imaging device main body (hereinafter referred to as the main body side control unit 140).
  • the main body side control unit 140 may have That is, a processor built in the external observation device 130 constitutes the control unit 30Y, and the control unit 30Y executes processing related to image display in the observation device 130, such as detection processing and control processing. good.
  • both the body-side control unit 140 and the control unit 30Y of the observation device 130 may execute processing related to image display in the observation device 130, such as detection processing and control processing.
  • an HVF for a digital camera has been described, but the present invention can also be applied to an HVF used in imaging devices other than digital cameras (for example, video cameras, smartphones, etc.).
  • the observation device 30 including the light-transmissive display 35 was exemplified as an example of the observation device that displays an image superimposed on an optical image, but the present invention is not limited to this.
  • a prism 231 that constitutes a beam splitter is placed in the optical path of the optical image, and the image on the display 233 that passes through the lens 232 is reflected at right angles by the prism 231 to form an optical image.
  • the observation device 230 may be used to superimpose and display an image on the .
  • a viewing device for an imaging device having an imaging element and controlled by a processor comprising: a lens that forms an optical image of a subject so that a user can observe it; a transmissive display that displays an image so that the user can observe it; and a light shielding mechanism that changes the light shielding rate by an electric signal,
  • the optical image has a first optical area and a second optical area,
  • the image is an image based on a signal generated by the imaging device, the transmissive display superimposes the image on the first optical region and displays both the image and the optical image so that the user can observe them;
  • the light shielding mechanism is arranged in an optical path between the subject and the transmissive display, and includes a first light shielding area overlapping the first optical area and a second light shielding area overlapping the second optical area.
  • the observation device wherein the processor controls the light blocking ratio of the first light blocking region and the second light blocking region based on the intensity of ambient light for the light blocking mechanism.
  • the processor 2. The observation device according to claim 1, wherein the light shielding ratios of the first light shielding area and the second light shielding area are controlled based on an output signal of a sensor that outputs a signal corresponding to the intensity to the light shielding mechanism.
  • the imaging device generates a signal when a part of the subject observed as the optical image is captured, 3.
  • the observation device according to claim 2 wherein the processor controls the light shielding rate of the first light shielding region based on the signal generated by the imaging element and the output signal of the sensor.
  • Appendix 4 4.
  • the observation device according to any one of additional items 1 to 3, wherein an end portion of the first light shielding region is located outside an end portion of the image display region.
  • [Appendix 5] 5.
  • the processor according to any one of additional items 1 to 4, wherein the processor controls the light shielding rate of the end regions of the second light shielding region to be lower than the light shielding rate of the central region of the second light shielding region. observation device.
  • [Appendix 6] In order for the user to observe the image and the optical image together with the viewing device, the optical image corresponds to the first optical region in the transmissive display in a state in which the optical image is transmitted through the transmissive display. 6.
  • the observation device according to any one of additional items 1 to 5, wherein the image is displayed on a part of the observation device.
  • the image includes an image of an in-focus area and an image of an out-of-focus area other than the in-focus area, Item 1, wherein the processor controls a light shielding rate of an area overlapping an image of the in-focus area in the first light shielding area so as to be higher than a light shielding rate of an area overlapping the image of the out-of-focus area. 7.
  • the observation device according to any one of items 1 to 6.
  • the image includes an image of an in-focus area and an image of an out-of-focus area other than the in-focus area, 7.
  • the observation device according to any one of additional items 1 to 6, wherein a display mode of the image in the focused area is different from a display mode of the image in the out-of-focus area.
  • the processor corrects the gradation value according to the signal generated by the imaging element,
  • the transmissive display displays the image based on the corrected gradation value, 9.
  • the observation device according to any one of additional items 1 to 8, wherein the correction amount for the gradation value is set according to the intensity.
  • the transmissive display displays a sign indicating an imaging range of the imaging element together with the image, Additional item 1, wherein in the transmissive display, in an overlap region where the display region of the sign and the display region of the image overlap, one of the sign and the image is preferentially displayed with respect to the other.
  • the observation device according to any one of items 1 to 9.
  • [Appendix 11] 11.
  • the observation device according to claim 10, wherein one of the sign and the image selected by the user is preferentially displayed over the other in the overlapping area.
  • a viewing device according to any one of clauses 1 to 11, for use with said imaging device including said processor.
  • a viewing device for an imaging device having an imaging element and controlled by a processor comprising: a lens that forms an optical image of a subject so that a user can observe it; a transmissive display that displays an image so that the user can observe it; and a light shielding mechanism that changes the light shielding rate by an electric signal,
  • the optical image has a first optical area and a second optical area
  • the transmissive display superimposes the image on the first optical region and displays both the image and the optical image so that the user can observe them
  • the light shielding mechanism is arranged in an optical path between the subject and the transmissive display, and has a plurality of light shielding regions overlapping the first optical region and the second optical region
  • the processor stores the display size of the image on the transmissive display and the light shielding ratios corresponding to the plurality of light shielding regions from a storage device that stores the display size and the plurality of light shielding regions according to the user's input operation.
  • an observation device that controls the transmissive display and the light shielding mechanism by reading out the light shielding rate corresponding to the light shielding area of .
  • Appendix 14 the display size and the light shielding ratio corresponding to the plurality of light shielding regions are stored in the storage device in association with one of a plurality of modes;
  • the input operation is an operation for the user to specify one of the plurality of modes, 14.
  • a viewing device for an imaging device having an imaging element and controlled by a processor comprising: a lens that forms an optical image of a subject so that a user can observe it; a transmissive display that displays an image so that the user can observe it, The optical image has a first optical area and a second optical area, The transmissive display superimposes the image on the first optical area, displays the image and the optical image so that both the image and the optical image can be observed by the user, and displays a mark surrounding the photographing range of the imaging device.
  • the image is displayed after being corrected according to the distortion caused by the lens;
  • the mark is displayed without the correction or with the correction weaker than the image based on the positional relationship between the optical image observed with the distortion and the imaging range. Observation device.
  • Imaging device 12 imaging lens 12a focus lens 14 connection unit 16 aperture 18 shutter 20 imaging device 22 rear display 24 operation unit 26 selection button 27 touch panel 28 lens drive mechanism 30 observation device 30X observation device main body 30Y control unit 32 optical system 32a, 32b Lens 34 Display mechanism 35 Display 35a First display area 35b Second display area 36 Light shielding mechanism 36a First light shielding area 36b Second light shielding area 40 Control section 42 Control processing section 44 Image creating section 45 A/D conversion section 46 Image data Creation unit 47 Correction unit 48 Sensor 50 Internal memory 52 Memory card 130 Observation device 140 Main body side control unit 230 Observation device 231 Prism 232 Lens 233 Display device EI Setting information F Label OP1 First optical area OP2 Second optical area P Image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif d'observation qui permet, lors de l'affichage d'une image superposée sur une image optique d'un sujet, d'ajuster la visibilité de l'image et de l'image optique en fonction de l'intensité de la lumière ambiante. Le dispositif d'observation (30) comprend un système optique (32), un mécanisme d'affichage (34) qui affiche l'image sur la base d'un signal généré par un élément d'imagerie (20), et un mécanisme de blocage de lumière (36) à taux de blocage de lumière variable. L'image optique présente une première région optique (OP1) et une seconde région optique (OP2). Le mécanisme d'affichage (34) superpose l'image sur la première région optique (OP1) et affiche l'image de telle sorte qu'un utilisateur peut observer l'image conjointement avec l'image optique. Le mécanisme de blocage de lumière (36) est positionné dans le trajet de lumière entre le sujet et le mécanisme d'affichage (34) et il comporte une première région de blocage de lumière (36a) chevauchant la première région optique (OP1) et une seconde région de blocage de lumière (36b) chevauchant la seconde région optique (OP2). Un processus de commande est exécuté sur le mécanisme de blocage de lumière (36) avec une unité de commande (40) pour commander le taux de blocage de lumière de la première région de blocage de lumière (36a) et de la seconde région de blocage de lumière (36b) sur la base de l'intensité de la lumière ambiante.
PCT/JP2021/042380 2021-01-28 2021-11-18 Dispositif d'observation WO2022163080A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022578068A JPWO2022163080A1 (fr) 2021-01-28 2021-11-18
US18/360,663 US20230370712A1 (en) 2021-01-28 2023-07-27 Observation apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-011985 2021-01-28
JP2021011985 2021-01-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/360,663 Continuation US20230370712A1 (en) 2021-01-28 2023-07-27 Observation apparatus

Publications (1)

Publication Number Publication Date
WO2022163080A1 true WO2022163080A1 (fr) 2022-08-04

Family

ID=82653170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/042380 WO2022163080A1 (fr) 2021-01-28 2021-11-18 Dispositif d'observation

Country Status (3)

Country Link
US (1) US20230370712A1 (fr)
JP (1) JPWO2022163080A1 (fr)
WO (1) WO2022163080A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078785A (ja) * 2001-08-30 2003-03-14 Olympus Optical Co Ltd カメラのファインダ装置
JP2012032622A (ja) * 2010-07-30 2012-02-16 Nikon Corp 表示装置およびカメラ
WO2013136907A1 (fr) * 2012-03-15 2013-09-19 富士フイルム株式会社 Dispositif de formation d'image et procédé pour l'affichage d'un viseur électronique
JP2015232665A (ja) * 2014-06-11 2015-12-24 キヤノン株式会社 撮影装置
JP2018011152A (ja) * 2016-07-12 2018-01-18 キヤノン株式会社 表示装置およびその制御方法
WO2019087713A1 (fr) * 2017-10-31 2019-05-09 富士フイルム株式会社 Dispositif de viseur, dispositif d'imagerie et procédé de commande de dispositif de viseur

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078785A (ja) * 2001-08-30 2003-03-14 Olympus Optical Co Ltd カメラのファインダ装置
JP2012032622A (ja) * 2010-07-30 2012-02-16 Nikon Corp 表示装置およびカメラ
WO2013136907A1 (fr) * 2012-03-15 2013-09-19 富士フイルム株式会社 Dispositif de formation d'image et procédé pour l'affichage d'un viseur électronique
JP2015232665A (ja) * 2014-06-11 2015-12-24 キヤノン株式会社 撮影装置
JP2018011152A (ja) * 2016-07-12 2018-01-18 キヤノン株式会社 表示装置およびその制御方法
WO2019087713A1 (fr) * 2017-10-31 2019-05-09 富士フイルム株式会社 Dispositif de viseur, dispositif d'imagerie et procédé de commande de dispositif de viseur

Also Published As

Publication number Publication date
JPWO2022163080A1 (fr) 2022-08-04
US20230370712A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
JP5512878B2 (ja) 撮影装置及び表示制御方法
JP5723488B2 (ja) カメラおよびその動作制御方法
US8471935B2 (en) Imaging apparatus having an image correction function and method for controlling the same
JP2007214964A (ja) 映像表示装置
JPWO2012035822A1 (ja) ファインダ装置の表示制御方法及びその装置
US8078049B2 (en) Imaging apparatus
US9143694B2 (en) Camera and method of controlling operation of same
US8169529B2 (en) Apparatus and methods for performing light metering in an imaging apparatus
US20200404135A1 (en) Image capturing device
JP2010016613A (ja) 撮像装置
WO2022163080A1 (fr) Dispositif d'observation
JP2014048491A (ja) 撮像装置
JP5153441B2 (ja) 撮像装置、その制御方法及びプログラム
JP2008154158A (ja) デジタル一眼レフカメラ
JP2006311126A (ja) 撮像装置
JP2016063391A (ja) 撮像装置、表示装置、及び、電子機器
JP2014048492A (ja) 撮像装置
JP5053942B2 (ja) 撮像装置
US10567662B2 (en) Imaging device and control method therefor using shift direction calculation
JP5078779B2 (ja) 撮像装置
JP4518104B2 (ja) カメラ
JP2009037031A (ja) 表示装置及びそれを備えたカメラ
JP2002023245A (ja) カメラ
JP2005006217A (ja) デジタル一眼レフカメラ
GB2614592A (en) Electronic device, method of controlling the same, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21923085

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022578068

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21923085

Country of ref document: EP

Kind code of ref document: A1