JP2010166281A - Imaging apparatus, photometry method and program - Google Patents

Imaging apparatus, photometry method and program Download PDF

Info

Publication number
JP2010166281A
JP2010166281A JP2009006289A JP2009006289A JP2010166281A JP 2010166281 A JP2010166281 A JP 2010166281A JP 2009006289 A JP2009006289 A JP 2009006289A JP 2009006289 A JP2009006289 A JP 2009006289A JP 2010166281 A JP2010166281 A JP 2010166281A
Authority
JP
Japan
Prior art keywords
imaging
exposure
image
signal
photometric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009006289A
Other languages
Japanese (ja)
Inventor
Shinichi Fujii
Hirotaka Ui
Motoyuki Yamaguchi
博貴 宇井
基志 山口
真一 藤井
Original Assignee
Sony Corp
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, ソニー株式会社 filed Critical Sony Corp
Priority to JP2009006289A priority Critical patent/JP2010166281A/en
Publication of JP2010166281A publication Critical patent/JP2010166281A/en
Pending legal-status Critical Current

Links

Images

Abstract

Provided is a technique capable of efficiently performing a photometric operation while saving unnecessary time in a photometric operation using an image sensor.
An imaging apparatus includes: an imaging element that generates an image signal related to a subject image; and signal adjustment that optimizes a cycle of a control signal that controls an operation of the imaging element according to an exposure time of the imaging element. Unit 622, a plurality of exposures having different exposure times, a shooting control unit 621 that sequentially executes each cycle of the control signal optimized by the signal adjustment unit 622, and a plurality of exposures, respectively. Metering control means for performing metering based on each image signal.
[Selection] Figure 5

Description

  The present invention relates to a photometric technique.
  In general, an image pickup apparatus (digital camera) including an image pickup device has a live view function for displaying a time-series image relating to a subject acquired by the image pickup device on a display unit (monitor) in a moving image manner.
  In an imaging apparatus having such a live view function, photometry is often performed using an image signal acquired by exposure by an imaging element.
  The image sensor can acquire a good image signal corresponding to the amount of light within the dynamic range. However, when a subject having various luminances is taken as a subject to be photographed, the required photometric range corresponding to the subject becomes wide, so the dynamic range of the image sensor can cover the entire required photometric range. It has become difficult.
  Thus, for example, a technique has been proposed that covers the entire required photometric range by performing multiple exposures under different exposure conditions during photometry (for example, Patent Document 1).
Japanese Patent Laid-Open No. 2004-221585
  However, in the technique described in Patent Document 1, all of the multiple exposures for photometry are sequentially performed in a frame period of a constant interval according to a vertical synchronization signal that defines a frame period for live view display. It was. For this reason, for example, when the exposure time is short with respect to the frame period, useless time occurs in the photometric operation.
  Therefore, an object of the present invention is to provide a technique capable of efficiently performing a photometric operation while eliminating wasted time in a photometric operation using an image sensor.
  The present invention relates to an imaging device that optimizes a cycle of an imaging device that generates an image signal related to a subject image and a control signal that controls the operation of the imaging device according to an exposure time of the imaging device. Means, and a plurality of exposures having different exposure times are sequentially executed for each cycle of the control signal optimized by the optimization means, and each of the plurality of exposures is acquired by the imaging device. And a photometric control means for performing photometry based on each of the image signals.
  According to the present invention, it is possible to efficiently perform a photometric operation using an image sensor.
1 is a diagram illustrating an external configuration of an imaging apparatus according to an embodiment of the present invention. 1 is a diagram illustrating an external configuration of an imaging apparatus according to an embodiment of the present invention. It is a longitudinal cross-sectional view of an imaging device. It is a longitudinal cross-sectional view of an imaging device. It is a block diagram which shows the electrical structure of an imaging device. It is a time chart for demonstrating the photometry operation | movement in the live view mode in an imaging device. It is the figure which expanded a part of FIG. It is a figure for demonstrating 3 patterns photometric exposure. It is the figure which expanded and displayed the signal transmission path | route in the partial area | region of an image pick-up element. It is a figure which shows the read-out area | region in an image pick-up element. It is a time chart for demonstrating photometry operation | movement in the live view mode in the imaging device which concerns on a comparative example. It is a flowchart of imaging operation of an imaging device. It is a flowchart of imaging operation of an imaging device.
  Embodiments of the present invention will be described below with reference to the drawings.
<Embodiment>
[Appearance Configuration of Imaging Device 1]
1 and 2 are diagrams showing an external configuration of an imaging apparatus 1 according to the embodiment of the present invention. Here, FIGS. 1 and 2 show a front view and a rear view, respectively.
  The imaging device 1 is configured as, for example, a single-lens reflex digital still camera, and includes a camera body 10 and a photographing lens unit 2 as an interchangeable lens that is detachable from the camera body 10.
  Specifically, as shown in FIG. 1, on the front side of the camera body 10, a mount portion 301 on which the photographing lens unit 2 is mounted at the front center and a lens disposed on the right side of the mount portion 301. An exchange button 302, a grip portion 303 for enabling gripping, a mode setting dial 305 disposed at the upper left portion of the front surface, a control value setting dial 306 disposed at the upper right portion of the front surface, and an upper surface of the grip portion 303 A shutter button (release button) 307 is provided.
  The photographing lens unit 2 functions as a lens window that takes in light that forms a subject image (subject light), and as a photographing optical system for guiding the subject light to the image sensor 101 disposed inside the camera body 10. Function.
  More specifically, the photographing lens unit 2 includes a lens group 21 including a plurality of lenses arranged in series along the optical axis LT (see FIG. 5). The lens group 21 includes a focus lens 211 (FIG. 5) for adjusting the focus and a zoom lens 212 (FIG. 5) for zooming, and each is driven in the direction of the optical axis LT. As a result, focus adjustment or zooming is performed. In addition, the photographing lens unit 2 is provided with an operation ring that can rotate along the outer peripheral surface of the lens barrel at an appropriate position on the outer periphery of the lens barrel. For example, the focus lens 211 rotates in the rotation direction of the operation ring. And it can move in the direction of the optical axis LT according to the amount of rotation.
  The mount 301 is provided with a connector Ec (see FIG. 5) for electrical connection with the mounted photographic lens unit 2 and a coupler 75 (FIG. 5) for mechanical connection.
  The lens exchange button 302 is a button that is pressed when the photographing lens unit 2 attached to the mount unit 301 is removed.
  The grip portion 303 is a portion where the photographer (user) holds the imaging device 1 during photographing, and the surface of the grip portion 303 is provided with unevenness according to the shape of the finger in order to improve fitting properties. Note that a battery storage chamber and a card storage chamber (not shown) are provided inside the grip portion 303. A battery 69B (see FIG. 5) is stored in the battery storage chamber as a power source for the imaging apparatus 1, and a memory card 67 (FIG. 5) for recording image data of a photographed image is detachable in the card storage chamber. It is designed to be stored. Note that the grip unit 303 may be provided with a grip sensor for detecting whether or not the user grips the grip unit 303.
  The mode setting dial 305 and the control value setting dial 306 are made of a substantially disk-shaped member that can rotate in a plane substantially parallel to the upper surface of the camera body 10. The mode setting dial 305 is used for various modes (such as various shooting modes (personal shooting mode, landscape shooting mode, full-auto shooting mode, etc.), a playback mode for playing back a captured image, and an external device. This is for selecting a communication mode for performing data communication. On the other hand, the control value setting dial 306 is for setting control values for various functions installed in the imaging apparatus 1.
  A shutter button (release button) 307 is a two-stage detection button that can detect a “half-pressed state” that is pressed halfway and a “full-pressed state” that is further pressed. When the shutter button 307 is half-pressed (S1) in the shooting mode, a preparatory operation for photographing a still image of the subject (exposure preparation operation such as setting of an exposure value and focus detection) is executed. Further, when the shutter button 307 is fully pressed (S2), the photographing operation (the image pickup device 101 (see FIG. 4)) is exposed, and predetermined image processing is performed on the image signal obtained by the exposure to perform the memory card 67. A series of operations to be recorded in the above are executed.
  As shown in FIG. 2, an LCD (Liquid Crystal Display) 311 that functions as a display unit, a finder window 316 disposed above the LCD 311, and a finder window 316 are provided on the back side of the camera body 10. An eye cup 321 surrounding the periphery, a main switch 317 disposed on the left side of the finder window 316, a flash unit 318 and a connection terminal unit 319 disposed above the finder window 316 are provided. On the back side of the camera body 10, a setting button group 312 disposed on the left side of the LCD 311, a direction selection key 314 disposed on the right side of the LCD 311, and a push disposed on the center of the direction selection key 314. A button 315 and a display changeover switch 85 arranged at the lower right of the direction selection key 314 are provided.
  The LCD 311 includes a color liquid crystal panel capable of displaying an image, and displays an image captured by the image sensor 101 (see FIG. 3) or reproduces and displays a recorded image. Function or mode setting screen. Note that an organic EL display device or a plasma display device may be used instead of the LCD 311.
  The viewfinder window (eyepiece window) 316 constitutes an optical viewfinder (OVF), and subject light that has passed through the photographing lens unit 2 is guided to the viewfinder window 316. The user can view the subject image actually captured by the image sensor 101 by looking through the finder window 316.
  The main switch 317 is a two-contact slide switch that slides to the left and right. When the switch is set to the left, the power supply of the imaging device 1 is turned on. When the switch is set to the right, the power supply of the imaging device 1 is turned off.
  The flash unit 318 is configured as a pop-up built-in flash. The external flash can be attached to the camera body 10 using the connection terminal portion 319.
  The eye cup 321 functions as a light shielding member that suppresses the entry of external light into the finder window 316.
  The setting button group 312 is a button for performing operations on various functions installed in the imaging apparatus 1. The setting button group 312 includes, for example, a menu button for displaying a menu screen on the LCD 311 and a menu switching button for switching the contents of the menu screen.
  The direction selection key 314 has an annular member having a plurality of pressing portions (triangle marks in the drawing) arranged at regular intervals in the circumferential direction, and is not shown provided corresponding to each pressing portion. The pressing operation of the pressing portion is detected by the contact (switch). The push button 315 is disposed at the center of the direction selection key 314. The direction selection key 314 and the push button 315 are used to change the shooting magnification (movement of the zoom lens 212 (see FIG. 5) in the wide direction or the tele direction), frame advance of a recorded image to be reproduced on the LCD 311 and the like, and shooting conditions (aperture). Value, shutter speed, presence / absence of flash emission, etc.).
  The display changeover switch 85 is composed of two slide switches. When the contact point is set at the upper “optical” position, the optical finder mode (also referred to as “OVF mode”) is selected, and the subject image is displayed in the optical finder field of view. The Accordingly, the user can visually recognize a subject image displayed in the optical viewfinder field through the viewfinder window 316 and perform a composition determination operation (framing).
  On the other hand, when the contact of the display changeover switch 85 is set to the lower “monitor” position, the electronic finder mode (also referred to as “EVF mode” or “live view mode”) is selected, and the live view image related to the subject image is displayed on the LCD 311 as a moving image. Displayed in a specific manner. Thereby, the user can perform framing by visually recognizing the live view image displayed on the LCD 311.
  As described above, the user can switch the finder mode by operating the display changeover switch 85, and the imaging apparatus 1 determines the composition of the subject using the electronic finder or the optical finder on which the live view display is performed. Is possible.
[Internal Configuration of Imaging Device 1]
Next, the internal configuration of the imaging apparatus 1 will be described. 3 and 4 are longitudinal sectional views of the imaging apparatus 1. FIG.
  As shown in FIG. 3, an image sensor 101, a finder unit (also referred to as “finder optical system”) 102, a mirror mechanism 8, and a phase difference AF module (also simply referred to as “AF module”) 107 are provided inside the camera body 10. Etc. are provided.
  The imaging element 101 is disposed perpendicular to the optical axis LT on the optical axis LT of the photographing lens unit 2 when the photographing lens unit 2 is attached to the camera body 10. As the image sensor 101, for example, a CMOS color area sensor (CMOS type image sensor) in which a plurality of pixels configured with photodiodes are two-dimensionally arranged in a matrix is used. The image sensor 101 generates analog electrical signals (image signals) of R (red), G (green), and B (blue) color components related to the subject image formed through the photographing lens unit 2, and R, Output as image signals of G and B colors.
  A shutter unit 40 is disposed on the imaging surface side of the imaging element 101. The shutter unit 40 includes a curtain body that moves in the vertical direction, and is configured as a mechanical focal plane shutter that performs an optical path opening operation and an optical path blocking operation of subject light guided to the image sensor 101 along the optical axis LT. The shutter unit 40 can be omitted when the image sensor 101 has a function as a complete electronic shutter.
  As shown in FIG. 3, a mirror mechanism 8 is provided on the optical path (also referred to as “imaging optical path”) from the imaging lens unit 2 to the image sensor 101.
  The mirror mechanism 8 has a main mirror 81 (main reflection surface) that reflects light from the photographing optical system upward. For example, a part or all of the main mirror 81 is configured as a half mirror, and transmits a part of light from the photographing optical system. The mirror mechanism 8 also includes a sub mirror 82 (sub reflective surface) that reflects light transmitted through the main mirror 81 downward.
  Further, the mirror mechanism 8 is configured as a so-called quick return mirror, and the posture can be switched between a mirror-down state and a mirror-up state.
  Specifically, until the shutter button 307 is fully pressed S2 in the shooting mode, in other words, when determining the composition, the mirror mechanism 8 is arranged to be in the mirror down state (see FIG. 3). . In the mirror-down state, the subject light from the photographing lens unit 2 is reflected upward by the main mirror 81, enters the finder optical system 102 as an observation light beam, and is guided to the finder window 316 along the finder optical path PA.
  The viewfinder optical system 102 includes a pentaprism 105, an eyepiece lens 106, an eyepiece shutter 108, a photometric element 109, and a viewfinder window 316.
  The pentaprism 105 has a pentagonal cross section, and is a prism for changing the top and right sides of the optical image into an upright image by reflection inside the subject image incident from the lower surface thereof.
  The eyepiece 106 has a function of guiding the subject image that has been made upright by the pentaprism 105 to the outside of the finder window 316.
  The eyepiece shutter 108 is provided between the eyepiece 106 and the finder window 316, and blocks the outside light entering the imaging apparatus 1 from the finder window 316 and does not block the outside light from the finder window 316. It functions as a light shielding (shutter) means capable of switching the state between the light shielding states. Details will be described later.
  The photometric element 109 is disposed above the pentaprism 105 and receives a part of the observation light beam guided to the finder optical system 102, and the brightness on the subject side, that is, the brightness of the subject (also referred to as “subject brightness”). ) Is generated (output).
  The finder optical system 102 having the above-described members functions as an optical finder for confirming the object field at the time of shooting standby before actual shooting.
  A part of the subject light is transmitted through the main mirror 81, reflected downward by the sub mirror 82, and guided to the AF module 107.
  The AF module 107 is configured by a line sensor or the like that detects focus information of a subject, and functions as a so-called AF sensor. The AF module 107 is disposed at the bottom of the mirror mechanism 8 and has a phase difference detection function for generating a phase difference detection signal corresponding to the degree of focus of the subject image. That is, in the mirror-down state at the time of shooting standby, the phase difference detection signal is output from the AF module 107 based on the subject light guided to the AF module 107.
  On the other hand, when the shutter button 307 is fully pressed S2, the mirror mechanism 8 is driven so as to be in the mirror up state (see FIG. 4), and the exposure operation is started.
  Specifically, as shown in FIG. 4, at the time of exposure, the mirror mechanism 8 jumps upward with the rotating shaft 83 as a fulcrum and retracts from the photographing optical path. Specifically, the main mirror 81 and the sub mirror 82 are retracted upward so as not to block the light from the photographing optical system, and the light from the photographing lens unit 2 reaches the image sensor 101 in accordance with the opening timing of the shutter unit 40. To do. The image sensor 101 generates an image signal related to the subject image based on the received light flux by photoelectric conversion. As described above, the light from the subject is guided to the image sensor 101 via the photographing lens unit 2, so that a photographed image (photographed image data) relating to the subject is obtained.
[Electrical Configuration of Imaging Device 1]
FIG. 5 is a block diagram illustrating an electrical configuration of the imaging apparatus 1. In FIG. 5, the same members as those shown in FIGS. 1 to 4 are denoted by the same reference numerals. For convenience of explanation, the electrical configuration of the photographic lens unit 2 will be described first.
  The photographic lens unit 2 includes a lens driving mechanism 24, a lens position detection unit 25, a lens control unit 26, and an aperture driving mechanism 27 in addition to the lens group 21 constituting the above-described photographic optical system.
  In the lens group 21, a focus lens 211 and a zoom lens 212, and a diaphragm 23 for adjusting the amount of light incident on the image sensor 101 are held in the optical axis LT direction in the lens barrel. The captured subject light is imaged on the image sensor 101. In the automatic focusing (AF) control, the focus lens 211 is driven in the direction of the optical axis LT by the AF actuator 71M in the photographing lens unit 2 to perform focus adjustment.
  The focus drive control unit 71A generates a drive control signal necessary for moving the focus lens 211 to the in-focus position based on the AF control signal given from the overall control unit 62 via the lens control unit 26, and the drive The AF actuator 71M is controlled using the control signal. The AF actuator 71M is composed of a stepping motor or the like, and applies a lens driving force to the lens driving mechanism 24.
  The lens driving mechanism 24 includes, for example, a helicoid and a gear (not shown) that rotates the helicoid, and receives the driving force from the AF actuator 71M to drive the focus lens 211 and the like in a direction parallel to the optical axis LT. It is. The moving direction and moving amount of the focus lens 211 are in accordance with the rotating direction and the rotating speed of the AF actuator 71M, respectively.
  The lens position detection unit 25 moves integrally with the lens while being in sliding contact with the encode plate in which a plurality of code patterns are formed at a predetermined pitch in the optical axis LT direction within the movement range of the lens group 21. An encoder brush, and detects the amount of movement of the lens group 21 during focus adjustment. The lens position detected by the lens position detection unit 25 is output as the number of pulses, for example.
  The lens control unit 26 is composed of, for example, a microcomputer having a built-in memory such as a ROM that stores a control program or a flash memory that stores data relating to status information.
  The lens control unit 26 has a communication function for communicating with the overall control unit 62 of the camera body 10 via the connector Ec. As a result, for example, status information data such as the focal length, aperture value, focusing distance, and peripheral light amount state of the lens group 21, and position information of the focus lens 211 detected by the lens position detection unit 25 are transmitted to the overall control unit 62. can do. Further, for example, data on the driving amount of the focus lens 211 can be received from the overall control unit 62.
  The aperture drive mechanism 27 receives the driving force from the aperture drive actuator 76M via the coupler 75 and changes the aperture diameter of the aperture 23.
  Next, the electrical configuration of the camera body 10 will be described. The camera body 10 includes an AFE (analog front end) 5, an image processing unit 61, an image memory 614, an overall control unit 62, a flash circuit 63, an operation unit 64, and a VRAM 65 in addition to the above-described imaging device 101 and shutter unit 40. Card I / F 66, memory card 67, communication I / F 68, power supply circuit 69, battery 69B, mirror drive control unit 72A, shutter drive control unit 73A, and aperture drive control unit 76A.
  As described above, a CMOS color area sensor is employed as the image sensor 101, and each of the image sensor 101 such as the start and end of exposure, output selection of each pixel of the image sensor 101, and readout of a pixel signal (charge signal), etc. The operation is controlled by an imaging control unit 621 described later.
  The AFE 5 gives a timing pulse for causing the image sensor 101 to perform a predetermined operation, performs predetermined signal processing on the image signal output from the image sensor 101, converts the image signal to a digital signal, and outputs the digital signal to the image processing unit 61. It has a function to do. The AFE 5 includes a timing control circuit 51, a signal processing unit 52, an A / D conversion unit 53, and the like.
  The timing control circuit 51 generates a predetermined timing pulse (a pulse for generating a vertical scanning pulse φVn, a horizontal scanning pulse φVm, a reset signal φVr, and the like) based on the reference clock output from the overall control unit 62, and the imaging device 101. And the operation of the image sensor 101 is controlled.
  For example, when performing live view display, a pulse wave (square wave) timing control signal (also simply referred to as “control signal”) NS corresponding to a vertical synchronization signal (VD) that defines a frame period of live view display. Is generated by the timing control circuit 51 (see FIG. 6 described later). During the live view display, the live view is displayed according to each pulse of the control signal NS generated at a constant cycle (also referred to as “reference cycle”) (for example, 1/60 second) corresponding to the frame period of the live view display. Exposure for view image acquisition is repeatedly performed.
  Further, for example, at the time of exposure for photometry, the pulse period of the control signal NS is changed, and a photometry control signal NSL including a signal having a period (for example, 1/480 seconds) different from the reference period is generated. Exposure for photometry is executed based on the signal NSL.
  Further, the timing control circuit 51 controls the operations of the signal processing unit 52 and the A / D conversion unit 53 by outputting predetermined timing pulses to the signal processing unit 52 and the A / D conversion unit 53, respectively.
  The signal processing unit 52 performs predetermined analog signal processing on the analog image signal output from the image sensor 101. The signal processing unit 52 includes a CDS (correlated double sampling) circuit, an AGC (auto gain control) circuit, a clamp circuit, and the like. The A / D conversion unit 53 converts the analog R, G, B image signals output from the signal processing unit 52 into a plurality of bits (for example, 12 bits) based on the timing pulse output from the timing control circuit 51. Is converted into a digital image signal.
  The image processing unit 61 performs predetermined signal processing on the image data output from the AFE 5 to create an image file, and includes a black level correction circuit 611, a white balance control circuit 612, a gamma correction circuit 613, and the like. Has been. The image data captured by the image processing unit 61 is temporarily written in the image memory 614 in synchronization with the reading of the image sensor 101. Thereafter, the image data written in the image memory 614 is accessed to access the image processing unit. Processing is performed in each of the 61 blocks.
  The black level correction circuit 611 corrects the black level of each of the R, G, and B digital image signals A / D converted by the A / D conversion unit 53 to a reference black level.
  The white balance control circuit 612 performs level conversion (white balance (WB) adjustment) of digital signals of R (red), G (green), and B (blue) color components based on the white reference corresponding to the light source. . Specifically, the white balance control circuit 612 identifies a portion that is originally estimated to be white in the photographic subject from the luminance or saturation data based on the WB adjustment data provided from the overall control unit 62, and the portion. The R, G, and B color component averages, the G / R ratio, and the G / B ratio are obtained, and the levels are corrected as R and B correction gains.
  The gamma correction circuit 613 corrects the gradation characteristics of the image data subjected to WB adjustment. Specifically, the gamma correction circuit 613 performs non-linear conversion of the image data level for each color component and offset adjustment using a preset gamma correction table.
  The image memory 614 temporarily stores the image data output from the image processing unit 61 in the shooting mode and is used as a work area for performing predetermined processing on the image data by the overall control unit 62. It is. In the playback mode, the image data read from the memory card 67 is temporarily stored.
  The overall control unit 62 is configured as a microcomputer and mainly includes a CPU, a RAM 62A, a ROM 62B, and the like. The overall control unit 62 implements various functions of the imaging apparatus 1 by reading a program stored in the ROM 62B and executing the program by the CPU.
  The overall control unit 62 functionally realizes the imaging control unit 621, the signal adjustment unit 622, the exposure control unit 623, the display control unit 624, and the phase difference AF control unit 625 by executing the above-described program.
  The shooting control unit 621 has a function of controlling various shooting operations for acquiring a shot image related to the subject image.
  For example, when the finder mode is the live view mode, exposure of the image sensor 101 is sequentially executed according to the control signal NS.
  In addition, when the live view mode is selected, the shooting control unit 621 performs exposure (“exposure for photometry”) by the image sensor 101 to acquire the subject luminance before the actual shooting (in short, for photometry). Are also executed). In photometric exposure, exposures with different exposure conditions are executed in multiple steps. Details will be described later.
  In the imaging apparatus 1, a photometric signal is acquired by the photometric element 109 in a state in which subject light is guided to the finder optical system 102, that is, in the OVF mode. On the other hand, in a state where the subject light is not guided to the finder optical system 102, that is, in the live view mode, a photometric signal is acquired by photometric exposure by the image sensor 101.
  The signal adjustment unit 622 has a function of adjusting (adjusting) the cycle (output interval) of the signal generated by the timing control circuit 51. For example, in live view display, the signal adjustment unit 622 causes the timing control circuit 51 to output a control signal NS having a constant period (also referred to as “reference period”) corresponding to the frame period of live view display.
  In addition, the signal adjustment unit 622 functions as an optimization unit that optimizes the cycle of the control signal NS in accordance with the exposure time of the photometric exposure during the photometric operation in the live view mode, and the optimized control signal NS Is output to the timing control circuit 51 as a photometric control signal NSL.
  The exposure control unit 623 performs exposure control for adjusting the shutter speed and the aperture value.
  Specifically, the exposure control unit 623 has a photometric control function that acquires luminance information (photometric value) of a subject based on a photometric signal acquired by the photometric element 109 or the image sensor 101. Note that, as the luminance information of the subject, a BV value indicating the luminance of the subject in the APEX method is acquired. Furthermore, the exposure control unit 623 has an exposure value determination function that determines an exposure value based on the acquired luminance information of the subject and adjusts the shutter speed and the aperture value based on the exposure value.
  The display control unit 624 controls display contents on the LCD 311. For example, in the live view mode, the display control unit 624 sequentially displays each of a plurality of images continuously acquired by the image sensor 101 on the LCD 311 as a live view image.
  The phase difference AF control unit 625 detects an in-focus position by a phase difference detection method and executes an automatic focusing operation (also referred to as “phase difference AF operation”). Specifically, the phase difference AF control unit 625, based on the phase difference detection signal acquired by the AF module 107, the position of the photographing lens (more specifically, the focus lens) at the time of focusing (focus lens position) An in-focus lens position specifying operation (also referred to as “focus detection operation”) is performed.
  The flash circuit 63 controls the light emission amount of the external flash connected to the flash unit 318 or the connection terminal unit 319 to the light emission amount set by the overall control unit 62 in the flash photographing mode.
  The operation unit 64 includes the mode setting dial 305, the control value setting dial 306, the shutter button 307, the setting button group 312, the direction selection key 314, the main switch 317, and the like, and inputs operation information to the overall control unit 62. Is for.
  The VRAM 65 has an image signal storage capacity corresponding to the number of pixels of the LCD 311 and is a buffer memory between the overall control unit 62 and the LCD 311.
  The card I / F 66 is an interface for enabling transmission / reception of signals between the memory card 67 and the overall control unit 62. The memory card 67 is a recording medium that stores image data generated by the overall control unit 62.
  The communication I / F 68 is an interface for enabling transmission of image data and the like to a personal computer or other external device.
  The power supply circuit 69 includes, for example, a constant voltage circuit, and generates a voltage for driving the entire imaging apparatus 1 such as the overall control unit 62, the imaging device 101, and other various drive control units. The battery 69B includes a primary battery such as an alkaline battery or a secondary battery such as a nickel hydride rechargeable battery, and is a power source that supplies power to the entire imaging apparatus 1.
  The mirror drive control unit 72A generates a drive signal for driving the mirror drive actuator 72M in accordance with the switching of the finder mode or the timing of the photographing operation. The mirror drive actuator 72M is an actuator that rotates the mirror mechanism 8 to a horizontal posture or an inclined posture.
  The shutter drive control unit 73A generates a drive control signal for the shutter drive actuator 73M based on a signal from the overall control unit 62. The shutter drive actuator 73M is an actuator that opens and closes the shutter unit 40.
  The aperture drive control unit 76A generates a drive control signal for the aperture drive actuator 76M based on the signal from the overall control unit 62. The aperture driving actuator 76M applies a driving force to the aperture driving mechanism 27 via the coupler 75.
[About metering]
Next, a photometric operation executed when the live view mode is selected in the imaging apparatus 1 will be described in detail. FIG. 6 is a time chart for explaining the photometric operation in the live view mode in the imaging apparatus 1. FIG. 7 is an enlarged view of the exposure Pc in FIG.
  When the live view mode is selected, the subject light is not guided to the finder optical system 102 and the photometric operation using the photometric element 109 cannot be executed. For this reason, the imaging apparatus 1 performs a photometric operation using the imaging element 101 in the live view mode.
  In the photometric operation using the image sensor 101, as shown in FIG. 6, the timing control circuit 51 generates three exposures (exposure for photometry) Pa to Pc having different exposure conditions over the entire imaging surface of the image sensor 101. The control signal NS is sequentially performed every one cycle.
  Specifically, in the imaging apparatus 1, a high-luminance subject (for example, a BV value of 8 to 15) is detected in the period (also referred to as “pulse period” or “single pulse period”) Tf1 of the first (first) pulse PS1. An exposure (first exposure) Pa of an assumed relatively short exposure time (for example, 1/64000 seconds) is executed. Then, after readout Ra of the charge signal generated in each pixel of the image sensor 101 by the first exposure Pa is executed in the pulse period Tf2 of the next pulse PS2, the read signal Ra is generated in the pulse period Tf3 of the next pulse PS3. An AE operation Ma is performed.
  Next, in the pulse period Tf2 of the second pulse PS2 after the live view mode is selected, a relatively moderate exposure time (for example, 1/1000 second) assuming a medium-brightness object (for example, a BV value of 2 to 9). Exposure Pb is executed. Then, after reading Rb of the charge signal generated in each pixel of the image sensor 101 by the second exposure Pb is performed in the pulse period Tf3 of the next pulse PS3, the AE is performed in the pulse period Tf4 of the next pulse PS4. Operation Mb is performed.
  Finally, a relatively long exposure time (for example, 1/60 second) assuming a low-luminance object (for example, BV value (−4) to 3) in the pulse period Tf3 of the third pulse PS3 after the live view mode is selected. The exposure Pc is executed. Then, after the readout Rc of the charge signal generated in each pixel of the image sensor 101 by the third exposure Pc is executed in the pulse period Tf4 of the next pulse PS4, in the pulse period Tf5 of the next pulse PS5. An AE calculation Mc is performed.
  In FIG. 6, the exposures Pa to Pc are represented by parallelograms, but this exposure is performed as the horizontal pixel line (also referred to as “horizontal pixel line” or “horizontal line”) of the image sensor 101 decreases. It shows that it has started late. Specifically, as shown in FIG. 7 in which the exposure Pc in FIG. 6 is enlarged, the vertical direction of the parallelogram represents the horizontal pixel line position in the image sensor 101, and the horizontal direction of the parallelogram represents the exposure. Represents time. For example, the upper side LU of the parallelogram represents that exposure is being performed with the first pixel line (upper horizontal pixel line) TP, and the lower side LD of the parallelogram represents the last pixel line ( The lower horizontal line (BM) indicates that exposure is being performed.
  That is, in the exposure of the imaging apparatus 1, all line uniform exposure with a constant exposure time is performed from the first pixel line TP to the last pixel line BM, and the exposure start timing (reset timing) and exposure end timing (pixel signal) Is gradually delayed from the first line TP to the last line BM. The reason why the exposure timing is shifted for each horizontal pixel line in this way is that a CMOS sensor having a so-called rolling shutter function is used as the image sensor 101.
  If the exposure Pa to Pc with the exposure time of the three patterns as described above is performed, it is possible to cover the entire required photometric range in the imaging apparatus 1 that takes subjects having various luminances as subjects for photographing. FIG. 8 is a diagram for explaining three patterns of photometric exposure.
  For example, in an interchangeable lens having an F value (F number) of 1.4, the exposure condition RJ1 set to ISO sensitivity 100 and exposure time (shutter speed (ss)) 1/64000 seconds as shown in FIG. This is the optimum exposure condition for the subject in the cross-hatching area. In the photometric exposure Pa performed under the exposure condition RJ1, a subject having a BV value of 8 to 15 (parallel oblique line portion) can be set in the photometric range. The exposure condition RJ2 set to ISO sensitivity 100 and exposure time 1/1000 second is an optimum exposure condition for a subject having a BV value of 7, and a BV value is obtained for the exposure Pb for photometry performed under the exposure condition RJ2. It is possible to set the photometric range for subjects of 2 to 9 (parallel oblique lines). The exposure condition RJ3 set to ISO sensitivity 400 and exposure time 1/60 seconds is the optimum exposure condition for a subject with a BV value of 1, and the BV value is obtained for the photometric exposure Pc performed under the exposure condition RJ3. The subjects (-4) to 3 (parallel oblique lines) can be in the photometric range.
  As described above, by performing three types of photometric exposures with different exposure conditions, it is possible to perform photometry that covers the entire photometric range necessary for photographing a subject.
  Further, in the imaging apparatus 1 of the present embodiment, the pulse period of the control signal NS, in other words, the pulse width of the control signal NS is optimized according to each exposure time of the photometric exposure Pa to Pc, and the time required for the photometric operation Is shortened.
  Specifically, since the exposure time of the first exposure Pa is shorter than the reference period corresponding to the frame period of the live view display, the imaging apparatus 1 uses the pulse period of the pulse PS1 that defines the first exposure Pa. Tf1 is set shorter than the reference period. Note that the pulse period of the control signal NS set shorter than the reference period is also referred to as a short period.
  Similarly, since the exposure time of the second exposure Pb is shorter than the reference period, in the imaging apparatus 1, the pulse period Tf2 of the pulse PS2 that defines the second exposure Pb is set shorter than the reference period. .
  As described above, by optimizing the pulse period of the control signal NS that regulates the execution of the photometric exposure according to the exposure time, the photometric operation using the image sensor can be shortened and the photometric operation can be performed at high speed. Can be
  In the imaging apparatus 1, when the optimization of the pulse period of the control signal NS according to the exposure time as described above is realized, the readout processing of the charge signal accumulated in the imaging element 101 is speeded up.
  Specifically, as described above, a series of processing relating to photometric exposure, that is, exposure, readout processing, and AE calculation processing are executed for each cycle of the control signal NS, respectively, and readout processing and AE relating to single exposure are performed. Each of the arithmetic processes is a process that is completed in one cycle of the control signal NS. Therefore, in the imaging apparatus 1 of the present embodiment, even when the pulse period of the control signal NS is shortened, the reading process for the photometric exposure is speeded up so that the reading process is completed in one cycle of the control signal NS. It is done.
  The speeding up of the reading process is realized by limiting the pixels (also referred to as “reading pixels”) from which signals are read by the imaging control unit 621 and reducing the data transfer amount. FIG. 9 is an enlarged view of the signal transmission path in the partial region RK1 of the image sensor 101. FIG. FIG. 10 is a diagram showing the readout region RK2 in the image sensor 101. As shown in FIG.
  As a reading pixel limiting method, for example, a method of limiting the reading pixels by thinning out a signal reading line (reading line) can be employed. Specifically, as shown in FIG. 9, in a vertical readout line as a signal transmission path, readout lines GR1 and non-readout lines GR2 are alternately set, and signal readout is performed every other line.
  Further, as a method for limiting the readout pixels, for example, a method of limiting the readout pixels by reducing (limiting) a signal readout region in the light receiving region of the image sensor 101 can be employed. Specifically, instead of reading out signals acquired by all the pixels in the light receiving area of the image sensor 101, as shown in FIG. 10, the central area RK2 (enclosed by a broken line) in the light receiving area of the image sensor 101 is displayed. You may make it read the signal acquired by the pixel contained in an area | region.
  Further, as a readout pixel limiting method, a method combining the above-described thinning and region limitation may be employed. That is, after the read area is limited, the read lines may be thinned out in the read area.
  As described above, in the imaging apparatus 1, the readout pixels are limited by thinning out the readout lines and / or limiting the readout area, thereby speeding up the readout process in the photometric exposure. Note that the limitation on the readout pixels as described above is performed in a range that does not affect the photometric accuracy.
  Here, the photometric exposure of the imaging apparatus 1 according to the present embodiment is compared with the photometric exposure of the imaging apparatus 1F according to the comparative example in which the pulse period of the control signal NS is not optimized according to the exposure time. FIG. 11 is a time chart for explaining the photometric operation in the live view mode in the imaging apparatus 1F.
  As shown in FIG. 11, the exposure for photometry of the imaging apparatus 1F according to the comparative example is performed in synchronization with the control signal NS of the reference period corresponding to the frame period of the live view display, and the photometric operation of the imaging apparatus 1F is as follows. This is executed in a period corresponding to 5 pulses of the reference period.
  On the other hand, as shown in FIG. 6, the photometric operation of the imaging apparatus 1 according to the present embodiment is executed in a period in which four pulses having a short period shorter than the reference period and one pulse having the reference period are combined. . For this reason, as compared with the imaging device 1F according to the comparative example, the time required for the photometric operation can be shortened by the shortened period.
  In the imaging apparatus 1 of the present embodiment, when a photometric operation using the imaging element 101 is performed, a short cycle control signal NS is generated as a general rule, and when exposure with an exposure time longer than the short cycle is performed, It can be considered that a control signal having a longer period (here, a reference period) longer than the period is generated. That is, when the photometric operation is performed in the live view mode, the shooting control unit 621 speeds up the reading process, and the signal adjustment unit 622 generates the short cycle control signal NS as a standard, When performing exposure with a long exposure time, a long cycle control signal NS is generated.
[Operation of Imaging Device 1]
Next, the operation of the imaging apparatus 1 will be described. 12 and 13 are flowcharts of the photographing operation of the imaging apparatus 1.
  When a shooting mode is selected by a dial operation using the mode setting dial 305 of the imaging apparatus 1, the shooting operation shown in FIGS. 12 and 13 is executed.
  Specifically, as shown in FIG. 12, first, in step SP1, it is determined whether or not the EVF mode is selected by operating the display changeover switch 85. If it is determined that the EVF mode is not selected, the process proceeds to step SP21 (FIG. 13), and the composition determining operation in the OVF mode can be executed. Each process after step SP21 will be described later.
  On the other hand, if it is determined in step SP1 that the EVF mode is selected, the process proceeds to step SP2, and the composition determination operation in the EVF mode can be executed.
  Specifically, in step SP2, the mirror unit 103 is driven, and the mirror unit 103 is retracted from the optical path of the subject light passing through the interchangeable lens 2.
  In step SP3, the shutter unit 40 is opened, and in step SP4, the image sensor 101 is activated, and an image signal can be generated by exposure.
  In step SP5, power supply to the LCD 311 is started, and display of a live view image on the LCD 311 is started by the display control unit 624 based on image signals sequentially generated by the image sensor 101.
  In the next step SP6, it is determined whether or not the finder mode has been switched. Specifically, when the contact position of the display changeover switch 85 is detected and the display changeover switch 85 is set to the OVF mode (the contact position is “optical”), the process proceeds to step SP11 to start from the EVF mode. The mode is changed (transitioned) to the OVF mode. Details will be described later.
  On the other hand, when the display changeover switch 85 is set to the EVF mode (the contact position is “monitor”), the process proceeds to step SP7.
  In step SP7, it is detected whether or not the shutter button 307 is half pressed. When the half-pressed state is not detected, the process proceeds to step SP5, and the process of step SP5 is executed again. If a half-pressed state is detected, the process proceeds to step SP8.
  In step SP8, a photometric operation using the image sensor 101 is performed. Specifically, photometric exposure Pa to Pc is performed under the exposure conditions of three patterns, and the photometric result is obtained using the optimal exposure data among the exposure data (exposure results) acquired by each photometric exposure Pa to Pc. (Here, photometric value) is acquired. Note that, as a method for determining the optimum exposure data, for example, an average brightness value of each exposure data is calculated, and is closest to the median value of the pixel output (128 when the maximum value of the pixel output is 255). A method of determining exposure data having a luminance average value as optimum exposure data can be employed.
  In step SP9, it is detected whether or not the shutter button 307 is fully pressed. If the fully pressed state is not detected, the process proceeds to step SP5, and the process of step SP5 is executed again. When the fully pressed state is detected, the process proceeds to step SP10.
  In step SP10, shooting (exposure) is performed. Specifically, exposure by the image sensor 101 is started in a mirror-up state where subject light is incident on the image sensor 101. The image signal acquired by the image sensor 101 is subjected to predetermined image processing and recorded in the memory card 67 or the like.
  When the process of step SP10 ends, the process proceeds to step SP6, and the process of step SP6 is executed again.
  Next, a process when it is determined in step SP6 that the display changeover switch 85 is set to the OVF mode will be described.
  In this case, the process proceeds to step SP11, and the finder mode transitions (mode transition) to the OVF mode.
  Specifically, in step SP11, the mirror unit 103 is driven, and the mirror unit 103 enters a mirror-down state arranged in the optical path of the subject light.
  In the next step SP12, the shutter unit 40 is closed, and in step SP13, the image sensor 101 is stopped. In step SP14, the LCD 311 is turned off, and the process proceeds to step SP22 (FIG. 13).
  After the transition to step SP22, when the shutter button 307 is not operated and the finder mode transition is detected (step SP21), the transition is made to step SP2, and the mode transition is made from the OVF mode to the EVF mode (described later).
  As described above, in the EVF mode, a photometric operation using the image sensor 101 is performed, and a photometric value is acquired.
  Next, a case will be described in which it is determined in step SP1 that the EVF mode is not selected (OVF mode is selected) by operating the display changeover switch 85, and the process proceeds to step SP21 (FIG. 13).
  In this case, first, in step SP21, as in step SP6 described above, the contact position of the display changeover switch 85 is detected, and it is determined whether or not the finder mode has been switched. If the contact position is set to the OVF mode (the contact position is “optical”), it is determined that the finder mode has not been switched, and the process proceeds to step SP22, where the contact position is set to the EVF mode (contact point). If the position is “monitor”), the process proceeds to step SP2.
  In the next step SP22, as in step SP7 described above, it is detected whether or not the shutter button 307 is half pressed. When the half-pressed state is not detected, the process proceeds to step SP21, and the process of step SP21 is executed again. On the other hand, when the half-pressed state is detected, the process proceeds to step SP23.
  In step SP23, the phase difference AF operation by the AF module 107 is performed.
  In the next step SP24, it is detected whether or not the half-pressed state of the shutter button 307 is released. When the release of the half-pressed state is detected, the process proceeds to step SP21, and the process of step SP21 is executed again. On the other hand, if the release of the half-pressed state is not detected, the process proceeds to step SP25.
  In step SP25, as in step SP9 described above, it is detected whether or not the shutter button 307 is fully pressed. If the fully pressed state of the shutter button 307 is not detected, the process proceeds to step SP24, and the process of step SP24 is executed again. On the other hand, when the fully-pressed state of the shutter button 307 is detected, the process proceeds to step SP10 described above and photographing is performed.
  As described above, in the imaging apparatus 1, when the live view mode is selected, the photometric operation using the imaging element 101 is executed in response to half-pressing of the shutter button 307. In the photometric operation, three types of photometric exposures with different exposure conditions are performed, and the pulse period of the control signal NS is optimized according to the exposure time of each photometric exposure. According to this, since a useless time can be saved in the photometric operation, the photometric operation can be performed efficiently (in other words, in terms of time).
<Modification>
Although the embodiments of the present invention have been described above, the present invention is not limited to the contents described above.
  For example, in the above embodiment, when the live view mode is selected, the photometric operation using the image sensor 101 is executed in response to half-pressing of the shutter button 307, but the present invention is not limited to this.
  Specifically, the photometric operation using the image sensor 101 may be executed in response to the selection of the live view mode (in other words, the start of live view display) by operating the display changeover switch 85.
  Further, when the subject brightness changes suddenly during execution of live view display, a photometric operation using the image sensor 101 may be executed. Note that a sudden change in subject brightness can be detected by monitoring whether or not the brightness average value of image data continuously acquired by the image sensor 101 has changed by a predetermined amount or more during live view display.
  Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that countless variations that are not illustrated can be envisaged without departing from the scope of the present invention.
DESCRIPTION OF SYMBOLS 1 Image pick-up device 51 Timing control circuit 101 Image pick-up element 307 Shutter button 621 Shooting control part 622 Signal adjustment part 623 Exposure control part 624 Display control part NS Control signal NSL Photometry control signal PS1-PS5 Pulse Pa-Pc Exposure for photometry Tf1-Tf5 Pulse period

Claims (6)

  1. An image sensor for generating an image signal related to a subject image;
    Optimization means for optimizing the period of a control signal for controlling the operation of the image sensor in accordance with the exposure time of the image sensor;
    Photographing control means for sequentially executing a plurality of exposures having different exposure times for each cycle of the control signal optimized by the optimization means;
    A photometric control means for performing photometry based on each image signal acquired by the imaging device in each of the plurality of exposures;
    An imaging apparatus comprising:
  2.   The imaging control means restricts the readout pixels of the image signal when reading out each image signal acquired by each of a plurality of exposures from the image sensor in one cycle of the control signal after each exposure. The imaging apparatus according to claim 1, wherein the reading process is speeded up.
  3.   The imaging apparatus according to claim 2, wherein the imaging control unit limits the readout pixels by thinning out readout lines in the imaging element.
  4.   The imaging apparatus according to claim 2, wherein the imaging control unit limits the readout pixels by reducing a readout area of the image signal in the imaging element.
  5. a) optimizing the period of the control signal for controlling the operation of the image sensor according to the exposure time of the image sensor;
    b) sequentially executing a plurality of exposures having different exposure times for each cycle of the control signal optimized by the optimization unit;
    c) performing photometry based on each image signal acquired by the imaging device in each of the plurality of exposures;
    A photometric method comprising:
  6. In the computer built into the imaging device,
    a) optimizing the period of the control signal for controlling the operation of the image sensor according to the exposure time of the image sensor;
    b) sequentially executing a plurality of exposures having different exposure times for each cycle of the control signal optimized by the optimization unit;
    c) performing photometry based on each image signal acquired by the imaging device in each of the plurality of exposures;
    A program that executes
JP2009006289A 2009-01-15 2009-01-15 Imaging apparatus, photometry method and program Pending JP2010166281A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009006289A JP2010166281A (en) 2009-01-15 2009-01-15 Imaging apparatus, photometry method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009006289A JP2010166281A (en) 2009-01-15 2009-01-15 Imaging apparatus, photometry method and program

Publications (1)

Publication Number Publication Date
JP2010166281A true JP2010166281A (en) 2010-07-29

Family

ID=42582107

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009006289A Pending JP2010166281A (en) 2009-01-15 2009-01-15 Imaging apparatus, photometry method and program

Country Status (1)

Country Link
JP (1) JP2010166281A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012257126A (en) * 2011-06-09 2012-12-27 Canon Inc Imaging device and method of controlling the same
JP2015133689A (en) * 2013-12-11 2015-07-23 株式会社リコー Imaging device, image forming apparatus, and drive control method of two-dimensional image sensor
US10904505B2 (en) 2015-05-01 2021-01-26 Duelight Llc Systems and methods for generating a digital image
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US10931897B2 (en) 2013-03-15 2021-02-23 Duelight Llc Systems and methods for a digital image sensor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08163430A (en) * 1994-12-02 1996-06-21 Ricoh Co Ltd High speed fetch device for photometry data, exposure condition arithmetic unit and image pickup device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08163430A (en) * 1994-12-02 1996-06-21 Ricoh Co Ltd High speed fetch device for photometry data, exposure condition arithmetic unit and image pickup device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012257126A (en) * 2011-06-09 2012-12-27 Canon Inc Imaging device and method of controlling the same
US10931897B2 (en) 2013-03-15 2021-02-23 Duelight Llc Systems and methods for a digital image sensor
JP2015133689A (en) * 2013-12-11 2015-07-23 株式会社リコー Imaging device, image forming apparatus, and drive control method of two-dimensional image sensor
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US10904505B2 (en) 2015-05-01 2021-01-26 Duelight Llc Systems and methods for generating a digital image

Similar Documents

Publication Publication Date Title
US8970759B2 (en) Digital camera
JP5919543B2 (en) Digital camera
JP4321579B2 (en) Imaging device
US8063944B2 (en) Imaging apparatus
US8675121B2 (en) Camera and camera system
JP4349407B2 (en) Imaging device
JP4884417B2 (en) Portable electronic device and control method thereof
US7817915B2 (en) Image taking system
JP3697256B2 (en) Imaging device and lens device
CN101540848B (en) Image pickup device and image pickup apparatus
US8218962B2 (en) Image-capturing apparatus
TWI383672B (en) Image capturing apparatus and image processing method
EP2317380B1 (en) Imaging apparatus and imaging apparatus control method
US8036521B2 (en) Image pickup apparatus and focus control method
US7822334B2 (en) Imaging device and in-focus control method
US7839448B2 (en) Camera apparatus having a plurality of image pickup elements
US7817201B2 (en) Control in a digital camera having a preview function
JP5003132B2 (en) Imaging device and imaging apparatus
JP2008187615A (en) Imaging device, imaging apparatus, control method, and program
US8274598B2 (en) Image capturing apparatus and control method therefor
US7920782B2 (en) Imaging device
US7609294B2 (en) Image pick-up apparatus capable of taking moving images and still images and image picking-up method
JP4466400B2 (en) Imaging apparatus and program thereof
JP5914886B2 (en) Imaging device
JP4315206B2 (en) Imaging system and imaging apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111124

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20121109

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121113

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20130312