WO2018181610A1 - Dispositif électronique - Google Patents

Dispositif électronique Download PDF

Info

Publication number
WO2018181610A1
WO2018181610A1 PCT/JP2018/013036 JP2018013036W WO2018181610A1 WO 2018181610 A1 WO2018181610 A1 WO 2018181610A1 JP 2018013036 W JP2018013036 W JP 2018013036W WO 2018181610 A1 WO2018181610 A1 WO 2018181610A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
image
unit
area
region
Prior art date
Application number
PCT/JP2018/013036
Other languages
English (en)
Japanese (ja)
Inventor
昌也 ▲高▼橋
敏之 神原
直樹 關口
孝 塩野谷
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2018181610A1 publication Critical patent/WO2018181610A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present invention relates to an electronic device.
  • Patent Document 1 An imaging device equipped with an image processing technique for generating an image based on a signal from an imaging element is known (see Patent Document 1). Conventionally, image quality improvement has been required.
  • the electronic device includes an imaging element having a plurality of imaging areas for imaging a subject image, and the imaging device is moved with respect to the subject image, so that the subject image has a plurality of images.
  • the imaging condition set in the second imaging area is changed.
  • a generating unit that generates an image of the subject image captured under the imaging condition changed by the changing unit.
  • the electronic device includes an imaging element having a plurality of imaging areas for imaging a subject image, and the imaging device is moved with respect to the subject image, so that the subject image has a plurality of images.
  • a changing unit capable of changing an imaging condition set in the second imaging area, and an imaging condition changed by the changing unit
  • a generating unit that generates an image of the subject image picked up in (1).
  • FIG. 7A is a diagram illustrating a recording image
  • FIG. 7B is a diagram illustrating a detection image. It is a figure which illustrates the to-be-photographed object area
  • FIG. 12A and 12B are schematic diagrams for explaining image blur correction.
  • FIGS. 13A to 13E are schematic diagrams for explaining image blur correction. It is a figure which illustrates a kernel. It is a figure which illustrates the position of the pixel for focus detection in an imaging surface. It is the figure which expanded the one part area
  • FIG. 18A is a diagram illustrating a template image representing an object to be detected
  • FIG. 18B is a diagram illustrating a monitor image and a search range. It is a flowchart explaining the flow of the process by 1st Embodiment.
  • 6 is a flowchart illustrating a flow of image blur correction processing.
  • FIG. 21A is a diagram illustrating still images
  • FIG. 21B and FIG. 21C are diagrams illustrating an overview of still image blur correction according to the second embodiment. It is a figure which illustrates the imaging timing of the image for recording, and the display timing of a blurring correction image.
  • 10 is a flowchart illustrating a flow of image blur processing.
  • FIG. 27A is a still image based on the first recording image
  • FIG. 27B is a still image based on the second recording image.
  • FIG. 27A is a block diagram which illustrates the composition of the imaging system by modification 2. It is a figure explaining supply of the program to a mobile device.
  • the digital camera 1 in FIG. 1 (hereinafter referred to as camera 1) is an example of an electronic device.
  • the camera 1 may be an interchangeable lens camera or a lens-integrated camera.
  • the camera mounted in portable terminals, such as a smart phone may be used.
  • you may comprise as imaging devices, such as a video camera and a mobile camera.
  • the camera 1 is equipped with an image sensor 32a as an example of an image sensor.
  • the imaging element 32a performs imaging under the imaging conditions set in the imaging area.
  • the imaging element 32a is configured to be able to perform imaging under different imaging conditions for each imaging region.
  • the image processing unit 33 of the camera 1 performs a process of suppressing the influence of image blur caused by the shaking (referred to as shaking of the camera 1) occurring in the camera 1 that is capturing a moving image.
  • Image blur refers to the movement of the subject image on the imaging surface of the image sensor 32a. Details of the camera 1 will be described with reference to the drawings. Note that image blur can be said to be that the image sensor 32a moves relative to the subject image.
  • FIG. 1 is a block diagram illustrating the configuration of a camera 1 according to an embodiment.
  • a camera 1 includes an imaging optical system 31, an imaging unit 32, an image processing unit 33, a control unit 34, a display unit 35, an operation member 36, a recording unit 37, and a photometric sensor 38. And a shake sensor 39.
  • the imaging optical system 31 guides the light flux from the subject to the imaging unit 32.
  • the imaging unit 32 includes an imaging element 32a and a driving unit 32b, and photoelectrically converts an object image formed by the imaging optical system 31.
  • the imaging unit 32 can capture images with the same imaging condition in the entire imaging region of the imaging element 32a, or can capture images with different imaging conditions for each imaging region. Details of the imaging unit 32 will be described later.
  • the drive unit 32b generates a drive signal necessary for causing the image sensor 32a to perform accumulation control. Imaging instructions such as charge accumulation time (exposure time), ISO sensitivity (gain), and frame rate for the drive unit 32b are transmitted from the control unit 34 to the drive unit 32b.
  • the image processing unit 33 includes an input unit 33a, a correction unit 33b, and a generation unit 33c.
  • the image data generated by the imaging unit 32 is input to the input unit 33a.
  • the correction unit 33b performs a correction process on the input image data with respect to image blur caused by camera shake. Details of the correction processing will be described later.
  • the generation unit 33c performs image processing on the input image data and the corrected image data to generate an image.
  • the image processing includes, for example, color interpolation processing, pixel defect correction processing, contour enhancement processing, noise reduction processing, white balance adjustment processing, gamma correction processing, display luminance adjustment processing, saturation adjustment processing, and the like. Further, the generation unit 33c generates an image to be displayed by the display unit 35 and an image to be recorded.
  • the control unit 34 is constituted by a CPU, for example, and controls the overall operation of the camera 1. For example, the control unit 34 performs a predetermined exposure calculation based on the photoelectric conversion signal generated by the imaging unit 32, the charge accumulation time of the imaging element 32a necessary for proper exposure, the aperture value of the imaging optical system 31, and the ISO sensitivity. The exposure condition such as is determined and an instruction is given to the drive unit 32b. In addition, image processing conditions for adjusting saturation, contrast, sharpness, and the like are determined and instructed to the image processing unit 33 according to the imaging scene mode set in the camera 1 and the type of the detected subject element. The detection of the subject element will be described later.
  • the control unit 34 includes an object detection unit 34a, a setting unit 34b, an imaging control unit 34c, and a lens movement control unit 34d. These are realized as software by the control unit 34 executing a program stored in a nonvolatile memory (not shown). However, these may be configured by an ASIC or the like.
  • the object detection unit 34a performs a known object recognition process, and from the image data generated by the imaging unit 32, an animal (animal face) such as a person (person's face), a dog or a cat, a plant, a bicycle, A subject element such as a vehicle, a vehicle such as a train, a building, a stationary object, a landscape such as a mountain or a cloud, or a predetermined specific object is detected.
  • the setting unit 34b sets imaging conditions for the imaging area of the imaging device 32a.
  • Imaging conditions include the exposure conditions (charge accumulation time, gain, ISO sensitivity, frame rate, etc.) and the image processing conditions (for example, white balance adjustment parameters, gamma correction curves, display brightness adjustment parameters, saturation adjustment parameters, etc.) ).
  • the setting unit 34b can set the same imaging condition for a plurality of imaging areas, or can set different imaging conditions for each of the plurality of imaging areas.
  • the imaging control unit 34c controls the imaging unit 32 (imaging element 32a) and the image processing unit 33 by applying the imaging conditions set in the imaging region by the setting unit 34b.
  • the number of pixels arranged in the imaging region may be singular or plural. Further, the number of pixels arranged between the plurality of imaging regions may be different.
  • the lens movement control unit 34d controls an automatic focus adjustment (autofocus: AF) operation for focusing on a corresponding subject at a predetermined position (called a focus point) on the imaging screen.
  • autofocus automatic focus adjustment
  • the lens movement control unit 34d is a drive signal for moving the focus lens of the imaging optical system 31 to the in-focus position based on the calculation result, for example, a signal for adjusting the subject image with the focus lens of the imaging optical system 31. Is sent to the lens driving mechanism 31m of the imaging optical system 31.
  • the lens movement control unit 34d functions as a moving unit that moves the focus lens of the imaging optical system 31 in the optical axis direction based on the calculation result.
  • the process performed by the lens movement control unit 34d for the AF operation is also referred to as a focus detection process. Details of the focus detection process will be described later.
  • the display unit 35 displays an image generated by the image processing unit 33, an image processed image, an image read by the recording unit 37, and the like.
  • the display unit 35 also displays an operation menu screen, a setting screen for setting imaging conditions, and the like.
  • the operation member 36 includes various operation members such as a recording button, a shutter button, and a menu button.
  • the operation member 36 sends an operation signal corresponding to each operation to the control unit 34.
  • the operation member 36 includes a touch operation member provided on the display surface of the display unit 35. In the present embodiment, recording a moving image is referred to as recording.
  • the recording unit 37 records image data or the like on a recording medium including a memory card (not shown) in response to an instruction from the control unit 34.
  • the recording unit 37 reads image data recorded on the recording medium in response to an instruction from the control unit 34.
  • the photometric sensor 38 detects the brightness of the subject and outputs a detection signal.
  • the shake sensor 39 is constituted by, for example, an angular velocity sensor and an acceleration sensor.
  • the shake sensor 39 detects the shake of the camera 1 and outputs a detection signal.
  • the shake of the camera 1 is also referred to as camera shake.
  • FIG. 2 is a cross-sectional view of the image sensor 100.
  • the imaging element 100 includes an imaging chip 111, a signal processing chip 112, and a memory chip 113.
  • the imaging chip 111 is stacked on the signal processing chip 112.
  • the signal processing chip 112 is stacked on the memory chip 113.
  • the imaging chip 111, the signal processing chip 112, the signal processing chip 112, and the memory chip 113 are electrically connected by a connection unit 109.
  • the connection unit 109 is, for example, a bump or an electrode.
  • the imaging chip 111 captures a subject image and generates image data.
  • the imaging chip 111 outputs image data from the imaging chip 111 to the signal processing chip 112.
  • the signal processing chip 112 performs signal processing on the image data output from the imaging chip 111.
  • the memory chip 113 has a plurality of memories and stores image data.
  • the image sensor 100 may include an image pickup chip and a signal processing chip.
  • a storage unit for storing image data may be provided in the signal processing chip or may be provided separately from the imaging device 100.
  • the imaging element 100 may have a configuration in which the imaging chip 111 is stacked on the memory chip 113 and the memory chip 113 is stacked on the signal processing chip 112.
  • the incident light is incident mainly in the positive direction of the Z axis indicated by the white arrow.
  • the left direction of the paper orthogonal to the Z axis is the X axis plus direction
  • the front side of the paper orthogonal to the Z axis and X axis is the Y axis plus direction.
  • the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes in FIG.
  • the imaging chip 111 is, for example, a CMOS image sensor. Specifically, the imaging chip 111 is a backside illumination type CMOS image sensor.
  • the imaging chip 111 includes a microlens layer 101, a color filter layer 102, a passivation layer 103, a semiconductor layer 106, and a wiring layer 108.
  • the imaging chip 111 is arranged in the order of the microlens layer 101, the color filter layer 102, the passivation layer 103, the semiconductor layer 106, and the wiring layer 108 in the positive Z-axis direction.
  • the microlens layer 101 has a plurality of microlenses L.
  • the microlens L condenses incident light on the photoelectric conversion unit 104 described later.
  • the color filter layer 102 has a plurality of types of color filters F having different spectral characteristics.
  • the color filter layer 102 includes a first filter (R) having a spectral characteristic that mainly transmits red component light and a second filter (Gb, Gr) that has a spectral characteristic that mainly transmits green component light. ) And a third filter (B) having a spectral characteristic that mainly transmits blue component light.
  • a first filter, a second filter, and a third filter are arranged in a Bayer arrangement.
  • the passivation layer 103 is made of a nitride film or an oxide film, and protects the semiconductor layer 106.
  • the semiconductor layer 106 includes a photoelectric conversion unit 104 and a readout circuit 105.
  • the semiconductor layer 106 includes a plurality of photoelectric conversion units 104 between a first surface 106a that is a light incident surface and a second surface 106b opposite to the first surface 106a.
  • the semiconductor layer 106 includes a plurality of photoelectric conversion units 104 arranged in the X-axis direction and the Y-axis direction.
  • the photoelectric conversion unit 104 has a photoelectric conversion function of converting light into electric charge. In addition, the photoelectric conversion unit 104 accumulates charges based on the photoelectric conversion signal.
  • the photoelectric conversion unit 104 is, for example, a photodiode.
  • the semiconductor layer 106 includes a readout circuit 105 on the second surface 106b side of the photoelectric conversion unit 104.
  • a plurality of readout circuits 105 are arranged in the X-axis direction and the Y-axis direction.
  • the readout circuit 105 includes a plurality of transistors, reads out image data generated by the electric charges photoelectrically converted by the photoelectric conversion unit 104, and outputs the image data to the wiring layer 108.
  • the wiring layer 108 has a plurality of metal layers.
  • the metal layer is, for example, an Al wiring, a Cu wiring, or the like.
  • the wiring layer 108 outputs the image data read by the reading circuit 105.
  • the image data is output from the wiring layer 108 to the signal processing chip 112 via the connection unit 109.
  • connection unit 109 may be provided for each photoelectric conversion unit 104. Further, the connection unit 109 may be provided for each of the plurality of photoelectric conversion units 104. When the connection unit 109 is provided for each of the plurality of photoelectric conversion units 104, the pitch of the connection units 109 may be larger than the pitch of the photoelectric conversion units 104. In addition, the connection unit 109 may be provided in a peripheral region of the region where the photoelectric conversion unit 104 is disposed.
  • the signal processing chip 112 has a plurality of signal processing circuits.
  • the signal processing circuit performs signal processing on the image data output from the imaging chip 111.
  • the signal processing circuit includes, for example, an amplifier circuit that amplifies the signal value of the image data, a correlated double sampling circuit that performs noise reduction processing of the image data, and analog / digital (A / D) conversion that converts the analog signal into a digital signal. Circuit etc.
  • a signal processing circuit may be provided for each photoelectric conversion unit 104.
  • a signal processing circuit may be provided for each of the plurality of photoelectric conversion units 104.
  • the signal processing chip 112 has a plurality of through electrodes 110.
  • the through electrode 110 is, for example, a silicon through electrode.
  • the through electrode 110 connects circuits provided in the signal processing chip 112 to each other.
  • the through electrode 110 may also be provided in the peripheral region of the imaging chip 111 and the memory chip 113.
  • some elements constituting the signal processing circuit may be provided in the imaging chip 111.
  • a comparator that compares an input voltage with a reference voltage may be provided in the imaging chip 111, and circuits such as a counter circuit and a latch circuit may be provided in the signal processing chip 112.
  • the memory chip 113 has a plurality of storage units.
  • the storage unit stores image data that has been subjected to signal processing by the signal processing chip 112.
  • the storage unit is a volatile memory such as a DRAM, for example.
  • a storage unit may be provided for each photoelectric conversion unit 104.
  • the storage unit may be provided for each of the plurality of photoelectric conversion units 104.
  • the image data stored in the storage unit is output to the subsequent image processing unit.
  • FIG. 3 is a diagram for explaining the pixel array and the unit area 131 of the imaging chip 111.
  • a state where the imaging chip 111 is observed from the back surface (imaging surface) side is shown.
  • 20 million or more pixels are arranged in a matrix in the pixel region.
  • four adjacent pixels of 2 pixels ⁇ 2 pixels form one unit region 131.
  • the grid lines in the figure indicate the concept that adjacent pixels are grouped to form a unit region 131.
  • the number of pixels forming the unit region 131 is not limited to this, and may be about 1000, for example, 32 pixels ⁇ 32 pixels, more or less, or one pixel.
  • the unit area 131 in FIG. 3 includes a so-called Bayer array composed of four pixels of green pixels Gb, Gr, blue pixels B, and red pixels R.
  • the green pixels Gb and Gr are pixels having a green filter as the color filter F, and receive light in the green wavelength band of incident light.
  • the blue pixel B is a pixel having a blue filter as the color filter F and receives light in the blue wavelength band
  • the red pixel R is a pixel having a red filter as the color filter F and having a red wavelength band. Receives light.
  • a plurality of blocks are defined so as to include at least one unit region 131 per block. That is, the minimum unit of one block is one unit area 131. As described above, of the possible values for the number of pixels forming one unit region 131, the smallest number of pixels is one pixel. Therefore, when one block is defined in units of pixels, the minimum number of pixels among the number of pixels that can define one block is one pixel.
  • Each block can control pixels included in each block with different control parameters.
  • the unit region 131 in the block that is, the pixels in the block are controlled under the same imaging condition. That is, photoelectric conversion signals having different imaging conditions can be acquired between a pixel group included in a certain block and a pixel group included in another block.
  • control parameters examples include a frame rate, a gain, a thinning rate, the number of addition rows or addition columns to which photoelectric conversion signals are added, a charge accumulation time or accumulation count, a digitization bit number (word length), and the like.
  • the imaging device 100 can freely perform not only thinning in the row direction (X-axis direction of the imaging chip 111) but also thinning in the column direction (Y-axis direction of the imaging chip 111).
  • the control parameter may be a parameter in image processing.
  • FIG. 4 is a diagram for explaining a circuit in the unit region 131.
  • one unit region 131 is formed by four adjacent pixels of 2 pixels ⁇ 2 pixels.
  • the number of pixels included in the unit region 131 is not limited to this, and may be 1000 pixels or more, or may be a minimum of 1 pixel.
  • the two-dimensional position of the unit area 131 is indicated by reference signs A to D.
  • the reset transistor (RST) of the pixel included in the unit region 131 is configured to be turned on and off individually for each pixel.
  • a reset wiring 300 for turning on / off the reset transistor of the pixel A is provided, and a reset wiring 310 for turning on / off the reset transistor of the pixel B is provided separately from the reset wiring 300.
  • a reset line 320 for turning on and off the reset transistor of the pixel C is provided separately from the reset lines 300 and 310.
  • a dedicated reset wiring 330 for turning on and off the reset transistor is also provided for the other pixels D.
  • the pixel transfer transistor (TX) included in the unit region 131 is also configured to be turned on and off individually for each pixel.
  • a transfer wiring 302 for turning on / off the transfer transistor of the pixel A, a transfer wiring 312 for turning on / off the transfer transistor of the pixel B, and a transfer wiring 322 for turning on / off the transfer transistor of the pixel C are separately provided.
  • a dedicated transfer wiring 332 for turning on / off the transfer transistor is provided for the other pixels D.
  • the pixel selection transistor (SEL) included in the unit region 131 is also configured to be turned on and off individually for each pixel.
  • a selection wiring 306 for turning on / off the selection transistor of the pixel A, a selection wiring 316 for turning on / off the selection transistor of the pixel B, and a selection wiring 326 for turning on / off the selection transistor of the pixel C are separately provided.
  • a dedicated selection wiring 336 for turning on and off the selection transistor is provided for the other pixels D.
  • the power supply wiring 304 is commonly connected from the pixel A to the pixel D included in the unit region 131.
  • the output wiring 308 is commonly connected to the pixel D from the pixel A included in the unit region 131.
  • the power supply wiring 304 is commonly connected between a plurality of unit regions, but the output wiring 308 is provided for each unit region 131 individually.
  • the load current source 309 supplies current to the output wiring 308.
  • the load current source 309 may be provided on the imaging chip 111 side or may be provided on the signal processing chip 112 side.
  • the charge accumulation including the charge accumulation start time, the accumulation end time, and the transfer timing is controlled from the pixel A to the pixel D included in the unit region 131. can do.
  • the photoelectric conversion signals of the pixels A to D can be output via the common output wiring 308.
  • a so-called rolling shutter system in which charge accumulation is controlled in a regular order with respect to rows and columns for the pixels A to D included in the unit region 131.
  • photoelectric conversion signals are output in the order of “ABCD” in the example of FIG.
  • the charge accumulation time can be controlled for each unit region 131.
  • photoelectric conversion signals with different frame rates between the unit regions 131 can be output.
  • the unit area 131 included in another block is paused while the unit areas 131 included in some blocks in the imaging chip 111 perform charge accumulation (imaging). Only the imaging can be performed, and the photoelectric conversion signal can be output.
  • FIG. 5 is a block diagram illustrating a functional configuration of the image sensor 100 corresponding to the circuit illustrated in FIG.
  • the multiplexer 411 sequentially selects the four PDs 104 that form the unit region 131 and outputs each pixel signal to the output wiring 308 provided corresponding to the unit region 131.
  • the multiplexer 411 is formed in the imaging chip 111 together with the PD 104.
  • the pixel signal output through the multiplexer 411 is supplied to the signal processing chip 112 by a signal processing circuit 412 that performs correlated double sampling (CDS) / analog / digital (A / D) conversion. D conversion is performed.
  • CDS correlated double sampling
  • a / D converted pixel signal is transferred to the demultiplexer 413 and stored in the pixel memory 414 corresponding to each pixel.
  • the demultiplexer 413 and the pixel memory 414 are formed in the memory chip 113.
  • the arithmetic circuit 415 formed in the memory chip 113 processes the pixel signal stored in the pixel memory 414 and passes it to the subsequent image processing unit.
  • the arithmetic circuit 415 may be provided in the signal processing chip 112. Note that FIG. 5 shows connections for one unit region 131, but actually these exist for each unit region 131 and operate in parallel. However, the arithmetic circuit 415 does not have to exist for each unit region 131. For example, one arithmetic circuit 415 may perform sequential processing while sequentially referring to the values of the pixel memory 414 corresponding to each unit region 131. Good.
  • the arithmetic circuit 415 may be configured to include functions of a control unit, an image processing unit, and the like at the subsequent stage.
  • the output wiring 308 is provided corresponding to each of the unit areas 131. Since the image pickup device 100 includes the image pickup chip 111, the signal processing chip 112, and the memory chip 113, each chip is arranged in the surface direction by using the electrical connection between the chips using the connection portion 109 for the output wiring 308. The wiring can be routed without increasing the size.
  • an imaging condition can be set for each of a plurality of blocks in the imaging device 32a.
  • the imaging control unit 34c of the control unit 34 causes the plurality of regions to correspond to the block and performs imaging under imaging conditions set for each block.
  • the number of pixels constituting the block may be singular or plural.
  • the camera 1 repeats imaging at a predetermined frame rate (for example, 60 fps) when capturing a moving image to be recorded.
  • the monitor moving image is a moving image that is captured before the recording button is operated or before the shutter button is operated.
  • a moving image to be recorded when the recording button is operated is referred to as a recording image
  • a monitoring moving image is referred to as a monitor image.
  • FIG. 6A is a diagram illustrating the arrangement of the first imaging region B1 and the second imaging region B2 set on the imaging surface of the imaging element 32a when a recording image or a monitor image is captured.
  • the first imaging region B1 is composed of blocks of even rows in odd columns of blocks and blocks of odd rows in even columns of blocks.
  • the second imaging region B2 is composed of even-numbered blocks in even-numbered columns of blocks and odd-numbered blocks in odd-numbered columns of blocks.
  • the imaging surface of the imaging element 32a is divided into a checkered pattern by a plurality of blocks belonging to the first imaging region B1 and a plurality of blocks belonging to the second imaging region B2.
  • FIG. 7A and FIG. 7B are diagrams illustrating the recording image 51 and the detection image 52.
  • the control unit 34 generates the recording image 51 based on the photoelectric conversion signal read from the first imaging region B1 of the imaging element 32a that has captured one frame.
  • the control unit 34 generates the detection image 52 based on the photoelectric conversion signal read from the second imaging region B2 of the imaging element 32a.
  • the photoelectric conversion signal is also referred to as image data.
  • An image used for subject detection, focus detection, imaging condition setting, and image generation is referred to as a detection image 52.
  • the recording image 51 is a moving image to be recorded.
  • the recording image 51 is also used to generate a monitor image to be displayed on the display unit 35.
  • the detection image 52 is used for information acquisition for subject detection, information acquisition for focus detection, information acquisition for imaging condition setting, and information acquisition for image generation.
  • the control unit 34 detects the subject element by the object detection unit 34 a based on the image data of the attention area described later in the detection image 52, and the image data of the attention area described later in the detection image 52.
  • the lens movement control unit 34d performs focus detection processing based on the image data
  • the setting unit 34b performs exposure calculation processing based on image data of a region of interest described later in the detection image 52.
  • the image processing unit 33 Based on the image data of the region, the image processing unit 33 performs image generation processing.
  • the control unit 34 sets the density of image data read from the first imaging region B1 of the imaging element 32a for the recording image 51 to a value necessary for the recording image 51.
  • the image data may be read out from a smaller number of pixels than the pixels included in the first imaging region B1. .
  • control unit 34 sets the density of image data read from the second imaging region B2 of the imaging element 32a for the detection image 52 to a value necessary for the above-described information acquisition.
  • the image data may be read out from a smaller number of pixels than the pixels included in the second imaging region B2.
  • the control unit 34 can vary the number of readout signals per block in the first imaging region B1 of the imaging device 32a and the second imaging region B2 of the imaging device 32a. For example, the density of the image data read per predetermined number of pixels may be different between the first imaging area B1 and the second imaging area B2 of the imaging element 32a. When the number of pixels constituting the block is 1, the number of readout signals per block is zero or one.
  • the display unit 35 of the camera 1 has a lower display resolution than the number of pixels of the image sensor 22.
  • the control unit 34 thins out the image data of the recording image 51 for each predetermined number of data or adds the data for each predetermined number of data. A smaller number of image data than the data is generated to correspond to the display resolution of the display unit 35.
  • the above addition processing for each predetermined number of data may be performed by the arithmetic circuit 415 provided in the signal processing chip 112 or the memory chip 113, for example.
  • control unit 34 generates the recording image 51 and the detection image 52 based on the image data read from the imaging element 32a that has captured one frame.
  • the recording image 51 and the detection image 52 are captured at the same angle of view and include a common subject image. Acquisition of the recording image 51 and acquisition of the detection image 52 can be performed in parallel as shown in FIG.
  • first imaging area B1 and the second imaging area B2 in the imaging area may be divided as shown in FIG. 6B or FIG.
  • the first imaging region B1 is configured by even columns of blocks.
  • the second imaging region B2 is configured by an odd number of blocks.
  • the first imaging region B1 is configured by odd-numbered rows of blocks.
  • the second imaging region B2 is configured by even rows of blocks.
  • the control unit 34 records based on the image data read from the first imaging region B1 of the imaging device 32a that has captured one frame.
  • a work image 51 is generated.
  • the control unit 34 generates the detection image 52 based on the image data read from the second imaging region B2 of the imaging element 32a.
  • the recording image 51 and the detection image 52 are captured at the same angle of view and include a common subject image. Acquisition of the recording image 51 and acquisition of the detection image 52 can be performed in parallel as shown in FIG.
  • the above-described recording image 51 may be used for focus detection processing or the like.
  • the monitor image may be transmitted from the camera 1 to a monitor outside the camera 1, and the monitor image may be displayed on the external monitor.
  • the imaging condition set in the first imaging area B1 for capturing the recording image 51 is referred to as the first imaging condition
  • the imaging condition set in the second imaging area B2 for capturing the detection image 52 is the first. It will be referred to as two imaging conditions.
  • the control unit 34 may set the first imaging condition and the second imaging condition to the same condition or different conditions.
  • the control unit 34 sets the first imaging condition set in the first imaging area B1 to a condition suitable for the recording image 51. At this time, the same first imaging condition is set uniformly throughout the first imaging region B1.
  • the control unit 34 sets the second imaging condition to be set in the second imaging area B2 to a condition suitable for the information acquisition or the like.
  • the condition set as the second imaging condition may be the same second imaging condition uniformly in the entire second imaging area B2, or different second imaging conditions for each area in the second imaging area B2. May be set.
  • the control unit 34 sets the second imaging condition to be set in the second imaging region B2. For each of the two imaging regions B2, a condition suitable for subject detection, a condition suitable for focus detection, a condition suitable for imaging condition setting, and a condition suitable for image generation may be set.
  • control unit 34 may change the second imaging condition set in the second imaging area B2 for each frame.
  • the second imaging condition set in the second imaging region B2 in the first frame of the detection image 52 is set as a condition suitable for subject detection
  • the second imaging condition B2 is set in the second imaging region B2 in the second frame of the detection image 52.
  • Two imaging conditions are suitable for focus detection
  • the second imaging condition set in the second imaging region B2 in the third frame of the detection image 52 is a condition suitable for imaging condition setting
  • four frames of the detection image 52 are used.
  • the second imaging condition set in the second imaging area B2 is set as a condition suitable for image generation. In these cases, the same second imaging condition may be set uniformly throughout the second imaging area B2 in each frame, or different second imaging conditions may be set for each area.
  • the area ratio between the first imaging region B1 and the second imaging region B2 may be different.
  • the control unit 34 sets the ratio of the first imaging area B1 on the imaging surface to be higher than the ratio of the second imaging area B2 based on the operation by the user or the determination of the control unit 34, or on the imaging surface.
  • the ratio occupied by the first imaging area B1 and the second imaging area B2 is set to be equal as illustrated in FIGS. 6A to 6C, or the ratio occupied by the first imaging area B1 on the imaging surface is set. It is set lower than the ratio occupied by the second imaging region B2.
  • control part 34 can set the operation rate of the block contained in 1st imaging area B1, and the operation rate of the block contained in 2nd imaging area B2, respectively.
  • the example is illustrated in FIGS. 6 (a) to 6 (c). Imaging is performed with 80% of the blocks included in the first imaging area B1, and imaging is performed with 80% of the blocks included in the second imaging area B2.
  • imaging is performed with 80% of the blocks included in the second imaging area B2.
  • driving every other block is performed to drive half of the blocks and stop driving the remaining half of the blocks.
  • FIG. 8 is a diagram illustrating a subject area on the imaging screen of the camera 1.
  • the subject area includes a person 61, a car 62, a bag 63, a mountain 64, a cloud 65, and a cloud 66.
  • the person 61 holds the bag 63 with both hands.
  • the automobile 62 stops at the right rear side of the person 61.
  • the image data read from the first imaging area B1 is used as shown in FIG.
  • a detection image 52 as shown in FIG. 7B is obtained from the image data read from the second imaging region B2.
  • the imaging conditions can be set for each block corresponding to each subject area. For example, a block included in the area of the person 61, a block included in the car 62, a block included in the bag 63, a block included in the mountain 64, a block included in the cloud 65, and a block included in the cloud 66.
  • the imaging conditions are set for each. In order to appropriately set the imaging condition for each of these blocks, it is necessary to capture the above-described detection image 52 under an appropriate exposure condition, and to obtain an image with no overexposure or underexposure. is there.
  • Overexposure means that the gradation of data in a high-luminance portion of an image is lost due to overexposure.
  • blackout means that the gradation of data in the low-luminance portion of the image is lost due to underexposure. This is because, in the detection image 52 in which whiteout or blackout occurs, for example, the outline (edge) of each subject area does not appear, and it is difficult to detect the subject element based on the detection image 52.
  • the subject based on the detection image 52 is set by appropriately setting the second imaging condition for acquiring the detection image 52 for the second imaging region B2 of the imaging element 32a. It is possible to appropriately perform detection, focus detection, imaging condition setting, and image generation.
  • the second imaging condition is set as follows.
  • the control unit 34 makes the second gain as the second imaging condition for obtaining the detection image 52 higher than the first gain as the first imaging condition for obtaining the recording image 51.
  • the object detection unit 34a can detect the bright detection image 52 even when the recording image 51 is a dark image. It is possible to accurately detect the subject based on the above. If the subject detection can be accurately performed, the image can be appropriately divided for each subject.
  • the control unit 34 sets the first imaging area B1 and the second imaging area B2 in the imaging element 32a.
  • the first imaging region B1 is a region where the recording image 51 is captured.
  • the second imaging region B2 is a region where a detection image 52 for detection is captured.
  • the control unit 34 sets a first gain for recording in the first imaging region B1 of the imaging device 32a, and a second gain higher than the first gain in the second imaging region B2 of the imaging device 32a.
  • the control unit 34 that has performed the gain setting performs imaging for the detection image 52 in the second imaging region B2 of the imaging device 32a, and performs detection based on the image data read from the second imaging region B2 of the imaging device 32a.
  • An image 52 is generated.
  • the control unit 34 causes the object detection unit 34 a to detect the subject element based on the detection image 52.
  • the control unit 34 causes the recording image 51 to be captured in the first imaging region B1 of the imaging element 32a.
  • the recording image 51 is not recorded by the recording unit 37, but the recording image 51 is captured in order to generate a monitor image based on the recording image 51.
  • the setting unit 34b divides the recording image 51 based on the subject element detected by the object detection unit 34a.
  • the recording image 51 acquired by the imaging unit 32 includes, for example, a person 61 area, an automobile 62 area, a bag 63 area, a mountain 64 area, and clouds 65. It is divided into a region, a cloud 66 region, and other regions.
  • the control unit 34 causes the display unit 35 to display a setting screen illustrated in FIG. In FIG. 9, a monitor image 60a is displayed on the display unit 35, and an imaging condition setting screen 70 is displayed on the right side of the monitor image 60a.
  • the setting screen 70 lists frame rate, shutter speed (TV), and gain (ISO sensitivity) in order from the top as an example of setting items for imaging conditions.
  • the frame rate is the number of frames of a moving image captured by the camera 1 per second.
  • the shutter speed corresponds to the exposure time.
  • the gain corresponds to the ISO sensitivity.
  • the setting items of the imaging conditions may be added as appropriate in addition to those exemplified in FIG. When all the setting items do not fit in the setting screen 70, other setting items may be displayed by scrolling the setting items up and down.
  • the control unit 34 can set an area selected by a user operation among the areas divided by the setting unit 34b as a target for setting (changing) the imaging condition. For example, in the camera 1 that can be touched, the user touches the display position of the subject for which the imaging condition is to be set (changed) on the display surface of the display unit 35 on which the monitor image 60a is displayed. For example, when the display position of the person 61 is touched, the control unit 34 sets an area corresponding to the person 61 in the monitor image 60 a as an imaging condition setting (change) target area and an area corresponding to the person 61. The outline is highlighted.
  • an area to be displayed with an emphasized outline indicates an area to be set (changed) for imaging conditions.
  • the highlighted display is, for example, a thick display, a bright display, a display with a different color, a broken line display, a blinking display, or the like.
  • the monitor image 60 a in which the outline of the region corresponding to the person 61 is emphasized is displayed.
  • the highlighted area is a target for setting (changing) the imaging condition.
  • the control unit 34 displays the current shutter speed for the highlighted area (person 61).
  • the set value is displayed on the screen (reference numeral 68).
  • the camera 1 is described on the premise of a touch operation.
  • the imaging condition may be set (changed) by operating a button or the like constituting the operation member 36.
  • the setting unit 34b increases or decreases the shutter speed display 68 from the current setting value according to the touch operation.
  • An instruction is sent to the imaging unit 32 (FIG. 1) so as to change the imaging condition of the unit area 131 (FIG. 3) of the imaging element 32a corresponding to the displayed area (person 61) in accordance with the touch operation.
  • the decision icon 72 is an operation icon for confirming the set imaging condition.
  • the setting unit 34b performs the setting (change) of the frame rate and gain (ISO) in the same manner as the setting (change) of the shutter speed (TV).
  • the setting unit 34b may set the imaging conditions based on the determination of the control unit 34 without being based on a user operation. For example, when an overexposure or underexposure occurs in an area including a subject having the maximum luminance or the minimum luminance in the image, the setting unit 34b cancels the overexposure or underexposure based on the determination of the control unit 34. Imaging conditions may be set. For the area that is not highlighted (area other than the person 61), the set imaging conditions are maintained.
  • the control unit 34 displays the entire target area brightly, increases the contrast of the entire target area, or displays the entire target area. May be displayed blinking.
  • the target area may be surrounded by a frame.
  • the display of the frame surrounding the target area may be a double frame or a single frame, and the display mode such as the line type, color, and brightness of the surrounding frame may be appropriately changed.
  • the control unit 34 may display an indication of an area for which an imaging condition is set, such as an arrow, in the vicinity of the target area.
  • the control unit 34 may darkly display a region other than the target region for which the imaging condition is set (changed), or may display a low contrast other than the target region.
  • the control unit 34 When a recording button (not shown) constituting the operation member 36 or a display for instructing recording start (for example, a release icon 74 in FIG. 9) is operated, the control unit 34 records the recording image 51 by the recording unit 37. Let it begin.
  • the imaging conditions of the recording image 51 may be applied to different imaging conditions for each of the above divided areas (person 61, car 62, bag 63, mountain 64, cloud 65, cloud 66). A common imaging condition can also be applied to the region.
  • the image processing unit 33 performs image processing on the image data acquired by the imaging unit 32. Image processing can also be performed under different image processing conditions for each of the divided areas.
  • the control unit 34 sets the first condition to the sixth condition as the imaging condition for each area divided as described above.
  • the area of the person 61 for setting the first condition is referred to as the first area 61
  • the area of the automobile 62 for setting the second condition is referred to as the second area 62
  • the bag 63 for setting the third condition is called the third region 63
  • the region of the mountain 64 that sets the fourth condition is called the fourth region 64
  • the region of the cloud 65 that sets the fifth condition is called the fifth region 65
  • the sixth condition is set
  • a region of the cloud 66 that is to be referred to is called a sixth region 66.
  • the image data of the recording image 51 set and acquired in this way is recorded by the recording unit 37.
  • a recording button (not shown) constituting the operation member 36 is operated again or a display instructing the end of recording is operated, the control unit 34 ends the recording of the recording image 51 by the recording unit 37.
  • FIG. 10 shows the imaging timing of the recording image 51 (Dv1, Dv2, Dv3,%), The imaging timing of the detection image 52 (Di, Dii, Diii, Div %), and the monitor images (LV1, LV2, LV3). ,... Is a diagram illustrating display timing of.
  • the imaging control unit 34c causes the imaging unit 32 to capture the recording image 51 and the detection image 52 under the imaging conditions set by the setting unit 34b.
  • the imaging control unit 34c performs imaging for the recording image 51 in the first imaging area B1 of the imaging element 32a.
  • the image sensor 32a sequentially captures the first frame recording image Dv1, the second frame recording image Dv2, the third frame recording image Dv3,.
  • the imaging of the recording image 51 is repeated by the first imaging area B1 of the imaging element 32a.
  • the imaging control unit 34c causes the second imaging region B2 of the imaging element 32a to perform imaging for the detection image 52 four times during one frame period of imaging of the recording image 51.
  • the image sensor 32a sequentially captures the detection image Di for the first frame, the detection image Dii for the second frame, the detection image Diii for the third frame, and the detection image Div for the fourth frame.
  • four frames of detection images Di, Dii, Diii, and Div are captured as the detection image 52 for one frame period of the recording image 51.
  • the detection image 52 of 4 frames is captured by the second imaging region B2 in parallel with the capturing of the recording image 51 of 1 frame by the first imaging region B1.
  • the detection image Di is, for example, for subject detection
  • the detection image Dii is, for example, for focus detection
  • the detection image Diii is, for example, an imaging condition setting. Therefore, the detection image Div is used for image generation, for example.
  • control unit 34 causes the display unit 35 to display the monitor image LV1 of the first frame based on the recording image Dv1 of the first frame. Then, the imaging control unit 34c sets the subject position detected based on the detection images Di, Dii, Diii, and Div captured in parallel with the recording of the recording image Dv1 for the first frame and the determined imaging conditions to the second The imaging unit 32 is controlled so as to be reflected in the imaging of the frame recording image Dv2.
  • control unit 34 causes the display unit 35 to display the monitor image LV2 of the second frame based on the recording image Dv2 of the second frame. Then, the imaging control unit 34c sets the subject position detected based on the detection images Di, Dii, Diii, and Div captured in parallel with the recording of the recording image Dv2 for the second frame, and the determined imaging conditions to the third
  • the imaging unit 32 is controlled so as to be reflected in the imaging of the frame recording image Dv3.
  • the control unit 34 repeats the same processing thereafter.
  • the control unit 34 causes the recording unit 37 to record the image data of the recording image 51 (recording image Dv3 and later) captured after that time.
  • the imaging control unit 34c detects the detection image Di in the second imaging area B2 of the imaging element 32a before starting the imaging of the recording image Dv1 of the first frame in the first imaging area B1 of the imaging element 32a.
  • the detection image Dii, the detection image Diii, and the detection image Div may be captured. For example, after the user turns on the camera 1, the detection image Di in the second imaging region B2 and the detection image are detected before the recording of the first frame recording image Dv1 in the first imaging region B1.
  • the image Dii, the detection image Diii, and the detection image Div are imaged.
  • the imaging control unit 34c captures the subject position detected based on the detection images Di, Dii, Diii, and Div and the determined imaging conditions in the imaging of the recording image Dv1 of the first frame. 32 is controlled.
  • the imaging control unit 34c does not automatically start imaging the detection images Di, Dii, Diii, and Div, but waits for the following operations to detect the detection images Di, Dii, and Diii. , Div imaging may be started.
  • the imaging control unit 34c starts imaging of the detection images Di, Dii, Diii, and Div in the second imaging region B2 of the imaging element 32a.
  • operations related to imaging there are, for example, an operation for changing an imaging magnification, an operation for changing an aperture, an operation related to focus adjustment (for example, selection of a focus point), and the like.
  • the imaging control unit 34c images the detection images Di, Dii, Diii, and Div in the second imaging region B2 of the imaging element 32a.
  • the detection images Di, Dii, Diii, and Div can be generated before the recording image Dv1 for the first frame is performed.
  • the imaging control unit 34c displays the detection images Di, Dii, Diii, and Div in the second imaging region B2 of the imaging device 32a. Imaging may be performed. This is because there is a high possibility that a new setting is made when an operation related to imaging is performed from the menu screen. In this case, the imaging control unit 34c repeatedly performs imaging of the detection images Di, Dii, Diii, and Div in the second imaging region B2 of the imaging element 32a during the period when the user is operating from the menu screen.
  • the detection image Di, the detection image Dii, the detection image Diii, and the detection image Div are imaged before the recording of the first frame recording image Dv1 is started, the detection image Di and the detection are detected.
  • the image for recording Dv1 may be taken.
  • the recording image Dv1 can be imaged reflecting the subject position detected based on the detection images Di, Dii, Diii, Div and the determined imaging conditions, and the monitor image LV1 can be displayed based on the recording image Dv1.
  • the imaging control unit 34 c controls the imaging unit 32.
  • the imaging control unit 34c performs the operation of the imaging element 32a. It is not necessary to capture the detection images Di, Dii, Diii, and Div in the two imaging regions B2. This is because the detection images Di, Dii, Diii, and Div are not wastefully imaged when there is a low possibility that new settings for imaging are performed.
  • the imaging control unit 34c May image the detection images Di, Dii, Diii, and Div in the second imaging region B2 of the imaging element 32a.
  • the imaging control unit 34c performs a predetermined operation during the period during which the dedicated button is operated.
  • the detection images Di, Dii, Diii, Div may be imaged in the second imaging region B2 of the imaging device 32a every period, or in the second imaging region B2 of the imaging device 32a when the operation of the dedicated button is finished.
  • the detection images Di, Dii, Diii, and Div may be captured. As a result, the detection images Di, Dii, Diii, and Div can be captured at a timing desired by the user.
  • Image blur correction> the recording image 51 read out from the first imaging region B1 of the image pickup device 32a is subjected to trimming processing based on the shake of the camera 1 due to camera shake, so that the recorded recording image 51 is captured. Reduce image blur. Such suppression of image blur is also referred to as image blur correction. Details of the image blur correction will be described below.
  • image blur generated in the camera 1 due to camera shake is divided into image blur (also referred to as angle blur) accompanying the rotational movement of the camera 1 and image blur (also referred to as translation blur) accompanying the translational movement of the camera 1.
  • the control unit 34 calculates image blur due to rotational movement of the camera 1 and image blur due to translational movement of the camera 1.
  • FIG. 11 is a diagram illustrating the control unit 34 that functions as a shake correction unit.
  • the control unit 34 includes a shake amount calculation unit 34e and a target movement amount calculation unit 34f.
  • the shake amount calculation unit 34e calculates an image shake in the Y-axis direction due to the rotational motion using a detection signal around the axis (Pitch direction) parallel to the X axis (FIG. 3) by the shake sensor 39. Further, the shake amount calculation unit 34e calculates an image shake in the X-axis direction due to the rotational motion using a detection signal around the axis (Yaw direction) parallel to the Y-axis (FIG. 3) by the shake sensor 39.
  • the shake amount calculation unit 34e further calculates an image shake in the X-axis direction due to translational motion using a detection signal in the X-axis direction by the shake sensor 39. Furthermore, the shake amount calculation unit 34e calculates the image shake in the Y-axis direction due to translational motion using the detection signal in the Y-axis direction from the shake sensor 39.
  • the target movement amount calculation unit 34f calculates the image blur in the X axis direction and the Y axis direction due to the rotational motion and the image blur in the X axis direction and the Y axis direction due to the translational motion calculated by the blur amount calculation unit 34e for each axis.
  • the image blur in the X-axis direction and the Y-axis direction is calculated. For example, when the direction of the image blur due to the rotational motion and the direction of the image blur due to the translational motion calculated by the blur amount calculation unit 34e in the same axial direction are the same, the image blur increases due to the addition, but the two calculated image blurs. When the orientations of the images are different, the image blur is reduced by the addition. In this way, the addition calculation is performed by adding a positive or negative sign depending on the image blur direction of each axis.
  • the target movement amount calculation unit 34f calculates the image blur in the X-axis direction and the Y-axis direction after addition, the photographing magnification (calculated based on the position of the zoom lens of the imaging optical system 31), and the camera 1. Based on the distance to the subject (calculated based on the position of the focus lens of the imaging optical system 31), the image blur amount of the image plane (imaging plane of the imaging element 32a) with respect to the subject image is calculated.
  • the control unit 34 sets a trimming range W ⁇ b> 1 smaller than the recording image 51 in the recording image 51.
  • the size of the trimming range W1 is, for example, 60% of the size of the recording image 51.
  • the size of the trimming range W1 may be determined by the average size of the detection signal from the shake sensor 39.
  • the control unit 34 decreases the trimming range W1 as the average value of the detection signal from the shake sensor 39 in the most recent predetermined time increases, and increases the trimming range W1 as the average value of the detection signal from the shake sensor 39 decreases.
  • the control unit 34 performs image blur correction on the recording image 51 by moving the trimming range W ⁇ b> 1 set as described above in the direction opposite to the shake direction of the camera 1 in the recording image 51. Therefore, the target movement amount calculation unit 34f calculates, as the target movement amount, the movement direction and movement amount of the trimming range W1 necessary for canceling the above-described image blur amount on the image plane. The target movement amount calculation unit 34f outputs the target movement amount calculated for the correction unit 33b to the correction unit 33b.
  • FIG. 12B illustrates a state in which the trimming range W1 is moved in the recording image 51. That is, since the camera 1 is shaken upward from the state of FIG. 12A and the subject image in the recording image 51 is blurred downward, the trimming range W1 is moved downward by the correction unit 33b. By such image blur correction, even if the position of the subject in the recording image 51 is moved due to camera shake, the subject stays at substantially the same position in the trimming range W1.
  • the area hatched in FIGS. 12A and 12B corresponds to the movement allowance of the trimming range W1.
  • the correction unit 33b like the image 51-1 and the image 51-2 illustrated in FIGS. 12A and 12B, the image corresponding to the trimming range W1 in the image data of the recording image 51.
  • Image data is extracted, and the extracted image data is used as the image data of the recording image 51 after image blur correction.
  • the control unit 34 performs such image blur correction for each frame constituting the recording image 51.
  • control unit 34 performs image blur correction similar to the recording image 51 not only on the recording image 51 but also on the detection image 52.
  • the control unit 34 also moves a later-described attention area set for the detection image 52 in the detection image 52. This will be described in detail below with reference to FIG.
  • FIG. 13A is a diagram for explaining image blur correction for the recording image 51
  • FIG. 13B is an image blur for the detection image 52 corresponding to the recording image 51 of FIG. 13A. It is a figure explaining correction
  • the control unit 34 sets a trimming range W ⁇ b> 2 smaller than the detection image 52 in the detection image 52 like the corresponding recording image 51.
  • the size of the trimming range W2 is set to 60% of the size of the detection image 52 as in the case of the recording image 51.
  • the size of the trimming range W ⁇ b> 2 may be determined by the average size of the detection signal from the shake sensor 39 as in the case of the recording image 51.
  • the control unit 34 performs image blur correction on the detection image 52 by moving the trimming range W2 set as described above in a direction opposite to the shake direction of the camera 1 in the detection image 51.
  • the correction unit 33b extracts the image data of the image corresponding to the trimming range W2 from the image data of the detection image 52, as in the image 52-2 illustrated in FIG. 13B, and extracts the extracted image data.
  • the image data of the detection image 52 after image blur correction is used.
  • the control unit 34 also moves the attention range 61-2 set for the detection image 52 in the direction opposite to the shake direction of the camera 1 together with the trimming range W2 in the detection image 52.
  • the distance between the trimming range W2 until the movement and the trimming range W2 after the change is the same as the distance between the attention range 61-2 before the movement and the attention range 61-2 after the movement.
  • the distance between the trimming range W2 before movement and the trimming range W2 after change is, for example, the distance between the center of gravity of the trimming range W2 before movement and the center of gravity of the trimming range W2 after movement.
  • the distance corresponds to the amount of movement of the trimming range W2. The same applies to the distance between the attention range 61-2 before the movement and the attention range 61-2 after the movement.
  • the attention range 61-2 is a region in the detection image 52 where the object detection unit 34a detects the subject element, a region in the detection image 52 where the lens movement control unit 34d performs focus detection processing, and a detection image. 52 corresponds to an area where the setting unit 34b performs the exposure calculation process, and an area referred to by the image processing unit 33 during the image generation process in the detection image 52.
  • the shape of the attention range 61-2 may be a square or a rectangle, and may be a circle or an ellipse.
  • the size of the attention range 61-2 is appropriately changed according to the size of the area for detecting the subject element, the area for performing the focus detection process, the area for performing the exposure calculation process, and the area to be referred to during the image generation process. It's okay.
  • the attention area 61-2 after movement may partially overlap the attention area 61-2 before movement (may include the attention area 61-2 before movement) or may not overlap. Good. Further, the attention area 61-2 after the movement may include the attention area 61-2 before the movement.
  • the control unit 34 can set different imaging conditions for the attention range 61-2 and other areas in the second imaging area B2 in the second imaging conditions for obtaining the detection image 52.
  • the control unit 34 changes the setting of the second imaging condition as the attention range 61-2 moves. That is, the second imaging condition for the second imaging region B2 is reset according to the position of the attention range 61-2 after movement.
  • FIG. 13C is a diagram for explaining image blur correction for the recording image 51, and illustrates a state in which the trimming range W1 is moved to the lower limit in the recording image 51.
  • FIG. 13D is a diagram for explaining image blur correction for the detection image 52, and illustrates a state in which the trimming range W ⁇ b> 2 is moved to the lower limit in the detection image 52. Since the trimming range W1 in FIG. 13C moves in the recording image 51, it cannot move below the position shown in FIG. 13C. Similarly, since the trimming range W2 in FIG. 13D also moves in the detection image 52, it cannot move below the position shown in FIG. 13D.
  • the control unit 34 moves in the direction opposite to the shake direction of the camera 1 in the trimming range W2, as shown in FIG. 13 (d). Move continuously.
  • the moving direction and moving amount of the attention range 61-3 are the target moving amounts output from the target moving amount calculating unit 34f to the correcting unit 33b.
  • the control unit 34 trims the attention range 61-3 when the subject of interest (for example, the head of a person) moves further downward. Move within the range W2. This is so that the head of the person is within the attention range 61-3 after the movement.
  • the distance between the trimming range W2 before the movement and the trimming range W2 after the movement is different from the distance between the attention range 61-3 before the movement and the attention range 61-3 after the movement.
  • the distance between the trimming range W2 until the movement and the trimming range W2 after the change is, for example, the distance between the center of gravity of the trimming range W2 before the movement and the center of gravity of the trimming range W2 after the movement.
  • the distance here corresponds to the amount of movement of the trimming range W2.
  • a region in which the object detection unit 34a detects a subject element in the detection image 52, a region in which the lens movement control unit 34d performs focus detection processing in the detection image 52, and detection In the image 52, the setting unit 34b performs the exposure calculation process, and in the detection image 52, the area referred to by the image processing unit 33 during the image generation process includes the same subject (in this example, the human head). Will exist.
  • the detection image 52 imaged as described above and subjected to image blur correction is used for image processing, focus detection processing, subject detection processing, and exposure condition setting processing.
  • the image processing unit 33 (the generation unit 33c) performs image processing using a kernel having a predetermined size centered on the target pixel P (processing target pixel) in the image data of the detection image 52.
  • FIG. 14 is a diagram illustrating a kernel, and corresponds to the attention area 90 in the monitor image 60a of FIG. In FIG.
  • pixels around the target pixel P (eight pixels in this example) included in the target region 90 (for example, 3 ⁇ 3 pixels) centered on the target pixel P are set as reference pixels Pr1 to Pr8.
  • the position of the target pixel P is the target position, and the positions of the reference pixels Pr1 to Pr8 surrounding the target pixel P are reference positions.
  • the pixel defect correction process is one of image processes performed on a captured image.
  • the image pickup element 32a which is a solid-state image pickup element, may produce pixel defects in the manufacturing process or after manufacturing, and output abnormal level image data. Therefore, the generation unit 33c of the image processing unit 33 corrects the image data output from the pixel in which the pixel defect has occurred, thereby making the image data in the pixel position in which the pixel defect has occurred inconspicuous.
  • the generation unit 33c of the image processing unit 33 uses, for example, a pixel at the position of a pixel defect recorded in advance in a non-illustrated nonvolatile memory in one frame image as a target pixel P (processing target pixel), and sets the target pixel P as a target pixel P. Pixels around the target pixel P included in the central target region 90 are set as reference pixels Pr1 to Pr8.
  • the generation unit 33c of the image processing unit 33 calculates the maximum value and the minimum value of the image data in the reference pixels Pr1 to Pr8, and when the image data output from the target pixel P exceeds these maximum value or minimum value, the target pixel Max and Min filter processing for replacing the image data output from P with the maximum value or the minimum value is performed. Such a process is performed for all pixel defects whose position information is recorded in a non-volatile memory (not shown).
  • the generation unit 33c of the image processing unit 33 performs the above-described pixel defect correction processing on the image data of the detection image 52, whereby subject detection processing based on the detection image 52, focus It is possible to prevent the influence of pixel defects in the detection process and the exposure calculation process.
  • the pixel defect correction processing is not limited to the detection image 52, and may be performed on the recording image 51.
  • the generation unit 33c of the image processing unit 33 performs the above-described pixel defect correction processing on the image data of the recording image 51, thereby preventing the influence of pixel defects in the recording image 51.
  • Color interpolation processing is one of image processing performed on a captured image. As illustrated in FIG. 3, in the imaging chip 111 of the imaging device 100, green pixels Gb and Gr, a blue pixel B, and a red pixel R are arranged in a Bayer array. Since the generation unit 33c of the image processing unit 33 lacks image data of a color component different from the color component of the color filter F arranged at each pixel position, by performing a known color interpolation process, The image data of the missing color component is generated with reference to the image data.
  • the generation unit 33c of the image processing unit 33 performs color interpolation processing on the image data of the detection image 52, so that subject detection processing is performed based on the color-interpolated detection image 52. It is possible to appropriately perform the focus detection process and the exposure calculation process. Note that the color interpolation process may be performed not only on the detection image 52 but also on the recording image 51.
  • the generation unit 33c of the image processing unit 33 performs the above-described color interpolation processing on the image data of the recording image 51, whereby the recording image 51 subjected to color interpolation can be obtained.
  • the outline enhancement process is one of image processes performed on a captured image.
  • the generation unit 33c of the image processing unit 33 performs, for example, a known linear filter calculation using a kernel of a predetermined size centered on the pixel of interest P (processing target pixel) in an image of one frame.
  • the kernel size of the sharpening filter which is an example of the linear filter is N ⁇ N pixels
  • the position of the target pixel P is the target position
  • the positions of (N 2 ⁇ 1) reference pixels Pr surrounding the target pixel P are Reference position.
  • the kernel size may be N ⁇ M pixels.
  • the generation unit 33c of the image processing unit 33 performs a filter process for replacing the image data in the target pixel P with a linear filter calculation result on each horizontal line, for example, from the upper horizontal line to the lower horizontal line of the frame image. This is done while shifting the pixels from left to right.
  • the generation unit 33c of the image processing unit 33 performs the above-described contour emphasis processing on the image data of the detection image 52, so that the contour is enhanced based on the detection image 52.
  • Subject detection processing, focus detection processing, and exposure calculation processing can be appropriately performed.
  • the contour enhancement processing is not limited to the detection image 52 but may be performed on the recording image 51.
  • the generation unit 33c of the image processing unit 33 performs the above-described contour enhancement processing on the image data of the recording image 51, whereby the recording image 51 with the contour enhanced can be obtained.
  • the strength of contour emphasis may be different between the case where it is performed on the image data of the detection image 52 and the case where it is performed on the image data of the recording image 51.
  • Noise reduction processing is one type of image processing performed on a captured image.
  • the generation unit 33c of the image processing unit 33 performs, for example, a known linear filter calculation using a kernel of a predetermined size centered on the pixel of interest P (processing target pixel) in an image of one frame.
  • the kernel size of the smoothing filter which is an example of the linear filter is N ⁇ N pixels
  • the position of the target pixel P is the target position
  • the positions of the (N 2 ⁇ 1) reference pixels Pr surrounding the target pixel P are Reference position.
  • the kernel size may be N ⁇ M pixels.
  • the generation unit 33c of the image processing unit 33 performs a filter process for replacing the image data in the target pixel P with a linear filter calculation result on each horizontal line, for example, from the upper horizontal line to the lower horizontal line of the frame image. This is done while shifting the pixels from left to right.
  • the generation unit 33c of the image processing unit 33 performs the above-described noise reduction processing on the image data of the detection image 52, thereby performing subject detection processing and focus detection based on the detection image 52. It is possible to prevent the influence of noise in the processing and the exposure calculation processing.
  • the noise reduction process may be performed not only on the detection image 52 but also on the recording image 51.
  • the generation unit 33c of the image processing unit 33 performs the above-described noise reduction processing on the image data of the recording image 51, whereby the recording image 51 with reduced noise can be obtained.
  • the degree of noise reduction may be different between the case of performing the image data of the detection image 52 and the case of performing the image data of the recording image 51.
  • the lens movement control unit 34d of the control unit 34 performs focus detection processing using signal data (image data) corresponding to a predetermined position (focus point) on the imaging screen.
  • image data image data
  • the lens movement control unit 34d (generation unit) of the control unit 34 detects image shift amounts (phase differences) of a plurality of subject images due to light beams that have passed through different pupil regions of the imaging optical system 31, thereby capturing the imaging optical system.
  • a defocus amount of 31 is calculated.
  • the lens movement control unit 34d of the control unit 34 adjusts the focus of the imaging optical system 31 by moving the focus lens of the imaging optical system 31 to a position where the defocus amount is zero (allowable value or less), that is, a focus position. .
  • FIG. 15 is a diagram illustrating the position of the focus detection pixel on the imaging surface of the imaging device 32a.
  • focus detection pixels are discretely arranged along the X-axis direction (horizontal direction) of the imaging chip 111.
  • fifteen focus detection pixel lines 160 are provided at predetermined intervals.
  • the focus detection pixels constituting the focus detection pixel line 160 output image data of the focus detection, recording image 51, and detection image 52.
  • normal imaging pixels are provided at pixel positions other than the focus detection pixel line 160.
  • the imaging pixels output image data of the recording image 51 and the detection image 52.
  • FIG. 16 is an enlarged view of a part of the focus detection pixel line 160 corresponding to the focus point 80A shown in FIG.
  • a red pixel R, a green pixel G (Gb, Gr), a blue pixel B, and a focus detection pixel are illustrated.
  • the red pixel R, the green pixel G (Gb, Gr), and the blue pixel B are arranged according to the rules of the Bayer arrangement described above.
  • the square area illustrated inside (behind) the microlens L and a color filter (not shown) is a photoelectric conversion unit of the imaging pixel.
  • Each imaging pixel receives a light beam passing through the exit pupil of the imaging optical system 31 (FIG. 1). That is, the red pixel R, the green pixel G (Gb, Gr), and the blue pixel B each have a square-shaped mask opening, and light passing through these mask openings reaches the photoelectric conversion unit of the imaging pixel.
  • the shape of the photoelectric conversion part (mask opening part) of the red pixel R, the green pixel G (Gb, Gr), and the blue pixel B is not limited to a quadrangle, and may be, for example, a circle.
  • the focus detection pixel has two photoelectric conversion units S1 and S2 inside (behind) a microlens L and a color filter (not shown).
  • a first photoelectric conversion unit S1 disposed on the left side of the pixel position and a second photoelectric conversion unit S2 disposed on the right side of the pixel position are included.
  • the first light beam passing through the first region of the exit pupil of the imaging optical system 31 (FIG. 1) is incident on the first photoelectric conversion unit S1, and the second photoelectric conversion unit S2 is imaged.
  • a second light beam passing through the second region of the exit pupil of the optical system 31 (FIG. 1) enters.
  • a photoelectric conversion unit and a readout circuit 105 that reads out a photoelectric conversion signal from the photoelectric conversion unit are referred to as “pixels”.
  • the read circuit 105 will be described with an example including a transfer transistor (TX), an amplification transistor (AMP), a reset transistor (RST), and a selection transistor (SEL).
  • TX transfer transistor
  • AMP amplification transistor
  • RST reset transistor
  • SEL selection transistor
  • the range of the read circuit 105 is not necessarily the same as this example. May be.
  • the position of the focus detection pixel line 160 in the imaging chip 111 is not limited to the position illustrated in FIG. Also, the number of focus detection pixel lines 160 is not limited to the example of FIG. For example, focus detection pixels may be arranged at all pixel positions.
  • the focus detection pixel line 160 in the imaging chip 111 may be a line in which focus detection pixels are arranged along the Y-axis direction (vertical direction) of the imaging chip 111.
  • An imaging element in which imaging pixels and focus detection pixels are two-dimensionally arranged as shown in FIG. 16 is known, and detailed illustration and description of these pixels are omitted.
  • the focus detection pixels are the first and first focus detection pixels. It may be configured to receive one of the two light beams.
  • the lens movement control unit 34d of the control unit 34 passes through different regions of the imaging optical system 31 (FIG. 1) based on the focus detection photoelectric conversion signals output from the photoelectric conversion units S1 and S2 of the focus detection pixels. An image shift amount (phase difference) between the pair of images by the pair of light beams is detected. Then, the defocus amount is calculated based on the image shift amount (phase difference). Such defocus amount calculation by the pupil division phase difference method is well known in the field of cameras, and thus detailed description thereof is omitted.
  • FIG. 17 is an enlarged view of the focus point 80A.
  • the position surrounded by the frame 170 corresponds to the focus detection pixel line 160 (FIG. 15).
  • the lens movement control unit 34d of the control unit 34 performs focus detection processing using signal data from the focus detection pixels indicated by the frame 170 in the detection image 52.
  • the lens movement control unit 34d sets the attention ranges 61-2 and 61-3 in the detection image 52 in FIGS. 13B and 13D corresponding to the range indicated by the frame 170 in FIG.
  • the lens movement control unit 34d can appropriately perform the focus detection process by using the signal data of the focus detection pixels of the detection image 52 subjected to the image blur correction process. Further, for example, the focus detection processing is performed by using the signal data of the focus detection pixels of the detection image 52 in which the gain is set high or the image processing suitable for detection of the image shift amount (phase difference) is performed. Can be done appropriately.
  • the focus detection process using the pupil division phase difference method is exemplified.
  • the contrast detection method in which the focus lens of the imaging optical system 31 is moved to the in-focus position based on the contrast of the subject image. Can be done as follows.
  • the control unit 34 moves the focus lens of the image pickup optical system 31 and moves the focus lens at each position of the focus lens so as to include an image included in the second image pickup region B2 of the image pickup element 32a corresponding to the focus point.
  • a known focus evaluation value calculation is performed based on the signal data output from the pixel for use. Then, the position of the focus lens that maximizes the focus evaluation value is obtained as the focus position. That is, the lens movement control unit 34d of the control unit 34 performs a focus evaluation value calculation using signal data from the imaging pixels corresponding to the focus point 80A in the detection image 52.
  • the lens movement control unit 34d associates the attention ranges 61-2 and 61-3 in the detection image 52 in FIGS.
  • the lens movement control unit 34d can appropriately perform the focus detection process by using the signal data of the detection image 52 subjected to the image blur correction process. Further, for example, by using the signal data of the detection image 52 in which the gain is set high or image processing suitable for detection of the image shift amount (phase difference) is used, the focus detection processing can be appropriately performed. it can.
  • FIG. 18A is a diagram illustrating a template image representing an object to be detected
  • FIG. 18B is a diagram illustrating a monitor image 60a and a search range 190.
  • the object detection unit 34a of the control unit 34 detects an object (for example, the bag 63 which is one of the subject elements in FIG. 9) from the monitor image 60a.
  • the object detection unit 34a of the control unit 34 may set the range in which the object is detected as the entire range of the monitor image 60a. However, in order to reduce the detection process, a part of the monitor image 60a may be used as the search range 190. Good.
  • the object detection unit 34 a of the control unit 34 sets the search range 190 in the vicinity of the region including the person 61. Note that the region 61 including the person 61 may be set as the search range.
  • the object detection unit 34 a of the control unit 34 performs subject detection processing using image data constituting the search range 190 in the detection image 52.
  • the object detection unit 34a sets the attention ranges 61-2 and 61-3 in the detection image 52 in FIGS. 13B and 13D in correspondence with the search range 190 in FIG.
  • the object detection unit 34a can appropriately perform the subject detection process by using the image data of the detection image 52 subjected to the image blur correction process. Further, for example, by using the image data of the detection image 52 in which the gain is set high or image processing suitable for detection of the subject element is used, subject detection processing can be performed appropriately.
  • the subject detection process is not limited to the detection image 52 but may be performed on the recording image 51.
  • Exposure calculation processing is performed using image data constituting the photometric range.
  • the setting unit 34b sets the imaging condition based on the exposure calculation result as follows. For example, when an overexposure or underexposure occurs in an area including a subject having the maximum luminance or the minimum luminance in the image, the setting unit 34b sets an imaging condition so as to eliminate overexposure or underexposure.
  • the setting unit 34 b of the control unit 34 performs an exposure calculation process using image data constituting the photometric range in the detection image 52.
  • the setting unit 34b sets the attention ranges 61-2 and 61-3 in the detection image 52 in FIGS. 13B and 13D in correspondence with the photometric range.
  • the setting unit 34b can appropriately perform the arithmetic processing by using the image data of the detection image 52 subjected to the image blur correction processing. Further, for example, by using the image data of the detection image 52 in which the gain is set to be low, the arithmetic processing can be appropriately performed.
  • the photometric range when performing the exposure calculation process described above but also the photometric (colorimetric) range used when determining the white balance adjustment value, or the necessity of emission of the auxiliary photographing light by the light source that emits the auxiliary photographing light is determined.
  • the process for setting the imaging conditions is not limited to the detection image 52 but may be performed on the recording image 51.
  • FIG. 19 is a flowchart for explaining the flow of processing of the camera 1 that captures an image by setting an imaging condition for each region.
  • the control unit 34 activates a program that executes the process shown in FIG.
  • the control unit 34 sets the first imaging area B1 and the second imaging area B2 in the imaging element 32a, and proceeds to step S20.
  • the first imaging region B1 is a region where the recording image 51 is captured
  • the second imaging region B2 is a region where the detection image 52 is captured.
  • step S20 the control unit 34 causes the image sensor 32a to start capturing a moving image, and proceeds to step S30.
  • the imaging element 32a repeats the imaging of the recording image 51 in the first imaging area B1, and repeats the imaging of the detection image 52 in the second imaging area B2.
  • step S ⁇ b> 30 the control unit 34 determines whether the camera 1 is shaken. When the shake of the camera 1 is detected, the control unit 34 makes a positive determination in step S30 and proceeds to step S40. When the shake of the camera 1 is not detected, the control unit 34 makes a negative determination in step S30 and proceeds to step S50.
  • step S40 the control unit 34 performs image blur correction processing on the moving image.
  • FIG. 20 is a diagram for explaining the flow of image blur correction processing.
  • the control unit 34 calculates the amount of image blur.
  • the target movement amount calculation unit 34f calculates the image blur calculated by the blur amount calculation unit 34e, the photographing magnification (calculated based on the position of the zoom lens of the imaging optical system 31), and the subject from the camera 1.
  • the image blur amount of the imaging surface of the imaging element 32a is calculated based on the distance to (calculated based on the position of the focus lens of the imaging optical system 31), and the process proceeds to step S44.
  • step S44 the control unit 34 performs image blur correction based on the image blur amount. Specifically, in order to cancel the image blur amount calculated by the target movement amount calculation unit 34f, the target movement amount of the trimming range W1 in the recording image 51 is calculated. Then, as illustrated in FIG. 13A, the correction unit 33 b moves the trimming range W ⁇ b> 1 in the recording image 51. By such image blur correction, even if the position of the subject in the recording image 51 is moved due to camera shake, the subject stays at substantially the same position in the trimming range W1.
  • the target movement amount calculation unit 34f also applies the target movement amount to the trimming range W2 in the detection image 52. Then, as shown in FIG. 13B, the correction unit 33b moves the trimming range W2 in the detection image 52. By such image blur correction, even if the position of the subject in the detection image 52 is moved due to camera shake, the subject stays at substantially the same position in the trimming range W2.
  • step S46 the control unit 34 sets the attention range 61-2 for the detection image 52, and the attention range 61-2 is also detected together with the trimming range W2 as shown in FIG. 13B.
  • step S50 the control unit 34 moves only the attention range 61-3 downward in the detection image 52 as shown in FIG. 13 (d).
  • step S50 of FIG. 19 the control unit 34 causes the display unit 35 to start displaying the monitor image, and proceeds to step S60.
  • step S60 the control unit 34 causes the object detection unit 34a to start processing for detecting a subject based on the detection image 52 imaged in the second imaging region B2, and proceeds to step S70.
  • the monitor image based on the recording image 51 imaged in the first imaging area B1 is sequentially displayed on the display unit 35.
  • the lens movement control unit 34d of the control unit 34 performs focus detection processing based on the detection image 52 captured in the second imaging region B2.
  • the AF operation for focusing on the subject element corresponding to the predetermined focus point is controlled. If the setting for performing the AF operation is not performed while the monitor image is being displayed, the lens movement control unit 34d of the control unit 34 performs the AF operation when the AF operation is instructed later.
  • step S70 the setting unit 34b of the control unit 34 divides the imaging screen imaged by the imaging element 32a into a plurality of regions including the subject element, and proceeds to step S80.
  • step S ⁇ b> 80 the control unit 34 displays an area on the display unit 35. As illustrated in FIG. 9, the control unit 34 highlights a region to be set (changed) for the imaging condition in the monitor image 60 a. The control unit 34 determines a region to be highlighted based on the touch operation position by the user. In addition, the control unit 34 displays the imaging condition setting screen 70 on the display unit 35 and proceeds to step S90. Note that when the display position of another subject on the display screen is touched with the user's finger, the control unit 34 changes the region including the subject to a region for which the imaging condition is set (changed). To highlight it.
  • step S90 the control unit 34 determines whether an AF operation is necessary. For example, when the focus adjustment state is changed due to the movement of the subject, the position of the focus point is changed by a user operation, or the execution of the AF operation is instructed by the user operation, the control unit 34 Then, affirmative determination is made in step S90, and the process proceeds to step S100. If the focus adjustment state does not change, the position of the focus point is not changed by the user operation, and the execution of the AF operation is not instructed by the user operation, the control unit 34 makes a negative determination in step S90 and proceeds to step 110. .
  • step S100 the control unit 34 performs an AF operation and proceeds to step S110.
  • the lens movement control unit 34d of the control unit 34 performs an AF operation for focusing on a subject corresponding to a predetermined focus point by performing a focus detection process based on the detection image 52 imaged in the second imaging region B2. Control.
  • step S110 the setting unit 34b of the control unit 34 sets (changes) the imaging condition for the highlighted area in accordance with a user operation, and the process proceeds to step S120.
  • the setting unit 34b sets (changes) an imaging condition for the first imaging region B1.
  • the setting unit 34b may set (change) the imaging condition for the second imaging region B2.
  • the setting unit 34b continuously sets the initial imaging conditions for the first imaging area B1 and the second imaging area B2.
  • the setting unit 34b performs an exposure calculation process based on the detection image 52 imaged in the second imaging area B2 when the imaging condition is determined by newly measuring the light with the imaging condition set (changed).
  • step S120 the imaging control unit 34c of the control unit 34 sends an instruction to the image processing unit 33, performs image processing on the image data of the recording image, and proceeds to step S130.
  • Image processing includes the pixel defect correction processing, color interpolation processing, contour enhancement processing, and noise reduction processing.
  • step S130 the control unit 34 determines whether or not the recording button is operated.
  • the control unit 34 makes a positive determination in step S130 and proceeds to step S140. If the recording button is not operated, the control unit 34 makes a negative determination in step S130 and returns to step S30.
  • the control unit 34 repeats the above-described processing.
  • step S140 the control unit 34 sends an instruction to the recording unit 37, starts a process of recording the image data after the image processing on a recording medium (not shown), and proceeds to step S150.
  • step S150 the control unit 34 determines whether an end operation has been performed. For example, when the recording button is operated again, the control unit 34 makes an affirmative determination in step S150 and ends the process of FIG. When the end operation is not performed, the control unit 34 makes a negative determination in step S150 and returns to step S30. When returning to step S30, the control unit 34 repeats the above-described processing. If the recording process is started, the above-described process is repeated while continuing the recording process.
  • the imaging element 32a the multilayer imaging element 100 in which imaging conditions can be set for each of a plurality of blocks in the imaging element (imaging chip 111) is illustrated, but the imaging element 32a is not necessarily configured as a multilayer imaging element. There is no.
  • the camera 1 changes the imaging condition of the region in the direction opposite to the shake direction, and performs imaging according to the changed imaging condition. .
  • the image of the subject image can be appropriately generated.
  • the subject element can be detected appropriately.
  • focus detection can be performed appropriately.
  • exposure calculation can be performed appropriately.
  • the camera 1 moves the attention range to an area in the direction opposite to the shake direction, and changes the imaging condition set in the attention range. Accordingly, an image of the subject image can be appropriately generated based on the subject image captured in the attention range.
  • the subject element can be detected appropriately. Furthermore, focus detection can be performed appropriately. Moreover, exposure calculation can be performed appropriately.
  • the camera 1 moves the trimming range so that the subject image exists at the same position in the trimming range even if the camera 1 is shaken. Thereby, the image blur of the subject image can be appropriately suppressed.
  • the camera 1 moves the range of interest included in the trimming range along with the movement of the trimming range. This makes it possible to focus on the same subject image even when the camera 1 is shaken.
  • the camera 1 moves only the attention range even if the trimming range cannot be moved. Thereby, even when the image blur of the subject image due to the shake of the camera 1 cannot be suppressed, it is possible to focus on the same subject image.
  • FIG. 13 (e) is a diagram illustrating image blur correction for the detection image 52 in Modification 1 of the first embodiment.
  • the control unit 34 gradually expands the area of the attention range 61-2 of FIG. 13B set for the detection image 52 as the trimming range W2 moves in the detection image 52.
  • the direction in which the area of the attention range 61-2 is expanded is the same as the direction in which the trimming range W2 is moved, and is the direction opposite to the shake of the camera 1.
  • the area of the attention range 61-3 in FIG. 13 (e) is wider downward than the area of the attention range 61-2 in FIG. 13 (b).
  • the control unit 34 moves the attention range 61 when the subject of interest (for example, the head of a person) moves further downward.
  • the area of ⁇ 3 is gradually expanded downward in the trimming range W2. This is so that the head of the person can be accommodated in the attention range 61-3 with the expanded area.
  • the image processing unit 33 during the image generation process includes the same subject (in this example, the human head).
  • control unit 34 changes the setting of the second imaging condition in accordance with the change of the areas of the attention ranges 61-2 and 61-3. That is, the second imaging condition for the second imaging region B2 is reset according to the positions and areas of the attention ranges 61-2 and 61-3 after changing the area.
  • the image processing unit 33 of the camera 1 performs a process of suppressing the influence of image blur due to the shake of the camera 1 that is capturing a still image. Details of such a camera 1 will be described below.
  • the camera 1 in the second embodiment may be either a lens interchangeable type or a lens interchangeable type.
  • you may comprise as imaging devices, such as a smart phone and a video camera.
  • the still image is captured by setting only the first imaging region B1 on the imaging surface of the image sensor 32a.
  • the operating rate of the blocks included in the first imaging area B1 may be set as appropriate, as in the case of the first embodiment. For example, the operating rate may be set to 100%, or the operating rate may be set to 70% or 50%.
  • the first imaging area B1 is operated out of the first imaging area B1 and the second imaging area B2 set on the imaging surface. Then, the second imaging region B2 may be paused.
  • the still image to be recorded is a still image captured when the shutter button is operated.
  • a still image that is recorded when the shutter button is operated is referred to as a recording image.
  • a recording image may be captured when a display for instructing capturing of a still image (for example, the release icon 74 in FIG. 9) is operated.
  • FIG. 21 (a) is a diagram illustrating still images taken indoors by the camera 1 without using flash light.
  • the still image in FIG. 21A is taken by setting different imaging conditions on the left and right of the screen of the imaging device 32a. For example, when the female outfit on the right side of the screen is white and brighter than the male outfit on the left side of the screen, the image sensor corresponding to the right side of the screen with respect to the area of the image sensor 32a corresponding to the left side of the screen (left side of the first image pickup area B1). An exposure time longer than the region 32a (the right side of the first imaging region B1) is set.
  • the exposure time of the area of the image sensor 32a corresponding to the left side of the screen is the first exposure time and the exposure time of the area of the image sensor 32a corresponding to the right side of the screen is the second exposure time, the first exposure time> the second exposure time. It is. Thereby, it is possible to take an image so that the male costume on the left side of the screen is not underexposed while preventing the female costume on the right side of the screen from being overexposed.
  • FIG. 21A when the amount of illumination light is reduced by the effect of the venue for taking a still image of FIG. 21A, the first exposure time and the second exposure time set in the image sensor 32a are exposures called so-called camera shake limits. It becomes longer than the time (for example, 1/60 seconds). For this reason, the camera 1 performs processing for suppressing the influence of image blur due to the shake of the camera 1 that is capturing a still image.
  • FIG. 21B and FIG. 21C are schematic diagrams for explaining the outline of still image blur correction according to the second embodiment.
  • ⁇ Image blur correction> when image blur correction of a still image is performed, an exposure time shorter than the exposure time (1/60 seconds) that is the camera shake limit (a shortened exposure time described later: for example, 1/300 seconds).
  • the first recording image 51L is imaged in a short exposure time by the region of the image sensor 32a corresponding to the left side of the screen (n images), and the region of the image sensor 32a corresponding to the right side of the screen
  • a plurality (m) of second recording images 51R are taken with the shortened exposure time.
  • the controller 34 sets the number n of the first recording images so that the sum of the shortened exposure times (n ⁇ shortened exposure time) of the n first recording images 51L becomes the first exposure time. decide. Further, the number m of the second recording images is determined so that the sum of the shortened exposure times (m ⁇ shortened exposure time) of the m second recording images 51R becomes the second exposure time.
  • the control unit 34 and the correction unit 33b include n first recording images 51 (DL-1 to DL-1) captured in the area of the image sensor 32a corresponding to the left side of the screen.
  • DL-n is subjected to a first synthesis process in which the positions of the feature points in each image are aligned and superimposed. Since such a synthesis process is known, a detailed description thereof will be omitted.
  • the control unit 34 and the correction unit 33b include m second images for recording 51 (DR-1 to DR-1 ⁇ DR-m) is subjected to a second synthesis process in which the positions of the feature points in each image are aligned and superimposed.
  • the correcting unit 33b further combines the first image after the first combining process and the second image after the second combining process to obtain a combined image as shown in FIG.
  • the correction unit 33b sets the image data of the composite image as image data of a recording image after image blur correction. In this way, the control unit 34 and the correction unit 33b perform image blur correction on the still image.
  • FIG. 22 shows the imaging timing of the recording image 51L (DL-1, DL-2, DL-3,...) And the imaging timing of the recording image 51R (DR-1, DR-2, DR-3,).
  • FIG. 5 is a diagram illustrating display timing of a display (blur correction image) based on a recording image after blur correction.
  • the imaging control unit 34c performs the processing before the recording button is operated (before time t1) described in the first embodiment until the release operation (for example, the shutter button pressing operation) is performed at the time t2. .
  • the imaging control unit 34c When the release operation is performed at time t2, the imaging control unit 34c performs imaging at the timing of FIG. 22 by the first imaging area B1 set on the imaging surface of the imaging element 32a. In other words, the imaging control unit 34c causes the imaging unit 32 to capture the recording image 51 under the imaging conditions set in the first imaging region B1 by the setting unit 34b.
  • the object detection unit 34a detects the male on the left side of the screen and the female on the right side of the screen based on the detection image Di in FIG. 10 described above, and the setting unit 34b performs the first imaging based on the detection result. It is assumed that the area B1 is divided into a screen left side and a screen right side.
  • the setting unit 32b sets different imaging conditions (for example, the first exposure time and the second exposure time) on the left side of the screen and the right side of the screen by performing an exposure calculation based on the detection image Diii in FIG. 10 described above. To do.
  • the imaging control unit 34c performs imaging for image blur correction. That is, the first image pickup area 51 corresponds to the right side of the screen in the first image pickup area B1 while the first image pickup area B1 in the first image pickup area B1 is picked up by the area of the image pickup element 32a corresponding to the left side of the screen.
  • the second image for recording 51R is imaged in the shortened exposure time depending on the area of the imaging element 32a to be used.
  • the first recording is performed by the area of the imaging element 32a corresponding to the left side of the screen.
  • a work image 51L is captured. This is because n> m.
  • the size of the first recording images DL-n-m + 1 to DL-n is larger than the size of the first recording images DL-1 to DL-m (in this example, in the right direction). Because it is wide), it includes a part of the woman located on the right side of the screen. As a result, even if the camera 1 becomes more shaken after the second recording image 51R has been captured (the camera shake in FIG. 21B causes the region of the image sensor 32a to shift to the left with respect to the person. Even when the first image after the first combining process (FIG. 21B) and the second image after the second combining process (FIG. 21C) are combined, the image data is A composite image as shown in FIG. 21A can be obtained without being insufficient.
  • the correction unit 33b sets the image data of the composite image as image data of a recording image after image blur correction.
  • the imaging control unit 34c Since no correction is required, normal imaging is performed. That is, the first exposure time is set in the area of the imaging element 32a corresponding to the left side of the screen in the first imaging area B1, and the second exposure is set in the area of the imaging element 32a corresponding to the right side of the screen in the first imaging area B1. Time is set and one recording image 51 (51L and 51R) is captured.
  • step S210 the processing from step S10 to step S110 is the same as the processing of FIG.
  • the control unit 34 determines whether or not the shutter button is operated in step S210, which is subsequent to step S110 in FIG.
  • the control unit 34 makes a positive determination in step S210 and proceeds to step S220. If the shutter button is not operated, the control unit 34 makes a negative determination in step S210 and returns to step S30.
  • the control unit 34 repeats the processing from step S30 to S210.
  • step S220 the control unit 34 determines whether image blur correction is necessary for the still image.
  • the control unit 34 makes a positive determination in step S220 and performs step Proceed to S230. If both the first exposure time and the second exposure time described above are shorter than the exposure time set in advance as the camera shake limit, the control unit 34 makes a negative determination in step S220 and proceeds to step S250.
  • step S230 the control unit 34 captures a plurality of images with a shortened exposure time.
  • n first imaging images 51L are captured with a reduced exposure time by the area of the imaging element 32a corresponding to the left side of the screen in the first imaging area B1, and the first imaging area B1 With the area of the image sensor 32a corresponding to the right side of the screen, the second recording image 51R is imaged with the shortened exposure time.
  • the region of the imaging element 32a corresponding to the left side of the screen in the first imaging region B1 is set to the second
  • the first recording image 51L is imaged to the right of the screen from the time before the recording image 51R is imaged.
  • step S240 the control unit 34 and the correction unit 33b perform image blur correction on the recording image. Specifically, a composition process is performed in which a plurality of recording images are combined and superimposed by combining the positions of feature points in each image.
  • step S250 the control unit 34 performs normal imaging. That is, the first exposure time is set in the area of the imaging element 32a corresponding to the left side of the screen in the first imaging area B1, and the second exposure is set in the area of the imaging element 32a corresponding to the right side of the screen in the first imaging area B1. Time is set and a single recording image 51L, 51R is captured.
  • step S260 the imaging control unit 34c of the control unit 34 sends an instruction to the image processing unit 33, performs image processing on the image data of the recording image, and proceeds to step S270.
  • Image processing includes the pixel defect correction processing, color interpolation processing, contour enhancement processing, and noise reduction processing.
  • step S270 the control unit 34 sends an instruction to the recording unit 37, records the image data after the image processing on a recording medium (not shown), and proceeds to step S280.
  • step S280 the control unit 34 determines whether an end operation has been performed. When the end operation is performed, the control unit 34 makes a positive determination in step S280 and ends the process illustrated in FIG. If the end operation is not performed, the control unit 34 makes a negative determination in step S280 and returns to step S10. When returning to step S10, the control unit 34 repeats the above-described processing.
  • the camera 1 can appropriately perform image blur correction that suppresses the influence of image blur due to camera shake on a subject image captured in a plurality of imaging regions under different imaging conditions. .
  • the camera 1 may cause image blurring when, for example, different exposure times are set for the left side of the screen for setting the first imaging condition and the right side of the screen for setting the second imaging condition, or when the left side of the screen and the right side of the screen are set.
  • image blur correction can be appropriately performed on the left side and the right side of the screen.
  • the camera 1 increases the size of the first recording image 51L, for example, by expanding the area on the left side of the screen for setting the first imaging condition.
  • the first image after the first image blur correction (FIG. 21B) and the first image are corrected.
  • FIG. 21C When combining with the second image after image blur correction (2) (FIG. 21C), a composite image as shown in FIG. 21A can be obtained without lack of image data.
  • the camera 1 enlarges the left range of the first imaging area B1 in the right direction. By enlarging the left range of the first imaging area B1, a wider range of subject images can be included in the first recording image 51L.
  • the camera 1 performs composition by shifting the positions of the plurality of first recording images 51L so as to match the positions of the other first recording images 51L. As a result, it is possible to obtain a composite image that is aligned between the plurality of first recording images 51L.
  • the camera 1 changes the position of the first recording image 51L so that the subject image of the first recording image 51L overlaps the position of the subject image of the other first recording image 51L ( Align). Thereby, even if the position of the subject image is shifted between the plurality of first recording images 51L, a composite image in which the shift of the subject image, that is, the image blur is suppressed can be obtained.
  • the third embodiment is a mode different from the second embodiment, and suppresses the influence of image blur due to the shake of the camera 1 that is capturing a still image.
  • the camera 1 in the third embodiment may be either an interchangeable lens type or not a interchangeable lens type.
  • you may comprise as imaging devices, such as a smart phone and a video camera.
  • the third embodiment sets the first shortened exposure time in the area of the image sensor 32a corresponding to the left side of the screen and the area of the image sensor 32a corresponding to the right side of the screen.
  • the second shortened exposure time (first shortened exposure time> second shortened exposure time) is set to n and recording images 51-1 to 51-n are different.
  • the number of first recording images 51L captured by the area of the image sensor 32a corresponding to the left side of the screen and the number of second recording images 51R captured by the area of the image sensor 32a corresponding to the right side of the screen. are the same (n).
  • the difference from the second embodiment will be mainly described with reference to the drawings.
  • FIG. 24 is a schematic diagram for explaining an outline of still image blur correction according to the third embodiment.
  • the control unit 34 sets the first exposure time such that the sum of the first shortened exposure times (n ⁇ first shortened exposure time) of the n first recording images 51L becomes the first exposure time.
  • the number n of recording images and the first shortened exposure time are determined.
  • the second shortened exposure time is determined such that the sum of the second shortened exposure times of the n second recording images 51R (n ⁇ second shortened exposure time) becomes the second exposure time. To do.
  • the control unit 34 and the correction unit 33b perform a composition process on the n recording images 51-1 to 51-n so that the positions of the feature points in each image are aligned and superimposed. . Since such a synthesis process is known, a detailed description thereof will be omitted.
  • the correction unit 33b sets the image data of the composite image as image data of a recording image after image blur correction. In this way, the control unit 34 and the correction unit 33b perform image blur correction on the still image.
  • FIG. 25 shows the imaging timing of the recording image 51L (DL1, DL2, DL3,%) In the area of the imaging element 32a corresponding to the left side of the screen and the recording image 51R (DR1 in the area of the imaging element 32a corresponding to the right side of the screen. , DR2, DR3, etc And the display timing of the display (blur correction image) based on the recording image after blur correction.
  • the imaging control unit 34c performs the processing before the recording button is operated (before time t1) described in the first embodiment until a release operation (for example, a shutter button pressing operation) is performed at time t3. .
  • the imaging control unit 34c When the release operation is performed at time t3, the imaging control unit 34c performs imaging at the timing of FIG. 25 by the first imaging area B1 set on the imaging surface of the imaging element 32a. In other words, the imaging control unit 34c causes the imaging unit 32 to capture the recording image 51 under the imaging conditions set in the first imaging region B1 by the setting unit 34b. Similar to the second embodiment, the object detection unit 34a detects the male on the left side of the screen and the female on the right side of the screen based on the detection image Di in FIG. 10 described above, and the setting unit 34b determines based on the detection result. Assume that the screen is divided into a left side and a right side.
  • the setting unit 32b performs an exposure calculation based on the detection image Diii of FIG. 10 described above, so that different imaging conditions (for example, the above-described first condition) 1 exposure time and 2nd exposure time) are set.
  • the imaging control unit 34c performs imaging for image blur correction. That is, the first shortened exposure time is set in the area of the image pickup element 32a corresponding to the left side of the screen in the first image pickup area B1, and the area of the image pickup element 32a corresponding to the right side of the screen in the first image pickup area B1.
  • n images for recording 51 51L and 51R
  • the number n of recording images 51 to be captured is the same on the left and right sides of the screen. This is because the second shortened exposure time is shorter than the first shortened exposure time as the shortened exposure time per sheet.
  • the image data between the left and right sides of the screen is not affected even if the camera 1 shakes greatly.
  • a composite image as shown in FIG. 24 can be obtained without being insufficient.
  • the correction unit 33b sets the image data of the composite image as image data of a recording image after image blur correction.
  • the camera 1 can appropriately perform image blur correction that suppresses the influence of image blur due to the shake of the camera 1 on an image of a subject image captured under different imaging conditions in a plurality of imaging regions.
  • the camera 1 may cause image blurring when, for example, different exposure times are set for the left side of the screen for setting the first imaging condition and the right side of the screen for setting the second imaging condition, or when the left side of the screen and the right side of the screen are set.
  • image blur correction can be appropriately performed.
  • the camera 1 appropriately performs image blur correction on the subject image captured at different exposure times on the left side of the screen for setting the first imaging condition and the right side of the screen for setting the second imaging condition. It can be carried out.
  • the camera 1 can maintain imaging simultaneity on the left side of the screen for setting the first imaging condition and the right side of the screen for setting the second imaging condition.
  • the image blur correction of the moving image according to the first embodiment is performed in parallel under two conditions (for example, the width of the trimming range is different).
  • the camera 1 in the fourth embodiment may be either a lens interchangeable type or a lens interchangeable type.
  • you may comprise as imaging devices, such as a smart phone and a video camera.
  • the control unit 34 sets four imaging areas on the imaging surface of the imaging element 32a when imaging a recording image or a monitor image.
  • FIG. 26 is a diagram illustrating the arrangement of the first imaging area B1, the second imaging area B2, the third imaging area C1, and the fourth imaging area C4 set on the imaging surface of the imaging element 32a. According to the partially enlarged view of FIG. 26, the unit U of a block having four of the plurality of blocks defined in the image sensor 32a as one unit is repeatedly arranged in the horizontal direction and the vertical direction of the pixel region. .
  • the block unit U includes a block belonging to the first imaging area B1, a block belonging to the second imaging area B2, a block belonging to the third imaging area C1, and a block belonging to the fourth imaging area C2.
  • the control unit 34 generates the first recording image 51A based on the photoelectric conversion signal read from the first imaging region B1 of the imaging element 32a that has captured one frame. Further, the control unit 34 generates the first detection image 52A based on the photoelectric conversion signal read from the second imaging region B2 of the imaging element 32a. The control unit 34 further generates a second recording image 51B based on the photoelectric conversion signal read from the third imaging region C1 of the imaging element 32a that has captured one frame. In addition, the control unit 34 generates the second detection image 52B based on the photoelectric conversion signal read from the fourth imaging region C2 of the imaging element 32a.
  • the first recording image 51A, the first detection image 52A, the second recording image 51B, and the second detection image 52B are all captured at the same angle of view and include a common subject image. . These images can be taken in parallel.
  • the imaging condition set in the first imaging area B1 for imaging the first recording image 51A is referred to as a first imaging condition, and the second imaging area for imaging the first detection image 52A.
  • the imaging condition set to B2 will be referred to as the second imaging condition.
  • the imaging condition set in the third imaging area C1 for capturing the second recording image 51B is referred to as a third imaging condition, and the imaging condition set in the fourth imaging area C2 for capturing the second detection image 52B. Will be referred to as the fourth imaging condition.
  • the control unit 34 may set the first imaging condition and the second imaging condition, the third imaging condition and the fourth imaging condition to the same condition, or may be set to different conditions.
  • control unit 34 performs different image blur correction on the moving images captured in parallel as described above.
  • image blur correction is performed under different conditions between the first recording image 51A and the first detection image 52A, and the second recording image 51B and the second detection image 52B.
  • the control unit 34 sets a trimming range W1 that is 90% of the size of the first recording image 51A in the first recording image 51A.
  • the image blur correction processing after setting the trimming ranges W1 and W2 is the same as that in the first embodiment.
  • control unit 34 sets a trimming range W3 that is 60% of the size of the second recording image 51B in the second recording image 51B.
  • the image blur correction processing after setting the trimming ranges W3 and W4 is the same as that in the first embodiment.
  • the trimming ranges W1 to W4 are often set to be narrow in order to sufficiently secure the movement allowance of the trimming ranges W1 to W4 when the camera 1 is largely shaken. For this reason, the screen size of a moving image after image blur correction tends to be small.
  • image blur correction with a wide trimming range is usually employed, and when camera 1 has a large shake, image blur correction with a narrow trimming range is employed. Is possible.
  • image blur correction of a moving image can be performed in parallel under two conditions (for example, the width of the trimming range is different).
  • the image blur correction of a moving image according to the first embodiment and the image blur correction of a still image according to the second or third embodiment are performed in parallel.
  • the still image is captured by the third imaging region C1 (FIG. 26) set on the imaging surface of the imaging element 32a.
  • the operating rate of the blocks included in the third imaging region C1 may be set as appropriate, as in the case of the first embodiment. For example, the operating rate may be set to 100%, or the operating rate may be set to 70% or 50%.
  • the fourth imaging region C2 (FIG. 26) set on the imaging surface of the imaging element 32a is paused. In FIG. 26, the position of the third imaging area C1 and the position of the fourth imaging area C2 may be combined and set as the third imaging area C1 without setting the fourth imaging area C2.
  • the first recording image 51A is generated based on the photoelectric conversion signal read from the first imaging region B1 of the imaging element 32a that has captured one frame.
  • the control unit 34 generates the first detection image 52A based on the photoelectric conversion signal read from the second imaging region B2 of the imaging element 32a.
  • the control unit 34 further generates a second recording image 51 based on the photoelectric conversion signal read from the third imaging region C1 of the imaging element 32a that has captured one frame.
  • the first recording image 51A and the first detection image 52A are moving images.
  • the second recording image 51 is a still image.
  • the control unit 34 performs image blur correction on the moving image on the first recording image 51A and the first detection image 52A, as in the first embodiment. In addition, the control unit 34 performs image blur correction on the still image on the second recording image 51 as in the second embodiment or the third embodiment.
  • image blur correction for a moving image and image blur correction for a still image can be performed in parallel.
  • the first imaging region B1 (FIG. 26) set on the imaging surface of the imaging element 32a is used for the first. Take a still image.
  • the operating rate of the blocks included in the third imaging region C1 may be set as appropriate, as in the case of the first embodiment. For example, the operating rate may be set to 100%, or the operating rate may be set to 70% or 50%.
  • the second imaging region B2 (FIG. 26) set on the imaging surface of the imaging element 32a is paused. In FIG. 26, the position of the first imaging area B1 and the position of the second imaging area B2 may be combined and set as the first imaging area B1 without setting the second imaging area B2.
  • the third imaging area C1 (FIG. 26) set on the imaging surface of the imaging element 32a is used. 2 still images are taken.
  • the operating rate of the blocks included in the third imaging region C1 may be set as appropriate, as in the case of the first embodiment. For example, the operating rate may be set to 100%, or the operating rate may be set to 70% or 50%.
  • the fourth imaging region C2 (FIG. 26) set on the imaging surface of the imaging element 32a is paused. In FIG. 26, the position of the third imaging area C1 and the position of the fourth imaging area C2 may be combined and set as the third imaging area C1 without setting the fourth imaging area C2.
  • the first recording image 51A is generated based on the photoelectric conversion signal read from the first imaging region B1 of the imaging element 32a that has captured one frame.
  • the control unit 34 generates the second recording image 51B based on the photoelectric conversion signal read from the third imaging region C1 of the imaging element 32a that has captured one frame.
  • the first recording image 51A is a first still image.
  • the second recording image 51B is a second still image.
  • the controller 34 performs image blur correction on a still image on the first recording image 51A and the second recording image 52B, respectively, as in the second embodiment or the third embodiment.
  • image blur correction is performed under different conditions for the first recording image 51A and the second recording image 51B.
  • FIG. 27 is a diagram illustrating still images obtained by capturing waterfalls under different conditions.
  • FIG. 27A is a still image based on the first recording image 51A
  • FIG. 27B is a still image based on the second recording image 51B.
  • the still images in FIGS. 27 (a) and 27 (b) are taken by setting different imaging conditions for the flow of water and the background of a rock or the like.
  • the imaging control unit 34c makes the shortened exposure time of the first recording image 51A shorter than the shortened exposure time of the second recording image 51B.
  • the imaging control unit 34c makes the number of first recording images 51A smaller than the number of second recording images 51B.
  • still image blur correction can be performed in parallel under two conditions (for example, different shortened exposure times).
  • the first imaging area B1 (FIG. 26) set on the imaging surface of the imaging element 32a is used for the first. Take a still image.
  • the second still image to be recorded is captured, the second still image is captured by the second imaging region B2 (FIG. 26) set on the imaging surface of the image sensor 32a.
  • the third still image to be recorded is captured, the third still image is captured by the third imaging region B1 (FIG. 26) set on the imaging surface of the imaging element 32a.
  • the fourth still image to be recorded is captured, the fourth still image is captured by the fourth imaging region C2 (FIG. 26) set on the imaging surface of the image sensor 32a.
  • the operating rates of the blocks included in the first imaging area B1, the second imaging area B2, the third imaging area C1, and the fourth imaging area C2 are set appropriately as in the case of the first embodiment. It doesn't matter.
  • the first recording image 51A is generated based on the photoelectric conversion signal read from the first imaging region B1 of the imaging device 32a that has captured one frame.
  • the control unit 34 generates the second recording image 51B based on the photoelectric conversion signal read from the second imaging region B2 of the imaging element 32a that has captured one frame.
  • the control unit 34 generates a third recording image 51C based on the photoelectric conversion signal read from the third imaging region C1 of the imaging element 32a that has captured one frame.
  • the control unit 34 generates a fourth recording image 51D based on the photoelectric conversion signal read from the fourth imaging region C2 of the imaging device 32a that has captured one frame.
  • the first recording image 51A is a first still image.
  • the second recording image 51B is a second still image.
  • the third recording image 51C is a third still image.
  • the fourth recording image 51D is a fourth still image.
  • the control unit 34 performs the second embodiment or the third recording image 51A on the first recording image 51A, the second recording image 52B, the third recording image 51C, and the fourth recording image 52D, respectively.
  • image blur correction is performed on a still image.
  • image blur correction is performed for each of the recording images 51A to 51D under different conditions.
  • still image blur correction can be performed in parallel under four conditions (for example, different shortened exposure times).
  • the camera 1 has been described as an example of an electronic device.
  • a mobile device such as a high-functional mobile phone, a tablet terminal, or a wearable device having a camera function like a smartphone having an image sensor 32a. It may be an electronic device.
  • the camera 1 in which the imaging unit 32 and the control unit 34 are configured as a single electronic device has been described as an example.
  • the imaging unit 32 and the control unit 34 may be provided separately, and the imaging system 1A that controls the imaging unit 32 from the control unit 34 via communication may be configured.
  • the imaging device 1001 including the imaging unit 32 is controlled from the control device 1002 including the control unit 34 will be described with reference to FIG.
  • FIG. 28 is a block diagram illustrating the configuration of an imaging system 1A according to the second modification.
  • the imaging system 1 ⁇ / b> A includes an imaging device 1001 and a display device 1002.
  • the imaging apparatus 1001 includes a first communication unit 1003 in addition to the imaging optical system 31, the imaging unit 32, and the photometric sensor 38 described in the above embodiment.
  • the display device 1002 includes a second communication unit 1004 in addition to the image processing unit 33, the control unit 34, the display unit 35, the operation member 36, and the recording unit 37 described in the above embodiment.
  • the first communication unit 1003 and the second communication unit 1004 can perform bidirectional image data communication using, for example, a well-known wireless communication technology or optical communication technology. Note that the imaging device 1001 and the display device 1002 may be connected by a wired cable, and the first communication unit 1003 and the second communication unit 1004 may perform bidirectional image data communication.
  • the control unit 34 controls the imaging unit 32 by performing data communication via the second communication unit 1004 and the first communication unit 1003. For example, by transmitting and receiving predetermined control data between the imaging device 1001 and the display device 1002, the display device 1002 divides the screen into a plurality of regions based on the images as described above, or the divided regions. A different imaging condition is set for each area, or a photoelectric conversion signal photoelectrically converted in each area is read out.
  • the monitor image acquired on the imaging device 1001 side and transmitted to the display device 1002 is displayed on the display unit 35 of the display device 1002, so that the user is at a position away from the imaging device 1001.
  • Remote control can be performed from a certain display device 1002.
  • the display device 1002 can be configured by a high-function mobile phone 250 such as a smartphone, for example.
  • the imaging device 1001 can be configured by an electronic device including the above-described stacked imaging element 100.
  • the object detection part 34a, the setting part 34b, the imaging control part 34c, and the lens movement control part 34d in the control part 34 of the display apparatus 1002 was demonstrated, the object detection part 34a, the setting part 34b, A part of the imaging control unit 34c and the lens movement control unit 34d may be provided in the imaging device 1001.
  • the program is supplied to the mobile device such as the camera 1, the high-function mobile phone 250, or the tablet terminal described above by, for example, infrared communication or short-range wireless communication from the personal computer 205 storing the program as illustrated in FIG. 29. Can be sent to mobile devices.
  • the program may be supplied to the personal computer 205 by setting a recording medium 204 such as a CD-ROM storing the program in the personal computer 205 or by a method via the communication line 201 such as a network. You may load. When passing through the communication line 201, the program is stored in the storage device 203 of the server 202 connected to the communication line.
  • the program can be directly transmitted to the mobile device via a wireless LAN access point (not shown) connected to the communication line 201.
  • a recording medium 204B such as a memory card storing the program may be set in the mobile device.
  • the program can be supplied as various forms of computer program products, such as provision via a recording medium or a communication line.
  • the object detection unit 34a of the control unit 34 detects the subject element based on the detection image Di, and the setting unit 34b divides the recording image 51 into regions including the subject element.
  • the control unit 34 divides the recording image 51 on the basis of the output signal from the image sensor 32a and another photometric sensor 38 instead of dividing the recording image 51 on the basis of the detection image Di. May be divided.
  • the control unit 34 divides the recording image 51 into a foreground and a background based on an output signal from the photometric sensor 38. Specifically, the recording image 51 acquired by the image sensor 32 b is determined to be the foreground area corresponding to the area determined to be the foreground from the output signal of the photometry sensor 38 and the background from the output signal of the photometry sensor 38. Divide into background areas corresponding to the areas.
  • the control unit 34 further controls the first imaging region B1 and the second imaging region with respect to the position corresponding to the foreground region on the imaging surface of the imaging device 32a.
  • Set B2 the control unit 34 sets only the first imaging region B1 on the imaging surface of the imaging device 32a with respect to the position corresponding to the background region of the imaging surface of the imaging device 32a.
  • the control unit 34 uses the recording image 51 imaged in the first imaging area B1 for display of the monitor image, and uses the detection image 52 imaged in the second imaging area B2 to detect subject elements, focus detection, And used for exposure calculation.
  • the first imaging area B1 is operated out of the first imaging area B1 and the second imaging area B2 set on the imaging surface. Then, the second imaging region B2 may be paused.
  • the fourth modification by using the output signal from the photometric sensor 38, it is possible to divide the monitor image acquired by the image sensor 32b. Further, the recording image 51 and the detection image 52 can be obtained for the foreground area, and only the recording image 51 can be obtained for the background area.
  • the imaging optical system 31 described above may include a zoom lens or tilt lens.
  • the lens movement control unit 34d adjusts the angle of view by the imaging optical system 31 by moving the zoom lens in the optical axis direction. That is, the image by the imaging optical system 31 can be adjusted by moving the zoom lens, such as obtaining an image of a wide range of subjects or obtaining a large image of a far subject. Further, the lens movement control unit 34d can adjust the distortion of the image by the imaging optical system 31 by moving the tilt lens in a direction orthogonal to the optical axis. Based on the idea that it is preferable to use the preprocessed image data as described above in order to adjust the state of the image by the imaging optical system 31 (for example, the state of the angle of view or the state of distortion of the image). The pre-processing described above may be performed.
  • the imaging element 32a described above has been described in advance as being divided into a plurality of blocks belonging to the first imaging area B1 and a plurality of blocks belonging to the second imaging area B2, as shown in FIG. It is not something.
  • the positions of the first imaging region B1 and the second imaging region B2 in the image sensor 32a may be set according to the brightness, type, shape, etc. of the subject, and the first imaging region B1 in the image sensor 32a and The position of the second imaging area B2 may be set.
  • the imaging area in the imaging element 32a is not limited to the first imaging area B1 and the second imaging area B2, and imaging conditions different from the imaging conditions set for the first imaging area B1 and the second imaging area B2 are set.
  • the captured imaging area may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Adjustment Of Camera Lenses (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

L'invention concerne un dispositif électronique comprenant : un élément d'imagerie comprenant une pluralité de zones d'imagerie et capturant une image de sujet ; une unité de modification qui modifie les conditions d'imagerie ayant été définies pour une seconde zone d'imagerie lorsqu'un résultat de l'élément d'imagerie est déplacé par rapport à l'image de sujet, l'image de sujet étant déplacée depuis une première zone d'imagerie parmi la pluralité de zones d'imagerie vers la seconde zone d'imagerie, qui est disposée dans la direction opposée à la direction de déplacement de l'élément d'imagerie ; et une unité de génération qui génère une image de l'image de sujet ayant été capturée dans les conditions d'imagerie modifiées par l'unité de modification.
PCT/JP2018/013036 2017-03-31 2018-03-28 Dispositif électronique WO2018181610A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-072668 2017-03-31
JP2017072668A JP2020096208A (ja) 2017-03-31 2017-03-31 電子機器

Publications (1)

Publication Number Publication Date
WO2018181610A1 true WO2018181610A1 (fr) 2018-10-04

Family

ID=63678154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/013036 WO2018181610A1 (fr) 2017-03-31 2018-03-28 Dispositif électronique

Country Status (2)

Country Link
JP (1) JP2020096208A (fr)
WO (1) WO2018181610A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09163215A (ja) * 1995-10-05 1997-06-20 Sony Corp 撮像装置
JP2011155492A (ja) * 2010-01-27 2011-08-11 Nikon Corp 画像処理装置
JP2015033036A (ja) * 2013-08-05 2015-02-16 株式会社ニコン 撮像装置、撮像装置の制御方法、及び制御プログラム
JP2016192605A (ja) * 2015-03-30 2016-11-10 株式会社ニコン 電子機器、記録媒体およびプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09163215A (ja) * 1995-10-05 1997-06-20 Sony Corp 撮像装置
JP2011155492A (ja) * 2010-01-27 2011-08-11 Nikon Corp 画像処理装置
JP2015033036A (ja) * 2013-08-05 2015-02-16 株式会社ニコン 撮像装置、撮像装置の制御方法、及び制御プログラム
JP2016192605A (ja) * 2015-03-30 2016-11-10 株式会社ニコン 電子機器、記録媒体およびプログラム

Also Published As

Publication number Publication date
JP2020096208A (ja) 2020-06-18

Similar Documents

Publication Publication Date Title
JP6604384B2 (ja) 撮像装置
WO2017057279A1 (fr) Dispositif de formation d'image, dispositif de traitement d'image et dispositif d'affichage
JP6516014B2 (ja) 撮像装置および画像処理装置
WO2017170716A1 (fr) Dispositif de capture d'image, dispositif de traitement d'image et appareil électronique
JP2024045553A (ja) 撮像装置
WO2018181612A1 (fr) Dispositif électronique
WO2018181615A1 (fr) Dispositif électronique
WO2017170717A1 (fr) Dispositif de capture d'image, dispositif de mise au point, et appareil électronique
JP2018056944A (ja) 撮像装置および撮像素子
JP6516016B2 (ja) 撮像装置および画像処理装置
WO2018181610A1 (fr) Dispositif électronique
WO2018181613A1 (fr) Dispositif électronique
WO2018181614A1 (fr) Dispositif électronique
WO2018181611A1 (fr) Dispositif électronique
JP2018056945A (ja) 撮像装置および撮像素子
JP6516015B2 (ja) 撮像装置および画像処理装置
JP6589989B2 (ja) 撮像装置
JP6604385B2 (ja) 撮像装置
JP6551533B2 (ja) 撮像装置および画像処理装置
JP6589988B2 (ja) 撮像装置
WO2017170719A1 (fr) Dispositif de capture d'image, et appareil électronique
WO2017170718A1 (fr) Dispositif de capture d'image, dispositif de détection de sujet et appareil électronique
WO2017057268A1 (fr) Dispositif d'imagerie et son procédé de commande
WO2017057280A1 (fr) Dispositif d'imagerie et dispositif de détection de sujet
WO2017057267A1 (fr) Dispositif d'imagerie et dispositif de détection de mise au point

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18776020

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18776020

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP