WO2015025740A1 - 制御装置、制御方法、および電子機器 - Google Patents
制御装置、制御方法、および電子機器 Download PDFInfo
- Publication number
- WO2015025740A1 WO2015025740A1 PCT/JP2014/071033 JP2014071033W WO2015025740A1 WO 2015025740 A1 WO2015025740 A1 WO 2015025740A1 JP 2014071033 W JP2014071033 W JP 2014071033W WO 2015025740 A1 WO2015025740 A1 WO 2015025740A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- exposure amount
- mode
- control
- signal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000003384 imaging method Methods 0.000 claims abstract description 118
- 229920006395 saturated elastomer Polymers 0.000 claims description 28
- 238000012937 correction Methods 0.000 claims description 23
- 230000002194 synthesizing effect Effects 0.000 claims description 23
- 230000015572 biosynthetic process Effects 0.000 claims description 17
- 238000003786 synthesis reaction Methods 0.000 claims description 17
- 244000144972 livestock Species 0.000 description 58
- 238000001514 detection method Methods 0.000 description 29
- 230000008569 process Effects 0.000 description 26
- 230000006870 function Effects 0.000 description 18
- 239000002131 composite material Substances 0.000 description 17
- 239000004065 semiconductor Substances 0.000 description 16
- 238000012545 processing Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 11
- 239000002775 capsule Substances 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 8
- 238000009825 accumulation Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 5
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000001454 recorded image Methods 0.000 description 3
- 235000014676 Phragmites communis Nutrition 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
- H04N25/581—Control of the dynamic range involving two or more exposures acquired simultaneously
- H04N25/583—Control of the dynamic range involving two or more exposures acquired simultaneously with different integration times
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
Definitions
- the present disclosure relates to a control device, a control method, and an electronic device, and more particularly, to a control device, a control method, and an electronic device that can set an appropriate exposure amount at high speed.
- the pixel group includes a pixel with a large exposure amount and a pixel with a small exposure amount
- a pixel with a large exposure amount is likely to be saturated in a bright part, and at the time of saturation, only a pixel with a small exposure amount can be output. .
- the number of pixels that can be used for image creation is reduced, so the resolution is lowered.
- imaging devices having a touch panel as represented by smartphones, have become widespread. For example, when performing an operation such as autofocusing at the touched position in the image being shot, if the touch position is saturated, information on how long the location will not be saturated Therefore, it took time to set the exposure amount to an appropriate value.
- the present disclosure has been made in view of such a situation, and makes it possible to set an appropriate exposure amount at high speed.
- a control device includes a control unit that controls an exposure amount of a pixel group in which a plurality of pixels are two-dimensionally arranged, and the control unit performs a first operation before starting recording of a captured image.
- a control unit controls an exposure amount of a pixel group in which a plurality of pixels are two-dimensionally arranged, and the control unit performs a first operation before starting recording of a captured image.
- a plurality of types of exposure amounts are set in the pixel group, and in the second mode in which a captured image is recorded, a smaller type of exposure amount is set in the pixel group than in the first mode.
- the control device that controls the exposure amount of a pixel group in which a plurality of pixels are two-dimensionally arranged has a plurality of control modes in the first mode before recording of a captured image is started.
- the second mode in which different types of exposure are set in the pixel group and a photographed image is recorded, a lower type of exposure than in the first mode is set in the pixel group.
- the electronic device includes a plurality of types of exposure in a solid-state imaging device including a pixel group in which a plurality of pixels are two-dimensionally arranged and in a first mode before starting recording of a captured image.
- a control unit is provided that sets a smaller amount of exposure in the pixel group than in the first mode.
- a second mode in which a plurality of types of exposure amounts are set in the pixel group and the captured image is recorded. In, a smaller amount of exposure than that in the first mode is set in the pixel group.
- control device and the electronic device may be independent devices, or may be internal blocks constituting one device.
- an appropriate exposure amount can be set at high speed.
- FIG. 5 is a diagram illustrating a comparison between the image pickup apparatus of FIG. 4 and general normal shooting modes and HDR shooting modes. It is a figure which shows the flow of the imaging
- the imaging device 1 (FIG. 4) described in detail below has two shooting modes (shooting methods), a normal shooting mode and an HDR shooting mode, which are used properly according to the operating state.
- Figure 1 summarizes the features of the normal shooting mode and the HDR shooting mode.
- the normal shooting mode In the normal shooting mode, a uniform exposure amount is set for each pixel arranged in the Bayer array. Then, an output signal of each pixel is obtained based on the set exposure amount.
- the normal shooting mode has an advantage that the resolution is high, but has a disadvantage that there is no dynamic range expansion function. “R 1 ” represents an R pixel that receives red light, “G 1 ” represents a G pixel that receives green light, and “B 1 ” represents a B pixel that receives blue light.
- the HDR shooting mode is a mode in which an output signal having a high dynamic range (HDR) is obtained, and a pixel set with a large exposure amount and a small exposure amount are set in a pixel group arranged in a Bayer array.
- a pixel having a large exposure amount is also referred to as a long pixel
- a pixel having a small exposure amount is also referred to as a short pixel.
- white background pixels on which “R 1 ”, “G 1 ”, and “B 1 ” are displayed indicate long-lived pixels
- FIG. 1 shows two types of HDR shooting modes in which the arrangement of long livestock pixels and short livestock pixels is different.
- the array P1 in the HDR shooting mode is an array method in which all odd lines are set as long pixels and all even lines are set as short pixels.
- the array P2 in the HDR shooting mode is an array method in which each of the long live pixels and the short live pixels is arranged so as to form a diagonal line as a whole by switching the vertical direction and the horizontal direction in units of 3 pixels.
- HDR shooting mode for example, even if long livestock pixels are saturated, signals can be obtained with short livestock pixels. It can be used to give gradation. On the other hand, in a scene where long livestock pixels are saturated, no signal can be obtained from the long livestock pixels, so the resolution is lowered.
- the example of the captured image will further explain the difference between the normal shooting mode and the HDR shooting mode.
- FIG. 3A shows an image obtained by photographing the subject shown in FIG. 2 in a general normal photographing mode
- FIG. 3B shows an image obtained by photographing the subject shown in FIG. 2 in a general HDR photographing mode.
- the optimum exposure amount is set in the lower area which is a dark part, and the upper area which is a bright part is saturated because the dynamic range is not sufficient.
- both bright and dark gradations are secured.
- the long pixel in the upper region which is a bright portion is saturated and becomes a missing pixel, and is represented by only the short pixel, so the resolution is lowered.
- the lower area, which is a dark part noise is increased in the short pixel, and therefore, when only the long pixel is used without using the short pixel, the resolution is lowered. Therefore, in general, it is preferable to use the HDR shooting mode if priority is given to the dynamic range, and it is preferable to use the normal shooting mode if priority is given to the resolution.
- the imaging device 1 executes the HDR shooting mode in the preview shooting before starting the recording of the shot image, and uses the exposure amount information acquired in the HDR shooting mode in the recording shooting to record the shot image. Execute the normal shooting mode.
- FIG. 4 is a block diagram illustrating a configuration example of the imaging apparatus according to the present disclosure.
- the optical lens 11 adjusts the focal length of the subject light incident on the solid-state image sensor 13.
- a stop (not shown) that adjusts the amount of subject light incident on the solid-state imaging device 13 is provided at the subsequent stage of the optical lens 11.
- the specific configuration of the optical lens 11 is arbitrary.
- the optical lens 11 may be configured by a plurality of lenses.
- the subject light transmitted through the optical lens 11 is incident on the solid-state imaging device 13 via an optical filter 12 configured as, for example, an IR cut filter that transmits light other than infrared light.
- an optical filter 12 configured as, for example, an IR cut filter that transmits light other than infrared light.
- the solid-state imaging device 13 includes a pixel group (pixel array unit) in which a plurality of pixels are two-dimensionally arranged, converts subject light into an electrical signal in units of pixels, and supplies the electrical signal to the A / D conversion unit 14. To do.
- pixel group pixel array unit
- the solid-state image pickup device 13 operates by being set to the HDR shooting mode by the 3A control unit 19 described later at the time of preview shooting. On the other hand, at the time of recording and shooting, the solid-state imaging device 13 operates by being set to the normal shooting mode by the 3A control unit 19.
- FIG. 5A is a diagram illustrating a state of the solid-state imaging device 13 when operating in the HDR imaging mode
- FIG. 5B is a diagram illustrating a state of the solid-state imaging device 13 when operating in the normal imaging mode. is there.
- the exposure amount is set for each pixel arranged in the Bayer array so that the long and short pixels are alternately arranged in units of two rows. Is done.
- the exposure amount set for the long pixels in the HDR shooting mode is set uniformly for all the pixels.
- the A / D conversion unit 14 converts an analog electrical signal (pixel signal) supplied from the solid-state imaging device 13 into a digital pixel signal.
- the A / D conversion unit 14 supplies the digital pixel signal after A / D conversion to the clamp unit 15.
- the clamp unit 15 subtracts a black level, which is a level determined to be black, from the pixel signal output from the A / D conversion unit 14. Then, the clamp unit 15 outputs the pixel signal after the black level subtraction to the memory unit 16.
- the memory unit 16 temporarily stores the pixel signal (pixel data) from the clamp unit 15. Then, the memory unit 16 supplies pixel data of long livestock pixels (hereinafter also referred to as long exposure data) to the long livestock short livestock synthesis unit 18 and the demosaic unit 21, and pixel data of short livestock pixels (hereinafter, short exposure data). (Also referred to as data) is supplied to the magnification calculator 17.
- the magnification calculator 17 corrects the magnification of the short exposure data to the sensitivity of the long exposure data by multiplying the short exposure data supplied from the memory unit 16 by the exposure ratio. Then, the magnification calculator 17 supplies the short exposure data after magnification correction to the long and short livestock synthesizer 18.
- FIG. 6 is a diagram for explaining the magnification correction processing by the magnification calculator 17.
- 6A shows the relationship between long exposure data and short exposure data. Even in the case where the long livestock pixel is saturated, the short livestock pixel is not saturated because the exposure amount is small and the saturation level is not reached.
- the magnification calculator 17 multiplies the short exposure data by the exposure ratio between the long livestock and the short livestock to generate short exposure data after magnification correction shown in B of FIG. As a result, a signal having a high dynamic range is generated. Note that the inclination of the short exposure data after magnification correction is equal to the inclination of the long exposure data.
- the long and short livestock synthesizing unit 18 synthesizes the short exposure data after the magnification correction supplied from the magnification calculating unit 17 and the long exposure data supplied from the memory unit 16 in the HDR imaging mode. To do. Then, the long and short livestock synthesizing unit 18 supplies the composite data obtained as a result of the synthesis to the 3A control unit 19, the moving subject detection unit 20, and the pattern detection unit 28.
- the pixel values of all the pixels are supplied from the memory unit 16 to the long lived short livestock synthesizing unit 18 as long exposure data.
- the long and short livestock synthesizing unit 18 supplies the long exposure data to the 3A control unit 19, the moving subject detection unit 20, and the pattern detection unit 28.
- the combined data and the long exposure data supplied from the long and short livestock synthesizing unit 18 to the 3A control unit 19, the moving subject detection unit 20, and the pattern detection unit 28 may be pixel unit data,
- the pixel group may be divided into a plurality of blocks, and pixel data in units of blocks such as an average value of each pixel data in the block may be used.
- the 3A control unit 19 performs the 3A control based on the synthesis data which is a high dynamic range signal supplied from the long and short livestock synthesis unit 18 in the HDR imaging mode. On the other hand, in the normal photographing mode, the 3A control unit 19 performs 3A control based on the long exposure data supplied from the long and short livestock synthesizing unit 18.
- 3A control means control of automatic exposure (AutoExposure), autofocus (Auto Focus), and auto white balance (Auto White Balance).
- the 3A control unit 19 determines whether or not it is in focus based on contrast information such as composite data supplied from the long and short livestock synthesizing unit 18, and based on the determination result. Then, control for driving the optical lens 11 is performed. Further, the 3A control unit 19 determines the shutter time and the gain of each pixel of the solid-state image sensor 13 based on the composite data supplied from the long and short livestock synthesizing unit 18 and sets the shutter time and gain in the solid-state image sensor 13. Further, the 3A control unit 19 generates color control information for performing color processing such as white balance based on the composite data supplied from the long and short livestock synthesizing unit 18, and the LM / WB / gamma correction unit 22 is supplied.
- the moving subject detection unit 20 executes processing for detecting a moving subject in the captured image, and supplies the detection result to the control unit 30.
- the demosaic unit 21 performs demosaic processing on the long exposure data from the memory unit 16 to complement color information and convert the data into RGB data.
- the demosaic unit 21 supplies the image data after the demosaic process to the LM / WB / gamma correction unit 22.
- the LM / WB / gamma correction unit 22 corrects the color characteristics of the image data from the demosaic unit 21 using the color control information from the 3A control unit 19. Specifically, the LM / WB / gamma correction unit 22 uses matrix coefficients to fill in the image data in order to fill in the difference between the chromaticity point of the primary color (RGB) defined in the standard and the chromaticity point of the actual camera. Each color signal is corrected to change the color reproducibility. The LM / WB / gamma correction unit 22 adjusts the white balance by setting a gain for white for each channel value of the image data.
- the LM / WB / gamma correction unit 22 adjusts the relative relationship between the color of the image data and the output device characteristics to perform gamma correction for obtaining a display closer to the original.
- the LM / WB / gamma correction unit 22 supplies the corrected image data to the luminance chroma signal generation unit 23.
- the luminance chroma signal generation unit 23 generates a luminance signal (Y) and a color difference signal (Cr, Cb) from the image data supplied from the LM / WB / gamma correction unit 23.
- the luminance chroma signal generation unit 23 generates the luminance chroma signal (Y, Cr, Cb)
- the luminance chroma signal and the color difference signal are supplied to the display 24 and the recording control unit 26.
- the display 24 includes, for example, a display such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and a display driver that drives the display.
- a display such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display
- a display driver that drives the display.
- the display 24 displays an image based on the image data (luminance chroma signal) supplied from the luminance chroma signal generation unit 23. In other words, the display 24 displays a moving image or a still image captured by the solid-state image sensor 13. A touch panel 25 is superimposed on the display 24, and an operation input to the display 24 by a user's finger or the like can be detected. Touch input information detected by the touch panel 25 is supplied to the control unit 30.
- the recording control unit 26 compresses the image data supplied from the luminance chroma signal generation unit 23 using a predetermined encoding method. Recording is performed on the recording medium 27.
- the recording medium 27 is composed of, for example, a semiconductor memory, a magnetic recording medium, a magneto-optical recording medium, or the like, and stores (records) image data of a captured image that has been compression-encoded.
- the memory unit 29 stores a pattern image of a face or an object necessary for pattern detection by the pattern detection unit 28.
- the control unit 30 controls the operation of the entire imaging apparatus 1. For example, the control unit 30 determines whether the preview shooting or the recording shooting is performed based on an operation of a recording button (not shown) for turning on / off the recording, and 3A controls a control signal for switching between the normal shooting mode and the HDR shooting mode. Supplied to the unit 19. Based on the control signal from the control unit 30, the 3A control unit 19 sets the exposure amount corresponding to the normal shooting mode or the HDR shooting mode for each pixel of the solid-state imaging device 13.
- control unit 30 is subject to 3A control based on moving subject information supplied from the moving subject detection unit 20, touch input information supplied from the touch panel 25, pattern detection information supplied from the pattern detection unit 28, and the like.
- a 3A control area signal indicating the area is generated and supplied to the 3A control unit 19.
- control unit 30 uses the touch input information supplied from the touch panel 25 to apply information indicating an area designated by the user in the captured image to a target area for 3A control that matches the focus, the exposure amount, and the like. Is supplied to the 3A control unit 19.
- control unit 30 adjusts the information indicating the area where the predetermined object detected in the captured image exists based on the moving subject information supplied from the moving subject detection unit 20 with the focus, the exposure amount, and the like.
- the control target area is supplied to the 3A control unit 19.
- control unit 30 performs the 3A control for adjusting the focus, the exposure amount, and the like on the information indicating the user's face area detected in the captured image based on the pattern detection information supplied from the pattern detection unit 28.
- the target area is supplied to the 3A control unit 19.
- the moving subject detection unit 20, the touch panel 25, and the pattern detection unit 28 function as a region specifying unit that specifies a predetermined region of a pixel group that is a target region for performing 3A control (for example, determination of exposure amount). .
- control unit 30 selects or uses the moving subject information, touch input information, and pattern detection information.
- the 3A control unit 19 and the control unit 30 can be realized, for example, by reading a program stored in a ROM (Read (Only Memory) and causing a CPU (Central Processing Unit) to execute the program. It is also possible to provide the control unit 30 as one control unit (control device).
- ROM Read (Only Memory)
- CPU Central Processing Unit
- control unit 19 and the control unit 30 can be configured as a single chip together with the solid-state imaging device 13 or an imaging module packaged together with the optical lens 11 and the optical filter 12.
- the 3A control unit 19 and the control unit 30 can be configured as a single control device together with the magnification calculation unit 17 and the long and short livestock synthesis unit 18. That is, each configuration of the imaging device 1 can be divided or integrated in an arbitrary unit and configured as a single device, module, chip, or the like.
- the imaging device 1 is configured as described above.
- FIG. 7 shows a flow of captured image data when the imaging apparatus 1 operates in the HDR imaging mode.
- the pixel data of each pixel obtained by setting either the long pixel or the short pixel in the solid-state imaging device 13 is supplied to and stored in the memory unit 16.
- Short exposure data which is pixel data of short livestock pixels, is supplied from the memory unit 16 to the magnification calculation unit 17, corrected for magnification by the magnification calculation unit 17, and then supplied to the long livestock short livestock synthesis unit 18.
- long exposure data which is pixel data of long livestock pixels, is supplied from the memory unit 16 to the long livestock short livestock synthesis unit 18 and the demosaic unit 21.
- the short exposure data and the long exposure data after magnification correction are synthesized, and the resultant synthesized data is supplied to the 3A control unit 19. Therefore, in the HDR imaging mode, 3A control is executed based on the high dynamic range composite data.
- high dynamic range composite data is supplied to the moving subject detection unit 20 and the pattern detection unit 28.
- the moving subject detection unit 20 and the pattern detection unit 28 can detect them.
- the image displayed on the display 24 at the time of preview shooting is an image based on the thinned signal as shown in FIG.
- the image displayed on the display 24 at the time of the preview shooting may be an image based on the thinned signal, but of course, an image complementing the pixel data of the missing short pixels as shown in FIG. 8B is displayed. May be.
- the long livestock / short livestock synthesizing unit 18 supplies the demosaic unit 21 with composite data obtained by synthesizing the short exposure data after the magnification correction and the long exposure data.
- the synthesized data supplied to the 3A control unit 19 and the like as shown in FIG. 9, pixel data above the saturation level is not supplied, and the synthesized data clipped within the same output range as the long-lived pixels is not displayed.
- the long livestock / short livestock synthesizing unit 18 supplies the display display signal to the demosaic unit 21.
- magnification correction data of short livestock pixels is also expressed with 1024 gradations. If this is magnification correction data for short pixels, it is possible to express a brighter area as shown in FIG. 6B, but it is assumed that the camera operates in the normal shooting mode during recording. This is to prevent a difference in the dynamic range of the pixel signal for display display between the preview shooting and the recording shooting. Accordingly, there is no difference between the image displayed on the display 24 during the preview shooting and the image displayed on the display 24 during the recording shooting, so that the video intended by the user at the recording shooting can be shot and recorded. Become.
- FIG. 10 shows a conceptual diagram of image data in the HDR photographing mode of the imaging apparatus 1.
- 10 shows a conceptual diagram of image data of short pixels and long pixels input to the memory unit 16.
- the number of calculation bits of the pixel signal is 10 bits
- both the short livestock pixel and the long livestock pixel are expressed with 1024 gradations.
- FIG. 10 shows a conceptual diagram of the synthesized data after being synthesized by the long and short livestock synthesizing unit 18.
- the synthesized data after the synthesis processing is expressed in a high dynamic range exceeding 1024 gradations.
- the 3A control unit 19 receives composite data having a high dynamic range exceeding 1024 gradations, and the display 24 displays an image in which the composite data is clipped at 1024 gradations.
- the clipped area is designated as the 3A control target area on the display 24.
- 3A control can be executed immediately so that the area is optimally exposed.
- FIG. 11 shows a flow of photographed image data when the imaging apparatus 1 operates in the normal photographing mode.
- all pixels of the solid-state image sensor 13 are set to the same exposure amount.
- the long exposure data is stored in the memory unit 16. Is memorized.
- the long exposure data stored in the memory unit 16 is supplied to the long and short livestock synthesizing unit 18 and the demosaic unit 21.
- the long and short livestock synthesizing unit 18 supplies the long exposure data supplied from the memory unit 16 to the 3A control unit 19, the moving subject detection unit 20, and the pattern detection unit 28 as they are. Therefore, in the normal photographing mode, 3A control is executed based on image data obtained by unifying the exposure amount in all pixels.
- the image data obtained by setting all the pixels to the exposure amount of the long-lived pixels is supplied to the demosaic unit 21, a high-resolution image set to the optimum exposure amount is displayed on the display 24. At the same time, it is recorded on the recording medium 27.
- a display image displayed on the display 24 during preview shooting, a signal used for 3A control, and a display image displayed on the display 24 during recording shooting (the same applies to the recorded image)
- the image has a high resolution but a narrow dynamic range.
- the dynamic range of the display image displayed on the display 24 at the time of preview shooting, the signal used for 3A control, and the display image displayed on the display 24 at the time of recording shooting is the same. Wide, but at the expense of resolution.
- the imaging apparatus 1 controls the solid-state imaging device 13 with the same drive as in the general HDR shooting mode in preview shooting, and outputs the same high dynamic range signal as in the general HDR shooting mode in 3A control. Although it is used, only a signal of long lived pixels is used as a display image displayed on the display 24. Further, since an image based on the long pixel signal is displayed on the display 24 even during recording, the imaging apparatus 1 has a feature that the appearance of the display image of the preview imaging and the display image of the recording imaging does not change.
- the imaging apparatus 1 uses a signal with a high dynamic range for 3A control, for example, when an area that is saturated in a general normal shooting mode is touch-input by the user as a 3A control target area Even so, it is possible to immediately calculate and control an appropriate exposure amount at the touch position.
- the imaging device 1 even in the case of shooting a scene with a strong contrast, in the preview shooting, the exposure amount, the focus, and the like can be instantaneously set to optimum values, and general HDR shooting is performed. It is possible to solve the problem that the resolution of the recorded photographed image generated in the mode is low.
- the imaging apparatus 1 performs control using a pixel signal with a wide dynamic range in preview shooting that does not leave an image, and controls to increase the resolution by setting a single exposure amount in recording shooting that leaves an image. I do.
- FIG. 13 shows the flow of imaging processing by the imaging apparatus 1.
- a narrow dynamic range (clipped) image 51 is displayed on the display 24.
- the touch position detected by the touch panel 25 is supplied to the 3A control unit 19 via the control unit 30.
- the 3A control unit 19 calculates an appropriate 3A control amount for the region 52 that was not displayed on the display 24.
- the touch position is controlled so as to have an appropriate exposure amount, focus position, and white balance. Thereby, an image 53 in which the exposure amount, the focus position, and the like are appropriately set according to the touch position is displayed on the display 24.
- a moving image (images 54-1, 54-2, 54-3,...) In which an appropriately set exposure amount is applied to all pixels is photographed and displayed on the display 24. At the same time, it is recorded on the recording medium 27.
- the imaging device 1 of the present disclosure since the signal with a high dynamic range based on the composite data is used for the 3A control, even when the touch position is saturated on the display 24, An appropriate exposure amount can be set quickly and accurately, and the focus can be adjusted.
- an appropriate exposure amount can be set at high speed and accurately by applying a signal with a high dynamic range based on the composite data only to the 3A control of preview shooting. , Focus can also be adjusted.
- step S1 the 3A control unit 19 drives the solid-state imaging device 13 in the HDR imaging mode based on a command from the control unit 30.
- the solid-state image sensor 13 starts driving in the HDR imaging mode.
- the exposure amount is set to be either a long pixel or a short pixel, and the captured pixel data is supplied to the memory unit 16.
- step S ⁇ b> 2 the memory unit 16 supplies short exposure data (pixel data of short lived pixels) to the magnification calculation unit 17, and the long exposure data (pixel data of long lived pixels) is demosaiced with the long lived short livestock synthesizing unit 18. To the unit 21.
- step S3 the magnification calculator 17 multiplies the short exposure data supplied from the memory unit 16 by an exposure ratio, and supplies the short exposure data after magnification correction to the long and short livestock synthesizer 18.
- step S4 the long and short livestock synthesizing unit 18 synthesizes the short exposure data after magnification correction supplied from the magnification calculating unit 17 and the long exposure data supplied from the memory unit 16.
- step S5 the long and short livestock synthesizing unit 18 supplies the composite data obtained as a result of the synthesis to the 3A control unit 19, the moving subject detection unit 20, and the pattern detection unit 28.
- step S6 the 3A control unit 19 performs 3A control based on the synthesized data that is a high dynamic range signal.
- step S7 the display 24 displays an image based on the long exposure data. Note that the process of step S7 can be executed in parallel with the processes of steps S3 to S6 described above.
- step S ⁇ b> 8 the control unit 30 determines whether or not the user has touched the display 24 based on touch input information from the touch panel 25.
- step S8 If it is determined in step S8 that there is no user touch, the process returns to step S1, and the subsequent processes are executed again. That is, 3A control based on the composite data and display of the image based on the long exposure data on the display 24 are continued.
- step S8 determines whether the user has touched. If it is determined in step S8 that the user has touched, the process proceeds to step S9, and the control unit 30 generates a 3A control target area signal indicating the touch position based on the touch input information from the touch panel 25. 3A to the control unit 19.
- step S10 the 3A control unit 19 determines whether the touch position is an area where the long lived pixels are saturated based on the acquired 3A control target area signal.
- step S10 If it is determined in step S10 that the touch position is an area where the long pixel is saturated, the process proceeds to step S11, and the 3A control unit 19 performs 3A control using the short exposure data around the touch position. Do.
- the 3A control unit 19 can instantaneously calculate the optimum exposure amount using the pixel data of the short-lived pixels, for example, according to the following calculation formula.
- Optimum exposure Y ⁇ saturation signal value a / (short pixel value b ⁇ exposure ratio c) ⁇ long exposure d
- the formula for calculating the optimum exposure dose is not limited to the above formula, and any formula can be adopted.
- step S10 determines whether the touch position is an area in which the long pixel is saturated. If it is determined in step S10 that the touch position is not an area in which the long pixel is saturated, the process proceeds to step S12, and the 3A control unit 19 performs 3A control using the long exposure data of the touch position. I do.
- the exposure amount of the long livestock pixel is set to the optimum exposure amount.
- the exposure amount of the short-lived pixels may be an arbitrary (fixed) exposure amount set in advance or an exposure amount obtained at a predetermined exposure ratio with respect to the exposure amount of the long-lived pixels set to an optimum value. it can.
- step S13 the control unit 30 determines whether or not the recording button has been pressed. If it is determined in step S13 that the recording button has not yet been pressed, the process returns to step S1 and the above-described process is repeated.
- step S13 if it is determined in step S13 that the recording button has been pressed, the process proceeds to step S14, where the control unit 30 changes the shooting mode from the HDR shooting mode to the normal shooting mode, and the solid-state image pickup device 13 is in the normal mode. Driving in the shooting mode is started, and the preview shooting process is ended.
- the exposure amount set for the long pixels of the solid-state imaging device 13 in the HDR shooting mode is set for all the pixels.
- the pixel data of long lived pixels is supplied to the display 24 in the HDR shooting mode at the time of preview shooting, and the exposure amount of all pixels is unified with the exposure amount of the long lived pixels in the normal shooting mode at the time of recording shooting.
- pixel data of short pixels may be used. That is, in the HDR shooting mode at the time of preview shooting, pixel data of short pixels is supplied to the display 24, and in the normal shooting mode at the time of recording shooting, the exposure amount of all pixels is unified with the exposure amount of short pixels. Also good. In this case, the short exposure pixel is set to the optimum exposure amount in the processes of steps S11 and S12 described above.
- the exposure amount types in the HDR shooting mode at the time of preview shooting are two types of short and long livestock, but the exposure amount types in the HDR shooting mode are three or more types. Can do.
- the type of exposure amount in the normal shooting mode at the time of recording shooting is smaller than the type of exposure amount in the HDR shooting mode at the time of preview shooting.
- the imaging apparatus 1 performs 3A control using four types of exposure amounts of exposure amount +, exposure amount ++, exposure amount +++, and exposure amount +++ in the HDR shooting mode during preview shooting.
- “+” represents a constant exposure amount, and the greater the number of “+”, the greater the exposure amount.
- the imaging apparatus 1 displays an image using two types of pixel data with large exposure amounts of exposure amount +++ and exposure amount +++ on the display 24. Then, the imaging apparatus 1 performs 3A control so that the exposure amount +++ and the exposure amount +++ pixels have optimum exposure amounts. Thereafter, when changing from the HDR shooting mode to the normal shooting mode, the imaging apparatus 1 records an image obtained by setting the exposure amount of all pixels to the exposure amount +++ or the exposure amount +++ on the recording medium 27. .
- the imaging apparatus 1 sets a plurality of types of exposure amounts in the pixel group in the HDR shooting mode during preview shooting, and at least the exposure amount during preview shooting in the normal shooting mode during recording shooting.
- One type of exposure amount that is smaller than that in the HDR shooting mode can be set for the pixel group. Even in such a case, at the time of recording shooting, the type of exposure amount is reduced and the resolution is improved compared to preview shooting, so that the sense of resolution can be enhanced compared to preview shooting.
- the arrangement method of the short pixels and the long pixels in the HDR shooting mode at the time of preview shooting is not limited to the arrangement shown in A of FIG. 5 and is arbitrary.
- an arrangement method such as the arrangement P1 or P2 shown in FIG. 1 may be used.
- an array in which 2 ⁇ 2 pixels are regarded as one unit pixel as described in JP 2010-28423 A may be used.
- the exposure amount (AE) control is mainly described in the 3A control, but it goes without saying that the auto focus control and the auto white balance control can be similarly controlled.
- the autofocus control may be performed using the contrast information of the short-lived pixels, or a part of the pixels which are arranged in the central portion of the 3 ⁇ 3 pixels shown in FIG.
- the phase difference pixels may be mixed in the pixel group of the solid-state imaging device 13, and the 3A control unit 19 may perform autofocus control using a signal from the phase difference pixels.
- phase difference pixels are shielded from light, so even if the exposure amount is set as a long pixel, it is less likely to be saturated than a normal pixel, but if it is saturated, it is exposed as a short accumulation pixel.
- the autofocus control may be performed using a phase difference pixel for which the amount is set. Note that an autofocus control method using a phase difference pixel signal is described in, for example, Japanese Patent Application Laid-Open No. 2010-160313.
- a long accumulation pixel with a large exposure amount will be saturated regardless of whether it is a fluorescent lamp or the sun, but a short accumulation pixel may be set so as not to be saturated if it has a certain level of brightness (for example, a fluorescent lamp). it can.
- a certain level of brightness for example, a fluorescent lamp.
- the 3A control unit 19 can appropriately control the auto white balance based on the light source estimation result. For example, if the light source is a warm-colored incandescent bulb, it is possible to perform control such as making the white balance a little stronger in red and bringing it closer to the memory color.
- the imaging function of the imaging apparatus 1 described above is, for example, a digital still camera, a video camera, a portable terminal such as a smartphone (multi-function mobile phone) having an imaging function, a capsule endoscope, and a spectacle type in which an imaging function is added to glasses.
- the present invention can be applied to all electronic devices such as imaging devices.
- the imaging function of the imaging device 1 includes a function of generating a pixel signal of the solid-state imaging device 13, a magnification calculator 17, a long and short livestock synthesizer 18, a 3A controller 19, a demosaic unit 21, a controller 30, and the like. Means a function of performing control using the generated pixel signal.
- FIG. 16 shows a configuration example in which the imaging function of the imaging apparatus 1 is in the form of a chip. That is, the imaging function of the imaging device 1 can be configured in a chip form as shown as the imaging devices 101 to 103 in FIG.
- 16 includes a pixel region 122 in which a pixel group is formed in one semiconductor chip 121, a control circuit 123 that supplies a control signal to the pixel group, and the 3A control unit 19 described above. And a logic circuit 124 including a signal processing circuit of the control unit 30.
- the imaging device 102 shown in the middle of FIG. 16 includes a first semiconductor chip part 131 and a second semiconductor chip part 132.
- the pixel region 133 and the control circuit 134 are mounted on the first semiconductor chip unit 131, and the logic circuit 135 including the signal processing circuits of the 3A control unit 19 and the control unit 30 described above is mounted on the second semiconductor chip unit 132. Is installed.
- the first semiconductor chip portion 131 and the second semiconductor chip portion 132 are electrically connected to each other, whereby the imaging device 102 as one semiconductor chip is configured.
- the imaging device 103 shown in the lower part of FIG. 16 includes a first semiconductor chip unit 141 and a second semiconductor chip unit 142.
- a pixel region 143 is mounted on the first semiconductor chip portion 141, and a logic circuit including the control circuit 144 and the signal processing circuits of the 3A control portion 19 and the control portion 30 described above is provided on the second semiconductor chip portion 142.
- 145 is mounted.
- the first semiconductor chip portion 141 and the second semiconductor chip portion 142 are electrically connected to each other, whereby the imaging device 103 as one semiconductor chip is configured.
- FIG. 17 is a diagram illustrating a cross-sectional configuration of a capsule endoscope in which the imaging function of the imaging device 1 is mounted.
- the capsule endoscope 200 shown in FIG. 17 is photographed by a camera (microminiature camera) 211 and a camera 211 for photographing an image of a body cavity in a case 210 having hemispherical end faces and a cylindrical center part, for example.
- a CPU 215 and a coil (magnetic force / current conversion coil) 216 are provided in the housing 210.
- the CPU 215 controls photographing by the camera 211 and data accumulation operation in the memory 212, and controls data transmission from the memory 212 to a data receiving device (not shown) outside the housing 210 by the wireless transmitter 213.
- the coil 216 supplies power to the camera 211, the memory 212, the wireless transmitter 213, the antenna 214, and a light source 211b described later.
- the casing 210 is provided with a reed (magnetic) switch 217 for detecting when the capsule endoscope 200 is set in the data receiving apparatus.
- a reed switch 217 detects the setting to the data receiving device and data transmission becomes possible, power supply from the coil 216 to the wireless transmitter 213 is started.
- the camera 211 includes, for example, a solid-state imaging device 211a including an objective optical system for taking an image of a body cavity, and a plurality (here, two) of light sources 211b that illuminate the inside of the body cavity.
- a solid-state imaging device 211a including an objective optical system for taking an image of a body cavity, and a plurality (here, two) of light sources 211b that illuminate the inside of the body cavity.
- it is composed of LEDs (Light Emitting Diode).
- the solid-state imaging device 211a corresponds to the solid-state imaging device 13 in FIG. 4, and the CPU 215 performs control corresponding to the 3A control unit 19 and the control unit 30 in FIG.
- FIG. 18 is a diagram illustrating an external configuration of a smartphone equipped with the imaging function of the imaging device 1.
- the smartphone 300 includes a speaker 311, a display 312, operation buttons 313, a microphone 314, an imaging unit 315, and the like.
- the transmitted voice acquired from the microphone 314 is transmitted to the base station via the communication unit (not shown), and the received voice from the other party is transmitted from the communication unit to the speaker 311. The sound is supplied and reproduced.
- the display 312 includes, for example, an LCD (Liquid Crystal Display), and displays a predetermined screen such as a telephone standby screen.
- a touch panel is superimposed on the display 312, and an operation input to the display 312 by a user's finger or the like can be detected.
- the smartphone 300 can perform predetermined processing, for example, execution of a shooting function, in accordance with the detected user operation input.
- the imaging unit 315 includes a solid-state imaging device and an optical lens, and captures an image of a subject and stores image data obtained as a result in an internal memory or the like.
- the solid-state imaging device of the imaging unit 315 corresponds to the solid-state imaging device 13 in FIG. 4, and the 3A control unit 19 and the control unit 30 are realized by a CPU provided in the smartphone 300.
- FIG. 19 shows a configuration example of a spectacle-type imaging device equipped with the imaging function of the imaging device 1.
- a spectacle-type imaging device 400 of FIG. 19 has a solid-state image sensor 412 attached to the center of the frame 411 and a case in which an image signal processing circuit for driving and controlling the solid-state image sensor 412 is fixed while the spectacle lens 413 is fixed. 414.
- the solid-state image sensor 412 corresponds to the solid-state image sensor 13 in FIG. 4, and the image signal processing circuit provided in the housing 414 has the control functions of the 3A control unit 19 and the control unit 30 in FIG. 3A control is performed in accordance with the direction (movement) of the eyeball.
- the image data captured by the solid-state image sensor 412 is transmitted to an external circuit via the communication cable 415.
- the eyeglass-type imaging device 400 may have a wireless communication function and transmit image data by wireless communication.
- an image photographed by the solid-state image sensor 412 may be projected onto the eyeglass lens 413.
- the pixel group of the solid-state imaging device 13 has been described as being arranged in a Bayer arrangement, but may be arranged in another arrangement such as a clear bit arrangement.
- the color filter may include not only R, G, and B but also a white filter (W), an infrared filter (IR), and the like.
- the solid-state imaging device 13 may be a backside-illumination type or a front-side illumination type solid-state imaging device, and the organic photoelectric conversion film and the inorganic photoelectric conversion layer as disclosed in JP 2011-29337 A are arranged in the vertical direction.
- a stacked vertical spectral type solid-state imaging device may be used.
- this indication can also take the following structures.
- the control unit sets at least one of the plurality of types of exposure amounts set in the pixel group in the first mode in the pixel group in the second mode. Control device.
- the control unit sets the exposure amount obtained by using the pixel signal of the unsaturated pixel in the pixel group without using the pixel signal of the saturated pixel.
- the control unit sets the exposure amount obtained by using the pixel signal of the unsaturated pixel having a larger S / N ratio among the pixel signals of the plurality of types of exposure amounts to the pixel group.
- the control unit sets the exposure amount obtained using a pixel signal having a high dynamic range among the pixel signals of the plurality of types of exposure amounts.
- control unit sets the exposure amount obtained by using the pixel signal having a higher dynamic range than the pixel signal for display display.
- the said control part sets the said exposure amount calculated
- the control apparatus as described in said (5).
- the region specifying unit specifies a region of a user's face in the captured image as the predetermined region that is a target region for determining the exposure amount.
- the control device (9) The control device according to (7), wherein the region specifying unit specifies a region specified by a user in the captured image as the predetermined region that is a target region for determining the exposure amount. (10) The control device according to (7), wherein the region specifying unit specifies a region where a predetermined object exists in the captured image as the predetermined region that is a target region for determining the exposure amount. (11) The control unit sets two types of exposure amounts to the pixel group in the first mode, and sets one type of exposure amount to the pixel group in the second mode. Thru
- a synthesis unit that synthesizes the pixel signal of the pixel with a small exposure amount by multiplying the pixel signal of the pixel with the exposure ratio and the pixel signal of the pixel with a large exposure amount;
- the combining unit combines the pixel signal subjected to the magnification correction and the pixel signal of the pixel having a large exposure amount and supplies the combined signal to the control unit, and in the second mode, The control device according to (11), wherein a pixel signal of one kind of exposure amount is supplied to the control unit.
- a synthesis unit that synthesizes the pixel signal of the pixel with a small exposure amount by multiplying the pixel signal of the pixel with the exposure ratio and the pixel signal of the pixel with a large exposure amount;
- the synthesizing unit clips a signal obtained by synthesizing the pixel signal whose magnification has been corrected and a pixel signal of the pixel having a large exposure amount with the display gradation of the second mode. Is output as a display display signal.
- the control device according to any one of (1) to (12).
- the control device according to (12) or (14), wherein the control unit also generates color control information for white balance based on the signal synthesized by the synthesis unit in the first mode.
- the pixel group includes a phase difference pixel, The control device according to any one of (1) to (15), wherein the control unit also performs autofocus control using a pixel signal of the phase difference pixel.
- a control method for setting the exposure amount of the pixel group to the pixel group (18)
- a solid-state imaging device including a pixel group in which a plurality of pixels are two-dimensionally arranged; In the first mode before starting the recording of the captured image, a plurality of types of exposure amounts are set in the pixel group, and in the second mode in which the captured image is recorded, there are fewer types than in the first mode.
- a control unit that sets the exposure amount of the pixel group to the pixel group.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Power Engineering (AREA)
- Color Television Image Signal Generators (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
Abstract
Description
1.通常撮影モードとHDR撮影モードの説明
2.撮像装置の構成例
3.HDR撮影モードの撮影画像データの流れ
4.通常撮影モードの撮影画像データの流れ
5.撮影処理の流れ
6.プレビュー撮影処理の処理フロー
7.変形例
8.電子機器への適用例
以下に詳述する撮像装置1(図4)は、通常撮影モードとHDR撮影モードの2つの撮影モード(撮影方式)を有し、それらを動作状態によって使い分ける。
図4は、本開示に係る撮像装置の構成例を示すブロック図である。
図7は、撮像装置1がHDR撮影モードで動作するときの撮影画像データの流れを示している。
図11は、撮像装置1が通常撮影モードで動作するときの撮影画像データの流れを示している。
図12を参照して、本開示の撮像装置1と、一般的な通常撮影モード及びHDR撮影モードとを、さらに比較説明する。
図13は、撮像装置1による撮影処理の流れを示している。
次に、図14のフローチャートを参照して、撮像装置1のプレビュー撮影時の処理であるプレビュー撮影処理について説明する。例えば、撮像装置1の電源が投入されたとき、図14の処理が開始される。
最適露光量Y<飽和信号値a÷(短畜画素値b×露光比c)×長畜露光量d
上述した例では、プレビュー撮影時のHDR撮影モードでは長畜画素の画素データをディスプレイ24に供給し、記録撮影時の通常撮影モードでは、長畜画素の露光量に全画素の露光量を統一するようにしたが、短畜画素の画素データを使うようにしてもよい。すなわち、プレビュー撮影時のHDR撮影モードでは短畜画素の画素データをディスプレイ24に供給し、記録撮影時の通常撮影モードでは、短畜画素の露光量に全画素の露光量を統一するようにしてもよい。この場合、上述したステップS11およびS12の処理で最適な露光量に設定されるのが、短畜画素となる。
上述した撮像装置1の撮像機能は、例えば、デジタルスチルカメラやビデオカメラ、撮像機能を有するスマートフォン(多機能携帯電話機)等の携帯端末、カプセル内視鏡、眼鏡に撮像機能が付加された眼鏡型撮像装置などの電子機器全般に対して適用可能である。
図16は、撮像装置1の撮像機能をチップ形態とした構成例を示している。即ち、撮像装置1の撮像機能は、図16において撮像装置101乃至103として示すようなチップ形態の構成とすることができる。
図17は、撮像装置1の撮像機能を搭載したカプセル内視鏡の断面構成を示す図である。
図18は、撮像装置1の撮像機能を搭載したスマートフォンの外観構成を示す図である。
図19は、撮像装置1の撮像機能を搭載した眼鏡型撮像装置の構成例を示している。
(1)
複数の画素が2次元配置された画素群の露光量を制御する制御部を備え、
前記制御部は、撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する
制御装置。
(2)
前記制御部は、前記第1のモードにおいて前記画素群に設定した前記複数種類の露光量のうちの少なくとも一つを、前記第2のモードにおいて前記画素群に設定する
前記(1)に記載の制御装置。
(3)
前記制御部は、前記第1のモードにおいて、飽和画素の画素信号を用いずに、非飽和画素の画素信号を使用して求めた前記露光量を前記画素群に設定する
前記(1)または(2)に記載の制御装置。
(4)
前記制御部は、前記複数種類の露光量の画素信号のうち、S/N比が大きい方の前記非飽和画素の画素信号を使用して求めた前記露光量を前記画素群に設定する
前記(3)に記載の制御装置。
(5)
前記制御部は、前記第1のモードにおいて、前記複数種類の露光量の画素信号のうち、ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
前記(1)乃至(4)のいずれかに記載の制御装置。
(6)
前記制御部は、ディスプレイ表示用の画素信号よりも前記ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
前記(5)に記載の制御装置。
(7)
前記露光量を決定する対象の領域となる前記画素群の所定の領域を特定する領域特定部をさらに備え、
前記制御部は、前記領域特定部により特定された領域の前記ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
前記(5)に記載の制御装置。
(8)
前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内のユーザの顔の領域を特定する
前記(7)に記載の制御装置。
(9)
前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内のユーザが指定した領域を特定する
前記(7)に記載の制御装置。
(10)
前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内の所定の物体が存在する領域を特定する
前記(7)に記載の制御装置。
(11)
前記制御部は、前記第1のモードにおいては、2種類の露光量を前記画素群に設定し、前記第2のモードにおいては、1種類の露光量を前記画素群に設定する
前記(1)乃至(10)のいずれかに記載の制御装置。
(12)
露光量の小さな画素の画素信号を露光比倍することにより倍率補正した画素信号と、露光量の大きな画素の画素信号とを合成する合成部をさらに備え、
前記合成部は、前記第1のモードにおいては、前記倍率補正した画素信号と、前記露光量の大きな画素の画素信号とを合成して前記制御部に供給し、前記第2のモードにおいては、1種類の露光量の画素信号を前記制御部に供給する
前記(11)に記載の制御装置。
(13)
露光量の小さな画素の画素信号を露光比倍することにより倍率補正した画素信号と、露光量の大きな画素の画素信号とを合成する合成部をさらに備え、
前記合成部は、前記第1のモードにおいては、前記倍率補正した画素信号と、前記露光量の大きな画素の画素信号とを合成した信号を、前記第2のモードの表示階調でクリッピングした信号を、ディスプレイ表示用の信号として出力する
前記(1)乃至(12)のいずれかに記載の制御装置。
(14)
前記制御部は、前記第1のモードにおいて、前記合成部により合成された前記信号に基づいて、オートフォーカス制御も行う
前記(12)に記載の制御装置。
(15)
前記制御部は、前記第1のモードにおいて、前記合成部により合成された前記信号に基づいて、ホワイトバランスのための色制御情報も生成する
前記(12)または(14)に記載の制御装置。
(16)
前記画素群には、位相差画素が含まれており、
前記制御部は、前記位相差画素の画素信号を用いて、オートフォーカス制御も行う
前記(1)乃至(15)のいずれかに記載の制御装置。
(17)
複数の画素が2次元配置された画素群の露光量を制御する制御装置が、
撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する
制御方法。
(18)
複数の画素が2次元配置された画素群を備える固体撮像素子と、
撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する制御部と
を備える電子機器。
Claims (18)
- 複数の画素が2次元配置された画素群の露光量を制御する制御部を備え、
前記制御部は、撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する
制御装置。 - 前記制御部は、前記第1のモードにおいて前記画素群に設定した前記複数種類の露光量のうちの少なくとも一つを、前記第2のモードにおいて前記画素群に設定する
請求項1に記載の制御装置。 - 前記制御部は、前記第1のモードにおいて、飽和画素の画素信号を用いずに、非飽和画素の画素信号を使用して求めた前記露光量を前記画素群に設定する
請求項1に記載の制御装置。 - 前記制御部は、前記複数種類の露光量の画素信号のうち、S/N比が大きい方の前記非飽和画素の画素信号を使用して求めた前記露光量を前記画素群に設定する
請求項3に記載の制御装置。 - 前記制御部は、前記第1のモードにおいて、前記複数種類の露光量の画素信号のうち、ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
請求項1に記載の制御装置。 - 前記制御部は、ディスプレイ表示用の画素信号よりも前記ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
請求項5に記載の制御装置。 - 前記露光量を決定する対象の領域となる前記画素群の所定の領域を特定する領域特定部をさらに備え、
前記制御部は、前記領域特定部により特定された領域の前記ハイダイナミックレンジな画素信号を使用して求めた前記露光量を設定する
請求項5に記載の制御装置。 - 前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内のユーザの顔の領域を特定する
請求項7に記載の制御装置。 - 前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内のユーザが指定した領域を特定する
請求項7に記載の制御装置。 - 前記領域特定部は、前記露光量を決定する対象の領域となる前記所定の領域として、前記撮影画像内の所定の物体が存在する領域を特定する
請求項7に記載の制御装置。 - 前記制御部は、前記第1のモードにおいては、2種類の露光量を前記画素群に設定し、前記第2のモードにおいては、1種類の露光量を前記画素群に設定する
請求項1に記載の制御装置。 - 露光量の小さな画素の画素信号を露光比倍することにより倍率補正した画素信号と、露光量の大きな画素の画素信号とを合成する合成部をさらに備え、
前記合成部は、前記第1のモードにおいては、前記倍率補正した画素信号と、前記露光量の大きな画素の画素信号とを合成して前記制御部に供給し、前記第2のモードにおいては、1種類の露光量の画素信号を前記制御部に供給する
請求項11に記載の制御装置。 - 露光量の小さな画素の画素信号を露光比倍することにより倍率補正した画素信号と、露光量の大きな画素の画素信号とを合成する合成部をさらに備え、
前記合成部は、前記第1のモードにおいては、前記倍率補正した画素信号と、前記露光量の大きな画素の画素信号とを合成した信号を、前記第2のモードの表示階調でクリッピングした信号を、ディスプレイ表示用の信号として出力する
請求項11に記載の制御装置。 - 前記制御部は、前記第1のモードにおいて、前記合成部により合成された前記信号に基づいて、オートフォーカス制御も行う
請求項12に記載の制御装置。 - 前記制御部は、前記第1のモードにおいて、前記合成部により合成された前記信号に基づいて、ホワイトバランスのための色制御情報も生成する
請求項12に記載の制御装置。 - 前記画素群には、位相差画素が含まれており、
前記制御部は、前記位相差画素の画素信号を用いて、オートフォーカス制御も行う
請求項1に記載の制御装置。 - 複数の画素が2次元配置された画素群の露光量を制御する制御装置が、
撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する
制御方法。 - 複数の画素が2次元配置された画素群を備える固体撮像素子と、
撮影画像の記録を開始する前の第1のモードにおいては、複数種類の露光量を前記画素群に設定し、撮影画像を記録する第2のモードにおいては、前記第1のモードよりも少ない種類の露光量を前記画素群に設定する制御部と
を備える電子機器。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480039968.3A CN105379248B (zh) | 2013-08-22 | 2014-08-08 | 控制器件、控制方法及电子装置 |
KR1020167000163A KR102184916B1 (ko) | 2013-08-22 | 2014-08-08 | 제어 장치, 제어 방법 및 전자 기기 |
US14/907,431 US9998679B2 (en) | 2013-08-22 | 2014-08-08 | Control device, control method, and electronic device to control an exposure amount of a pixel group |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-171966 | 2013-08-22 | ||
JP2013171966A JP2015041890A (ja) | 2013-08-22 | 2013-08-22 | 制御装置、制御方法、および電子機器 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015025740A1 true WO2015025740A1 (ja) | 2015-02-26 |
Family
ID=52483518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/071033 WO2015025740A1 (ja) | 2013-08-22 | 2014-08-08 | 制御装置、制御方法、および電子機器 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9998679B2 (ja) |
JP (1) | JP2015041890A (ja) |
KR (1) | KR102184916B1 (ja) |
CN (1) | CN105379248B (ja) |
TW (1) | TWI661727B (ja) |
WO (1) | WO2015025740A1 (ja) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150024098A (ko) * | 2013-08-26 | 2015-03-06 | 삼성전자주식회사 | 디지털 카메라에서 사진 합성 방법 및 장치 |
JP6146293B2 (ja) * | 2013-12-25 | 2017-06-14 | ソニー株式会社 | 制御装置、制御方法および制御システム |
CN104243827A (zh) * | 2014-09-23 | 2014-12-24 | 深圳市中兴移动通信有限公司 | 拍摄方法和装置 |
US9467632B1 (en) | 2015-07-13 | 2016-10-11 | Himax Imaging Limited | Dual exposure control circuit and associated method |
US9848137B2 (en) * | 2015-11-24 | 2017-12-19 | Samsung Electronics Co., Ltd. | CMOS image sensors having grid array exposure control |
EP3424403B1 (en) | 2016-03-03 | 2024-04-24 | Sony Group Corporation | Medical image processing device, system, method, and program |
CN108702467B (zh) | 2016-03-24 | 2020-12-08 | 富士胶片株式会社 | 固体电子摄像装置的控制装置及其控制方法 |
JP6675917B2 (ja) * | 2016-04-19 | 2020-04-08 | オリンパス株式会社 | 撮像装置及び撮像方法 |
US10097766B2 (en) * | 2016-08-31 | 2018-10-09 | Microsoft Technology Licensing, Llc | Provision of exposure times for a multi-exposure image |
GB2561163B (en) * | 2017-03-30 | 2021-05-12 | Apical Ltd | Control systems and image sensors |
WO2019035245A1 (ja) * | 2017-08-18 | 2019-02-21 | 富士フイルム株式会社 | 撮像装置、撮像装置の制御方法、及び撮像装置の制御プログラム |
JP2020043522A (ja) * | 2018-09-12 | 2020-03-19 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
JP2020198470A (ja) * | 2019-05-30 | 2020-12-10 | ソニーセミコンダクタソリューションズ株式会社 | 画像認識装置および画像認識方法 |
JP6628925B2 (ja) * | 2019-07-01 | 2020-01-15 | キヤノン株式会社 | 画像表示装置及びその制御方法 |
WO2021034311A1 (en) * | 2019-08-19 | 2021-02-25 | Google Llc | Dual exposure control in a camera system |
JP7457473B2 (ja) * | 2019-09-12 | 2024-03-28 | キヤノン株式会社 | 撮像装置及びその制御方法 |
KR20230032359A (ko) * | 2021-08-30 | 2023-03-07 | 에스케이하이닉스 주식회사 | 전자 장치 및 그 동작 방법 |
US12096129B2 (en) * | 2022-02-24 | 2024-09-17 | Qualcomm Incorporated | Multi-frame auto exposure control (AEC) |
CN117241134B (zh) * | 2023-11-15 | 2024-03-08 | 杭州海康威视数字技术股份有限公司 | 用于摄像机的拍摄模式切换方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135648A (ja) * | 2000-10-26 | 2002-05-10 | Olympus Optical Co Ltd | 撮像装置 |
JP2002232777A (ja) * | 2001-02-06 | 2002-08-16 | Olympus Optical Co Ltd | 撮像システム |
JP2007158544A (ja) * | 2005-12-01 | 2007-06-21 | Olympus Corp | カメラシステム |
JP2011059337A (ja) * | 2009-09-09 | 2011-03-24 | Fujifilm Corp | 撮像装置 |
JP2012205244A (ja) * | 2011-03-28 | 2012-10-22 | Canon Inc | 画像処理装置、及びその制御方法 |
JP2013009105A (ja) * | 2011-06-23 | 2013-01-10 | Jvc Kenwood Corp | 画像処理装置及び画像処理方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009053296A (ja) * | 2007-08-24 | 2009-03-12 | Olympus Imaging Corp | 撮影装置および撮影装置の制御方法 |
JP4661912B2 (ja) | 2008-07-18 | 2011-03-30 | ソニー株式会社 | 固体撮像素子およびカメラシステム |
TW201108728A (en) * | 2009-08-25 | 2011-03-01 | Hon Hai Prec Ind Co Ltd | Method and system for exposing photograph by separated regions in camera devices |
US8970719B2 (en) * | 2011-06-23 | 2015-03-03 | JVC Kenwood Corporation | Image processing apparatus and image processing method |
JP6019692B2 (ja) * | 2012-04-16 | 2016-11-02 | ソニー株式会社 | 撮像素子、撮像素子の制御方法、および、撮像装置 |
-
2013
- 2013-08-22 JP JP2013171966A patent/JP2015041890A/ja active Pending
-
2014
- 2014-07-24 TW TW103125392A patent/TWI661727B/zh active
- 2014-08-08 WO PCT/JP2014/071033 patent/WO2015025740A1/ja active Application Filing
- 2014-08-08 US US14/907,431 patent/US9998679B2/en active Active
- 2014-08-08 KR KR1020167000163A patent/KR102184916B1/ko active IP Right Grant
- 2014-08-08 CN CN201480039968.3A patent/CN105379248B/zh active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135648A (ja) * | 2000-10-26 | 2002-05-10 | Olympus Optical Co Ltd | 撮像装置 |
JP2002232777A (ja) * | 2001-02-06 | 2002-08-16 | Olympus Optical Co Ltd | 撮像システム |
JP2007158544A (ja) * | 2005-12-01 | 2007-06-21 | Olympus Corp | カメラシステム |
JP2011059337A (ja) * | 2009-09-09 | 2011-03-24 | Fujifilm Corp | 撮像装置 |
JP2012205244A (ja) * | 2011-03-28 | 2012-10-22 | Canon Inc | 画像処理装置、及びその制御方法 |
JP2013009105A (ja) * | 2011-06-23 | 2013-01-10 | Jvc Kenwood Corp | 画像処理装置及び画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
KR20160045051A (ko) | 2016-04-26 |
TW201515466A (zh) | 2015-04-16 |
TWI661727B (zh) | 2019-06-01 |
KR102184916B1 (ko) | 2020-12-01 |
CN105379248B (zh) | 2019-08-09 |
US9998679B2 (en) | 2018-06-12 |
CN105379248A (zh) | 2016-03-02 |
US20160173751A1 (en) | 2016-06-16 |
JP2015041890A (ja) | 2015-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015025740A1 (ja) | 制御装置、制御方法、および電子機器 | |
JP5701664B2 (ja) | 撮像装置 | |
US8749653B2 (en) | Apparatus and method of blurring background of image in digital image processing device | |
US20170171446A1 (en) | Image capturing apparatus, control method therefor, program, and recording medium | |
US8947575B2 (en) | Image pickup apparatus having warning region detection related to dynamic range expansion | |
JP5123137B2 (ja) | 撮像装置および撮像方法 | |
US8885078B2 (en) | Image processing apparatus, image processing method, and recording medium storing image processing program | |
JP2010147786A (ja) | 撮像装置及び画像処理方法 | |
JP5223686B2 (ja) | 撮像装置および撮像方法 | |
KR20220064170A (ko) | 이미지 센서를 포함하는 전자 장치 및 그 동작 방법 | |
US8743239B2 (en) | Image processing apparatus, control method thereof, and image-capturing apparatus | |
JP2011239157A (ja) | 画像処理装置およびその制御方法、撮像装置 | |
JP5277863B2 (ja) | 撮像装置および撮像方法 | |
US11196938B2 (en) | Image processing apparatus and control method for same | |
JP2013192121A (ja) | 撮像装置及び撮像方法 | |
JP5310331B2 (ja) | 撮像装置および撮像方法 | |
JP7397640B2 (ja) | 画像処理装置および画像処理方法 | |
US20100259606A1 (en) | Imaging device and method for controlling imaging device | |
JP6300514B2 (ja) | 撮像装置、および撮像装置の制御方法 | |
JP2005167465A (ja) | デジタルカメラ及びデジタルカメラの撮像方法 | |
JP5091734B2 (ja) | 撮像装置および撮像方法 | |
JP2012227744A (ja) | 撮像装置 | |
JP5145876B2 (ja) | 撮像装置および撮像方法 | |
JP2009089358A (ja) | 撮像装置および撮像方法 | |
JP2011191439A (ja) | 撮像装置、補助光の発光制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14838730 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20167000163 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14907431 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14838730 Country of ref document: EP Kind code of ref document: A1 |