WO2012124181A1 - 撮像装置及び撮像プログラム - Google Patents
撮像装置及び撮像プログラム Download PDFInfo
- Publication number
- WO2012124181A1 WO2012124181A1 PCT/JP2011/067544 JP2011067544W WO2012124181A1 WO 2012124181 A1 WO2012124181 A1 WO 2012124181A1 JP 2011067544 W JP2011067544 W JP 2011067544W WO 2012124181 A1 WO2012124181 A1 WO 2012124181A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel data
- line
- color
- pixel
- pixels
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 85
- 238000006243 chemical reaction Methods 0.000 claims abstract description 101
- 239000003086 colorant Substances 0.000 claims description 11
- 230000004048 modification Effects 0.000 abstract description 3
- 238000012986 modification Methods 0.000 abstract description 3
- 238000003491 array Methods 0.000 abstract 1
- 230000000875 corresponding effect Effects 0.000 description 92
- 238000000034 method Methods 0.000 description 51
- 230000008569 process Effects 0.000 description 38
- 230000003287 optical effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000004131 Bayer process Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003321 amplification Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4015—Demosaicing, e.g. colour filter array [CFA], Bayer pattern
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2209/00—Details of colour television systems
- H04N2209/04—Picture signal generators
- H04N2209/041—Picture signal generators using solid-state devices
- H04N2209/042—Picture signal generators using solid-state devices having a single pick-up sensor
- H04N2209/045—Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
- H04N2209/046—Colour interpolation to calculate the missing colour values
Definitions
- the present invention relates to an imaging apparatus and an imaging program, and more particularly to an imaging apparatus and an imaging program using a single-plate color imaging element.
- the output image of the single-plate color imaging device is a RAW image (mosaic image)
- a multi-channel image is obtained by a process of interpolating missing color pixels from surrounding pixels (synchronization process).
- the reproduction characteristics of the high-frequency image signal may be a problem.
- color image sensors tend to cause aliasing in captured images, so it is important to widen the reproduction band and improve resolution while suppressing the occurrence of moiré (false colors). It becomes.
- Patent Document 1 discloses an imaging apparatus that performs thinning output with less generation of moire or the like.
- Patent Document 2 discloses an imaging device that suppresses generation of moiré and a decrease in color S / N ratio and improves resolution even when sensitized by pixel mixing.
- a primary color Bayer array (see, for example, Patent Documents 3 to 5), which is the most widely used color array in a single-plate color image sensor, is sensitive to human eyes and is the most contributing green for obtaining a luminance signal.
- G Since the pixels are arranged in a checkered pattern and red (R) and blue (B) are arranged in a line sequence, the G signal generates a high-frequency signal in an oblique direction, and the R and B signals generate horizontal and vertical high-frequency signals.
- the reproduction accuracy is a problem.
- image processing such as synchronization processing needs to perform image processing according to the new color filter arrangement, but changing the image processing according to the new color filter arrangement is very complicated. There is a problem that the design man-hours become enormous.
- the present invention has been made to solve the above problems, and even when an image sensor having a color filter of an array other than the Bayer array is used, the image processing unit corresponding to the Bayer array can be used without being changed.
- An object of the present invention is to provide an imaging apparatus and an imaging program that can perform imaging.
- an image pickup apparatus includes an image pickup element including a plurality of photoelectric conversion elements arranged in a horizontal direction and a vertical direction, and a plurality of pixels including the plurality of photoelectric conversion elements.
- the first filter corresponding to the first color that contributes most to obtain a luminance signal is disposed on the four corners and the center pixel of a 3 ⁇ 3 pixel square array,
- a second filter corresponding to a second color different from the first color is arranged in a central line in the horizontal direction of the square array, and a third filter different from the first color and the second color.
- the third filter corresponding to the color of the first arrangement pattern arranged in the center line in the vertical direction of the square arrangement is the same as the arrangement of the first arrangement pattern and the first filter.
- the second field A color filter in which a basic arrangement pattern of 6 ⁇ 6 pixels in which a second arrangement pattern in which the arrangement of the data and the arrangement of the third filter are exchanged is point-symmetrically arranged is repeatedly arranged in the vertical direction
- Driving means for driving the image sensor so as to read only pixel data of pixels on a line at a predetermined position, and pixel data of each line read out by thinning out from the image sensor is a square of 2 ⁇ 2 pixels.
- a Bayer array pattern in which two pixels on one diagonal line of the array are arranged with the pixels of the first color, and two pixels on the other diagonal line are arranged with the pixels of the second color and the pixels of the third color
- Pixel conversion means for converting into Bayer array pixel data, and based on the Bayer array pixel data, for each pixel, pixel data of a color other than the corresponding color is pixel data of surrounding pixels.
- interpolation characterized by comprising a generating means for generating a respective color of the pixel data of each pixel.
- a color filter that is not a Bayer array pattern in which a basic array pattern of 6 ⁇ 6 pixels is repeatedly arranged is provided, and pixel data of each line read out by thinning out from the image sensor is used as the Bayer array pattern Bayer. Since the pixel conversion means for converting into array pixel data is provided, the image processing unit corresponding to the Bayer array can be used without being changed.
- the pixel conversion unit mixes pixel data of each line read out by thinning out from the image sensor with pixel data of pixels of the same color adjacent in the horizontal direction.
- the pixel data of each line read out by being thinned out from the image sensor may be converted into Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the third line, and the pixel conversion means includes the first line and the third line.
- Pixel data obtained by mixing the pixel data of the first color adjacent to each other on the line is applied to the pixels of the first color at corresponding positions in the Bayer arrangement pattern, and the pixel data on the first line
- the pixel data of the third color is applied to the pixel of the third color at the corresponding position of the Bayer arrangement pattern
- the pixel data of the second color on the third line is applied to the correspondence of the Bayer arrangement pattern.
- the second color of the position to be By fitting the pixel, the pixel data of each line read by thinning out from the imaging element may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the fifth line, and the pixel conversion means includes the second line and the fifth line.
- the pixel data of the first color on the line is applied to the pixel of the first color at the corresponding position of the Bayer arrangement pattern, and the pixel data of the second color adjacent to the second line is used.
- the pixel data obtained by mixing the pixels is applied to the second color pixel at the corresponding position of the Bayer array pattern, and the pixel data obtained by mixing the pixel data of the third color adjacent on the fifth line is obtained.
- the pixel data of each line read by thinning out from the imaging element may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the fourth line, and the pixel conversion means includes the first line and the fourth line.
- Pixel data obtained by mixing the pixel data of the first color adjacent to each other on the line is applied to the pixels of the first color at corresponding positions in the Bayer arrangement pattern, and the pixel data on the first line Pixel data in which pixel data of the second color is applied to the pixel of the second color at a corresponding position of the Bayer array pattern, and pixel data of the third color adjacent to the fourth line is mixed.
- the Bayer array pattern By fitting the third color pixels of corresponding positions of the emission, the pixel data of each line read by thinning out from the imaging element may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the seventh line, and the pixel conversion means includes the first line and the seventh line.
- the pixel data obtained by mixing the pixel data of the first color adjacent to each other on the line is applied to the pixel of the first color at the corresponding position of the Bayer arrangement pattern, and the first data in the vertical direction is applied.
- the pixel data of the third color on the seventh line in the vertical direction is applied by applying the pixel data of the second color on the line to the pixel of the second color at the corresponding position of the Bayer arrangement pattern.
- the bay By fitting the third color pixel at a corresponding position over the arrangement pattern, the pixel data of each line read by thinning out from the imaging element may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the eighth line
- the pixel conversion means includes the second line and the eighth line.
- the pixel data of the first color on the second line is applied to the pixel of the first color at the corresponding position of the Bayer arrangement pattern, and the adjacent second pixel on the second line in the vertical direction.
- Pixel data obtained by mixing pixel data of colors is applied to the second color pixel at a corresponding position in the Bayer arrangement pattern, and the adjacent third color on the eighth line in the vertical direction is applied.
- Pixel data to pixel By applying the combined pixel data to the pixels of the third color at the corresponding positions of the Bayer array pattern, the pixel data of each line read out by thinning out from the image sensor is converted into the Bayer array pixel data. You may make it do.
- the pixel data of the first color on the line is applied to the pixel of the first color at the corresponding position of the Bayer arrangement pattern, and the adjacent third color on the second line in the vertical direction
- the pixel data obtained by mixing the pixel data is applied to the third color pixel at the corresponding position in the Bayer arrangement pattern, and the second color pixel data on the eleventh line in the vertical direction is applied.
- the second color pixels of corresponding positions yer arrangement pattern the pixel data of each line read by thinning out from the imaging element may be converted into the Bayer array pixel data.
- the pixel conversion means uses pixel data of pixels of the same color adjacent in the horizontal direction as pixel positions with respect to pixel data of each line read out by thinning out from the image sensor.
- the pixel data of the first color, the second color, and the third color at each pixel position are generated by performing weighted addition in response, and the Bayer is generated based on the generated pixel data of each color.
- Array pixel data may be generated.
- the image sensor is driven so as to read out only the pixel data of the pixels on the third line, and the pixel conversion means includes the first line and the third line.
- the pixel data of the first color, the second color, and the third color at each pixel position are obtained by weighting and adding pixel data of adjacent pixels of the same color on the line of the pixel according to the pixel position.
- the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the image sensor is driven so that only pixel data of pixels on the fifth line is read out, and the pixel conversion means includes the second line and the fifth line.
- the pixel data of the first color, the second color, and the third color at each pixel position are obtained by weighting and adding pixel data of adjacent pixels of the same color on the line of the pixel according to the pixel position.
- the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the image sensor is driven so as to read out only the pixel data of the pixels on the fourth line, and the pixel conversion means includes the first line and the fourth line.
- the pixel data of the first color, the second color, and the third color at each pixel position are obtained by weighting and adding pixel data of adjacent pixels of the same color on the line of the pixel according to the pixel position.
- the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the image pickup device is driven so as to perform thinning and reading, and the pixel conversion unit weights and adds pixel data of adjacent pixels of the same color on the first line according to the pixel position, thereby
- the pixel data of the first color, the second color, and the third color may be generated, and the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the image sensor is driven so as to perform thinning and reading, and the pixel conversion unit weights and adds pixel data of adjacent pixels of the same color on the second line according to the pixel position, thereby
- the pixel data of the first color, the second color, and the third color may be generated, and the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the image sensor is driven so as to perform thinning and reading, and the pixel conversion unit weights and adds pixel data of adjacent pixels of the same color on the second line according to the pixel position, thereby
- the pixel data of the first color, the second color, and the third color may be generated, and the Bayer array pixel data may be generated based on the generated pixel data of each color.
- the pixel conversion unit generates pixel data for two lines corresponding to the Bayer arrangement pattern from the pixel data of each line read out by thinning out from the image sensor.
- the generated two lines of pixel data may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the third line, and the pixel conversion means includes two sets on the first line.
- Each of the pixels of the first color at the corresponding position of the array line and the (2n + 2) th (n 0, 1, 2,...) Second Bayer array line, and on the third line.
- the pixel data of each line read out by thinning out from the image sensor may be converted into the Bayer array pixel data.
- the image sensor is driven so as to read out only the pixel data of the pixels on the eighth line, and the pixel conversion means has two sets on the second line.
- the pixel data obtained by applying the pixel mixture of the pixel data of the second color adjacent on the eighth line to the pixel of the second color at the corresponding position of the array line is the fourth color of the Bayer array pattern.
- the pixel data obtained by applying the pixel mixture of the adjacent third color pixel data on the second line to the second color pixel at the corresponding position of the Bayer array line is used as the pixel data of the Bayer array pattern.
- the pixel data obtained by applying the pixel mixture of the adjacent third color pixel data on the eighth line to the third color pixel at the corresponding position of the Bayer array line is used as the pixel data of the Bayer array pattern.
- the first color is a green (G) color
- the second color is one of a red (R) color and a blue (B) color
- the third color may be the other color of red (R) and blue (B).
- An imaging program according to a twenty-first aspect of the invention is an imaging program for causing a computer to function as a pixel conversion unit constituting the imaging apparatus according to any one of the first to twentieth aspects.
- the image processing unit corresponding to the Bayer array can be used without being changed.
- FIG. 1 shows a schematic block diagram of the imaging apparatus 10 according to the present embodiment.
- the imaging device 10 includes an optical system 12, an imaging element 14, an imaging processing unit 16, a pixel conversion processing unit 18, an image processing unit 20, a driving unit 22, and a control unit 24.
- the optical system 12 includes, for example, a lens group including a plurality of optical lenses, an aperture adjustment mechanism, a zoom mechanism, an automatic focus adjustment mechanism, and the like.
- the image sensor 14 includes an image sensor including a plurality of photoelectric conversion elements arranged in a horizontal direction and a vertical direction, for example, a color filter disposed on an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor). This is a so-called single-plate type imaging device having the above-described configuration.
- a CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- FIG. 2 shows a part of the color filter 30 according to this embodiment.
- the number of pixels is (4896 ⁇ 3264) pixels as an example and the aspect ratio is 3: 2, but the number of pixels and the aspect ratio are not limited thereto.
- the color filter 30 includes a 3 ⁇ 3 pixel square array in which a first filter G (hereinafter referred to as a G filter) corresponding to G (green) that contributes most to obtain a luminance signal.
- a second filter R hereinafter referred to as an R filter
- R corresponding to R (red) is disposed on the center line in the horizontal direction of the square array, and B (blue).
- the first basic arrangement pattern A, and the filter G are arranged in the center line in the vertical direction of the square arrangement.
- a color filter in which a 6 ⁇ 6 pixel basic array pattern C in which the arrangement is the same and the second arrangement pattern B in which the arrangement of the filter R and the arrangement of the filter B are exchanged is point-symmetrically arranged is repeatedly arranged is there.
- the color filter 30 has the following features (1), (2), (3), (4), and (5).
- the color filter 30 shown in FIG. 2 includes a basic array pattern C composed of square array patterns corresponding to 6 ⁇ 6 pixels, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction. That is, in this color filter array, R, G, and B color filters (R filter, G filter, and B filter) are arrayed with a predetermined periodicity.
- the R filter, the G filter, and the B filter are arranged with a predetermined periodicity in this way, when performing the synchronization (interpolation) processing of the R, G, and B signals read from the color image sensor, the pattern repeatedly Can be processed according to
- the color filter array of the reduced image after the thinning process can be the same as the color filter array before the thinning process, and a common processing circuit is provided. Can be used.
- the G filter corresponding to the color (G color in this embodiment) that contributes most to obtain the luminance signal corresponds to the horizontal, vertical, and diagonal lines of the color filter array. Is placed inside.
- G filters corresponding to luminance pixels are arranged in horizontal, vertical, and diagonal lines of the color filter array, improving the reproducibility of synchronization processing in the high frequency range regardless of the direction of high frequency. Can be made.
- the color filter 30 shown in FIG. 2 has an R filter and a B filter corresponding to two or more other colors (in this embodiment, R and B colors) other than the G color, And arranged in each vertical line.
- the R filter and B filter are arranged in the horizontal and vertical lines of the color filter array, the occurrence of color moire (false color) can be suppressed. Thereby, it is possible to prevent an optical low-pass filter for suppressing the occurrence of false colors from being arranged in the optical path from the incident surface of the optical system to the imaging surface. Even when an optical low-pass filter is applied, it is possible to apply a filter having a weak function of cutting a high-frequency component for preventing the occurrence of false colors, so that the resolution is not impaired.
- the basic array pattern C includes a 3 ⁇ 3 pixel first array pattern A surrounded by a broken line frame, and a 3 ⁇ 3 pixel second array pattern B surrounded by a dashed line frame.
- the arrangement is arranged alternately in the horizontal and vertical directions.
- G filters which are luminance pixels, are arranged at the four corners and the center, and are arranged on both diagonal lines.
- the B filter is arranged in the horizontal direction and the R filter is arranged in the vertical direction across the center G filter, while the second arrangement pattern B is arranged in the center G filter.
- the R filters are arranged in the horizontal direction, and the B filters are arranged in the vertical direction. That is, in the first arrangement pattern A and the second arrangement pattern B, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
- the G filters at the four corners of the first array pattern A and the second array pattern B have the first array pattern A and the second array pattern B alternately in the horizontal and vertical directions as shown in FIG. Are arranged in a square array corresponding to 2 ⁇ 2 pixels.
- the color filter 30 shown in FIG. 2 includes a square array corresponding to 2 ⁇ 2 pixels made of a G filter.
- 2 ⁇ 2 pixels composed of G filters are taken out, the absolute difference between the pixel values of the G pixels in the horizontal direction, the absolute difference between the pixel values of the G pixels in the vertical direction, the diagonal direction (upper right diagonal, By obtaining the absolute difference value of the pixel values of the G pixel in the upper left diagonal), it can be determined that there is a correlation in the direction in which the absolute difference value is smaller among the horizontal direction, the vertical direction, and the oblique direction.
- this color filter array it is possible to determine a direction having a high correlation among the horizontal direction, the vertical direction, and the diagonal direction by using the information of the G pixel having the minimum pixel interval.
- This direction discrimination result can be used for a process of interpolating from surrounding pixels (synchronization process).
- the basic arrangement pattern C of the color filter 30 shown in FIG. 2 is point-symmetric with respect to the center of the basic arrangement pattern C (the centers of the four G filters). Further, as shown in FIG. 2, the first array pattern A and the second array pattern B in the basic array pattern C are also point-symmetric with respect to the central G filter.
- the color filter array of the first and third lines of the first to sixth lines in the horizontal direction is GRGGBG, and the color filter array of the second line Is BGBGR, the color filter array of the fourth and sixth lines is GBGGRG, and the color filter array of the fifth line is RGRBGB.
- the basic array pattern C in which the basic array pattern is point-symmetric is referred to as a basic array pattern for convenience.
- FIG. 24 is a view showing a modification of the color filter according to the present embodiment.
- the color filter 30A shown in the figure includes a basic array pattern C composed of a square array pattern corresponding to 4 ⁇ 4 pixels, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction.
- the G filter is arranged in each horizontal, vertical, and diagonal line of the color filter array, and the R filter and the B filter are color filters. It is arranged in each horizontal and vertical line of the filter array.
- the basic array pattern C is point-symmetric with respect to the center of the basic array pattern C.
- the color filter 30A does not include a square array corresponding to 2 ⁇ 2 pixels made of a G filter, but has a G filter adjacent in the horizontal direction, and has an oblique direction (upper right oblique, upper left oblique). Has adjacent G filters.
- the pixel values of G pixels corresponding to these G filters can be used when determining the correlation in the vertical direction.
- the color filter 30A has the same characteristics as the characteristics (1), (2), (3), and (5) of the color filter 30 shown in FIG.
- FIG. 3 shows a part of the color filter 40 in the Bayer array.
- the color filter shown in the same figure also has (4896 ⁇ 3264) pixels as an example, and the aspect ratio is 3: 2.
- the Bayer array color filter 40 has G filters arranged on two pixels on one diagonal line of a square array of 2 ⁇ 2 pixels, and R on two pixels on the other diagonal line. The filter and the B filter are arranged.
- the imaging processing unit 16 performs predetermined processing such as amplification processing, correlated double sampling processing, A / D conversion processing, and the like on the imaging signal output from the imaging device 14 and supplies the pixel conversion processing unit 18 as pixel data. Output.
- This pixel data is pixel data corresponding to the arrangement of the color filters 30 shown in FIG.
- the pixel conversion processing unit 18 converts pixel data corresponding to the array of the color filters 30 shown in FIG. 2 output from the imaging processing unit 16 into pixel data corresponding to the Bayer array shown in FIG. To do.
- the image processing unit 20 performs so-called synchronization processing on the pixel data corresponding to the Bayer array output from the pixel conversion processing unit 18. That is, for all pixels, pixel data of colors other than the corresponding color is interpolated from the pixel data of surrounding pixels to generate R, G, and B pixel data of all pixels. Then, so-called YC conversion processing is performed on the generated R, G, and B pixel data to generate luminance data Y and color difference data Cr and Cb. Then, a resizing process for resizing these signals to a size corresponding to the shooting mode is performed.
- the driving unit 22 performs reading driving of the imaging signal from the imaging device 14 in accordance with an instruction from the control unit 24.
- the control unit 24 controls the drive unit 22, the pixel conversion processing unit 18, the image processing unit 20, and the like according to the shooting mode and the like. Although details will be described later, the control unit 24 instructs the drive unit 22 to read out the imaging signal by a readout method according to the imaging mode, or instructs the pixel conversion processing unit 18 according to the imaging mode. An instruction is given to perform pixel conversion processing, or an instruction is given to the image processing unit 20 to perform image processing according to the shooting mode.
- control unit 24 instructs the drive unit 22 to read out the image pickup signal using a thinning method according to the instructed shooting mode. To do.
- an HD moving image mode in which a captured image is thinned to generate relatively high resolution HD (high definition) moving image data and recorded on a recording medium such as a memory card (not shown), and shooting is performed.
- a through moving image mode is set in which images are thinned out and a relatively low resolution through moving image is output to a display unit (not shown).
- FIG. 4 schematically shows a flow of processing when shooting is performed in the HD moving image mode in the case where the imaging device 14 having the conventional configuration using the color filter 40 with the Bayer arrangement is used.
- the R pixel pixel mixture of three R pixels in the first line frame Ra (including 5 pixels) shown in the upper left of FIG. 4 (for example, the average of the pixel values of the three R pixels)
- the R pixel R1 in the first line of the reduced image shown in the upper right of the figure is used, and the G pixel is within the frame Ga (including 5 pixels) in the first line shown in the upper left of FIG.
- the G pixel is within the frame Ga (including 5 pixels) in the first line shown in the upper left of FIG.
- G pixel G2 in the first line of the reduced image shown in the upper right of FIG.
- B pixel the three B pixels in the frame Ba (including 5 pixels) in the second line shown in the upper left of FIG. 4 are mixed to form a B pixel B1 in the second line shown in the upper right of the figure.
- pixel data of a reduced Bayer array as shown in the upper right of the figure is generated.
- the above-described synchronization processing and YC conversion processing (Bayer processing) are performed on the pixel data on the upper right side of the figure to generate luminance data Y and color difference data Cr and Cb as shown on the lower left side of the figure.
- a resizing process is performed to resize the luminance data Y and the color difference data Cr and Cb to a size corresponding to the shooting mode.
- the number of pixels is (1920 ⁇ 1080) and the image is resized to an aspect ratio of 16: 9.
- FIG. 5 schematically shows a flow of processing when shooting is performed in the through moving image mode (live view mode) in the case where the imaging element 14 having the conventional configuration using the color filter 40 with the Bayer arrangement is used.
- the Bayer processing shown in the lower left of FIG. 4 and the lower left of FIG. 5 is a synchronization processing and YC conversion processing corresponding to the Bayer arrangement, when the color filter 30 having an arrangement different from the Bayer arrangement shown in FIG. The Bayer process needs to be changed.
- the pixel conversion processing unit 18 is provided, and the pixel conversion processing unit 18 converts the pixel data corresponding to the array of the color filters 30 output from the imaging processing unit 16 into pixel data corresponding to the Bayer array. Since the data is converted and output to the image processing unit 20, it is not necessary to change the image processing unit 20.
- the control unit 24 instructs the drive unit 22 to read out the imaging signal by thinning out by a thinning method corresponding to the shooting mode.
- the control unit 24 instructs the pixel conversion processing unit 18 to execute the pixel conversion processing corresponding to the shooting mode, and performs Bayer processing (synchronization processing and YC conversion processing) and resizing processing corresponding to the shooting mode.
- the image processing unit 20 is instructed to execute.
- step 100 pixel data is input from the imaging processing unit 16.
- step 102 a pixel conversion process for converting the input pixel data into pixel data corresponding to the Bayer array is executed.
- step 104 the converted pixel data is output to the image processing unit 20.
- the pixel conversion processing unit 18 can be configured by a computer including a CPU, ROM, RAM, nonvolatile ROM, and the like.
- the processing program for the above processing can be stored in advance in a nonvolatile ROM, for example, and can be read and executed by the CPU.
- pixel data of each line read out by being thinned out from the image sensor 14 is thinned out from the image sensor 14 by mixing pixel data of pixels of the same color adjacent in the horizontal direction.
- pixel data of each line read out in this way is converted into pixel data of a Bayer array will be described.
- FIG. 7 schematically shows a processing flow when shooting in the HD video mode.
- the (6n + 1) th and (6n + 3) th lines are output from the image sensor 14. That is, among the six lines in the vertical direction of the basic array pattern C, the second, fourth to sixth lines are thinned out, and the pixel data for two lines of the first and third lines is read for each basic array pattern.
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color that are adjacent in the horizontal direction with the pixel data that has been read out as shown in the upper right of FIG. Both generate pixel data having a pixel number (1632 ⁇ 1088) that is 1/3 of the number of pixels of the image sensor 14.
- the G pixel two G pixels in the frame Ga of the first line shown in the upper left of FIG. 7 are pixel-mixed (for example, an average value of the pixel values of the two G pixels is taken, and so on). Then, the G pixel G1 of the first line of the reduced image shown in the upper right of the figure and the two G pixels in the frame Gb of the third line are mixed to obtain 1 of the reduced image shown in the upper right of the figure.
- the G pixel G2 of the line is assumed.
- the B pixel one B pixel in the frame Ba of the first line shown in the upper left of FIG.
- pixel data obtained by mixing pixel data of G pixels adjacent to each other on the (6n + 1) th line and the (6n + 3) th line in the vertical direction is applied to G pixels at corresponding positions in the Bayer array pattern
- the pixel data of the B pixel on the (6n + 1) th line is applied to the B pixel at the corresponding position in the Bayer array pattern
- the pixel data of the R pixel on the (6n + 3) th line in the vertical direction is matched with the Bayer array pattern.
- the Bayer process (synchronization process and YC conversion process) and the resizing process are executed in the image processing unit 20 in the same manner as in FIG. 4, and the resized image is output.
- This embodiment is different from the first embodiment in how to thin out imaging signals, and the rest is the same as in the first embodiment.
- FIG. 8 schematically shows the flow of processing when shooting in the HD video mode.
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color that are adjacent in the horizontal direction with the pixel data that has been read out as shown in the upper right of FIG. Both generate pixel data having a pixel number (1632 ⁇ 1088) that is 1/3 of the number of pixels of the image sensor 14.
- one pixel in the frame Ga of the second line shown in the upper left of FIG. 8 is set as the G pixel G1 of the first line of the reduced image shown in the upper right of the same figure
- the fifth line One pixel in the frame Gb is a G pixel G2 in the first line of the reduced image shown in the upper right of the figure.
- pixel data obtained by mixing two R pixels in the frame Ra of the fifth line shown in the upper left of FIG. 8 is set as the R pixel of the first line of the reduced image shown in the upper right of the figure, and for the B pixel.
- pixel data of a reduced Bayer array as shown in the upper right of the figure is generated. That is, the pixel data of the (6n + 2) th line and the (6n + 5) th G pixel in the vertical direction are respectively applied to the G pixel at the corresponding position in the Bayer array pattern, and adjacent R on the (6n + 2) th line.
- the pixel data obtained by mixing the pixel data of the pixels is applied to the R pixel at the corresponding position in the Bayer array pattern, and the pixel data obtained by mixing the pixel data of the adjacent B pixels on the (6n + 5) th line is the Bayer array.
- the pixel data of each line read out by being thinned out from the image sensor 14 is converted into Bayer array pixel data.
- the pixel data read out by thinning includes a large number of R pixels and B pixels, so that an image in which false colors are not easily obtained can be obtained. Note that the processing after the pixel conversion processing is the same as that in the above embodiment, and a description thereof will be omitted.
- This embodiment is different from the first embodiment in how to thin out imaging signals, and the rest is the same as in the first embodiment.
- FIG. 9 schematically shows the flow of processing when shooting in the HD video mode.
- the (6n + 1) th and (6n + 4) th lines are output from the image sensor 14. That is, among the six lines in the vertical direction of the basic array pattern C, the second, third, fifth, and sixth lines are thinned out, and pixel data for two lines of the first and fourth lines is read for each basic array pattern. repeat.
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color that are adjacent in the horizontal direction with the pixel data that has been read out as shown in the upper right of FIG. Both generate pixel data having a pixel number (1632 ⁇ 1088) that is 1/3 of the number of pixels of the image sensor 14.
- pixel data obtained by mixing two pixels in the frame Ga of the first line shown in the upper left of FIG. 9 is combined with the G pixel 1 of the first line of the reduced image shown in the upper right of the same figure.
- the pixel data obtained by mixing the two pixels in the frame Gb on the fourth line is set as the G pixel 2 on the second line of the reduced image shown in the upper right of FIG.
- the R pixel one R pixel in the frame Ra of the first line shown in the upper left of FIG. 9 is set as the R pixel R1 of the first line of the reduced image shown in the upper right of the same figure, and for the B pixel in the upper left of FIG.
- the pixel data obtained by mixing the two B pixels in the frame Ba on the fourth line shown in the figure is defined as the B pixel B1 on the second line shown in the upper right of the figure.
- pixel data of a reduced Bayer array as shown in the upper right of the figure is generated. That is, pixel data obtained by mixing pixel data of adjacent G pixels on the (6n + 1) -th line and the (6n + 4) -th line in the vertical direction is applied to G pixels at corresponding positions in the Bayer array pattern, respectively.
- the pixel data of the R pixel on the (6n + 1) th line is applied to the R pixel at the corresponding position in the Bayer array pattern, and the pixel data obtained by mixing the pixel data of the adjacent B pixel on the (6n + 4) th line is By applying the B pixel at the corresponding position in the Bayer array pattern, the pixel data of each line read out by being thinned out from the image sensor 14 is converted into the Bayer array pixel data.
- the pixel data read out by thinning includes many G pixels, and the R pixel and the B pixel have a checkered pattern. Produces images that are difficult to produce to some extent.
- This embodiment is different from the first embodiment in that the shooting mode is the through video mode (live view mode) and the method of thinning out the image pickup signal is different, and the rest is the same as in the first embodiment.
- FIG. 10 schematically shows the flow of processing when shooting in the through video mode.
- illustration of the Bayer process and the resizing process is omitted (the same applies to the following embodiments).
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color adjacent in the horizontal direction to the read out pixel data that has been thinned out, so that the horizontal direction is 1 / 3. Generate pixel data of the number of pixels (1632 ⁇ 544) whose vertical direction is 1/6.
- pixel data G1 of the first line of the reduced image shown on the right of the same figure is used as pixel data G1 of the first line of the reduced image shown on the right of the same figure.
- pixel data obtained by mixing two G pixels in the frame Gb of the seventh line is set as a G pixel G2 of the second line of the reduced image shown on the right side of the figure.
- the pixel data of one R pixel in the frame Ra of the first line shown on the left side of FIG. 10 is set as the R pixel R1 of the first line of the reduced image shown on the right side of FIG.
- the pixel data of one B pixel in the frame Ba of the seventh line shown on the left 10 is assumed to be the B pixel B1 of the second line shown on the right of the figure. Thereafter, similar processing is performed to generate pixel data of a reduced Bayer array as shown on the right side of the figure.
- pixel data obtained by mixing pixel data of G pixels adjacent to each other on the (12n + 1) th line and the (12n + 7) th line in the vertical direction is applied to G pixels at corresponding positions in the Bayer array pattern
- the pixel data of the R pixel on the (12n + 1) th line in the vertical direction is applied to the R pixel at the corresponding position in the Bayer array pattern
- the pixel data of the B pixel on the (12n + 7) th line in the vertical direction is applied to the Bayer array.
- This embodiment is different from the fourth embodiment in how to thin out imaging signals, and the rest is the same as in the fourth embodiment.
- FIG. 11 schematically shows the flow of processing when shooting in the through video mode.
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color adjacent in the horizontal direction to the read out pixel data that has been thinned out, so that the horizontal direction is 1 / 3. Generate pixel data of the number of pixels (1632 ⁇ 544) whose vertical direction is 1/6.
- the pixel data of one G pixel in the frame Ga of the second line shown on the left of FIG. 11 is set as the G pixel G1 of the first line of the reduced image shown on the right of the same figure.
- the pixel data of one G pixel in the frame Gb of the eighth line is set as the G pixel G2 of the first line of the reduced image shown on the right side of the figure.
- pixel data obtained by mixing two R pixels in the frame Ra of the second line shown on the left in FIG. 11 is set as the R pixel R1 of the first line of the reduced image shown on the right of the figure, and for the B pixel.
- FIG. 11 represents pixel data obtained by mixing two B pixels in the frame Ba on the eighth line shown on the left side of FIG. 11 as a B pixel B1 on the second line shown on the right side of FIG. Thereafter, similar processing is performed to generate pixel data of a reduced Bayer array as shown on the right side of the figure. That is, the pixel data of the G pixel on the (12n + 2) th line and the (12n + 8) th line in the vertical direction are respectively applied to the G pixel at the corresponding position in the Bayer arrangement pattern, and the (12n + 2) th line in the vertical direction.
- the pixel data obtained by mixing the above adjacent R pixels is applied to the R pixel at the corresponding position in the Bayer array pattern, and the pixel data obtained by mixing the adjacent B pixels on the (12n + 8) th line in the vertical direction is the Bayer
- the pixel data of each line read out by being thinned out from the image sensor 14 is converted into Bayer array pixel data.
- This embodiment is different from the fourth embodiment in how to thin out imaging signals, and the rest is the same as in the fourth embodiment.
- FIG. 12 schematically shows the flow of processing when shooting in the through video mode.
- the pixel conversion processing unit 18 mixes pixel data of pixels of the same color adjacent in the horizontal direction to the read out pixel data that has been thinned out, so that the horizontal direction is 1 / 3. Generate pixel data of the number of pixels (1632 ⁇ 362) whose vertical direction is 1/9.
- the pixel data of one G pixel in the frame Ga of the second line shown on the left of FIG. 12 is set as the G pixel G1 of the first line of the reduced image shown on the right of the same figure.
- the pixel data of one G pixel in the frame Gb on the eleventh line is taken as the G pixel G2 on the first line of the reduced image shown on the right side of the figure.
- pixel data obtained by mixing two B pixels in the frame Ba on the second line shown on the left in FIG. 12 is set as the B pixel B1 on the first line of the reduced image shown on the right in the figure, and the R pixel is displayed.
- the pixel data obtained by mixing two R pixels in the frame Ra on the eleventh line shown on the left in FIG. 12 is defined as the R pixel R1 on the second line shown on the right in the figure.
- Similar processing is performed to generate pixel data of a reduced Bayer array as shown on the right side of the figure. That is, the pixel data of the G pixel on the (18n + 2) th line and the (18n + 11) th line in the vertical direction are respectively applied to the G pixel at the corresponding position in the Bayer array pattern, and the (18n + 2) th line in the vertical direction.
- the pixel data obtained by mixing the above adjacent B pixels is applied to the B pixel at the corresponding position in the Bayer array pattern, and the pixel data obtained by mixing the adjacent R pixels on the (18n + 11) th line in the vertical direction is the Bayer
- the pixel data of each line read out by being thinned out from the image sensor 14 is converted into Bayer array pixel data.
- the pixel conversion processing unit 18 weights and adds pixel data of pixels of the same color adjacent in the horizontal direction to the pixel data of each line read out by being thinned out from the image sensor 14.
- R, G, and B pixel data at each pixel position is generated, and Bayer array pixel data is generated based on the generated pixel data of each color.
- FIG. 13 schematically shows the flow of processing when shooting in the HD video mode.
- the pixel conversion processing unit 18 first adds the pixel data of pixels of the same color adjacent in the horizontal direction to the pixel data of each line read out by thinning out from the image sensor 14 according to the pixel position. Pixel data of R, G, and B at the position is generated.
- G pixel based on the pixel data of two G pixels in the frame Ga (including 3 pixels) of the first line shown in the upper left of FIG.
- G pixel data at each pixel position in the frame is generated.
- the center pixel in the frame Ga originally has only B pixels, but G pixel data is interpolated by the above weighted addition.
- the (6n + 1) th line and the (6n + 1) th line while shifting the frame Ga in the horizontal direction, the (6n + 1) th line and all the (6n + 3) th pixel positions are processed.
- the above-mentioned weighted addition is performed while shifting the frames Ra and Ba (both of which include 7 pixels) shown in the upper left of the figure in the horizontal direction, and the (6n + 1) th line and ( Pixel data of R pixels and B pixels is generated for all pixel positions of the 6n + 3) th line.
- Ra and Ba both of which include 7 pixels
- 6n + 1) th line and ( Pixel data of R pixels and B pixels is generated for all pixel positions of the 6n + 3) th line.
- R, G, B pixel data of the (6n + 1) th and (6n + 3) th lines are generated.
- the pixel conversion processing unit 18 extracts pixel data at a position corresponding to the Bayer array pattern from the R, G, and B pixel data generated as described above, as shown in the lower left of FIG. Then, pixel data of a Bayer arrangement pattern as shown in the lower left of the figure is generated. Then, the pixel data of the Bayer arrangement pattern shown in the lower left of the figure is mixed with pixels of the same color adjacent in the horizontal direction so that the number of pixels of the pixel data in the lower left of the figure is halved in the horizontal direction. Pattern pixel data is generated.
- pixel data obtained by mixing two G pixels in the frame Ga of the first line shown in the lower left of FIG. 13 is the G pixel of the first line of the reduced image shown in the lower right of the figure.
- pixel data obtained by mixing two G pixels in the frame Gb of the second line shown in the lower left of FIG. 13 is set as the G pixel G1 of the second line of the reduced image shown in the upper right of the figure.
- pixel data obtained by mixing two R pixels in the frame Ra of the first line shown in the upper left of FIG. 13 is set as the R pixel R1 of the first line of the reduced image shown in the lower right of the figure, and the B pixel.
- pixel data obtained by mixing two B pixels in the frame Ba of the second line shown in the lower left of FIG. 13 is defined as a B pixel B1 of the second line shown in the lower right of the figure. Thereafter, similar processing is performed to generate pixel data of a reduced Bayer array as shown in the lower right of FIG.
- This embodiment is different from the seventh embodiment in the method of thinning image signals, and the others are the same as in the seventh embodiment.
- FIG. 14 schematically shows a processing flow when shooting in the HD moving image mode.
- the subsequent processing is the same as in the seventh embodiment.
- This embodiment is different from the seventh embodiment in the method of thinning image signals, and the others are the same as in the seventh embodiment.
- FIG. 15 schematically shows the flow of processing when shooting in the HD video mode.
- the subsequent processing is the same as in the seventh embodiment.
- This embodiment is different from the seventh embodiment in the shooting mode and the method of thinning out the imaging signals, and the other is the same as in the seventh embodiment.
- FIG. 16 schematically shows the flow of processing when shooting in the through video mode.
- the subsequent processing is the same as in the seventh embodiment.
- This embodiment is different from the seventh embodiment in the shooting mode and the method of thinning out the imaging signals, and the other is the same as in the seventh embodiment.
- FIG. 17 schematically shows the flow of processing when shooting in the through video mode.
- the subsequent processing is the same as in the seventh embodiment.
- This embodiment is different from the seventh embodiment in the shooting mode and the method of thinning out the imaging signals, and the other is the same as in the seventh embodiment.
- FIG. 18 schematically shows the flow of processing when shooting in the through video mode.
- the pixel conversion processing unit 18 generates pixel data for two lines corresponding to the Bayer arrangement pattern from the pixel data of each line read out by thinning out from the image sensor 14, and the generated 2 The line pixel data is converted into Bayer array pixel data.
- FIG. 19 schematically shows the flow of processing when shooting in the HD video mode.
- pixel data obtained by mixing two G pixels in the frame Ga is shown on the right side of FIG.
- the pixel data obtained by mixing two G pixels in a frame Gb (including three pixels) adjacent to the left side of the frame Ga as the G pixel G1 in the first line of the reduced image is shown on the right side of FIG.
- the G pixel G2 in the second line of the image is assumed.
- one R pixel in the frame Ra of the first line shown on the left side of FIG. 19 is set as the R pixel R1 of the first line of the reduced image shown on the right side of FIG.
- One B pixel in the frame Ba on the first line shown on the left is defined as a B pixel B1 on the second line shown on the right in the figure.
- Similar processing is performed on the pixel data of the (6n + 1) th and (6n + 3) th lines, thereby generating pixel data of a reduced Bayer array as shown on the right side of FIG. That is, pixel data obtained by mixing pixel data of two sets of adjacent G pixels on the (6n + 1) th line in the vertical direction is used as the (2n + 1) th line and (2n + 2) th in the vertical direction of the Bayer array pattern.
- the pixel data obtained by applying the pixel data of two sets of adjacent G pixels on the (6n + 3) th line in the vertical direction to the G pixel at the corresponding position in the line is represented as (2n + 3) in the vertical direction of the Bayer array pattern.
- This embodiment is different from the thirteenth embodiment in the shooting mode and the thinning process, and the others are the same as in the thirteenth embodiment.
- FIG. 20 schematically shows the flow of processing when shooting in the through video mode.
- pixel data obtained by mixing two G pixels in the frame Ga is mixed on the first line of the reduced image shown on the right in the figure.
- pixel data obtained by mixing two G pixels in the frame Gb is set as a G pixel G2 in the second line of the reduced image shown on the right side of the figure.
- R pixel one R pixel in the frame Ra on the first line shown on the left in FIG. 20 is set as the R pixel on the first line of the reduced image shown on the right in FIG.
- Each pixel is applied to the G pixel, and the pixel data of the R pixel on the (6n + 1) th line is applied to the R pixel at the corresponding position of the (2n + 1) th line of the Bayer array pattern, and the B pixel on the (6n + 1) th line Is applied to the pixels of the third color at the corresponding positions of the (2n + 2) th line of the Bayer array pattern, so that the pixel data of each line read out by being thinned out from the image sensor 14 is Bayer array. Convert to pixel data.
- This embodiment is different from the thirteenth embodiment in the shooting mode and the thinning process, and the others are the same as in the thirteenth embodiment.
- FIG. 21 schematically shows the flow of processing when shooting in the through video mode.
- the pixel data of one pixel in the frame Ga is set as the G pixel G 1 on the first line of the reduced image shown on the right in the figure.
- the pixel data of one pixel in the frame Gb is set as the G pixel G2 of the second line of the reduced image shown on the right side of the figure.
- one R pixel in the frame Ra of the second line shown on the left side of FIG. 21 is set as the R pixel R1 of the first line of the reduced image shown on the right side of FIG.
- One B pixel in the frame Ba on the second line shown on the left is defined as a B pixel B1 on the second line shown on the right in the figure.
- the same processing is performed on the pixel data of the (12n + 2) th and (12n + 8) th lines.
- the positions of the R pixel and the B pixel are switched as shown on the right side of the figure.
- pixel data of a reduced Bayer array as shown on the right side of the figure is generated.
- two sets of G pixels on the (12n + 2) th line are respectively applied to G pixels at corresponding positions of the (2n + 1) th line and the (2n + 2) th line in the vertical direction of the Bayer array pattern, and (12n + 2
- the pixel data obtained by mixing adjacent R pixels on the second line is applied to the R pixel at the corresponding position of the (2n + 1) th line of the Bayer array pattern, and the adjacent R pixel on the (12n + 8) th line is applied.
- the pixel data obtained by mixing the pixels is applied to the R pixel at the corresponding position of the (2n + 4) th line of the Bayer array pattern, and the pixel data obtained by mixing the adjacent B pixels on the (12n + 2) th line is used as the pixel data of the Bayer array pattern.
- Fit to the pixel of the third color at the corresponding position of the (2n + 2) th line, and (12n + 8) th The pixel data obtained by mixing the adjacent B pixels on the line is read out by thinning out from the image sensor 14 by applying the pixel data to the third color pixel at the corresponding position of the (2n + 3) th line of the Bayer array pattern.
- the converted pixel data of each line is converted into Bayer array pixel data.
- the color filter array of the three primary colors of RGB has been described, but the type of color filter is not limited to this.
- Imaging device 12 Optical system 14
- Image sensor 16 Imaging process part 18
- Pixel conversion process part 20 Image processing part 22
- Drive part 24 Control part 30
Abstract
Description
12 光学系
14 撮像素子
16 撮像処理部
18 画素変換処理部
20 画像処理部
22 駆動部
24 制御部
30 カラーフィルタ
Claims (21)
- 水平方向及び垂直方向に配列された複数の光電変換素子を含む撮像素子と、
前記複数の光電変換素子からなる複数の画素上に設けられたカラーフィルタであって、輝度信号を得るために最も寄与する第1の色に対応する第1のフィルタが、3×3画素の正方配列の四隅及び中央の画素上に配置され、前記第1の色と異なる第2の色に対応する第2のフィルタが、前記正方配列の前記水平方向における中央のラインに配置され、前記第1の色及び前記第2の色と異なる第3の色に対応する第3のフィルタが、前記正方配列の前記垂直方向における中央のラインに配置された第1の配列パターンと、前記第1の配列パターンと前記第1のフィルタの配置が同一で且つ前記第2のフィルタの配置と前記第3のフィルタの配置とを入れ替えた第2の配列パターンと、が点対称で配置された6×6画素の基本配列パターンが繰り返し配置されたカラーフィルタと、
前記垂直方向における予め定めた位置のライン上の画素の画素データのみを読み出すように前記撮像素子を駆動する駆動手段と、
前記撮像素子から間引いて読み出された各ラインの画素データを、2×2画素の正方配列の一方の対角線上の2つの画素が前記第1の色の画素、他方の対角線上の2つの画素が前記第2の色の画素及び前記第3の色の画素で配置されたベイヤー配列パターンであるベイヤー配列画素データに変換する画素変換手段と、
前記ベイヤー配列画素データに基づいて、各画素について、対応する色以外の色の画素データを周囲の画素の画素データから補間することにより、各画素の各色の画素データを生成する生成手段と、
を備えた撮像装置。 - 前記画素変換手段は、前記撮像素子から間引いて読み出された各ラインの画素データを、前記水平方向に隣接する同色の画素の画素データを画素混合することにより、前記撮像素子から間引いて読み出された各ラインの画素データをベイヤー配列画素データに変換する
請求項1記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン及び(6n+3)番目(n=0、1、2、・・・)の第3のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン及び前記第3のライン上の隣接する前記第1の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に各々当てはめ、前記第1のライン上の前記第3の色の画素データを前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめ、前記第3のライン上の前記第2の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+2)番目(n=0、1、2、・・・)の第2のライン及び(6n+5)番目(n=0、1、2、・・・)の第5のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン及び第5のライン上の前記第1の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に当てはめ、前記第2のライン上の隣接する前記第2の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめ、前記第5のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン及び(6n+4)番目(n=0、1、2、・・・)の第4のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン及び前記第4のライン上の隣接する前記第1の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に各々当てはめ、前記第1のライン上の前記第2の色の画素データを前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめ、前記第4のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(12n+1)番目(n=0、1、2、・・・)の第1のライン及び(12n+7)番目(n=0、1、2、・・・)の第7のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン及び前記第7のライン上の隣接する前記第1の色の画素データを各々画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に当てはめ、前記垂直方向における前記第1のライン上の前記第2の色の画素データを前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめ、前記垂直方向における前記第7のライン上の前記第3の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(12n+2)番目(n=0、1、2、・・・)の第2のライン及び(12n+8)番目(n=0、1、2、・・・)の第8のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン及び前記第8のライン上の前記第1の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に当てはめ、前記垂直方向における前記第2のライン上の隣接する前記第2の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめ、前記垂直方向における前記第8のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(18n+2)番目(n=0、1、2、・・・)の第2のライン及び(18n+11)番目(n=0、1、2、・・・)の第11のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン及び第11のライン上の前記第1の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第1の色の画素に当てはめ、前記垂直方向における前記第2のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの対応する位置の前記第3の色の画素に当てはめ、前記垂直方向における前記第11のライン上の前記第2の色の画素データを、前記ベイヤー配列パターンの対応する位置の前記第2の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項2記載の撮像装置。 - 前記画素変換手段は、前記撮像素子から間引いて読み出された各ラインの画素データについて、前記水平方向に隣接する同色の画素の画素データを画素位置に応じて重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項1記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン及び(6n+3)番目(n=0、1、2、・・・)の第3のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン及び前記第3のライン上の隣接する同色の画素の画素データを画素位置に応じて各々重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+2)番目(n=0、1、2、・・・)の第2のライン及び(6n+5)番目(n=0、1、2、・・・)の第5のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン及び前記第5のライン上の隣接する同色の画素の画素データを画素位置に応じて各々重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン及び(6n+4)番目(n=0、1、2、・・・)の第4のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン及び前記第4のライン上の隣接する同色の画素の画素データを画素位置に応じて各々重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン上の隣接する同色の画素の画素データを画素位置に応じて重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+2)番目(n=0、1、2、・・・)の第2のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン上の隣接する同色の画素の画素データを画素位置に応じて重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(9n+2)番目(n=0、1、2、・・・)の第2のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン上の隣接する同色の画素の画素データを画素位置に応じて重み付け加算することにより、各画素位置の前記第1の色、前記第2の色、及び前記第3の色の画素データを生成し、当該生成した各色の画素データに基づいて、前記ベイヤー配列画素データを生成する
請求項9記載の撮像装置。 - 前記画素変換手段は、前記撮像素子から間引いて読み出された各ラインの画素データから、前記ベイヤー配列パターンに対応した2ライン分の画素データを各々生成し、当該生成した2ラインの画素データを前記ベイヤー配列画素データに変換する
請求項1記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン及び(6n+3)番目(n=0、1、2、・・・)の第3のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン上の2組の隣接する前記第1の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの前記垂直方向における(2n+1)番目(n=0、1、2、・・・)の第1のベイヤー配列ライン及び(2n+2)番目(n=0、1、2、・・・)の第2のベイヤー配列ラインの対応する位置の前記第1の色の画素に各々当てはめ、前記第3のライン上の2組の隣接する前記第1の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの前記垂直方向における(2n+3)番目(n=0、1、2、・・・)の第3のベイヤー配列ライン及び(2n+4)番目(n=0、1、2、・・・)の第4のベイヤー配列ラインの対応する位置の前記第1の色の画素に各々当てはめ、前記第1のライン及び前記第3のライン上の前記第2の色の画素データを前記ベイヤー配列パターンの前記第1のベイヤー配列ライン及び前記第3のベイヤー配列ラインの対応する位置の前記第2の色の画素に各々当てはめ、前記第1のライン及び前記第3のライン上の前記第3の色の画素データを、前記ベイヤー配列パターンの前記第2のベイヤー配列ライン及び前記第4のベイヤー配列ラインの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項16記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(6n+1)番目(n=0、1、2、・・・)の第1のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第1のライン上の2組の隣接する前記第1の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの前記垂直方向における(2n+1)番目(n=0、1、2、・・・)の第1のベイヤー配列ライン及び(2n+2)番目(n=0、1、2、・・・)の第2のベイヤー配列ラインの対応する位置の前記第1の色の画素に各々当てはめ、前記第1のライン上の前記第2の色の画素データを前記ベイヤー配列パターンの前記第1のベイヤー配列ラインの対応する位置の前記第2の色の画素に当てはめ、前記第1のライン上の前記第3の色の画素データを、前記ベイヤー配列パターンの前記第2のベイヤー配列ラインの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項16記載の撮像装置。 - 前記駆動手段は、前記垂直方向における(12n+2)番目(n=0、1、2、・・・)の第2のライン及び(12n+8)番目(n=0、1、2、・・・)の第8のライン上の画素の画素データのみを間引いて読み出すように前記撮像素子を駆動し、
前記画素変換手段は、前記第2のライン上の2組の前記第1の色の画素データを、前記ベイヤー配列パターンの前記垂直方向における(2n+1)番目(n=0、1、2、・・・)の第1のベイヤー配列ライン及び(2n+2)番目(n=0、1、2、・・・)の第2のベイヤー配列ラインの対応する位置の前記第1の色の画素に各々当てはめ、前記第8のライン上の2組の前記第1の色の画素データを、前記ベイヤー配列パターンの前記垂直方向における(2n+3)番目(n=0、1、2、・・・)の第3のベイヤー配列ライン及び(2n+4)番目(n=0、1、2、・・・)の第4のベイヤー配列ラインの対応する位置の前記第1の色の画素に各々当てはめ、前記第2のライン上の隣接する前記第2の色の画素データを画素混合した画素データを前記ベイヤー配列パターンの前記第1のベイヤー配列ラインの対応する位置の前記第2の色の画素に当てはめ、前記第8のライン上の隣接する前記第2の色の画素データを画素混合した画素データを前記ベイヤー配列パターンの前記第4のベイヤー配列ラインの対応する位置の前記第2の色の画素に当てはめ、前記第2のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの前記第2のベイヤー配列ラインの対応する位置の前記第3の色の画素に当てはめ、前記第8のライン上の隣接する前記第3の色の画素データを画素混合した画素データを、前記ベイヤー配列パターンの前記第3のベイヤー配列ラインの対応する位置の前記第3の色の画素に当てはめることにより、前記撮像素子から間引いて読み出された各ラインの画素データを前記ベイヤー配列画素データに変換する
請求項16記載の撮像装置。 - 前記第1の色は、緑(G)色であり、前記第2の色は、赤(R)色及び青(B)色の一方の色であり、前記第3の色は、赤(R)色及び青(B)色の他方の色である
請求項1~19の何れか1項に記載の撮像装置。 - コンピュータを、請求項1~請求項20の何れか1項に記載の撮像装置を構成する画素変換手段として機能させるための撮像プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013504501A JP5474258B2 (ja) | 2011-03-11 | 2011-07-29 | 撮像装置及び撮像プログラム |
EP11860926.2A EP2685725B1 (en) | 2011-03-11 | 2011-07-29 | Imaging device and imaging program |
CN201180069092.3A CN103416067B (zh) | 2011-03-11 | 2011-07-29 | 摄像装置 |
US14/021,536 US8928785B2 (en) | 2011-03-11 | 2013-09-09 | Imaging device and storage medium storing an imaging program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011054631 | 2011-03-11 | ||
JP2011-054631 | 2011-03-11 | ||
JP2011-163306 | 2011-07-26 | ||
JP2011163306 | 2011-07-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/021,536 Continuation US8928785B2 (en) | 2011-03-11 | 2013-09-09 | Imaging device and storage medium storing an imaging program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012124181A1 true WO2012124181A1 (ja) | 2012-09-20 |
Family
ID=46830283
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/067544 WO2012124181A1 (ja) | 2011-03-11 | 2011-07-29 | 撮像装置及び撮像プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8928785B2 (ja) |
EP (1) | EP2685725B1 (ja) |
JP (1) | JP5474258B2 (ja) |
CN (1) | CN103416067B (ja) |
WO (1) | WO2012124181A1 (ja) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012124182A1 (ja) * | 2011-03-11 | 2012-09-20 | 富士フイルム株式会社 | 撮像装置及び撮像プログラム |
WO2013100036A1 (ja) * | 2011-12-27 | 2013-07-04 | 富士フイルム株式会社 | カラー撮像素子 |
JP6016423B2 (ja) * | 2012-04-10 | 2016-10-26 | キヤノン株式会社 | 信号処理装置、撮像装置及び信号処理方法 |
IN2015DE03212A (ja) * | 2015-10-06 | 2015-10-23 | Hcl Technologies Ltd | |
CN106488203B (zh) * | 2016-11-29 | 2018-03-30 | 广东欧珀移动通信有限公司 | 图像处理方法、图像处理装置、成像装置及电子装置 |
CN106507068B (zh) | 2016-11-29 | 2018-05-04 | 广东欧珀移动通信有限公司 | 图像处理方法及装置、控制方法及装置、成像及电子装置 |
CN109246373B (zh) * | 2018-10-31 | 2021-03-02 | 上海集成电路研发中心有限公司 | 一种调整图像传感器输出的图像的像素排列的方法及装置 |
CN112218062B (zh) * | 2020-10-12 | 2022-04-22 | Oppo广东移动通信有限公司 | 图像缩放装置、电子设备、图像缩放方法及图像处理芯片 |
KR20220144087A (ko) * | 2021-04-19 | 2022-10-26 | 삼성전자주식회사 | 이미지 센서 및 이미지 센싱 시스템 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10243407A (ja) * | 1997-02-27 | 1998-09-11 | Olympus Optical Co Ltd | 画像信号処理装置及び画像入力処理装置 |
JP2010512048A (ja) * | 2006-11-30 | 2010-04-15 | イーストマン コダック カンパニー | 低解像度画像の生成 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3268891B2 (ja) * | 1992-08-14 | 2002-03-25 | オリンパス光学工業株式会社 | 内視鏡撮像装置 |
JPH0823543A (ja) | 1994-07-07 | 1996-01-23 | Canon Inc | 撮像装置 |
JPH0823542A (ja) | 1994-07-11 | 1996-01-23 | Canon Inc | 撮像装置 |
EP0930789B1 (en) * | 1998-01-20 | 2005-03-23 | Hewlett-Packard Company, A Delaware Corporation | Colour image pickup device |
JP4098438B2 (ja) | 1999-04-15 | 2008-06-11 | オリンパス株式会社 | カラー撮像素子及びカラー撮像装置 |
JP4487351B2 (ja) * | 1999-07-15 | 2010-06-23 | ソニー株式会社 | 固体撮像素子およびその駆動方法並びにカメラシステム |
JP2002135793A (ja) | 2000-10-20 | 2002-05-10 | Victor Co Of Japan Ltd | カラー撮像装置 |
US7847829B2 (en) * | 2001-01-09 | 2010-12-07 | Sony Corporation | Image processing apparatus restoring color image signals |
JP4019417B2 (ja) | 2003-01-14 | 2007-12-12 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP2004266369A (ja) | 2003-02-21 | 2004-09-24 | Sony Corp | 固体撮像装置およびその駆動方法 |
JP4385282B2 (ja) | 2003-10-31 | 2009-12-16 | ソニー株式会社 | 画像処理装置および画像処理方法 |
JP3960965B2 (ja) | 2003-12-08 | 2007-08-15 | オリンパス株式会社 | 画像補間装置及び画像補間方法 |
EP1558023A2 (en) * | 2004-01-21 | 2005-07-27 | Matsushita Electric Industrial Co., Ltd. | Solid state imaging device and driving method thereof |
JP4524609B2 (ja) * | 2004-10-29 | 2010-08-18 | ソニー株式会社 | 固体撮像素子、固体撮像素子の駆動方法および撮像装置 |
JP4701975B2 (ja) * | 2005-10-05 | 2011-06-15 | パナソニック株式会社 | 固体撮像装置および撮像装置 |
US7821553B2 (en) | 2005-12-30 | 2010-10-26 | International Business Machines Corporation | Pixel array, imaging sensor including the pixel array and digital camera including the imaging sensor |
JP4662883B2 (ja) | 2006-05-15 | 2011-03-30 | 富士フイルム株式会社 | 二次元カラー固体撮像素子 |
JP2008078794A (ja) * | 2006-09-19 | 2008-04-03 | Pentax Corp | 撮像素子駆動装置 |
US20080125507A1 (en) | 2006-11-28 | 2008-05-29 | Bayer Materialscience Llc | Reduction of VOC emissions from low density cavity filling NVH polyurethane foams |
US7701496B2 (en) * | 2006-12-22 | 2010-04-20 | Xerox Corporation | Color filter pattern for color filter arrays including a demosaicking algorithm |
JP5082528B2 (ja) | 2007-03-23 | 2012-11-28 | ソニー株式会社 | 固体撮像装置及び撮像装置 |
US7745779B2 (en) * | 2008-02-08 | 2010-06-29 | Aptina Imaging Corporation | Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers |
JP2009246465A (ja) | 2008-03-28 | 2009-10-22 | Panasonic Corp | 撮像装置、撮像モジュール、及び撮像システム |
US8902321B2 (en) * | 2008-05-20 | 2014-12-02 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
JP5472584B2 (ja) * | 2008-11-21 | 2014-04-16 | ソニー株式会社 | 撮像装置 |
JP5149143B2 (ja) * | 2008-12-24 | 2013-02-20 | シャープ株式会社 | 固体撮像素子およびその製造方法、電子情報機器 |
JP5471117B2 (ja) * | 2009-07-24 | 2014-04-16 | ソニー株式会社 | 固体撮像装置とその製造方法並びにカメラ |
WO2012124182A1 (ja) * | 2011-03-11 | 2012-09-20 | 富士フイルム株式会社 | 撮像装置及び撮像プログラム |
WO2013001868A1 (ja) * | 2011-06-30 | 2013-01-03 | 富士フイルム株式会社 | 撮像装置、撮像方法、及び撮像プログラム |
BE1020345A5 (nl) | 2012-05-25 | 2013-08-06 | Atla Con Bvba | Afvul- en lasinrichting en werkwijze voor het afvullen en dichtlassen van een zak. |
-
2011
- 2011-07-29 CN CN201180069092.3A patent/CN103416067B/zh not_active Expired - Fee Related
- 2011-07-29 JP JP2013504501A patent/JP5474258B2/ja not_active Expired - Fee Related
- 2011-07-29 WO PCT/JP2011/067544 patent/WO2012124181A1/ja active Application Filing
- 2011-07-29 EP EP11860926.2A patent/EP2685725B1/en not_active Not-in-force
-
2013
- 2013-09-09 US US14/021,536 patent/US8928785B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10243407A (ja) * | 1997-02-27 | 1998-09-11 | Olympus Optical Co Ltd | 画像信号処理装置及び画像入力処理装置 |
JP2010512048A (ja) * | 2006-11-30 | 2010-04-15 | イーストマン コダック カンパニー | 低解像度画像の生成 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2685725A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP2685725B1 (en) | 2016-12-07 |
US20140009655A1 (en) | 2014-01-09 |
JP5474258B2 (ja) | 2014-04-16 |
JPWO2012124181A1 (ja) | 2014-07-17 |
EP2685725A4 (en) | 2014-09-03 |
US8928785B2 (en) | 2015-01-06 |
CN103416067A (zh) | 2013-11-27 |
CN103416067B (zh) | 2014-10-29 |
EP2685725A1 (en) | 2014-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5474258B2 (ja) | 撮像装置及び撮像プログラム | |
EP2690874A1 (en) | Color image sensor, imaging device, and control program for imaging device | |
JP5702896B2 (ja) | カラー撮像素子及び撮像装置 | |
US20060050956A1 (en) | Signal processing apparatus, signal processing method, and signal processing program | |
EP2690873A1 (en) | Color imaging element, imaging device, and imaging program | |
JP4796871B2 (ja) | 撮像装置 | |
EP2690875A1 (en) | Color image sensor, imaging device, and control program for imaging device | |
EP2690872A1 (en) | Color image capturing element, image capturing device, and image capturing program | |
EP2690871A1 (en) | Color image sensor, imaging device, and imaging program | |
US8723993B2 (en) | Imaging device and storage medium storing an imaging program | |
US8976275B2 (en) | Color imaging element | |
EP2800383A1 (en) | Color imaging element | |
JP5033711B2 (ja) | 撮像装置及び撮像装置の駆動方法 | |
WO2013099917A1 (ja) | 撮像装置 | |
JP5607266B2 (ja) | 撮像装置、撮像装置の制御方法、及び制御プログラム | |
US8711257B2 (en) | Color imaging device | |
JP5624228B2 (ja) | 撮像装置、撮像装置の制御方法、及び制御プログラム | |
JP5607267B2 (ja) | 撮像装置、撮像装置の制御方法、及び制御プログラム | |
JP5624227B2 (ja) | 撮像装置、撮像装置の制御方法、及び制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11860926 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013504501 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2011860926 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011860926 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |