US20060098240A1 - Shading compensation device, shading compensation value calculation device and imaging device - Google Patents
Shading compensation device, shading compensation value calculation device and imaging device Download PDFInfo
- Publication number
- US20060098240A1 US20060098240A1 US11/265,291 US26529105A US2006098240A1 US 20060098240 A1 US20060098240 A1 US 20060098240A1 US 26529105 A US26529105 A US 26529105A US 2006098240 A1 US2006098240 A1 US 2006098240A1
- Authority
- US
- United States
- Prior art keywords
- shading compensation
- shading
- component
- compensation value
- relative position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/401—Compensating positionally unequal response of the pick-up or reproducing head
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
Definitions
- the present invention relates to a shading compensation device, a shading compensation value calculation device and an imaging device, and relates to, for example, an excellent shading compensation device, shading compensation value calculation device and imaging device which are employed at a time of compensating for shading of an image signal which is outputted from a high-pixel count image capture device.
- CCD camera shading compensation circuits have been disclosed (see, for example, Japanese Patent Application Laid-Open (JP-A) No. 5-83622) which are formed such that shading does not vary when the aperture of a lens is altered.
- the shading compensation circuit recited in JP-A No. 5-83622 in response to the size of a lens aperture, designates an address of shading compensation data which has been written to a ROM beforehand, and alters shading compensation signals in accordance with shading compensation data read from the ROM.
- distance detection devices for vehicles have been disclosed (see, for example, JP-A No. 6-273171) which prevent effects of shading of an imaging system and improve accuracy of detection of distances.
- the distance detection device for a vehicle recited in JP-A No. 6-273171 in advance has stored shading compensation data in a memory and, by multiplying the shading compensation data stored in the memory with data of respective pixels of a digital image, provides a shading-compensated digital image.
- JP-A No. 6-319042 image processing devices
- the image processing device recited in JP-A No. 6-319042 calculates average values of brightness data for each of a plurality of regions, infers brightnesses of the pixels in the regions, calculates shading compensation values on the basis of the inferred pixel brightnesses, and utilizes these shading compensation values to correct the brightness data.
- the present invention is proposed in order to solve the problems described above, and an object of the present invention is to provide a shading compensation device, shading compensation value calculation device and imaging device which are capable of compensating for shading which differs for respective colors.
- a first aspect of the present invention provides a shading compensation device comprising: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes an image signal of the single screen and the compensation table stored in the compensation table storage component to match scale, and calculates a relative position, with respect to the representative positions of the compensation table, of a pixel, of the image signal of the single screen, that is a processing object; a shading compensation value interpolation component which, for each color, interpolates a shading compensation value for the relative position calculated by the relative position calculation component on the basis of the relative position and the shading compensation values of the representative positions; and a shading compensation component which, for each color, generates a shading-compensated pixel signal based on a pixel signal of the pixel that is a processing object and the shading compensation value interpolated by the shading compensation value interpolation component.
- Compensation value tables for the respective colors are stored at the compensation table storage component.
- the colors are not limited to three primary colors but may include other colors.
- the compensation value tables associate the shading compensation values with each of the plurality of representative positions of a single screen.
- a single screen may be a single frame, and may be a single field. With these compensation value tables, data volumes can be greatly reduced in comparison with cases featuring one shading compensation value for each pixel.
- the relative position calculation component causes the image signal of the single screen to correspond with the compensation tables stored in the compensation table storage component and, for pixels which are processing objects in the image signal of the single screen, calculates relative positions with respect to the representative positions of the compensation tables. That is, the relative position calculation component calculates one relative position for each pixel of the image signal.
- the shading compensation value interpolation component interpolates a shading compensation value for the relative position, on the basis of the relative position calculated by the relative position calculation component and the shading compensation values of the representative positions.
- the shading compensation component generates shading-compensated image signals, for the respective colors, on the basis of pixel signals of the pixels which are processing objects and the shading compensation values interpolated by the shading compensation value interpolation component. That is, the shading compensation component implements shading compensation for each individual pixel.
- the shading compensation device relating to the present invention by utilizing the compensation tables which associate the shading compensation values with each of the plurality of representative positions of the single screen to perform shading compensation, is capable of compensating for shading which differs for the respective colors, while greatly reducing an amount of data that must be prepared beforehand.
- the shading compensation value interpolation component may utilize the shading compensation values of a predetermined number of the representative positions, which are peripheral to a relative position calculated by the relative position calculation component, for interpolating the shading compensation value for the relative position.
- the shading compensation device described above may further include a lens shading compensation value calculation component, which calculates lens shading compensation values on the basis of distances from a central position of the screen to the pixels that are processing objects.
- the shading compensation component also utilizes the lens shading compensation values calculated by the lens shading compensation value calculation component for generating the shading-compensated pixel signals.
- shading caused at a lens can also be compensated for.
- a second aspect of the present invention provides an imaging device comprising: an image capture component, which takes an image of a subject and generates an image signal; a storage component, which stores the image signal; a black level correction component, which corrects a black level of the image signal; a shading compensation device, which implements shading compensation of the image signal of which the black level has been corrected by the black level correction component; and a control component which performs control for, in a case of a usual exposure mode, supplying the image signal generated by the image capture component to the black level correction component, and in a case of a long-duration exposure mode, writing the image signal generated by the image capture component to the storage component and, after image capture, reading the image signal stored at the storage component and supplying the same to the black level correction component, wherein the shading compensation device includes: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes the image
- control component feeds the image signal generated by the image capture component to the black level correction component.
- the image signals generated by the image capture component are black level-corrected, and thereafter shading-compensated.
- the control component writes the image signals generated by the image capture component to the storage component and, after image capture, reads out the image signals stored at the storage component and feeds the same to the black level correction component.
- the image signals generated by the image capture component are temporarily written to the storage component.
- the black level correction component reliably detects the black level without time-variations, and performs black level correction of the image signals with high accuracy.
- the shading compensation device can perform shading compensation without black level floating.
- the imaging device described above can perform shading compensation without black level floating.
- a third aspect of the present invention provides a shading compensation value calculation device comprising: an image capture element, which takes an image of a subject and generates an image signal; an aggregation component which, for each of colors, aggregates color signals, which are included in the image signal generated by the image capture element, at each of a plurality of regions of a single screen, for calculating aggregate values; a white balance adjustment component which, on the basis of the aggregate values of each color calculated by the aggregation component, performs white balance adjustment for an overall region of the single screen with reference to a level of a reference color signal and a level of another color signal at a central portion of the single screen; and a shading compensation value calculation component, which calculates shading compensation values on the basis of the aggregate values of the respective regions of the single screen, which have been white balance-adjusted by the white balance adjustment component.
- the aggregation component aggregates color signals, which constitute the image signal generated by the image capture device, for each color at each of the plurality of regions of the single screen, to calculate aggregate values.
- the plural regions referred to here correspond, respectively, to the plurality of representative positions of the compensation value tables mentioned in the above.
- the white balance adjustment component implements white balance adjustment for the overall region of the single screen by reference to the level of the reference color signal and the levels of other color signals at the central portion of the single screen.
- the levels of the respective color signals substantially coincide at the central portion of the screen, these levels may not coincide at end portions of the screen if shading occurs.
- the shading compensation value calculation component calculates shading compensation values on the basis of the aggregated values of the respective regions of the single screen which have been white balance-adjusted by the white balance adjustment component.
- the shading compensation value calculation component relating to the present invention by calculating shading compensation values in order to compensate for shading at respective regions of a single screen, is capable of providing shading compensation values with a reduced amount of data.
- the shading compensation value calculation component may calculate, for each of the regions of the single screen, a shading compensation value of another color signal on the basis of comparison of the aggregate value of the reference color signal with the aggregate value of the other color signal.
- the shading compensation value calculation component may, for each of the regions of the single screen, calculate a shading compensation value of another color signal on the basis of comparison of the aggregate value of the reference color signal at the screen central portion with the aggregate value of the other color signal, and calculate a shading compensation value of the reference color signal on the basis of comparison of the aggregate value of the reference color signal at the screen central portion with the aggregate value of the reference color signal at the each region.
- a fourth aspect of the present invention provides an imaging device comprising: a shading compensation device; and a shading compensation value calculation device, wherein the shading compensation device includes: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes an image signal of the single screen and the compensation table stored in the compensation table storage component to match scale, and calculates a relative position, with respect to the representative positions of the compensation table, of a pixel, of the image signal of the single screen, that is a processing object; a shading compensation value interpolation component which, for each color, interpolates a shading compensation value for the relative position calculated by the relative position calculation component on the basis of the relative position and the shading compensation values of the representative positions; and a shading compensation component which, for each color, generates a shading-compensated pixel signal based on a pixel signal of the pixel that is a processing object and the shading compensation value interpol
- the shading compensation device, shading compensation value calculation device and imaging device relating to the present invention are capable of compensating for shading which is different for respective colors.
- FIG. 1 is a block diagram showing structure of an imaging device relating to a first embodiment of the present invention.
- FIG. 2 is a block diagram showing structure of a shading compensation section 22 .
- FIG. 3 is a diagram showing a co-ordinate position (500, 1000) in a single screen constituted by 2048 by 1536 pixels.
- FIG. 4 is a diagram showing image data of the single screen which has been compressed so as to correspond with a compensation value table.
- FIG. 5 is a diagram for explaining an interpolation calculation of a shading compensation value corresponding to a relative co-ordinate position.
- FIG. 6 is a diagram showing a state in which 8 ⁇ 8-pixel compensation value tables are magnified so as to correspond with 2048 ⁇ 1536-pixel image data.
- FIG. 7 is a diagram showing CCDij and OUTij.
- FIG. 8 is a block diagram showing structure of an imaging device relating to a second embodiment of the present invention.
- FIG. 9 is a flowchart showing a shading correction value generation routine.
- FIG. 10A is a diagram describing image data which is composed of 2048 by 1536 pixels of three primary colors.
- FIG. 10B is a diagram describing aggregate values (summed values Rji) of R data for respective regions, in a case in which the image data is divided into 8 by 8 regions.
- FIG. 11 is a diagram showing Rji for respective regions in the case in which the image data is divided into 8 by 8 regions.
- FIG. 12 is a diagram showing levels of R, G and B data after white balance adjustment.
- FIG. 13 is a diagram showing RHji for respective regions in the case in which the image data is divided into 8 by 8 regions.
- FIG. 14 is a diagram showing levels of R, G and B data after white balance adjustment.
- FIG. 15 is a block diagram showing structure of a shading compensation section 22 A.
- FIG. 16 is a diagram for explaining a lens shading compensation function.
- FIG. 17 is a block diagram showing structure of an imaging device relating to a fifth embodiment of the present invention.
- FIG. 1 is a block diagram showing structure of an imaging device relating to a first embodiment of the present invention.
- the imaging device is provided with a CCD image sensor 11 , an analog front end/timing generator (AFE/TG) 12 , a processing system 20 and an SDRAM 30 .
- the CCD image sensor 11 takes images of a subject.
- the AFE/TG 12 performs predetermined analog signal processing on image signals which have been generated by the CCD image sensor 11 and generates synchronous signals, the processing system 20 performs predetermined digital signal processing, and the SDRAM 30 stores image data.
- the CCD image sensor 11 generates image signals constituted of three primary colors in accordance with imaged light from the subject, and feeds these image signals to the AFE/TG 12 .
- the AFE/TG 12 applies correlated double sampling processing and analog/digital conversion to the image signals supplied from the CCD image sensor 11 .
- the AFE/TG 12 also generates vertically synchronized signals and horizontally synchronized signals, uses these synchronous signals during the analog signal processing, and feeds the same to the CCD image sensor 11 . Then, the image data which has been converted to digital signals at the AFE/TG 12 is fed to the processing system 20 .
- the processing system 20 is provided with a black level correction section 21 , a shading compensation section 22 , a memory control section 23 and a CPU 24 .
- the black level correction section 21 corrects a black level of the image data.
- the shading compensation section 22 implements shading compensation of the image data.
- the memory control section 23 controls writing and reading of image data to and from the SDRAM 30 .
- the CPU 24 controls overall operations of the processing system 20 .
- the black level correction section 21 corrects the black level of the image data fed from the AFE/TG 12 so as to eliminate an offset due to dark current noise which is included in the image data, and feeds the black level-corrected image data to the shading compensation section 22 .
- the shading compensation section 22 performs shading compensation of the image data, and feeds the shading-compensated image data to the memory control section 23 .
- the memory control section 23 writes the image data which has been fed from the shading compensation section 22 to the SDRAM 30 .
- FIG. 2 is a block diagram showing structure of the shading compensation section 22 .
- the shading compensation section 22 implements shading correction of each pixel for each of the colors red (R), green (G) and blue (B).
- the shading compensation section 22 is provided with a pixel counter 41 , a relative position calculation section 42 , a compensation value table memory 43 , a compensation value interpolation section 44 and a multiplier 45 .
- the pixel counter 41 counts pixels.
- the relative position calculation section 42 calculates relative co-ordinate positions.
- the compensation value table memory 43 stores compensation value tables.
- the compensation value interpolation section 44 interpolates to calculate shading compensation values for the relative co-ordinate positions.
- the multiplier 45 multiplies pixel data of pixels of the image data which are processing objects by shading compensation data.
- the pixel counter 41 counts co-ordinate positions of the image data which is fed from the black level correction section 21 .
- the relative position calculation section 42 compares the image data with the compensation value tables, and calculates relative co-ordinate positions of the pixels which are processing objects.
- Compensation value tables for the respective colors R, G and B are stored at the compensation value table memory 43 .
- the compensation value tables are tables in which, for a case in which a single screen is divided into a plurality of regions, respective shading compensation values are associated with representative positions of the respective regions.
- the compensation value tables associate shading compensation values with co-ordinate positions of respective pixels for a case in which the single screen is divided as 8 by 8 pixels.
- the compensation value tables are constituted with 8 by 8 pixels.
- the compensation value tables by associating the shading compensation values with the respective regions for the case in which the single screen is divided into a mere 64 regions, can greatly reduce an amount of data in comparison with a case which features shading compensation values for individual pixels.
- the compensation value interpolation section 44 refers to the compensation value tables stored in the compensation value table memory 43 for each color, and interpolates shading compensation values for the relative co-ordinate positions calculated by the relative position calculation section 42 .
- the multiplier 45 generates the shading-compensated image data by multiplying pixel data which is the object of processing with the new shading compensation values obtained by the interpolation processing.
- the imaging device structured as described above performs shading compensation of a red pixel which is a processing object (the co-ordinate position (500, 1000)) as follows.
- FIG. 3 is a diagram showing the co-ordinate position (500, 1000) of the single screen constituted by 2048 by 1536 pixels.
- the pixel counter 41 shown in FIG. 2 counts co-ordinate positions of the image data and obtains the co-ordinate position (500, 1000) of the pixel that is the processing object.
- the relative position calculation section 42 calculates the relative co-ordinate position of the pixel which is the processing object with reference to numbers of vertical and horizontal pixels of the compensation value tables. That is, the relative position calculation section 42 calculates a relative co-ordinate position of the pixel which is the processing object, corresponding to a case in which the vertical and horizontal numbers of pixels of the image data of the single screen are shrunk so as to match with the vertical and horizontal numbers of pixels of a compensation value table.
- FIG. 4 is a diagram showing image data of the single screen which has been shrunk so as to correspond with the compensation value table.
- the relative co-ordinate position of the co-ordinate position (500, 1000) in the 2048 ⁇ 1536-pixel single screen becomes (1.953125, 5.208). That is, the relative position calculation section 42 calculates the relative co-ordinate position (1.953125, 5.208).
- the compensation value interpolation section 44 refers to the compensation value table stored in the compensation value table memory 43 , and calculates a shading compensation value corresponding to the relative co-ordinate position by interpolation.
- FIG. 5 is a diagram for explaining the interpolation calculation of the shading compensation value corresponding to the relative co-ordinate position.
- the compensation value interpolation section 44 utilizes, for example, a spline function, which is a function without discontinuities, and sixteen shading compensation values peripheral to the relative co-ordinate position to interpolate the shading compensation value corresponding to the relative co-ordinate position.
- the compensation value interpolation section 44 may extrapolate shading compensation values which are at the edge of the compensation value table unaltered, and thus acquire sixteen shading compensation values peripheral to the relative co-ordinate position. Further, the interpolation calculation of the shading compensation value corresponding to the relative co-ordinate position is not limited to the calculation described above.
- the multiplier 45 generates shading-compensated red pixel data by multiplying the image data of the aforementioned co-ordinate position (500, 1000) which is the processing object with the shading compensation value calculated by interpolation at the compensation value interpolation section 44 .
- the shading compensation is applied to all red pixels, and then the shading compensation is applied in a similar manner to all the blue and green pixels.
- the shading-compensated image data is stored to the SDRAM 30 via the memory control section 23 .
- the imaging device relating to the first embodiment of the present invention includes compensation value tables for each color in which respective shading compensation values are associated with representative positions of plural regions of a single screen.
- respective shading compensation values are associated with representative positions of plural regions of a single screen.
- this imaging device can implement a shading compensation that would be difficult to express as a function simply, by referring to the compensation value tables and interpolating the shading compensation values for the co-ordinate positions that are processing objects.
- this imaging device compresses the image data so as to fit to the compensation value table, it is also possible to expand the compensation value table to fit to the image data.
- FIG. 6 is a diagram showing a state in which 8 ⁇ 8-pixel compensation value tables are magnified so as to correspond with 2048 ⁇ 1536-pixel image data.
- the co-ordinate positions of the respective pixels of the compensation value tables are also magnified.
- magnified compensation data CFij of a co-ordinate position (i,j) in the 2048 by 1536 pixels is calculated by interpolation of the shading compensation values of the magnified compensation value tables.
- this imaging device may magnify the compensation value tables to correspond with the image data.
- interpolation calculations of the shading compensation values for the pixels that are processing objects are implemented, and shading compensation of the pixel data of the pixels that are processing objects can be implemented in accordance with these shading compensation values.
- FIG. 8 is a block diagram showing structure of an imaging device relating to the second embodiment.
- the imaging device relating to the second embodiments calculates red and blue shading compensation values, and is provided with a processing system 20 A with a structure which differs from the first embodiment.
- the processing system 20 A is further provided with an auto white balance (AWB) adjustment section 25 and a memory control section 26 .
- the AWB adjustment section 25 both implements automatic white balance adjustment and calculates shading compensation values.
- the memory control section 26 writes image data to the SDRAM 30 after the white balance adjustment.
- the imaging device which is structured as described above calculates shading compensation values in accordance with the following procedure when imaging a gray proofing plate.
- FIG. 9 is a flowchart showing a shading correction value generation routine.
- the AWB adjustment section 25 of the imaging device executes the following shading correction value generation routine in accordance with image data supplied via the black level correction section 21 and the shading compensation section 22 .
- the shading compensation section 22 simply throughputs the image data supplied thereto from the black level correction section 21 without performing shading compensation.
- step S 1 the AWB adjustment section 25 divides image data constituted of, for example, 2048 by 1536 pixels of three basic colors into 8 by 8 regions, calculates aggregate values of the colors Rji, Gji and Bji for each region, and then proceeds to step S 2 .
- FIG. 10A is a diagram describing 2048 by 1536 pixels of R data
- FIG. 10B is a diagram describing aggregate values (summed values Rji) of R data for respective regions, when the image data has been divided into 8 by 8 regions.
- the summed values Rji are obtained by summing values of 256 by 192 pixels of R data for each region.
- R 00 represents a top-left region or an aggregate value thereof.
- the regions Rji are represented as R 01 , R 02 , . . . , R 07 moving toward the right, and are represented as R 10 , R 20 . . . , R 70 moving toward the bottom.
- the AWB adjustment section 25 also calculates summed values Gji and Bji for the respective regions in a similar manner.
- step S 2 the AWB adjustment section 25 calculates averages of the aggregate values of predetermined regions which are at a central portion of the screen, for each of R, G and B.
- the AWB adjustment section 25 calculates an average value Rcenter of, for example, the summed values R 33 , R 34 , R 43 and R 44 of four regions at the central portion of the screen.
- the AWB adjustment section 25 calculates average values of the aggregate values of the four regions at the screen central portion. That is, the AWB adjustment section 25 performs the following calculations, and then proceeds to step S 3 .
- R center ( R 33 +R 34 +R 43 +R 44 )/4
- G center ( G 33 +G 34 +G 43 +G 44 )/4
- B center ( B 33 +B 34 +B 43 +B 44 )/4
- step S 3 the AWB adjustment section 25 calculates respective gains of R and B, Rgain and Bgain. That is, the AWB adjustment section 25 performs the following calculations, and then proceeds to step S 4 .
- step S 4 the AWB adjustment section 25 multiplies the respective gains Rgain and Bgain with the summed values Rji and Bji, to calculate Rji and B′ji. That is, the AWB adjustment section 25 performs the following calculations, and then proceeds to step S 5 .
- FIG. 11 is a diagram showing R′ji for respective regions in the case in which the image data is divided into 8 by 8 regions.
- the AWB adjustment section 25 performs white balance adjustment to adjust the R data and the B data of the whole screen by reference to the average values of the R, G and B data at the central portion of the image.
- the summed values Rji and Bji are provided for each of the eight by eight regions subsequent to white balance adjustment with the central portion of the image being the reference point.
- FIG. 12 is a diagram showing R, G and B data after the white balance adjustment. Because the white balance adjustment is performed with reference to the central portion of the image, levels of the R, G and B data at the central portion of the image substantially coincide. However, compared with the G data, the levels of the R and B data fall from the central portion toward the edges of the image. When there are such mismatches of the respective levels of the data, shading will occur in accordance with these mismatches.
- step S 5 the AWB adjustment section 25 performs the following calculations to find shading compensation values RHji and BHji.
- RHji Gji/R′ji
- BHji Gji/B′ji
- the AWB adjustment section 25 finds the mismatches of the summed values R′ji and B′ji with respect to the summed values Gji, in the form of the shading compensation values RHji and BHji.
- the values RHji and BHji mentioned above may be multiplied by, for example, a predetermined coefficient ⁇ ( ⁇ 1).
- Preferable values of ⁇ are, for example, 0.9, 0.8 and the like.
- FIG. 13 is a diagram showing RHji for respective regions in the case in which the image data is divided into 8 by 8 regions. This corresponds to a compensation value table for R.
- a compensation value table for B is found by the AWB adjustment section 25 in a similar manner.
- the imaging device relating to the second embodiment can obtain compensation value tables for R and B by finding offsets of the summed values R′ji and B′ji relative to the summed values Gji after the white balance adjustment, to serve as shading compensation values.
- the imaging device relating to the third embodiment calculates, as well as shading compensation values for R and B, shading compensation values for G
- the imaging device relating to the present embodiment has a similar structure to the second embodiment, but a portion of the shading compensation value generation routine of the AWB adjustment section 25 is different.
- the AWB adjustment section 25 may perform the following calculations, so as to find shading compensation values RHji, GHji and BHji.
- FIG. 14 is a diagram showing levels of R, G and B data after white balance adjustment.
- the level of the G data falls slightly from the central portion toward the edges of the image.
- the AWB adjustment section 25 finds the mismatches of the summed values R′ji, Gji and Bji with respect to Gcenter of the central portion of the summed values Gji, in the form of the shading compensation values RHji, GHji and BHji.
- compensation value tables are found for R, G and B.
- the values RHji, GHji and BHji described above may be multiplied by, for example, a predetermined coefficient ⁇ ( ⁇ 1).
- Preferable values of ⁇ are, for example, 0.9, 0.8 and the like.
- the imaging device relating to the third embodiment can, by finding offsets of the summed values of R′ji, Gji and Bji relative to Gcenter of the central portion of the summed values Gji after the white balance adjustment to serve as shading compensation values, obtain compensation value tables which enable more accurate compensation than in the second embodiment.
- the imaging device relating to the fourth embodiment is also capable of compensating for lens shading.
- the shading compensation values utilized in the first to third embodiments are referred to as “CCD shading compensation values”, and the compensation values for lens shading are referred to as “lens shading compensation values”.
- the imaging device relating to the present embodiment is provided with the shading compensation section 22 A as follows.
- FIG. 15 is a block diagram showing structure of the shading compensation section shading compensation section 22 A.
- the shading compensation section 22 A is provided with a compensation value function calculation section 46 and a multiplier 47 .
- the compensation value function calculation section 46 utilizes a lens shading compensation function to calculate lens shading compensation values.
- the multiplier 47 multiplies the CCD shading compensation values with the lens shading compensation values.
- FIG. 16 is a diagram for explaining the lens shading compensation function.
- a distance d from a center serves as a variable, and the function is represented by, for example, the following equation.
- Lens shading compensation value ⁇ d 4 + ⁇ d 2 + ⁇
- the compensation value function calculation section 46 calculates the distance d from a center position to a co-ordinate position which has been counted by the pixel counter 41 , and calculates a lens shading compensation value by substitution of the distance d into the above equation.
- the multiplier 47 multiplies a CCD shading compensation value found by the compensation value interpolation section 44 with a lens shading compensation value found by the compensation value function calculation section 46 , and feeds the multiplied value to the multiplier 45 . Then, the multiplier 45 multiplies image data which is a processing object by the new shading compensation value found by the multiplier 47 . Thus, shading-compensated image data is generated.
- the imaging device relating to the fourth embodiment can simultaneously compensate not just for shading caused at the CCD image sensor 11 , but also for shading caused at a lens.
- FIG. 17 is a block diagram showing structure of an imaging device relating to the fifth embodiment.
- the imaging device relating to the fifth embodiment is capable of executing shading compensation properly even with a long-duration exposure at a time of image capture, and is provided with a processing system 20 B instead of the processing system 20 or processing system 20 A of the embodiments described above.
- the processing system 20 B is provided with a signal processing section 27 , a memory control section 28 and a memory control section 29 .
- the signal processing section 27 carries out predetermined digital signal processing on the shading-compensated image data.
- the memory control section 28 writes the white balance-adjusted or digital signal-processed image data to the SDRAM 30 .
- the memory control section 29 reads the image data from the SDRAM 30 and feeds the same to the black level correction section 21 .
- the black level correction section 21 and the shading compensation section 22 perform predetermined compensation processing at a time of reading of image data or at a time of signal processing.
- the imaging device structured as described above operates as follows for cases of image capture by usual exposure.
- the memory control section 29 feeds the image data supplied from the AFE/TG 12 to the black level correction section 21 .
- This image data is black level-corrected by the black level correction section 21 , and shading-compensated by the shading compensation section 22 .
- the shading compensation section 22 writes the shading-compensated image data to the SDRAM 30 , via the memory control section 23 , and stores the same at the AWB adjustment section 25 to serve as aggregate data for white balance adjustment.
- the memory control section 29 reads the shading-compensated image data from the SDRAM 30 and the aggregate data for white balance adjustment.
- the data which has been read by the memory control section 29 is provided through the black level correction section 21 and the shading compensation section 22 , and fed to the signal processing section 27 .
- the signal processing section 27 utilizes the aggregate data for white balance adjustment to perform white balance adjustment, and other digital signal processing, on the image data. Thereafter, the processed image data is written to the SDRAM 30 via the memory control section 28 .
- this imaging device saves the black level-corrected and shading-compensated image data to serve as the aggregate data for white balance adjustment at the time of image capture, and utilizes the aggregate data for white balance adjustment to perform the white balance adjustment at the time of signal processing. Thus, accuracy of the signal processing is raised.
- the black level will gradually float during the exposure duration, it will not be possible to correct the black level consistently, and there may be black-floating in the image data. Furthermore, if the shading compensation is performed on image data with black-floating, there may be a problem in that the black level also floats in accordance with the compensation amounts.
- this imaging device performs processing as follows in a case of image capture with a long-duration exposure.
- the memory control section 29 writes the unprocessed image data supplied from the AFE/TG 12 to the SDRAM 30 . Meanwhile, the unprocessed image data is provided through the black level correction section 21 and the shading compensation section 22 , and fed to the AWB adjustment section 25 . Hence, at the AWB adjustment section 25 , the unprocessed image data is stored to serve as the aggregate data for white balance adjustment.
- the memory control section 29 reads the image data from the SDRAM 30 and the aggregate data for white balance adjustment.
- This image data is black level-corrected by the black level correction section 21 and shading-compensated by the shading compensation section 22 , and then fed to the signal processing section 27 .
- the black level correction section 21 can detect an accurate black level on the basis of the unprocessed image data which has been temporarily held at the SDRAM 30 , and can correct the black level with high accuracy.
- the signal processing section 27 utilizes the aggregate data for white balance adjustment, which has been stored at the AWB adjustment section 25 , to execute the white balance adjustment, and other predetermined processing, on the shading-compensated image data.
- the memory control section 28 writes the image data that has been processed by the signal processing section 27 to the SDRAM 30 .
- the imaging device relating to the fifth embodiment performs black level correction and shading compensation on image data from the image capture system at a time of image capture with usual exposure, but at a time of image capture with a long-duration exposure, temporarily stores the image data from the image capture system and performs the black level correction and shading compensation on the image data subsequent to the image capture.
- this imaging device can, at a time of image capture with a long-duration exposure, perform the black level correction and the shading compensation on image data in which a floating amount of black is settled. Therefore, this imaging device can perform excellent black level correction and shading compensation even on image data of a long-duration exposure.
Abstract
Description
- This application claims priority under 35 USC 119 from Japanese Patent Application No. 2004-323820, the disclosure of which is incorporated by reference herein.
- 1. Field of the Invention
- The present invention relates to a shading compensation device, a shading compensation value calculation device and an imaging device, and relates to, for example, an excellent shading compensation device, shading compensation value calculation device and imaging device which are employed at a time of compensating for shading of an image signal which is outputted from a high-pixel count image capture device.
- 2. Description of the Related Art
- In an image generated by a CCD image sensor, because of effects such as variations in transparency of a lens and the like, a so-called shading effect occurs, in which a central portion of the image is brighter than a peripheral portion. Accordingly, various techniques relating to shading compensation, in order to prevent the effects of such shading, have been disclosed.
- For example, CCD camera shading compensation circuits have been disclosed (see, for example, Japanese Patent Application Laid-Open (JP-A) No. 5-83622) which are formed such that shading does not vary when the aperture of a lens is altered. The shading compensation circuit recited in JP-A No. 5-83622, in response to the size of a lens aperture, designates an address of shading compensation data which has been written to a ROM beforehand, and alters shading compensation signals in accordance with shading compensation data read from the ROM.
- Further, distance detection devices for vehicles have been disclosed (see, for example, JP-A No. 6-273171) which prevent effects of shading of an imaging system and improve accuracy of detection of distances. The distance detection device for a vehicle recited in JP-A No. 6-273171 in advance has stored shading compensation data in a memory and, by multiplying the shading compensation data stored in the memory with data of respective pixels of a digital image, provides a shading-compensated digital image.
- Further yet, image processing devices have been disclosed (see, for example, JP-A No. 6-319042) which are capable of shading compensation which is not influenced by noise, surface texture of a proofing plate or the like. The image processing device recited in JP-A No. 6-319042 calculates average values of brightness data for each of a plurality of regions, infers brightnesses of the pixels in the regions, calculates shading compensation values on the basis of the inferred pixel brightnesses, and utilizes these shading compensation values to correct the brightness data.
- Because of increases in pixel numbers of solid-state image capture devices and reductions in cell sizes, there is a trend for shading amounts of respective colors to differ in accordance with exit pupil, aperture and the like. Hence, when shading amounts for respective colors are different, a problem arises in that colors differ between locations of a screen, and white balance is also affected. However, consideration has not been extended to differences in shading amounts for respective colors in any of the technologies recited in JP-A No. 5-83622, JP-A No. 6-273171 and JP-A No. 6-319042, and it has not been possible to solve the problems described above.
- The present invention is proposed in order to solve the problems described above, and an object of the present invention is to provide a shading compensation device, shading compensation value calculation device and imaging device which are capable of compensating for shading which differs for respective colors.
- A first aspect of the present invention provides a shading compensation device comprising: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes an image signal of the single screen and the compensation table stored in the compensation table storage component to match scale, and calculates a relative position, with respect to the representative positions of the compensation table, of a pixel, of the image signal of the single screen, that is a processing object; a shading compensation value interpolation component which, for each color, interpolates a shading compensation value for the relative position calculated by the relative position calculation component on the basis of the relative position and the shading compensation values of the representative positions; and a shading compensation component which, for each color, generates a shading-compensated pixel signal based on a pixel signal of the pixel that is a processing object and the shading compensation value interpolated by the shading compensation value interpolation component.
- Compensation value tables for the respective colors are stored at the compensation table storage component. The colors are not limited to three primary colors but may include other colors. The compensation value tables associate the shading compensation values with each of the plurality of representative positions of a single screen. A single screen may be a single frame, and may be a single field. With these compensation value tables, data volumes can be greatly reduced in comparison with cases featuring one shading compensation value for each pixel.
- The relative position calculation component causes the image signal of the single screen to correspond with the compensation tables stored in the compensation table storage component and, for pixels which are processing objects in the image signal of the single screen, calculates relative positions with respect to the representative positions of the compensation tables. That is, the relative position calculation component calculates one relative position for each pixel of the image signal.
- Now, when the compensation table is referred to, there is usually no shading compensation value that corresponds to the relative position. Accordingly, the shading compensation value interpolation component interpolates a shading compensation value for the relative position, on the basis of the relative position calculated by the relative position calculation component and the shading compensation values of the representative positions.
- Then, the shading compensation component generates shading-compensated image signals, for the respective colors, on the basis of pixel signals of the pixels which are processing objects and the shading compensation values interpolated by the shading compensation value interpolation component. That is, the shading compensation component implements shading compensation for each individual pixel.
- Thus, the shading compensation device relating to the present invention, by utilizing the compensation tables which associate the shading compensation values with each of the plurality of representative positions of the single screen to perform shading compensation, is capable of compensating for shading which differs for the respective colors, while greatly reducing an amount of data that must be prepared beforehand.
- Herein, the shading compensation value interpolation component may utilize the shading compensation values of a predetermined number of the representative positions, which are peripheral to a relative position calculated by the relative position calculation component, for interpolating the shading compensation value for the relative position.
- Further, the shading compensation device described above may further include a lens shading compensation value calculation component, which calculates lens shading compensation values on the basis of distances from a central position of the screen to the pixels that are processing objects. In such a case, the shading compensation component also utilizes the lens shading compensation values calculated by the lens shading compensation value calculation component for generating the shading-compensated pixel signals. Thus, shading caused at a lens can also be compensated for.
- A second aspect of the present invention provides an imaging device comprising: an image capture component, which takes an image of a subject and generates an image signal; a storage component, which stores the image signal; a black level correction component, which corrects a black level of the image signal; a shading compensation device, which implements shading compensation of the image signal of which the black level has been corrected by the black level correction component; and a control component which performs control for, in a case of a usual exposure mode, supplying the image signal generated by the image capture component to the black level correction component, and in a case of a long-duration exposure mode, writing the image signal generated by the image capture component to the storage component and, after image capture, reading the image signal stored at the storage component and supplying the same to the black level correction component, wherein the shading compensation device includes: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes the image signal of the single screen and the compensation table stored in the compensation table storage component to match scale, and calculates a relative position, with respect to the representative positions of the compensation table, of a pixel, of the image signal of the single screen, that is a processing object; a shading compensation value interpolation component which, for each color, interpolates a shading compensation value for the relative position calculated by the relative position calculation component on the basis of the relative position and the shading compensation values of the representative positions; and a shading compensation component which, for each color, generates a shading-compensated pixel signal based on a pixel signal of the pixel that is a processing object and the shading compensation value interpolated by the shading compensation value interpolation component.
- At times of the usual exposure mode, the control component feeds the image signal generated by the image capture component to the black level correction component. Thus, in the case of the usual exposure mode, the image signals generated by the image capture component are black level-corrected, and thereafter shading-compensated.
- At times of the long-duration exposure mode, the control component writes the image signals generated by the image capture component to the storage component and, after image capture, reads out the image signals stored at the storage component and feeds the same to the black level correction component. Thus, in the case of the long-duration exposure mode, the image signals generated by the image capture component are temporarily written to the storage component. Then, after the image capture, the image signals which have been read from the storage component are black level-corrected, and thereafter shading-compensated. Therefore, the black level correction component reliably detects the black level without time-variations, and performs black level correction of the image signals with high accuracy. As a result, the shading compensation device can perform shading compensation without black level floating.
- Thus, even when in the long-duration exposure mode, the imaging device described above can perform shading compensation without black level floating.
- A third aspect of the present invention provides a shading compensation value calculation device comprising: an image capture element, which takes an image of a subject and generates an image signal; an aggregation component which, for each of colors, aggregates color signals, which are included in the image signal generated by the image capture element, at each of a plurality of regions of a single screen, for calculating aggregate values; a white balance adjustment component which, on the basis of the aggregate values of each color calculated by the aggregation component, performs white balance adjustment for an overall region of the single screen with reference to a level of a reference color signal and a level of another color signal at a central portion of the single screen; and a shading compensation value calculation component, which calculates shading compensation values on the basis of the aggregate values of the respective regions of the single screen, which have been white balance-adjusted by the white balance adjustment component.
- The aggregation component aggregates color signals, which constitute the image signal generated by the image capture device, for each color at each of the plurality of regions of the single screen, to calculate aggregate values. The plural regions referred to here correspond, respectively, to the plurality of representative positions of the compensation value tables mentioned in the above.
- On the basis of the aggregate values for the respective colors which have been aggregated by the aggregation component, the white balance adjustment component implements white balance adjustment for the overall region of the single screen by reference to the level of the reference color signal and the levels of other color signals at the central portion of the single screen. Hence, although the levels of the respective color signals substantially coincide at the central portion of the screen, these levels may not coincide at end portions of the screen if shading occurs.
- Accordingly, the shading compensation value calculation component calculates shading compensation values on the basis of the aggregated values of the respective regions of the single screen which have been white balance-adjusted by the white balance adjustment component.
- Thus, the shading compensation value calculation component relating to the present invention, by calculating shading compensation values in order to compensate for shading at respective regions of a single screen, is capable of providing shading compensation values with a reduced amount of data.
- Herein, the shading compensation value calculation component may calculate, for each of the regions of the single screen, a shading compensation value of another color signal on the basis of comparison of the aggregate value of the reference color signal with the aggregate value of the other color signal.
- Furthermore, the shading compensation value calculation component may, for each of the regions of the single screen, calculate a shading compensation value of another color signal on the basis of comparison of the aggregate value of the reference color signal at the screen central portion with the aggregate value of the other color signal, and calculate a shading compensation value of the reference color signal on the basis of comparison of the aggregate value of the reference color signal at the screen central portion with the aggregate value of the reference color signal at the each region.
- A fourth aspect of the present invention provides an imaging device comprising: a shading compensation device; and a shading compensation value calculation device, wherein the shading compensation device includes: a compensation table storage component which stores, for each of colors, a compensation table in which shading compensation values are associated with each of a plurality of representative positions of a single screen; a relative position calculation component which, for each color, causes an image signal of the single screen and the compensation table stored in the compensation table storage component to match scale, and calculates a relative position, with respect to the representative positions of the compensation table, of a pixel, of the image signal of the single screen, that is a processing object; a shading compensation value interpolation component which, for each color, interpolates a shading compensation value for the relative position calculated by the relative position calculation component on the basis of the relative position and the shading compensation values of the representative positions; and a shading compensation component which, for each color, generates a shading-compensated pixel signal based on a pixel signal of the pixel that is a processing object and the shading compensation value interpolated by the shading compensation value interpolation component, the shading compensation value calculation device includes: an image capture element, which takes an image of a subject and generates the image signal; an aggregation component which, for each color, aggregates color signals, which are included in the image signal generated by the image capture element, at each of a plurality of regions of the single screen, for calculating aggregate values; a white balance adjustment component which, on the basis of the aggregate values of each color calculated by the aggregation component, performs white balance adjustment for an overall region of the single screen with reference to a level of a reference color signal and a level of another color signal at a central portion of the single screen; and a shading compensation value calculation component, which calculates the shading compensation values on the basis of the aggregate values of the respective regions of the single screen, which have been white balance-adjusted by the white balance adjustment component, and the compensation table storage component stores the shading compensation values of each color which have been calculated by the shading compensation value calculation device to serve as the compensation value tables.
- The shading compensation device, shading compensation value calculation device and imaging device relating to the present invention are capable of compensating for shading which is different for respective colors.
-
FIG. 1 is a block diagram showing structure of an imaging device relating to a first embodiment of the present invention. -
FIG. 2 is a block diagram showing structure of ashading compensation section 22. -
FIG. 3 is a diagram showing a co-ordinate position (500, 1000) in a single screen constituted by 2048 by 1536 pixels. -
FIG. 4 is a diagram showing image data of the single screen which has been compressed so as to correspond with a compensation value table. -
FIG. 5 is a diagram for explaining an interpolation calculation of a shading compensation value corresponding to a relative co-ordinate position. -
FIG. 6 is a diagram showing a state in which 8×8-pixel compensation value tables are magnified so as to correspond with 2048×1536-pixel image data. -
FIG. 7 is a diagram showing CCDij and OUTij. -
FIG. 8 is a block diagram showing structure of an imaging device relating to a second embodiment of the present invention. -
FIG. 9 is a flowchart showing a shading correction value generation routine. -
FIG. 10A is a diagram describing image data which is composed of 2048 by 1536 pixels of three primary colors. -
FIG. 10B is a diagram describing aggregate values (summed values Rji) of R data for respective regions, in a case in which the image data is divided into 8 by 8 regions. -
FIG. 11 is a diagram showing Rji for respective regions in the case in which the image data is divided into 8 by 8 regions. -
FIG. 12 is a diagram showing levels of R, G and B data after white balance adjustment. -
FIG. 13 is a diagram showing RHji for respective regions in the case in which the image data is divided into 8 by 8 regions. -
FIG. 14 is a diagram showing levels of R, G and B data after white balance adjustment. -
FIG. 15 is a block diagram showing structure of ashading compensation section 22A. -
FIG. 16 is a diagram for explaining a lens shading compensation function. -
FIG. 17 is a block diagram showing structure of an imaging device relating to a fifth embodiment of the present invention. - Herebelow, a best mode for carrying out the present invention will be described in detail with reference to the drawings.
-
FIG. 1 is a block diagram showing structure of an imaging device relating to a first embodiment of the present invention. The imaging device is provided with aCCD image sensor 11, an analog front end/timing generator (AFE/TG) 12, aprocessing system 20 and anSDRAM 30. TheCCD image sensor 11 takes images of a subject. The AFE/TG 12 performs predetermined analog signal processing on image signals which have been generated by theCCD image sensor 11 and generates synchronous signals, theprocessing system 20 performs predetermined digital signal processing, and theSDRAM 30 stores image data. - The
CCD image sensor 11 generates image signals constituted of three primary colors in accordance with imaged light from the subject, and feeds these image signals to the AFE/TG 12. The AFE/TG 12 applies correlated double sampling processing and analog/digital conversion to the image signals supplied from theCCD image sensor 11. The AFE/TG 12 also generates vertically synchronized signals and horizontally synchronized signals, uses these synchronous signals during the analog signal processing, and feeds the same to theCCD image sensor 11. Then, the image data which has been converted to digital signals at the AFE/TG 12 is fed to theprocessing system 20. - As shown in
FIG. 1 , theprocessing system 20 is provided with a blacklevel correction section 21, ashading compensation section 22, amemory control section 23 and aCPU 24. The blacklevel correction section 21 corrects a black level of the image data. Theshading compensation section 22 implements shading compensation of the image data. Thememory control section 23 controls writing and reading of image data to and from theSDRAM 30. TheCPU 24 controls overall operations of theprocessing system 20. - The black
level correction section 21 corrects the black level of the image data fed from the AFE/TG 12 so as to eliminate an offset due to dark current noise which is included in the image data, and feeds the black level-corrected image data to theshading compensation section 22. - The
shading compensation section 22 performs shading compensation of the image data, and feeds the shading-compensated image data to thememory control section 23. Herein, detailed structure of theshading compensation section 22 will be described later. Then, thememory control section 23 writes the image data which has been fed from theshading compensation section 22 to theSDRAM 30. -
FIG. 2 is a block diagram showing structure of theshading compensation section 22. Theshading compensation section 22 implements shading correction of each pixel for each of the colors red (R), green (G) and blue (B). - The
shading compensation section 22 is provided with apixel counter 41, a relativeposition calculation section 42, a compensationvalue table memory 43, a compensationvalue interpolation section 44 and amultiplier 45. The pixel counter 41 counts pixels. The relativeposition calculation section 42 calculates relative co-ordinate positions. The compensationvalue table memory 43 stores compensation value tables. The compensationvalue interpolation section 44 interpolates to calculate shading compensation values for the relative co-ordinate positions. Themultiplier 45 multiplies pixel data of pixels of the image data which are processing objects by shading compensation data. - The pixel counter 41 counts co-ordinate positions of the image data which is fed from the black
level correction section 21. The relativeposition calculation section 42 compares the image data with the compensation value tables, and calculates relative co-ordinate positions of the pixels which are processing objects. - Compensation value tables for the respective colors R, G and B are stored at the compensation
value table memory 43. The compensation value tables are tables in which, for a case in which a single screen is divided into a plurality of regions, respective shading compensation values are associated with representative positions of the respective regions. Here, in the present embodiment, the compensation value tables associate shading compensation values with co-ordinate positions of respective pixels for a case in which the single screen is divided as 8 by 8 pixels. In other words, the compensation value tables are constituted with 8 by 8 pixels. - Thus, the compensation value tables, by associating the shading compensation values with the respective regions for the case in which the single screen is divided into a mere 64 regions, can greatly reduce an amount of data in comparison with a case which features shading compensation values for individual pixels.
- The compensation
value interpolation section 44 refers to the compensation value tables stored in the compensationvalue table memory 43 for each color, and interpolates shading compensation values for the relative co-ordinate positions calculated by the relativeposition calculation section 42. Themultiplier 45 generates the shading-compensated image data by multiplying pixel data which is the object of processing with the new shading compensation values obtained by the interpolation processing. - For, for example, image data in which the single screen is constituted by 2048 by 1536 pixels, the imaging device structured as described above performs shading compensation of a red pixel which is a processing object (the co-ordinate position (500, 1000)) as follows.
-
FIG. 3 is a diagram showing the co-ordinate position (500, 1000) of the single screen constituted by 2048 by 1536 pixels. Thepixel counter 41 shown inFIG. 2 counts co-ordinate positions of the image data and obtains the co-ordinate position (500, 1000) of the pixel that is the processing object. - The relative
position calculation section 42 calculates the relative co-ordinate position of the pixel which is the processing object with reference to numbers of vertical and horizontal pixels of the compensation value tables. That is, the relativeposition calculation section 42 calculates a relative co-ordinate position of the pixel which is the processing object, corresponding to a case in which the vertical and horizontal numbers of pixels of the image data of the single screen are shrunk so as to match with the vertical and horizontal numbers of pixels of a compensation value table. -
FIG. 4 is a diagram showing image data of the single screen which has been shrunk so as to correspond with the compensation value table. With horizontal compression of 1/256 and vertical compression of 1/192, the relative co-ordinate position of the co-ordinate position (500, 1000) in the 2048×1536-pixel single screen becomes (1.953125, 5.208). That is, the relativeposition calculation section 42 calculates the relative co-ordinate position (1.953125, 5.208). - The compensation
value interpolation section 44 refers to the compensation value table stored in the compensationvalue table memory 43, and calculates a shading compensation value corresponding to the relative co-ordinate position by interpolation. -
FIG. 5 is a diagram for explaining the interpolation calculation of the shading compensation value corresponding to the relative co-ordinate position. The compensationvalue interpolation section 44 utilizes, for example, a spline function, which is a function without discontinuities, and sixteen shading compensation values peripheral to the relative co-ordinate position to interpolate the shading compensation value corresponding to the relative co-ordinate position. - If the relative co-ordinate position is at an edge of the compensation value table, it may not be possible to acquire the sixteen shading compensation values. In such a case, the compensation
value interpolation section 44 may extrapolate shading compensation values which are at the edge of the compensation value table unaltered, and thus acquire sixteen shading compensation values peripheral to the relative co-ordinate position. Further, the interpolation calculation of the shading compensation value corresponding to the relative co-ordinate position is not limited to the calculation described above. - The
multiplier 45 generates shading-compensated red pixel data by multiplying the image data of the aforementioned co-ordinate position (500, 1000) which is the processing object with the shading compensation value calculated by interpolation at the compensationvalue interpolation section 44. - Similarly, the shading compensation is applied to all red pixels, and then the shading compensation is applied in a similar manner to all the blue and green pixels. Hence, the shading-compensated image data is stored to the
SDRAM 30 via thememory control section 23. - As described above, the imaging device relating to the first embodiment of the present invention includes compensation value tables for each color in which respective shading compensation values are associated with representative positions of plural regions of a single screen. Thus, in comparison with a case of including shading compensation values corresponding to all pixels, a data volume that must be held can be hugely reduced.
- Further, this imaging device can implement a shading compensation that would be difficult to express as a function simply, by referring to the compensation value tables and interpolating the shading compensation values for the co-ordinate positions that are processing objects.
- Note that although this imaging device compresses the image data so as to fit to the compensation value table, it is also possible to expand the compensation value table to fit to the image data.
-
FIG. 6 is a diagram showing a state in which 8×8-pixel compensation value tables are magnified so as to correspond with 2048×1536-pixel image data. In such a case, the co-ordinate positions of the respective pixels of the compensation value tables are also magnified. Hence, magnified compensation data CFij of a co-ordinate position (i,j) in the 2048 by 1536 pixels is calculated by interpolation of the shading compensation values of the magnified compensation value tables. -
FIG. 7 is a diagram showing CCDij and OUTij. If image data (CCD data) that is a processing object is ‘CCDij’ and the CCD data after compensation is ‘OUTij’, OUTij is found by the following equation.
OUTij=CCDij×CFij - As described above, this imaging device may magnify the compensation value tables to correspond with the image data. Hence, utilizing the shading compensation values of the magnified compensation value tables, interpolation calculations of the shading compensation values for the pixels that are processing objects are implemented, and shading compensation of the pixel data of the pixels that are processing objects can be implemented in accordance with these shading compensation values.
- Next, a second embodiment of the present invention will be described. Here, portions that are the same as in the first embodiment are assigned the same reference numerals, and detailed descriptions thereof are omitted.
-
FIG. 8 is a block diagram showing structure of an imaging device relating to the second embodiment. The imaging device relating to the second embodiments calculates red and blue shading compensation values, and is provided with aprocessing system 20A with a structure which differs from the first embodiment. - In addition to the various structures of the
processing system 20 shown inFIG. 1 , theprocessing system 20A is further provided with an auto white balance (AWB)adjustment section 25 and amemory control section 26. TheAWB adjustment section 25 both implements automatic white balance adjustment and calculates shading compensation values. Thememory control section 26 writes image data to theSDRAM 30 after the white balance adjustment. - The imaging device which is structured as described above calculates shading compensation values in accordance with the following procedure when imaging a gray proofing plate.
-
FIG. 9 is a flowchart showing a shading correction value generation routine. TheAWB adjustment section 25 of the imaging device executes the following shading correction value generation routine in accordance with image data supplied via the blacklevel correction section 21 and theshading compensation section 22. Herein, when this routine is executed, theshading compensation section 22 simply throughputs the image data supplied thereto from the blacklevel correction section 21 without performing shading compensation. - In step S1, the
AWB adjustment section 25 divides image data constituted of, for example, 2048 by 1536 pixels of three basic colors into 8 by 8 regions, calculates aggregate values of the colors Rji, Gji and Bji for each region, and then proceeds to step S2. -
FIG. 10A is a diagram describing 2048 by 1536 pixels of R data, andFIG. 10B is a diagram describing aggregate values (summed values Rji) of R data for respective regions, when the image data has been divided into 8 by 8 regions. In such a case, the summed values Rji are obtained by summing values of 256 by 192 pixels of R data for each region. Here, inFIG. 10B , R00 represents a top-left region or an aggregate value thereof. With R00 as an origin, the regions Rji are represented as R01, R02, . . . , R07 moving toward the right, and are represented as R10, R20 . . . , R70 moving toward the bottom. TheAWB adjustment section 25 also calculates summed values Gji and Bji for the respective regions in a similar manner. - In step S2, the
AWB adjustment section 25 calculates averages of the aggregate values of predetermined regions which are at a central portion of the screen, for each of R, G and B. TheAWB adjustment section 25 calculates an average value Rcenter of, for example, the summed values R33, R34, R43 and R44 of four regions at the central portion of the screen. Similarly for G and B, theAWB adjustment section 25 calculates average values of the aggregate values of the four regions at the screen central portion. That is, theAWB adjustment section 25 performs the following calculations, and then proceeds to step S3.
Rcenter=(R 33 +R 34 +R 43 +R 44)/4
Gcenter=(G 33 +G 34 +G 43 +G 44)/4
Bcenter=(B 33 +B 34 +B 43 +B 44)/4 - In step S3, the
AWB adjustment section 25 calculates respective gains of R and B, Rgain and Bgain. That is, theAWB adjustment section 25 performs the following calculations, and then proceeds to step S4.
Rgain=Gcenter/Rcenter
Bgain=Gcenter/Bcenter - In step S4, the
AWB adjustment section 25 multiplies the respective gains Rgain and Bgain with the summed values Rji and Bji, to calculate Rji and B′ji. That is, theAWB adjustment section 25 performs the following calculations, and then proceeds to step S5.
R′ji=Rgain×Rji
B′ji=Bgain×Bji -
FIG. 11 is a diagram showing R′ji for respective regions in the case in which the image data is divided into 8 by 8 regions. Thus, by performing the processing from step S1 to step S4, theAWB adjustment section 25 performs white balance adjustment to adjust the R data and the B data of the whole screen by reference to the average values of the R, G and B data at the central portion of the image. As a result, the summed values Rji and Bji are provided for each of the eight by eight regions subsequent to white balance adjustment with the central portion of the image being the reference point. -
FIG. 12 is a diagram showing R, G and B data after the white balance adjustment. Because the white balance adjustment is performed with reference to the central portion of the image, levels of the R, G and B data at the central portion of the image substantially coincide. However, compared with the G data, the levels of the R and B data fall from the central portion toward the edges of the image. When there are such mismatches of the respective levels of the data, shading will occur in accordance with these mismatches. - Accordingly, in step S5, the
AWB adjustment section 25 performs the following calculations to find shading compensation values RHji and BHji.
RHji=Gji/R′ji
BHji=Gji/B′ji - That is, in consideration of occurrences of shading in accordance with mismatches of the levels of the white balance-adjusted R, G and B data, after the white balance adjustment, the
AWB adjustment section 25 finds the mismatches of the summed values R′ji and B′ji with respect to the summed values Gji, in the form of the shading compensation values RHji and BHji. Herein, in order to prevent over-compensation, the values RHji and BHji mentioned above may be multiplied by, for example, a predetermined coefficient α (<1). Preferable values of α are, for example, 0.9, 0.8 and the like. -
FIG. 13 is a diagram showing RHji for respective regions in the case in which the image data is divided into 8 by 8 regions. This corresponds to a compensation value table for R. A compensation value table for B is found by theAWB adjustment section 25 in a similar manner. - As described above, the imaging device relating to the second embodiment can obtain compensation value tables for R and B by finding offsets of the summed values R′ji and B′ji relative to the summed values Gji after the white balance adjustment, to serve as shading compensation values.
- Next, a third embodiment of the present invention will be described. Here, portions that are the same as in the second embodiment are assigned the same reference numerals, and detailed descriptions thereof are omitted.
- The imaging device relating to the third embodiment calculates, as well as shading compensation values for R and B, shading compensation values for G Here, the imaging device relating to the present embodiment has a similar structure to the second embodiment, but a portion of the shading compensation value generation routine of the
AWB adjustment section 25 is different. - Specifically, after performing the processing from step S1 to step S4 shown in
FIG. 9 , theAWB adjustment section 25 may perform the following calculations, so as to find shading compensation values RHji, GHji and BHji.
RHji=Gcenter/Rji
GHji=Gcenter/Gji
BHji=Gcenter/B′ji -
FIG. 14 is a diagram showing levels of R, G and B data after white balance adjustment. Although not to the same extent as the R and B data, the level of the G data falls slightly from the central portion toward the edges of the image. Accordingly, after the white balance adjustment, theAWB adjustment section 25 finds the mismatches of the summed values R′ji, Gji and Bji with respect to Gcenter of the central portion of the summed values Gji, in the form of the shading compensation values RHji, GHji and BHji. Thus, compensation value tables are found for R, G and B. - Here, in order to prevent over-compensation, the values RHji, GHji and BHji described above may be multiplied by, for example, a predetermined coefficient α (<1). Preferable values of α are, for example, 0.9, 0.8 and the like.
- As described above, the imaging device relating to the third embodiment can, by finding offsets of the summed values of R′ji, Gji and Bji relative to Gcenter of the central portion of the summed values Gji after the white balance adjustment to serve as shading compensation values, obtain compensation value tables which enable more accurate compensation than in the second embodiment.
- Next, a fourth embodiment of the present invention will be described. Here, portions that are the same as in the embodiments described above are assigned the same reference numerals, and detailed descriptions thereof are omitted.
- The imaging device relating to the fourth embodiment is also capable of compensating for lens shading. Hereafter, the shading compensation values utilized in the first to third embodiments are referred to as “CCD shading compensation values”, and the compensation values for lens shading are referred to as “lens shading compensation values”.
- Instead of the
shading compensation section 22 illustrated for the embodiments described above, the imaging device relating to the present embodiment is provided with theshading compensation section 22A as follows. -
FIG. 15 is a block diagram showing structure of the shading compensation sectionshading compensation section 22A. In addition to the structure shown inFIG. 2 , theshading compensation section 22A is provided with a compensation valuefunction calculation section 46 and amultiplier 47. The compensation valuefunction calculation section 46 utilizes a lens shading compensation function to calculate lens shading compensation values. Themultiplier 47 multiplies the CCD shading compensation values with the lens shading compensation values. -
FIG. 16 is a diagram for explaining the lens shading compensation function. In the lens shading compensation function, a distance d from a center serves as a variable, and the function is represented by, for example, the following equation.
Lens shading compensation value=α·d 4 +β·d 2+γ - The compensation value
function calculation section 46 calculates the distance d from a center position to a co-ordinate position which has been counted by thepixel counter 41, and calculates a lens shading compensation value by substitution of the distance d into the above equation. - The
multiplier 47 multiplies a CCD shading compensation value found by the compensationvalue interpolation section 44 with a lens shading compensation value found by the compensation valuefunction calculation section 46, and feeds the multiplied value to themultiplier 45. Then, themultiplier 45 multiplies image data which is a processing object by the new shading compensation value found by themultiplier 47. Thus, shading-compensated image data is generated. - As described above, the imaging device relating to the fourth embodiment can simultaneously compensate not just for shading caused at the
CCD image sensor 11, but also for shading caused at a lens. - Next, a fifth embodiment of the present invention will be described. Here, portions that are the same as in the embodiments described above are assigned the same reference numerals, and detailed descriptions thereof are omitted.
-
FIG. 17 is a block diagram showing structure of an imaging device relating to the fifth embodiment. The imaging device relating to the fifth embodiment is capable of executing shading compensation properly even with a long-duration exposure at a time of image capture, and is provided with aprocessing system 20B instead of theprocessing system 20 orprocessing system 20A of the embodiments described above. - In addition to the black
level correction section 21, theshading compensation section 22, thememory control section 23, theCPU 24 and theAWB adjustment section 25, theprocessing system 20B is provided with asignal processing section 27, amemory control section 28 and amemory control section 29. Thesignal processing section 27 carries out predetermined digital signal processing on the shading-compensated image data. Thememory control section 28 writes the white balance-adjusted or digital signal-processed image data to theSDRAM 30. Thememory control section 29 reads the image data from theSDRAM 30 and feeds the same to the blacklevel correction section 21. Herein, the blacklevel correction section 21 and theshading compensation section 22 perform predetermined compensation processing at a time of reading of image data or at a time of signal processing. - Times of Usual Exposure
- The imaging device structured as described above operates as follows for cases of image capture by usual exposure.
- At the time of image capture, the
memory control section 29 feeds the image data supplied from the AFE/TG 12 to the blacklevel correction section 21. This image data is black level-corrected by the blacklevel correction section 21, and shading-compensated by theshading compensation section 22. Then, theshading compensation section 22 writes the shading-compensated image data to theSDRAM 30, via thememory control section 23, and stores the same at theAWB adjustment section 25 to serve as aggregate data for white balance adjustment. - At the time of signal processing, the
memory control section 29 reads the shading-compensated image data from theSDRAM 30 and the aggregate data for white balance adjustment. The data which has been read by thememory control section 29 is provided through the blacklevel correction section 21 and theshading compensation section 22, and fed to thesignal processing section 27. Thesignal processing section 27 utilizes the aggregate data for white balance adjustment to perform white balance adjustment, and other digital signal processing, on the image data. Thereafter, the processed image data is written to theSDRAM 30 via thememory control section 28. - Thus, this imaging device saves the black level-corrected and shading-compensated image data to serve as the aggregate data for white balance adjustment at the time of image capture, and utilizes the aggregate data for white balance adjustment to perform the white balance adjustment at the time of signal processing. Thus, accuracy of the signal processing is raised.
- Now, if processing as described above is performed in a case of image capture with a long-duration exposure, the black level will gradually float during the exposure duration, it will not be possible to correct the black level consistently, and there may be black-floating in the image data. Furthermore, if the shading compensation is performed on image data with black-floating, there may be a problem in that the black level also floats in accordance with the compensation amounts.
- Times of Long-Duration Exposure
- Accordingly, this imaging device performs processing as follows in a case of image capture with a long-duration exposure.
- The
memory control section 29 writes the unprocessed image data supplied from the AFE/TG 12 to theSDRAM 30. Meanwhile, the unprocessed image data is provided through the blacklevel correction section 21 and theshading compensation section 22, and fed to theAWB adjustment section 25. Hence, at theAWB adjustment section 25, the unprocessed image data is stored to serve as the aggregate data for white balance adjustment. - At the time of signal processing, the
memory control section 29 reads the image data from theSDRAM 30 and the aggregate data for white balance adjustment. This image data is black level-corrected by the blacklevel correction section 21 and shading-compensated by theshading compensation section 22, and then fed to thesignal processing section 27. As a result, the blacklevel correction section 21 can detect an accurate black level on the basis of the unprocessed image data which has been temporarily held at theSDRAM 30, and can correct the black level with high accuracy. Then, thesignal processing section 27 utilizes the aggregate data for white balance adjustment, which has been stored at theAWB adjustment section 25, to execute the white balance adjustment, and other predetermined processing, on the shading-compensated image data. Thememory control section 28 writes the image data that has been processed by thesignal processing section 27 to theSDRAM 30. - As described above, the imaging device relating to the fifth embodiment performs black level correction and shading compensation on image data from the image capture system at a time of image capture with usual exposure, but at a time of image capture with a long-duration exposure, temporarily stores the image data from the image capture system and performs the black level correction and shading compensation on the image data subsequent to the image capture.
- Thus, this imaging device can, at a time of image capture with a long-duration exposure, perform the black level correction and the shading compensation on image data in which a floating amount of black is settled. Therefore, this imaging device can perform excellent black level correction and shading compensation even on image data of a long-duration exposure.
- Note that the present invention is not limited to the embodiments described above, and variations of design are possible within the scope recited in the patented claims. For example, details of the respective embodiments may be arbitrarily combined.
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/453,197 US7760252B2 (en) | 2004-11-08 | 2009-05-01 | Shading compensation device, shading compensation value calculation device and imaging device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004323820A JP4322781B2 (en) | 2004-11-08 | 2004-11-08 | Imaging device |
JP2004-323820 | 2004-11-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/453,197 Division US7760252B2 (en) | 2004-11-08 | 2009-05-01 | Shading compensation device, shading compensation value calculation device and imaging device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060098240A1 true US20060098240A1 (en) | 2006-05-11 |
Family
ID=36315984
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/265,291 Abandoned US20060098240A1 (en) | 2004-11-08 | 2005-11-03 | Shading compensation device, shading compensation value calculation device and imaging device |
US12/453,197 Expired - Fee Related US7760252B2 (en) | 2004-11-08 | 2009-05-01 | Shading compensation device, shading compensation value calculation device and imaging device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/453,197 Expired - Fee Related US7760252B2 (en) | 2004-11-08 | 2009-05-01 | Shading compensation device, shading compensation value calculation device and imaging device |
Country Status (2)
Country | Link |
---|---|
US (2) | US20060098240A1 (en) |
JP (1) | JP4322781B2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080062292A1 (en) * | 2006-09-07 | 2008-03-13 | Canon Kabushiki Kaisha | Imaging apparatus configured to correct noise |
US20080181596A1 (en) * | 2007-01-25 | 2008-07-31 | Marc Drader | Handheld Electronic Device and Camera Providing Flash Compensation of Images, and Associated Method |
EP2143068A1 (en) * | 2007-04-23 | 2010-01-13 | Hewlett-Packard Development Company, L.P. | Correcting a captured image in digital imaging devices |
CN103369202A (en) * | 2012-04-01 | 2013-10-23 | 联咏科技股份有限公司 | Method for compensating local lens shadow |
CN104240207A (en) * | 2014-10-10 | 2014-12-24 | 深圳市开立科技有限公司 | Image shadow removing method and device |
US8988547B2 (en) | 2008-06-30 | 2015-03-24 | Sony Corporation | Image signal correcting device, imaging device, image signal correcting method, and program |
US10005682B1 (en) | 2009-10-02 | 2018-06-26 | Tersano Inc. | Holding tank-less water ozonating system |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8471852B1 (en) | 2003-05-30 | 2013-06-25 | Nvidia Corporation | Method and system for tessellation of subdivision surfaces |
US8571346B2 (en) | 2005-10-26 | 2013-10-29 | Nvidia Corporation | Methods and devices for defective pixel detection |
US7750956B2 (en) | 2005-11-09 | 2010-07-06 | Nvidia Corporation | Using a graphics processing unit to correct video and audio data |
US8588542B1 (en) | 2005-12-13 | 2013-11-19 | Nvidia Corporation | Configurable and compact pixel processing apparatus |
US8737832B1 (en) | 2006-02-10 | 2014-05-27 | Nvidia Corporation | Flicker band automated detection system and method |
KR100793938B1 (en) | 2006-06-14 | 2008-01-16 | 주식회사 아이닉스 | Device and method for shading compensation |
US8594441B1 (en) | 2006-09-12 | 2013-11-26 | Nvidia Corporation | Compressing image-based data using luminance |
US8723969B2 (en) | 2007-03-20 | 2014-05-13 | Nvidia Corporation | Compensating for undesirable camera shakes during video capture |
US8724895B2 (en) | 2007-07-23 | 2014-05-13 | Nvidia Corporation | Techniques for reducing color artifacts in digital images |
US8570634B2 (en) | 2007-10-11 | 2013-10-29 | Nvidia Corporation | Image processing of an incoming light field using a spatial light modulator |
US9177368B2 (en) | 2007-12-17 | 2015-11-03 | Nvidia Corporation | Image distortion correction |
US8698908B2 (en) | 2008-02-11 | 2014-04-15 | Nvidia Corporation | Efficient method for reducing noise and blur in a composite still image from a rolling shutter camera |
US9379156B2 (en) | 2008-04-10 | 2016-06-28 | Nvidia Corporation | Per-channel image intensity correction |
US8373718B2 (en) | 2008-12-10 | 2013-02-12 | Nvidia Corporation | Method and system for color enhancement with color volume adjustment and variable shift along luminance axis |
TWI386042B (en) * | 2008-12-17 | 2013-02-11 | Altek Corp | Digital camera device and its brightness correction method |
US8749662B2 (en) | 2009-04-16 | 2014-06-10 | Nvidia Corporation | System and method for lens shading image correction |
US8698918B2 (en) | 2009-10-27 | 2014-04-15 | Nvidia Corporation | Automatic white balancing for photography |
TWI458347B (en) | 2010-12-20 | 2014-10-21 | Ind Tech Res Inst | Image pickup apparatus and method thereof |
JP5757836B2 (en) * | 2011-10-11 | 2015-08-05 | 京セラドキュメントソリューションズ株式会社 | Image reading apparatus and image forming apparatus having the same |
US9798698B2 (en) | 2012-08-13 | 2017-10-24 | Nvidia Corporation | System and method for multi-color dilu preconditioner |
US9508318B2 (en) | 2012-09-13 | 2016-11-29 | Nvidia Corporation | Dynamic color profile management for electronic devices |
US9307213B2 (en) | 2012-11-05 | 2016-04-05 | Nvidia Corporation | Robust selection and weighting for gray patch automatic white balancing |
US9418400B2 (en) | 2013-06-18 | 2016-08-16 | Nvidia Corporation | Method and system for rendering simulated depth-of-field visual effect |
US9826208B2 (en) | 2013-06-26 | 2017-11-21 | Nvidia Corporation | Method and system for generating weights for use in white balancing an image |
US9756222B2 (en) | 2013-06-26 | 2017-09-05 | Nvidia Corporation | Method and system for performing white balancing operations on captured images |
JP6051246B2 (en) * | 2015-02-25 | 2016-12-27 | 京セラドキュメントソリューションズ株式会社 | Image reading apparatus and image forming apparatus |
WO2017138372A1 (en) * | 2016-02-10 | 2017-08-17 | ソニー株式会社 | Solid-state imaging device and electronic device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4863715A (en) * | 1984-03-29 | 1989-09-05 | Nycomed As | Method of NMK imaging using a contrast agent comprising particles of a ferromagnetic material |
US5458869A (en) * | 1990-03-28 | 1995-10-17 | Nycomed Salutar Inc. | Multinuclear complexes for X-ray imaging |
US5618514A (en) * | 1983-12-21 | 1997-04-08 | Nycomed Imaging As | Diagnostic and contrast agent |
US20020033975A1 (en) * | 1999-12-02 | 2002-03-21 | Yoshiro Yamazaki | Image reading apparatus and method |
US20030234879A1 (en) * | 2002-06-20 | 2003-12-25 | Whitman Christopher A. | Method and apparatus for color non-uniformity correction in a digital camera |
US20050030412A1 (en) * | 2003-08-07 | 2005-02-10 | Canon Kabushiki Kaisha | Image correction processing method and image capture system using the same |
US20060061593A1 (en) * | 2004-09-22 | 2006-03-23 | Satoshi Miura | Image display unit and method of correcting brightness in image display unit |
US7233352B2 (en) * | 2002-06-20 | 2007-06-19 | Hewlett-Packard Development Company, L.P. | Method and apparatus for color non-uniformity correction in a digital camera |
US7463294B2 (en) * | 2003-05-23 | 2008-12-09 | Nikon Corporation | Signal processing unit for correcting shading of image signal, and electronic camera |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0583622A (en) | 1991-09-24 | 1993-04-02 | Nec Corp | Shading correction circuit for ccd camera |
JPH06273171A (en) | 1993-03-18 | 1994-09-30 | Fuji Heavy Ind Ltd | Distance detection device for vehicle |
JPH06319042A (en) | 1993-05-10 | 1994-11-15 | Toshiba Eng Co Ltd | Image processor |
JPH11355511A (en) | 1998-06-05 | 1999-12-24 | Fuji Photo Film Co Ltd | Picture processor |
JP3539394B2 (en) * | 2001-03-26 | 2004-07-07 | ミノルタ株式会社 | Image processing apparatus, program, and recording medium |
US7386185B2 (en) * | 2002-02-12 | 2008-06-10 | Matsushita Electric Industrial Co., Ltd. | Image processing device and image processing method |
JP2003348604A (en) | 2002-05-28 | 2003-12-05 | Sharp Corp | Imaging apparatus and imaging method |
JP4537665B2 (en) | 2003-03-27 | 2010-09-01 | 富士フイルム株式会社 | Image data signal processing circuit and method |
JP4329409B2 (en) * | 2003-05-23 | 2009-09-09 | 株式会社ニコン | Electronic camera shading correction circuit |
JP4377622B2 (en) * | 2003-07-16 | 2009-12-02 | オリンパス株式会社 | Shading correction device |
JP2005269373A (en) * | 2004-03-19 | 2005-09-29 | Fuji Photo Film Co Ltd | Video signal processing system, and electronic video apparatus |
JP2006121612A (en) * | 2004-10-25 | 2006-05-11 | Konica Minolta Photo Imaging Inc | Image pickup device |
-
2004
- 2004-11-08 JP JP2004323820A patent/JP4322781B2/en not_active Expired - Fee Related
-
2005
- 2005-11-03 US US11/265,291 patent/US20060098240A1/en not_active Abandoned
-
2009
- 2009-05-01 US US12/453,197 patent/US7760252B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5618514A (en) * | 1983-12-21 | 1997-04-08 | Nycomed Imaging As | Diagnostic and contrast agent |
US4863715A (en) * | 1984-03-29 | 1989-09-05 | Nycomed As | Method of NMK imaging using a contrast agent comprising particles of a ferromagnetic material |
US5458869A (en) * | 1990-03-28 | 1995-10-17 | Nycomed Salutar Inc. | Multinuclear complexes for X-ray imaging |
US5614168A (en) * | 1990-03-28 | 1997-03-25 | Nycomed Salutar Inc. | Multinuclear complexes for X-ray imaging |
US20020033975A1 (en) * | 1999-12-02 | 2002-03-21 | Yoshiro Yamazaki | Image reading apparatus and method |
US20030234879A1 (en) * | 2002-06-20 | 2003-12-25 | Whitman Christopher A. | Method and apparatus for color non-uniformity correction in a digital camera |
US7233352B2 (en) * | 2002-06-20 | 2007-06-19 | Hewlett-Packard Development Company, L.P. | Method and apparatus for color non-uniformity correction in a digital camera |
US7463294B2 (en) * | 2003-05-23 | 2008-12-09 | Nikon Corporation | Signal processing unit for correcting shading of image signal, and electronic camera |
US20050030412A1 (en) * | 2003-08-07 | 2005-02-10 | Canon Kabushiki Kaisha | Image correction processing method and image capture system using the same |
US20060061593A1 (en) * | 2004-09-22 | 2006-03-23 | Satoshi Miura | Image display unit and method of correcting brightness in image display unit |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7787036B2 (en) * | 2006-09-07 | 2010-08-31 | Canon Kabushiki Kaisha | Imaging apparatus configured to correct noise |
US20080062292A1 (en) * | 2006-09-07 | 2008-03-13 | Canon Kabushiki Kaisha | Imaging apparatus configured to correct noise |
US8009978B2 (en) | 2007-01-25 | 2011-08-30 | Research In Motion Limited | Handheld electronic device and camera providing flash compensation of images, and associated method |
US7702235B2 (en) * | 2007-01-25 | 2010-04-20 | Research In Motion Limited | Handheld electronic device and camera providing flash compensation of images, and associated method |
US20100172640A1 (en) * | 2007-01-25 | 2010-07-08 | Research In Motion Limited | Handheld electronic device and camera providing flash compensation of images, and associated method |
US20080181596A1 (en) * | 2007-01-25 | 2008-07-31 | Marc Drader | Handheld Electronic Device and Camera Providing Flash Compensation of Images, and Associated Method |
US8306412B2 (en) | 2007-01-25 | 2012-11-06 | Research In Motion Limited | Handheld electronic device and camera providing flash compensation of images, and associated method |
US8693862B2 (en) | 2007-01-25 | 2014-04-08 | Blackberry Limited | Handheld electronic device and camera providing flash compensation of images, and associated method |
EP2143068A1 (en) * | 2007-04-23 | 2010-01-13 | Hewlett-Packard Development Company, L.P. | Correcting a captured image in digital imaging devices |
EP2143068A4 (en) * | 2007-04-23 | 2012-11-07 | Hewlett Packard Development Co | Correcting a captured image in digital imaging devices |
US8988547B2 (en) | 2008-06-30 | 2015-03-24 | Sony Corporation | Image signal correcting device, imaging device, image signal correcting method, and program |
US10005682B1 (en) | 2009-10-02 | 2018-06-26 | Tersano Inc. | Holding tank-less water ozonating system |
CN103369202A (en) * | 2012-04-01 | 2013-10-23 | 联咏科技股份有限公司 | Method for compensating local lens shadow |
CN104240207A (en) * | 2014-10-10 | 2014-12-24 | 深圳市开立科技有限公司 | Image shadow removing method and device |
Also Published As
Publication number | Publication date |
---|---|
JP2006134157A (en) | 2006-05-25 |
US7760252B2 (en) | 2010-07-20 |
US20090213243A1 (en) | 2009-08-27 |
JP4322781B2 (en) | 2009-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7760252B2 (en) | Shading compensation device, shading compensation value calculation device and imaging device | |
JP4768448B2 (en) | Imaging device | |
US8106976B2 (en) | Peripheral light amount correction apparatus, peripheral light amount correction method, electronic information device, control program and readable recording medium | |
US7751619B2 (en) | Image processing apparatus and method, recording medium, and program | |
US7893974B2 (en) | Apparatus, method, and computer program for processing information | |
US8942475B2 (en) | Image signal processing device to emphasize contrast | |
JP5761946B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JP4683994B2 (en) | Image processing apparatus, image processing method, electronic camera, scanner | |
US20080143881A1 (en) | Image processor and image processing program | |
US7689059B2 (en) | Image processing method and image processing circuit | |
KR20060000715A (en) | Apparatus and method for improving image quality in a image sensor | |
US8593546B2 (en) | Image processing apparatus, image processing method, and camera module for detecting and correcting defective pixels based on contrast and illuminance | |
KR101639664B1 (en) | Photographing apparatus and photographing method | |
KR100956228B1 (en) | Image processing apparatus having function of correctting distortion of image | |
JP2023106486A (en) | Imaging apparatus and a control method for the same, and program | |
US20080297632A1 (en) | Imaging apparatus and image processing program | |
JP4044826B2 (en) | Semiconductor integrated circuit | |
KR100747729B1 (en) | Image processor, device for compensating of lens shading and the same method | |
JP4993275B2 (en) | Image processing device | |
KR100566571B1 (en) | Method and apparatus for auto compensating image sensor lens shading | |
JP2010193112A (en) | Image processing apparatus and digital still camera | |
JP2004357238A (en) | Image pickup device and its image processing method | |
JP2018147417A (en) | Image processor, image processing method, and program | |
JPS6211373A (en) | Solid-state image pickup device | |
JP2012134671A (en) | Image processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI PHOTO FILM CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUNAMOTO, KENJI;REEL/FRAME:017189/0135 Effective date: 20051028 |
|
AS | Assignment |
Owner name: FUJIFILM CORPORATION,JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO. LTD.);REEL/FRAME:019331/0493 Effective date: 20070130 Owner name: FUJIFILM CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO. LTD.);REEL/FRAME:019331/0493 Effective date: 20070130 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |