CN103974043A - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
CN103974043A
CN103974043A CN201310027214.3A CN201310027214A CN103974043A CN 103974043 A CN103974043 A CN 103974043A CN 201310027214 A CN201310027214 A CN 201310027214A CN 103974043 A CN103974043 A CN 103974043A
Authority
CN
China
Prior art keywords
object pixel
edge
colouring information
heterochromia
colouring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310027214.3A
Other languages
Chinese (zh)
Other versions
CN103974043B (en
Inventor
陈世泽
黄文聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realtek Semiconductor Corp
Original Assignee
Realtek Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realtek Semiconductor Corp filed Critical Realtek Semiconductor Corp
Priority to CN201310027214.3A priority Critical patent/CN103974043B/en
Publication of CN103974043A publication Critical patent/CN103974043A/en
Application granted granted Critical
Publication of CN103974043B publication Critical patent/CN103974043B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an image processing device and an image processing method. The image processing device comprises the steps that a piece of image data is captured from a frame buffer; a piece of second color information in the upward direction, a piece of second color information in the downward direction, a piece of second color information in the leftward direction, and a piece of second color information in the rightward direction of a first target pixel in the image data are estimated according to first color information of the target pixel and color information of pixels adjacent to the target pixel; a color difference gradient value in the upward direction, a color difference gradient value in the downward direction, a color difference gradient value in the leftward direction, and a color difference gradient value in the rightward direction of the target pixel are calculated according to the four pieces of second color information of the target pixel; the edge texture characteristics of the target pixel are decided according to the four color difference gradient values of the target pixel; the edge indicative factor of the target pixel is decided according to the edge texture characteristics of the target pixel, and the weight used in subsequent interpolation operation is decided according to the edge indicative factor through table look-up.

Description

Image processor and image treatment method
Technical field
The present invention is relevant for a kind of image treatment method, and is particularly to a kind of image data that image sensor is captured and carries out the method for color interpolation and relevant image processor.
Background technology
At digital camera, video cameras, multimedia handset, supervisory control system, picture telephone ... etc. the single image sensor of every use (Single sensor), carry out in the consumption electronic products of image capture, its sensor can be by covering bayer color filter array (Bayer color filter array, Bayer CFA) in single image sensor, record redness, green and blue color information, wherein, this single sensor only can record a kind of colouring intensity value to reduce manufacturing cost at each location of pixels.In addition, for original Bel's mosaic image that single image sensor captured (Raw Bayer CFA Image) is reduced into full-color image, need to carry out the color data that color interpolation processing is lost to obtain.Because color interpolation algorithm has comprised the analysis for presentation content structure and color, for the image quality of last output, there is quite critical impact.
In IC design application; the operation of color interpolation can take more memorizer buffer space (line buffer) conventionally; and computation complexity is also higher; therefore, how under these manufacturing costs of minimizing, can effectively avoid again the color interpolation method of the distortion situations such as boundary of object and texture region generation slide fastener shape (zipper effects), color overlapping (color aliasing), moir Lou Lou patterns (Moire pattern) and flaw look (false color) in image is the Research Emphasis between same domain always.In addition, in No. I274908th, TaiWan, China patent announcement, disclosed to utilize and calculate in advance least squares error (MinimumSquare Error, MSE) and estimate the weighted value that contiguous wish adopts color pixel value; And the image edge detected on object pixel on a plurality of interior direction interpolations for No. I296482 as TaiWan, China patent announcement of another technology, and produce a plurality of image edge Grad, then with a random factor normalization summation of adjusting, estimate the hue component that object pixel is lost.Yet above-mentioned Technology Need is used larger memorizer buffer space and considerable division arithmetic, can improve like this manufacturing cost and relative being not easy in hardware realization, so reduce technical application degree.In addition, above-mentioned color interpolation method, because considered too much directional information, therefore, easily produces the situation of blurring effect and reconstruction distortion, thereby reduces image quality at the imagery zone of sharpened edge and fine structure.
Summary of the invention
Therefore, one of object of the present invention is to provide a kind of image treatment method that can produce high-quality image, and in the implementation of hardware, does not also need to use expensive one-tenth division arithmetic and extra buffer memory size, to solve the problems of the prior art.
According to one embodiment of the invention, a kind of image processor includes an initial interpolation unit, one heterochromia gradient estimation unit, one edge textural characteristics determining means and an edge indicator record cell, wherein this initial interpolation unit is in order to capture an image data from one frame buffer, wherein each pixel in this image data only has a kind of colouring information, and for the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, to estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information, this heterochromia gradient estimation unit is coupled to this initial interpolation unit, and in order to four the second colouring informations according to this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad, this Edge texture characteristics determined unit is coupled to this heterochromia gradient estimation unit, and in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel, and this edge indicator record cell is coupled to this Edge texture characteristics determined unit, and in order to the Edge texture feature according to this object pixel, determine whether revising the bit value of this first colouring information of this object pixel that is stored in this frame buffer.
According to another embodiment of the present invention, a kind of image treatment method includes: from one frame buffer, capture an image data, wherein each pixel in this image data only has a kind of colouring information; For the object pixel in this image data, according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information; According to four the second colouring informations of this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad; In order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And the bit value that determines whether revising this first colouring information of this object pixel that is stored in this frame buffer according to the Edge texture feature of this object pixel.
According to another embodiment of the present invention, a kind of image processor, include an initial interpolation unit, one heterochromia gradient estimation unit, one edge textural characteristics determining means, one Dynamic Weights quantizing distribution unit and a Weighted Interpolation unit, wherein this initial interpolation unit is in order to capture an image data from one frame buffer, wherein each pixel in this image data only has a kind of colouring information, and for the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, to estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information, this heterochromia gradient estimation unit is coupled to this initial interpolation unit, and in order to four the second colouring informations according to this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad, this Edge texture characteristics determined unit, is coupled to this heterochromia gradient estimation unit, and in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel, this Dynamic Weights quantizing distribution unit is coupled to this heterochromia gradient estimation unit and this Edge texture characteristics determined unit, and the Edge texture feature being used for according to four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights, and this Weighted Interpolation unit is coupled to this initial interpolation unit and this Dynamic Weights quantizing distribution unit, wherein this Weighted Interpolation unit is weighted addition by the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations, to obtain target second colouring information of this object pixel.
According to another embodiment of the present invention, a kind of image treatment method, includes: from one frame buffer, capture an image data, wherein each pixel in this image data only has a kind of colouring information; For the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information; According to four the second colouring informations of this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad; According to four heterochromia Grad of this object pixel, decide the Edge texture feature of this object pixel; According to the Edge texture feature of four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights; And by the plurality of weight, at least two the second colouring informations in four of this object pixel the second colouring informations are weighted to addition, to obtain target second colouring information of this object pixel.
Accompanying drawing explanation
Fig. 1 is the schematic diagram according to the image processor of one embodiment of the invention.
Fig. 2 is the flow chart according to the image treatment method of one embodiment of the invention.
Fig. 3 is the schematic diagram of Bel's mosaic image.
Fig. 4 A is for calculating the schematic diagram of top heterochromia Grad with top shade.
Fig. 4 B is for calculating the schematic diagram of below heterochromia Grad with below shade.
Fig. 4 C is for calculating the schematic diagram of left heterochromia Grad with left shade.
Fig. 4 D is for calculating the schematic diagram of right-hand heterochromia Grad with right-hand shade.
Fig. 5 is the schematic diagram of obtaining corresponding edge indicator in neighborhood pixels.
Fig. 6 is the schematic diagram according to the computer readable media of one embodiment of the invention.
Wherein, description of reference numerals is as follows:
100: image processor
102: image sensor
104: initial interpolation unit
106: heterochromia gradient estimation unit
108: Edge texture characteristics determined unit
110: edge indicator record cell
112: Dynamic Weights quantizing distribution unit
114: Weighted Interpolation unit
120: frame buffer
200~216: step
402: top shade
404: below shade
406: left shade
408: right-hand shade
600: host computer
610: processor
620: computer readable media
622: computer program
Embodiment
In the middle of specification and follow-up claim, used some vocabulary to censure specific element.Person with usual knowledge in their respective areas should understand, and hardware manufacturer may be called same element with different nouns.This specification and follow-up claim are not used as distinguishing the mode of element with the difference of title, but the difference in function is used as the criterion of distinguishing with element.In the whole text, in the middle of specification and follow-up claims, be an open term mentioned " comprising ", therefore should be construed to " comprise but be not limited to ".In addition, " coupling " word comprises directly any and is indirectly electrically connected means at this, therefore, if describe a first device in literary composition, be coupled to one second device, represent that this first device can directly be electrically connected in this second device, or be indirectly electrically connected to this second device through other devices or connection means.
Please refer to Fig. 1, Fig. 1 is the schematic diagram according to the image processor 100 of one embodiment of the invention.As shown in Figure 1, image processor 100 is coupled to an image sensor 102 and one frame buffer (frame buffer) 120, and include an initial interpolation unit 104, one heterochromia gradient estimation unit 106, one edge textural characteristics determining means 108, one edge indicator record cell 110, one Dynamic Weights quantizing distribution unit 112 and a Weighted Interpolation unit 114, wherein, image sensor 102 is a single image sensor, and on it, be covered with bayer color filter array, also be that each pixel that image sensor 102 captures only has a kind of colouring information (redness, blue or green).In addition, image processor 100 can be applicable to the electronic installation of the single image sensor of any use, for example digital camera, video cameras, multimedia handset, supervisory control system, picture telephone etc.
Please also refer to Fig. 1 and Fig. 2, wherein Fig. 2 is the flow chart according to the image treatment method of one embodiment of the invention.With reference to figure 2, flow process is described below.
In step 200, flow process starts, and image sensor 102 is temporarily stored in a produced image data in frame buffer 120, wherein this image data is Bel's mosaic image, a part of data in this image data can be as shown in Figure 3, each pixel only has a color data, is also that the pixel that indicates " R " in Fig. 3 represents that this pixel only has red colouring information, and has lost green and blue colouring information; The pixel that indicates " G " represents only viridescent colouring information of this pixel, and has lost red and blue colouring information; And the pixel that indicates " B " represents that this pixel only has blue colouring information, and green and red colouring information have been lost.
In step 202, this image data that initial 104 pairs of interpolation units receive carries out initial interpolation, also in red pixel and blue pixel, estimate the green color information of four direction, and on green pixel, estimate redness and the blue colouring information of four direction.Specifically,, with reference to the partial image data shown in figure 3, suppose that i and j represent respectively the position of current column and row, current object pixel to be processed is (i, j), and represents for convenient, following content all represents original pixels color with c (i, j), with the pixel color that representative estimates, wherein c value can be R, G or B, on red pixel, estimating green information can obtain via following formula (1.1)~(1.4):
Wherein subscript L, R, T, B represent respectively upper and lower, left and right four direction.Then, estimate green information on to red pixel after, can further utilize the estimated green information going out on red pixel of above-mentioned formula to help estimate the red information on green pixel, and the red information estimating on green pixel can obtain via following formula (2.1)~(2.4) (following formula hypothetical target pixel is green pixel G (i, j)):
As for initial estimation in blue pixel, go out the green color value of four direction, and the blue color value that initial estimation goes out four direction on green pixel, can utilize above-mentioned two groups of formula to obtain in a similar manner (for example the R in above-mentioned formula (2.1)~(2.4) being replaced to B).
Should be noted, above-mentioned two groups of formula (1.1)~(1.4) are only for explanation, to estimate the colouring information of the four direction of object pixel with (2.1)~(2.4), its detailed computational methods are only an example explanation, and not as restriction of the present invention, as long as also can estimate the green color information of four direction in red pixel and blue pixel, and the redness and the blue colouring information that on green pixel, estimate four direction, above-mentioned formula can be replaced by any other applicable computing formula.
Then,, in step 204, heterochromia gradient estimation unit 106 is according to the color estimated value calculating in original pixels color and step 202, to calculate the heterochromia Grad of object pixel on four direction.In natural image, pixel color aberration in an object (green and red value differences for example, or green and blue value differences) there is level and smooth characteristic, in other words, along the heterochromia Grad of edge direction, can be less than the direction of bounding edge, so this characteristic can be used to judge edge and the textural characteristics of object pixel.Specifically, please refer to Fig. 4 A~Fig. 4 D, Fig. 4 A~Fig. 4 D is respectively the schematic diagram that calculates heterochromia Grad with top shade, below shade, left shade and right-hand shade.The upper direction shade shown in Fig. 4 A of take is example, with represent top heterochromia Grad, can be calculated by following mode:
Wherein (p, q) ∈ (i+m, j+n) | m=0 ,-1; N=0, ± 1}, and represent the green of correspondence position and red heterochromia value, represent the green of correspondence position and blue heterochromia value, and can be defined as follows:
If (p, q) is positioned at red pixel position, if (p, q) is positioned at green pixel position, if (p, q) is positioned at blue pixel position, if (p, q) is positioned at green pixel position,
About the heterochromia Grad of other three directions, it is also below heterochromia Grad left heterochromia Grad and right-hand heterochromia Grad all can utilize similar above-mentioned formula to try to achieve, owing to having in this area, conventionally know that the knowledgeable should be able to read heterochromia Grad above above-mentioned calculating formula after understand the heterochromia Grad how to calculate other three directions, therefore details does not repeat them here.
Above-mentioned relevant for calculating top heterochromia Grad formula be only an example explanation, and not as restriction of the present invention.For instance, if need further to strengthen accuracy, can recycle simple 1 * 3 low pass filter above-mentioned heterochromia value is carried out to filtering processing to reduce the impact that is subject to noise jamming, or each the heterochromia value in formula (3) is weighted and is added or subtracts each other to obtain top heterochromia Grad
In step 206, corresponding edge indicator in neighborhood pixels is first obtained in Edge texture characteristics determined unit 108 in frame buffer 120, wherein edge indicator system to be used to refer to those neighborhood pixels be to be respectively positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region (not especially significantly edge feature).For instance, please refer to Fig. 5, Fig. 5 is the schematic diagram of obtaining corresponding edge indicator in neighborhood pixels, as shown in Figure 5, suppose that current object pixel to be processed is R (i, j), the required neighborhood pixels of obtaining edge indicator can comprise the pixel that Fig. 5 is shown with mark triangular form and rhombus, wherein indicating leg-of-mutton is important representative pixels, and sign rhombus is the common general pixel of importance.In addition, in the present embodiment, the edge indicator system of each pixel is stored in last bit value of each pixel value in frame buffer, as for account form and the recording method of edge indicator, in subsequent step, narrates again.
Then,, in step 208, Edge texture characteristics determined unit 108 is according to the top heterochromia Grad determining in step 204 below heterochromia Grad left heterochromia Grad and right-hand heterochromia Grad and corresponding edge indicator in obtained neighborhood pixels in step 206, judge the Edge texture feature of object pixel, also judge that object pixel is to be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region (also there is no especially significantly edge feature).For instance, can judge by following judgment formula the Edge texture feature of object pixel:
If meet following formula (4.1), judge that object pixel is positioned at sharp keen vertical edge:
α ( Δ CD L ( i , j ) + ΔCD R ( i , j ) + ΔCD CH ( i , j ) × ( 16 - ρ ) / 16 ) >
( ΔCD T ( i , j ) + ΔCD B ( i , j ) + ΔCD CV ( i , j ) × ρ / 16 ) . . . . . . ( 4.1 )
If meet following formula (4.2), judge that object pixel is positioned at sharp keen vertical edge:
( &Delta;CD L ( i , j ) + &Delta;CD R ( i , j ) + &Delta;CD CH ( i , j ) &times; ( 16 - &rho; ) / 16 ) <
&alpha; ( &Delta;CD T ( i , j ) + &Delta;CD B ( i , j ) + &Delta;CD CV ( i , j ) &times; &rho; / 16 ) . . . . . . ( 4.2 )
And if formula (4.1) does not all meet with (4.2), judge that object pixel is positioned at texture region, wherein, in above-mentioned formula (4.1) and (4.2), wherein α is a zoom factor, be used for controlling vertical and horizontal limit in image with and the set sizes of textural characteristics &Delta;CD CV ( i , j ) = | G &OverBar; T ( i , j ) - R ( i , j ) - G &OverBar; B ( i , j ) - R ( i , j ) | , &Delta;CD CH ( i , j ) = | G &OverBar; L ( i , j ) - R ( i , j ) - G &OverBar; R ( i , j ) - R ( i , j ) | , Above-mentioned formula can be simplified and is revised as &Delta;CD CV ( i , j ) = | G &OverBar; T ( i , j ) - G &OverBar; B ( i , j ) | , &Delta;CD CH ( i , j ) = | G &OverBar; L ( i , j ) - G &OverBar; R ( i , j ) | , And ρ is for to calculate according to corresponding edge indicator in neighborhood pixels obtained in step 206.
In the calculating of ρ value, because the edge indicator of each pixel is last bit value that is stored in each pixel value in frame buffer, therefore, if suppose that edge indicator " 0 " represents that this pixel is positioned at sharp keen vertical edge, edge indicator " 1 " represents that this pixel is positioned at sharp keen horizontal edge, with reference to shown in figure 5, if indicating the edge indicator of leg-of-mutton representative neighborhood pixels is " 1 ", ρ value being added to 3(is also ρ=ρ+3), and if the edge indicator that indicates the general neighborhood pixels of rhombus is " 1 ", ρ value being added to 1(is also ρ=ρ+1), and if the edge indicator of neighborhood pixels is " 0 ", ρ is constant.Also, ρ value can be by being scaled the edge indicator of the neighborhood pixels shown in Fig. 5 cumulative obtaining after corresponding numerical value.
Judge object pixel be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region after, flow process enters step 210 to determine the edge indicator of object pixel, and edge indicator record cell 110 is also embedded in the edge indicator of object pixel in the pixel value of this stored object pixel of frame buffer.Specifically, suppose when object pixel is judged as perpendicular edge feature, remove to check last bit (Least Significant Bit, LSB) of the binary bit value of the object pixel primitive color value be stored in frame buffer simultaneously, if be not 0, add 1 or subtract 1 action; Otherwise, if be just 0, be left intact.Similarly, when object pixel is judged as horizontal sides feature, the LSB that carries out object pixel primitive color value checks, if be not 1, adds 1 or subtract 1 action; Otherwise, if be just 1, be left intact.Raw video color pixel values with 10 bit sizes, the variation of LSB is also not easy to be discovered by human eye, utilize this way just can be in the situation that not increasing extra buffer storage, when noting down these helpful information image feature judgement being provided for example, with reference to (can be used in step 206).
Then, determine object pixel be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region after, just can utilize the interpolation reconstruction mode of one of following formula (5.1)~(5.3) to reconstruct lost colouring information (shown below go out green color information for interpolation on red pixel):
If object pixel is to be positioned at sharp keen vertical edge, utilize following formula (5.1) to reconstruct lost colouring information (color of object information):
If object pixel is to be positioned at horizontal vertical edge, utilize following formula (5.2) to reconstruct lost colouring information:
If object pixel is to be positioned at texture region, utilize following formula (5.3) to reconstruct lost colouring information:
In order to determine each weighted value of required use in above-mentioned formula (5.1)~(5.3), in step 212, Dynamic Weights quantizing distribution unit 112 is according to the top heterochromia Grad calculating in step 204 below heterochromia Grad left heterochromia Grad and right-hand heterochromia Grad to table look-up, the mode quantizing is distributed the weights of two or four direction.For instance, if object pixel is to be positioned at sharp keen vertical edge, can utilize table one shown below to decide weights W t(i, j) and W b(i, j):
And if object pixel is to be positioned at sharp keen horizontal edge, can utilize table two shown below to decide weights W l(i, j) and W r(i, j):
And if object pixel is to be positioned at texture region, can utilize table three shown below to decide weights W t(i, j), W b(i, j), W l(i, j) and W r(i, j):
Wherein in table three be respectively top heterochromia Grad below heterochromia Grad left heterochromia Grad and right-hand heterochromia Grad the heterochromia Grad of middle maximum, and for inferior high heterochromia Grad.Also the magnitude relationship of heterochromia Grad that is the four direction of hypothetical target pixel is respectively &Delta;CD T ( i , j ) > &Delta;CD L ( i , j ) > &Delta;CD B ( i , j ) > &Delta;CD R ( i , j ) , And &Delta;CD L ( i , j ) < = 0.25 * &Delta;CD T ( i , j ) , W t(i, j) can be set as (21/32), W l(i, j) can be set as (3/32), and two other weights W b(i, j) and W r(i, j) just distributes remaining (8/32) together.
Should be noted, table one~table three is only that an example determines the mode of above-mentioned weight with explanation, and not as restriction of the present invention, as long as can judge according to heterochromia Grad the weight of each direction, the comparison rule in table one~table three and the weighted value of quantizing distribution can need to adjust according to designer to some extent.
After determining needed weight, in step 214, Weighted Interpolation unit 114 one of utilizes in above-mentioned formula (5.1)~(5.3) and to be weighted interpolation, the colouring information of being lost to reconstruct, after above-mentioned all step process are complete, the green color information that reconstructs not only can retain more image detail, and does not need to use any division arithmetic and extra buffer memory size.
In addition, in step 214, after the green color information reconstructing on red pixel (or blue pixel), the green color information that Weighted Interpolation unit 114 recyclings rebuild out reconstructs redness and the blue information on other neighborhood pixels, also can utilize the green color information of rebuilding out reconstruct redness or the blue information on other green pixels, the green color information that utilization rebuilds out reconstructs the blue information on other red pixels, or the green color information that utilization rebuilds out reconstructs the red information in other blue pixel.Specifically, if to utilize the green color information of rebuilding out reconstruct the red color information on other green pixels, can utilize following formula (6.1) or (6.2):
If (i, j) is positioned at green red listing:
If (i, j) is positioned at bluish-green listing:
In addition, if to utilize the green color information of rebuilding out reconstruct the red information in other blue pixel, can utilize following formula (6.3):
In addition, the above is in green and blue pixel, to reconstruct the formula of red color information, similar owing to reconstructing formula and the above-mentioned formula of blue colouring information on green and red pixel, in this area, have conventionally know the knowledgeable should be able to understand how above-mentioned formula is done slightly to revise after to reconstruct blue colouring information, therefore it will not go into details for details.
In step 214, reconstructing red color information can synchronize and process to reduce the time of implementation with blue colouring information, and because step 214 is to utilize the green color information that accuracy is higher to help rebuild red color information and blue colouring information, therefore image quality can be higher, and overall efficiency also can promote.
Finally, in step 216, flow process finishes, and Weighted Interpolation unit 114 outputs to reconstructed colouring information the processing unit of rear end, after treatment and be shown on a display screen.
In addition, although the image processor 100 use hardware circuits shown in Fig. 1 carry out implementation, yet the flow process of the image treatment method shown in Fig. 2 also can be carried out implementation with software, and is not limited to use hardware circuit.Specifically, please refer to Fig. 6, one host computer 600 at least includes a processor 610 and a computer readable media 620, and wherein computer readable media 620 can be a hard disk or other storage device, and computer readable media 620 stores a computer program 622.When processor 610 is carried out computer program 622, the step shown in host computer 600 meeting execution graphs 2.
In image processor of the present invention and image treatment method, because prior art carries out the operation of pixel color interpolation conventionally by the weights by each edge direction, yet these technology not only need to use expensive division arithmetic conventionally, and the mode of the trying to achieve complexity to weights is also quite high, cause also relative raising of hardware implementation cost.Therefore, image processor of the present invention and image treatment method can, not needing to use division arithmetic and utilizing and quite simply table look-up under action, reach the advantage of using peripheral adjustment method of weighting.In addition, for the trickle marginal texture treatment efficiency of strengthening human eye to be easier to be concerned about, what one embodiment of the invention had proposed not need extra buffer storage just can record the method for edge indicator, by the minimum effective bit of the primitive color value of object pixel (Least Significant Bit, LSB) position is changed to the numerical value of edge indicator, so just can in the treatment step that determines edge feature direction, the edge judged result of the front neighborhood pixels of reference with reinforcement, judge the accuracy of the rim condition of object pixel.

Claims (20)

1. an image processor, includes:
One initial interpolation unit, in order to capture an image data from one frame buffer, wherein each pixel in this image data only has a kind of colouring information, and for the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information;
One heterochromia gradient estimation unit, is coupled to this initial interpolation unit, in order to four the second colouring informations according to this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad;
One edge textural characteristics determining means, is coupled to this heterochromia gradient estimation unit, in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And
One edge indicator record cell, is coupled to this Edge texture characteristics determined unit, determines whether revising the bit value of this first colouring information of this object pixel be stored in this frame buffer in order to the Edge texture feature according to this object pixel.
2. image processor as claimed in claim 1, wherein this image data is Bel's mosaic image.
3. image processor as claimed in claim 1, wherein this Edge texture characteristics determined unit decides the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel and the edge indicator of at least one neighborhood pixels.
4. image processor as claimed in claim 3, wherein the edge indicator of this at least one neighborhood pixels is last bit value that is stored in the colouring information of this at least one neighborhood pixels in this frame buffer.
5. image processor as claimed in claim 1, wherein this edge indicator record cell determines whether revising last bit value of this first colouring information of this object pixel that is stored in this frame buffer according to the Edge texture feature of this object pixel.
6. image processor as claimed in claim 1, wherein this Edge texture characteristics determined unit decides this object pixel according to four heterochromia Grad of this object pixel and is positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region.
7. image processor as claimed in claim 6, wherein when this object pixel is positioned at sharp keen vertical edge, this edge indicator record cell determines that last bit value of this first colouring information of this object pixel of this frame buffer is one in " 1 " or " 0 "; When this object pixel is positioned at sharp keen horizontal edge, this edge indicator record cell determines that last bit value of this first colouring information of this object pixel of this frame buffer is another in " 1 " or " 0 "; And when this object pixel is positioned at texture region, this edge indicator record cell is not revised last bit value of this first colouring information of this object pixel of this frame buffer.
8. image processor as claimed in claim 1, also includes:
One Dynamic Weights quantizing distribution unit, be coupled to this heterochromia gradient estimation unit and this Edge texture characteristics determined unit, the Edge texture feature being used for according to four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights; And
One Weighted Interpolation unit, be coupled to this initial interpolation unit and this Dynamic Weights quantizing distribution unit, wherein this Weighted Interpolation unit is weighted addition by the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations, to obtain target second colouring information of this object pixel.
9. an image treatment method, includes:
From one frame buffer, capture an image data, wherein each pixel in this image data only has a kind of colouring information;
For the object pixel in this image data, according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information;
According to four the second colouring informations of this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad;
In order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel; And
According to the Edge texture feature of this object pixel, determine whether revising the bit value of this first colouring information of this object pixel that is stored in this frame buffer.
10. image treatment method as claimed in claim 9, wherein this image data is Bel's mosaic image.
11. image treatment methods as claimed in claim 9, the step that wherein determines the Edge texture feature of this object pixel includes:
According to four heterochromia Grad of this object pixel and the edge indicator of at least one neighborhood pixels, decide the Edge texture feature of this object pixel.
12. image treatment methods as claimed in claim 11, wherein the edge indicator of this at least one neighborhood pixels is last bit value that is stored in the colouring information of this at least one neighborhood pixels in this frame buffer.
13. image treatment methods as claimed in claim 9, wherein include according to the step of bit value that the Edge texture feature of this object pixel determines whether revising this first colouring information of this object pixel that is stored in this frame buffer:
According to the Edge texture feature of this object pixel, determine whether revising last bit value of this first colouring information of this object pixel that is stored in this frame buffer.
14. image treatment methods as claimed in claim 9, the step that wherein determines the Edge texture feature of this object pixel includes:
According to four heterochromia Grad of this object pixel, decide this object pixel and be positioned at sharp keen vertical edge, sharp keen horizontal edge or texture region.
15. image treatment methods as claimed in claim 14, wherein include according to the step of bit value that the Edge texture feature of this object pixel determines whether revising this first colouring information of this object pixel that is stored in this frame buffer:
When this object pixel is positioned at sharp keen vertical edge, determine that last bit value of this first colouring information of this object pixel of this frame buffer is one in " 1 " or " 0 ";
When this object pixel is positioned at sharp keen horizontal edge, determine that last bit value of this first colouring information of this object pixel of this frame buffer is another in " 1 " or " 0 "; And
When this object pixel is positioned at texture region, do not revise last bit value of this first colouring information of this object pixel of this frame buffer.
16. image treatment methods as claimed in claim 9, also include:
According to the Edge texture feature of four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights; And
By the plurality of weight, at least two the second colouring informations in four of this object pixel the second colouring informations are weighted to addition, to obtain target second colouring information of this object pixel.
17. 1 kinds of image processors, include:
One initial interpolation unit, in order to capture an image data from one frame buffer, wherein each pixel in this image data only has a kind of colouring information, and for the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information;
One heterochromia gradient estimation unit, is coupled to this initial interpolation unit, in order to four the second colouring informations according to this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad;
One edge textural characteristics determining means, is coupled to this heterochromia gradient estimation unit, in order to decide the Edge texture feature of this object pixel according to four heterochromia Grad of this object pixel;
One Dynamic Weights quantizing distribution unit, be coupled to this heterochromia gradient estimation unit and this Edge texture characteristics determined unit, the Edge texture feature being used for according to four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights; And
One Weighted Interpolation unit, be coupled to this initial interpolation unit and this Dynamic Weights quantizing distribution unit, wherein this Weighted Interpolation unit is weighted addition by the plurality of weight at least two the second colouring informations in four of this object pixel the second colouring informations, to obtain target second colouring information of this object pixel.
18. image processors as claimed in claim 17, wherein this image data is Bel's mosaic image.
19. 1 kinds of image treatment methods, include:
From one frame buffer, capture an image data, wherein each pixel in this image data only has a kind of colouring information;
For the object pixel in this image data, this initial interpolation unit is according to one first colouring information of this object pixel itself and the colouring information of neighborhood pixels, with estimate this object pixel up, below, left and right-hand on four the second colouring informations, wherein the corresponding color of this first colouring information is different from four corresponding colors of the second colouring information;
According to four the second colouring informations of this object pixel calculate this object pixel up, below, left and right-hand on four heterochromia Grad;
According to four heterochromia Grad of this object pixel, decide the Edge texture feature of this object pixel;
According to the Edge texture feature of four heterochromia Grad and this object pixel of this object pixel, and use comparison list to decide a plurality of weights; And
By the plurality of weight, at least two the second colouring informations in four of this object pixel the second colouring informations are weighted to addition, to obtain target second colouring information of this object pixel.
20. image treatment methods as claimed in claim 19, wherein this image data is Bel's mosaic image.
CN201310027214.3A 2013-01-24 2013-01-24 Image processor and image treatment method Active CN103974043B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310027214.3A CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310027214.3A CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Publications (2)

Publication Number Publication Date
CN103974043A true CN103974043A (en) 2014-08-06
CN103974043B CN103974043B (en) 2016-02-10

Family

ID=51243022

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310027214.3A Active CN103974043B (en) 2013-01-24 2013-01-24 Image processor and image treatment method

Country Status (1)

Country Link
CN (1) CN103974043B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110490029A (en) * 2018-05-15 2019-11-22 瑞昱半导体股份有限公司 The image treatment method of differentiation processing can be done to face data
CN110858894A (en) * 2018-08-23 2020-03-03 瑞昱半导体股份有限公司 Color reconstruction device and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200620149A (en) * 2004-12-03 2006-06-16 Altek Corp System and method applied to adaptive image transformation
CN1870048A (en) * 2005-05-25 2006-11-29 凌阳科技股份有限公司 Edge strengthening method and device of Bel image and color image pick-up system
TW200643820A (en) * 2005-06-03 2006-12-16 Ultramedia Inc Color interpolation method with directed weights
US20070110300A1 (en) * 2005-11-17 2007-05-17 Hung-An Chang Color interpolation apparatus and color interpolation method utilizing edge indicators adjusted by stochastic adjustment factors to reconstruct missing colors for image pixels
CN101815220A (en) * 2009-02-20 2010-08-25 华晶科技股份有限公司 Method for correcting image color distortion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200620149A (en) * 2004-12-03 2006-06-16 Altek Corp System and method applied to adaptive image transformation
CN1870048A (en) * 2005-05-25 2006-11-29 凌阳科技股份有限公司 Edge strengthening method and device of Bel image and color image pick-up system
TW200643820A (en) * 2005-06-03 2006-12-16 Ultramedia Inc Color interpolation method with directed weights
US20070110300A1 (en) * 2005-11-17 2007-05-17 Hung-An Chang Color interpolation apparatus and color interpolation method utilizing edge indicators adjusted by stochastic adjustment factors to reconstruct missing colors for image pixels
CN101815220A (en) * 2009-02-20 2010-08-25 华晶科技股份有限公司 Method for correcting image color distortion

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110490029A (en) * 2018-05-15 2019-11-22 瑞昱半导体股份有限公司 The image treatment method of differentiation processing can be done to face data
CN110490029B (en) * 2018-05-15 2022-04-15 瑞昱半导体股份有限公司 Image processing method capable of performing differentiation processing on face data
CN110858894A (en) * 2018-08-23 2020-03-03 瑞昱半导体股份有限公司 Color reconstruction device and method
CN110858894B (en) * 2018-08-23 2021-11-26 瑞昱半导体股份有限公司 Color reconstruction device and method

Also Published As

Publication number Publication date
CN103974043B (en) 2016-02-10

Similar Documents

Publication Publication Date Title
Park et al. Single image dehazing with image entropy and information fidelity
US7844127B2 (en) Edge mapping using panchromatic pixels
Ding et al. Efficient dark channel based image dehazing using quadtrees
US9870600B2 (en) Raw sensor image and video de-hazing and atmospheric light analysis methods and systems
KR20190004271A (en) Fusion of parallax masks of color and mono images for macrophotography
US20080240602A1 (en) Edge mapping incorporating panchromatic pixels
US20140078347A1 (en) Systems and Methods for Reducing Noise in Video Streams
US10083497B2 (en) Demosaicing methods and apparatuses using the same
US9712720B2 (en) Image refocusing for camera arrays
WO2017052976A1 (en) A method and system of low-complexity histogram of gradients generation for image processing
CN110738609A (en) method and device for removing image moire
US11645734B2 (en) Circuitry for image demosaicing and contrast enhancement and image-processing method
CN104867111A (en) Block-blur-kernel-set-based heterogeneous video blind deblurring method
US8587705B2 (en) Hardware and software partitioned image processing pipeline
US9008421B2 (en) Image processing apparatus for performing color interpolation upon captured images and related method thereof
CN111429371A (en) Image processing method and device and terminal equipment
CN113052923B (en) Tone mapping method, tone mapping apparatus, electronic device, and storage medium
CN103974043B (en) Image processor and image treatment method
Jin et al. Color correction and local contrast enhancement for underwater image enhancement
Prakash et al. Color image demosaicing using sparse based radial basis function network
CN109584275B (en) Target tracking method, device, equipment and storage medium
CN111369435A (en) Color image depth up-sampling method and system based on self-adaptive stable model
Wang et al. A bilateral filtering based ringing elimination approach for motion-blurred restoration image
Hsu et al. A hybrid algorithm with artifact detection mechanism for region filling after object removal from a digital photograph
CN103077396B (en) The vector space Feature Points Extraction of a kind of coloured image and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant