CN114155166A - Interpolation method for image color restoration based on FPGA - Google Patents
Interpolation method for image color restoration based on FPGA Download PDFInfo
- Publication number
- CN114155166A CN114155166A CN202111474556.0A CN202111474556A CN114155166A CN 114155166 A CN114155166 A CN 114155166A CN 202111474556 A CN202111474556 A CN 202111474556A CN 114155166 A CN114155166 A CN 114155166A
- Authority
- CN
- China
- Prior art keywords
- color
- component
- restored
- green
- reconstructed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 239000003086 colorant Substances 0.000 claims abstract description 18
- 239000011159 matrix material Substances 0.000 claims description 58
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 30
- 238000004364 calculation method Methods 0.000 claims description 21
- 230000002093 peripheral effect Effects 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 5
- 238000010276 construction Methods 0.000 claims description 2
- 230000002708 enhancing effect Effects 0.000 claims description 2
- JXASPPWQHFOWPL-UHFFFAOYSA-N Tamarixin Natural products C1=C(O)C(OC)=CC=C1C1=C(OC2C(C(O)C(O)C(CO)O2)O)C(=O)C2=C(O)C=C(O)C=C2O1 JXASPPWQHFOWPL-UHFFFAOYSA-N 0.000 claims 4
- 238000003708 edge detection Methods 0.000 description 4
- 230000003044 adaptive effect Effects 0.000 description 3
- 238000006467 substitution reaction Methods 0.000 description 2
- 101100248200 Arabidopsis thaliana RGGB gene Proteins 0.000 description 1
- 210000000683 abdominal cavity Anatomy 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- G06T5/77—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4023—Decimation- or insertion-based scaling, e.g. pixel or line decimation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
Abstract
The invention relates to an interpolation method for restoring image colors based on FPGA, which is characterized in that after the edge information of an original BAYER image is obtained on the basis of a self-adaptive interpolation method, a new color restoration formula is established according to the actual working environment and the color correlation of each channel, and in the missing color reconstruction process, the missing color components in the BAYER image are restored according to the gradient size and direction of the non-reconstructed color components of pixels around the reconstructed pixels and the gradient relation of the non-reconstructed color components at the reconstructed pixels. The restored color components are close to one side with small gradient change along the edge direction, and have larger difference with one side with large gradient change along the edge direction, so that details contained in other colors are reserved and enhanced, the color restored by interpolation according to a new color restoration formula eliminates the problem of false color at the edge of the image in the traditional method at the position where the color changes violently, and the edge of the image is clearer and sharper.
Description
Technical Field
The invention relates to an interpolation method for image color restoration based on an FPGA (field programmable gate array), belonging to the technical field of image processing.
Background
The data format output by the current color cmos image sensor is generally in a BAYER format, and compared with an RGB data format, a BAYER format image has two colors missing at each pixel position. To obtain a full-color image, interpolation is typically used to restore the other two colors missing at each pixel location. The interpolation reduction method used in the industry at present is mainly a linear interpolation method and an adaptive interpolation method using image edge information as a guide. The linear interpolation is to select the average value of the surrounding same colors according to the color arrangement mode of the BAYER image to restore the missing colors. In 1997, adams and hamilton found that twice as many green pixels as red and blue pixels would contain more edge information of the original image, so that the adaptive interpolation method is invented.
The basic principle of the self-adaptive interpolation method is that under an RGB color space, an edge detection operator which is most suitable for a detection data gradient of a product application scene is designed, then interpolation of a green component is carried out in a correct direction according to the indication of a calculation result of the edge detection operator, reconstruction of the red component and the blue component is carried out according to edge information of a reconstructed green component, and then interpolation reconstruction is completed by adopting a corresponding algorithm. The self-adaptive interpolation algorithm effectively solves the problem that the edge part of an image interpolated by the linear interpolation algorithm is blurred or jagged after normal imaging or zooming.
However, the edge-oriented interpolation algorithm has problems in certain specific scenes and in the environment of the abdominal cavity of the human body. For example, when interpolating colors at edges, especially missing red and blue colors, it is easy to cause serious color distortion, and the intra-abdominal green color component is very small compared to the red color component, and many details in the intra-abdominal cavity are more present in the red color component, which makes the adaptive interpolation method unsuitable for application in an endoscope system.
Disclosure of Invention
Aiming at the defects of the technology, the invention provides an interpolation method for image color restoration based on an FPGA. According to the method, after the edge information of an original BAYER image is obtained on the basis of a self-adaptive interpolation method, various traditional methods of interpolating colors at the edge by using an edge-oriented interpolation algorithm are abandoned, a new color reduction formula is established according to the actual working environment and the color correlation of each channel, and the missing color component in the BAYER image is reduced according to the gradient size and direction of the non-reconstructed color component of the surrounding pixels of the reconstructed pixel and the gradient relation of the non-reconstructed color component at the reconstructed pixel in the missing color reconstruction process.
In order to achieve the above object, the present invention provides an interpolation method for BAYER data color reduction based on FPGA, which defines the size of BAYER image to be processed as MXN and the data bit width as Z for convenience of description, and is characterized by comprising the following steps:
step S1: constructing a sliding matrix sequence when restoring a green component G of a BAYER image, and specifically adopting a matrix of (2k +1) X (2k +1), wherein k is a positive integer and 1<k<M and 1<k<N, using the odd-even of the image row as the basis for distinguishing the color at the restoration position, and when the original BAYER color component at the matrix (i, j) position is the green component Gi,jWhen the green component is restored, the matrix data reading format can be divided into the following two cases:
when the original BAYER color component of the matrix (i, j) position is Ri,j(blue component) to restore the green component G missing thereini,jThe matrix data reading format is as follows:
data(2k+1)… | datak… | data1 | |
L1 | R(i-k,j-k) | R(i-k,j) | R(i-k,j+k) |
… | … | … | … |
Lk | R(i,j-k) | R(i,j) | R(i,j+k) |
… | … | … | … |
L(2k+1) | R(i+k,j-k) | R(i+k,j) | R(i+k,j+k) |
② when the original BAYER color component of the matrix (i, j) position is Bi,j(blue component) to restore the green component G missing thereini,jThe matrix data reading format is as follows:
data(2k+1)… | datak… | data1 | |
L1 | B(i-k,j-k) | B(i-k,j) | B(i-k,j+k) |
… | … | … | … |
Lk | B(i,j-k) | B(i,j) | B(i,j+k) |
… | … | … | … |
L(2k+1) | B(i+k,j-k) | B(i+k,j) | B(i+k,j+k) |
。
step S2: calculating the gradient size in the horizontal or vertical direction of the position of the green component to be restored, wherein the gradient edge calculation formula at the position is as follows:
wherein m is {1,2,3 …, k }, m is a positive integer and 2m-1 is less than or equal to k,for the influence factors of the peripheral other color components on the reconstructed green component, f (x) is pixel data at (i, j), when calculating the edge gradient in the horizontal direction, f (x ± 2m) and f (x ± 2m ± 1) represent pixel data at coordinates shifted left and right by 2m or (2m ± 1) of (i, j), when calculating the edge gradient in the vertical direction, f (x ± 2m) and f (x ± 2m ± 1) represent pixel data at coordinates shifted up and down by 2m or (2m ± 1) of (i, j).
Step S3: selecting an interpolation direction by taking a calculation result of a gradient edge calculation formula as a guide, determining a green component interpolation reduction rule according to the gradient magnitude and direction of non-green components of surrounding pixels of a reconstruction pixel and the gradient relation of the non-green components at the reconstruction pixel, specifically, performing green component interpolation reduction along the direction of an edge according to the direction of a gradient edge at a (i, j) coordinate position obtained by calculation of the gradient edge calculation formula, and performing G component interpolation reductioni,jRepresents the restored green component, and when the green component is reconstructed in the horizontal direction, f (x) is the pixel data at (i, j), and f (x ± 2m) and f (x ± 2m ± 1) represent the pixel data at coordinates shifted left and right by 2m or (2m ± 1) of (i, j). When the green component is reconstructed along the vertical direction, f (x) is the pixel data at (i, j), f (x ± 2m) and f (x ± 2m ± 1) represent the pixel data at coordinates (i, j) shifted up and down by 2m or (2m ± 1), and the interpolation reduction rule follows the following calculation formula:
step S4: performing data fusion on the green component obtained in the step 3 and the original BAYER image, changing the bit width of the image data into 3Z, and constructing a sliding matrix required for restoring the red component and the blue component based on the fused image data, specifically, performing (2h +1) X (2h +1) matrix construction on the fused image data, wherein h is a positive integer, 1< k < M and 1< k < N, and the matrix data reading format can be divided into the following four cases:
when the matrix (i, j) has only a green component, the other color components in the vertical direction are red components, and the other color components in the horizontal direction are blue components, the matrix data reading format is:
when the matrix (i, j) has only a green component, the other color components in the vertical direction are blue components, and the other color components in the horizontal direction are red components, the matrix data reading format is as follows:
data(2h+1)… | datah… | data1 | |
L1 | G(i-h,j-h) | GB(i-h,j) | G(i-h,j+h) |
… | … | … | … |
Lh | RG(i,j-h) | G(i,j) | RG(i,j+h) |
… | … | … | … |
L(2h+1) | G(i+h,j-h) | GB(i+h,j) | G(i+h,j+h) |
when there are green and blue components at the matrix (i, j) position, the matrix data reading format is:
data(2h+1)… | datah… | data1 | |
L1 | RG(i-h,j-h) | G(i-h,j) | RG(i-h,j+h) |
… | … | … | … |
Lh | G(i,j-h) | GB(i,j) | G(i,j+h) |
… | … | … | … |
L(2h+1) | RG(i+h,j-h) | G(i+h,j) | RG(i+h,j+h) |
when the green component and the red component exist in the position of the matrix (i, j), the matrix data reading format is as follows:
data(2h+1)… | datah… | data1 | |
L1 | GB(i-h,j-h) | G(i-h,j) | GB(i-h,j+h) |
… | … | … | … |
Lh | G(i,j-h) | RG(i,j) | G(i,j+h) |
… | … | … | … |
L(2h+1) | GB(i+h,j-h) | G(i+h,j) | GB(i+h,j+h) |
。
step S5: g only for original BAYER imagei,jThe green component position is restored to the red component Ri,jAnd a blue component Bi,jSpecifically, the matrix readout format is as shown in the first and second matrix readout formats in step S4, and the interpolation reduction rule follows the following calculation formula:
(1) when the red component is restored at the matrix (i, j) position, where Ri,jRepresenting the reconstructed red component, f (x) is the red component at (i, j), g (x) is the green component at (i, j), the reconstruction formula is as follows:
firstly, whenAnd is Or p is 1hgx +2p-1<p is 1hgx-2p +1 andwhen the temperature of the water is higher than the set temperature,
② whenAnd is Or p is 1hgx +2p-1<p is 1hgx-2p +1 andwhen the temperature of the water is higher than the set temperature,
(2) when the blue component is restored at the matrix (i, j) position, where Bi,jRepresenting the reconstructed red component, f (x) being the blue component at (i, j), g (x) being the green component at (i, j), the reconstruction formula is as follows:
fourthly whenAnd is Or p is 1hgx +2p-1<p is 1hgx-2p +1 andwhen the temperature of the water is higher than the set temperature,
wherein p is {1,2,3 …, h }, p is a positive integer and 2p-1 ≦ h,for the peripheral green component influence factor on the reconstructed color component, when the color is restored in the horizontal direction, f (x) is a color component identical to the restored color at (i, j), and f (x ± 2p ± 1) represent color components identical to the restored color at coordinates shifted left and right by 2p or (2p ± 1) of (i, j). When a color is restored in the vertical direction, f (x) is pixel data at (i, j), and f (x ± 2p ± 1) represent color components that are the same as the restored color at coordinates shifted up and down by 2p or (2p ± 1) of (i, j); when the color is restored in the horizontal direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represent a shift of the (i, j) coordinates by 2p to the left or right or a shift of the (2p ± 1) coordinates from the green component. When the color is restored along the vertical directionIn color, g (x) is the green component at (i, j), and g (x ± 2p) and g (x ± 2p ± 1) represent the green component at coordinates (i, j) shifted up or down by 2p or (2p ± 1).
Step S6, R Only for original BAYER imagei,jThe red component position restores the blue component Bi,jOr only Bi,jThe blue component position is restored to the red component Bi,jSpecifically, the matrix readout format follows the following calculation formula as shown in the matrix readout format of the third and fourth types in step S4:
wherein h is>m>0, m is a positive integer,. DELTA.f45°(i, j) and Δ f45°(i, j) are 45 ° directional and 135 ° upper edge gradient information, respectively.Andthe other color components around 45 degrees and 135 degrees respectively influence the reconstructed color component, f (i, j) is the color component needing to be restored at (i, j), and f (i +2m-1, j-2m +1) and f (i-2m +1, j +2m-1) respectively represent the same color component as the restored color component at the coordinate of which the (i, j) coordinate is deviated from the 45 degrees direction by (2m +/-1). f (i +2m-1, j-2m +1), f (i-2m +1, j +2m-1) represent the same color component as the restored color component at a coordinate shifted by 135 DEG (2m + -1) in the (i, j) coordinate, g (i, j) is the green component at (i, j), and g (i +2m-1, j-2m +1), g (i-2m +1, j +2m-1) represent the green component at a coordinate shifted by 45 DEG by 2m or (2m + -1), respectively. g (i +2m-1, j-2m +1) and g (i-2m +1, j +2m-1) represent green at coordinates shifted by 135 DEG from (2m + -1) in the (i, j) coordinateA color component.
Step S7, according to the gradient results of 45 DEG and 135 DEG directions, judging the direction of the image gradient at the reduction position, and performing color reconstruction according to the direction of the edge, datai,jRepresenting the colors of the last two positions which are lacked in the image, and substituting the color component of the position in the formula according to the color category needing to be reconstructed to calculate the corresponding color component, wherein the reconstruction formula is as follows:
when the color is reconstructed along the direction of 45 degrees, the color is reconstructed when And | (h-1)2 × gi, j-m ═ 1hgi +2m-1, j +2m-1>| (h-1)2 ═ fx-m ═ 1hgi-2m +1, j-2m +1|, orAnd is When the temperature of the water is higher than the set temperature,
② when the color is reconstructed along the 45-degree direction And | (h-1)2 × gi, j-m ═ 1hgi +2m-1, j +2m-1<L (h-1)2 x fx-m ═ 1hgi-2m +1, j-2m +1| orAnd is When the temperature of the water is higher than the set temperature,
③ when the color is reconstructed along the 135-degree direction And | (h-1)2 × gi, j-m ═ 1hgi +2m-1, j-2m +1|, respectively>1hgi-2m +1, j +2m-1| (h-1)2 × fx-m |, orAnd is When the temperature of the water is higher than the set temperature,
when the color is reconstructed along the 135-degree direction And | (h-1)2 × gi, j-m ═ 1hgi +2m-1, j-2m +1|, respectively<L (h-1)2 x fx-m 1hgi-2m +1, j +2m-1 or whenAnd is When the temperature of the water is higher than the set temperature,
step S8, performing data threshold constraint on the interpolated color component, specifically, when one or more color component values of the green, red, and blue components of RGB are greater than threshold 2ZWhen-1, take the value of 2Z-1, below threshold 2ZAnd (4) taking the original value when the image is-1, and performing data fusion to output RGB image data.
The invention ensures that the restored color component is close to the side with small gradient change along the edge direction and has larger difference with the side with large gradient change along the edge direction, thereby keeping and enhancing the details contained in other colors, eliminating the problem of false color at the edge of the image in the traditional method at the position with violent color change according to the color restored by interpolation of a new color restoration formula, simultaneously ensuring that the edge of the image is clearer and sharper, being applied to an endoscope camera system, being capable of restoring richer details and realizing real-time pipeline processing in FPGA.
Drawings
Fig. 1 is a flowchart of an interpolation method for color reduction of a BAYER image based on an FPGA according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments.
The present invention will use the BAYER data collected by medical electronic endoscopes and high-definition CMOS image sensor chips. The data is 12bit wide, 1920X1080 high-definition images transmitted in an HDMI external synchronization mode, and the original BAYER data of RGGB is sequentially arranged. And sending the data output by the front-stage sensor into an FPGA chip for image processing by an SERDES transmission protocol to start image interpolation reduction processing. This embodiment has exactly the same steps as the inventive content, and to avoid repetition, only the key data are listed:
the image size MXN has M1920, N1080, and Z12. Specifically, when green component reconstruction is performed, the original BAYER data needs to pass through step S1, step S2, step S3, where k is 2 and m is 1, the size of the matrix of (2k +1) X (2k +1) is selected as a 5X5 matrix, the values of k and m are substituted into an edge detection operator formula, and according to an edge calculation result, pixel coordinates in the matrix along an edge direction are selected for interpolation, so that all green components missing in an image are finally restored. In the method, in the process of reconstructing and restoring the green component, the height directions of other colors in the pixel coordinates along the edge direction and the gradient relation of the corresponding colors at the reconstructed pixels are judged, and then a special color restoration formula is used, so that the reconstructed green component is close to the side with small gradient change and has a larger difference with the side with large gradient change, and the details contained in the other colors are reserved and enhanced. The green component at the reconstruction and reduction position has less sawtooth effect at the obvious edge than bilinear interpolation, and the reconstructed green component has richer and clearer details in the edge direction while avoiding the large edge compared with the common self-adaptive interpolation method.
The green component obtained in step S4 is image-fused with the original BAYER image, where the bit width of the image data to be restored becomes 36 bits, and the unreconstructed color component is replaced with 0. In step S4 and step S5, the value of h is 1, the value of p is 1, and the value of m is 1, and a 3X3 matrix processing window is constructed by using the fused data. And respectively calculating and restoring the red component and the blue component of the green component sampling point in the original BAYER by using respective color restoration formulas according to the gradient change directions and the gradient change sizes of the green component color in the pixel in the surrounding coordinate and the green component color at the reconstructed pixel, which are used when restoration is performed, by using the existing red component and the different blue component positions in the surrounding coordinate at the color restoration point, so as to obtain the missing red component and the missing blue component at the green component position in the original BAYER image. The reconstructed red component or blue component is close to the side with small gradient change of the surrounding color and is different from the side with large gradient change. This enriches and clears the detail presented by the reconstructed red and blue components.
In step S6 and step S7, only one color of the red component or blue component pixel is missing after step 3 is completed. Wherein h is 1 and m is 1. And substituting the values of h and m into 45-degree and 135-degree edge detection operator formulas, and selecting colors in pixel coordinates along the edge direction in a 3X3 matrix according to the edge calculation result to perform interpolation reduction, so as to finally reduce the missing red component and the missing blue component. In the method, in the process of reconstructing and restoring the missing component, the direction of the gradient of the green component in the pixel coordinate along the edge direction and the gradient relation with the color of the green component at the reconstructed pixel are newly added, and a special color restoration formula is used, so that the red component and the blue component reconstructed in the step are close to the side with small gradient change and have larger difference with the side with large gradient change.
In step S8, data threshold constraint is performed on the interpolated color component, specifically, when a numerical value of one or more color components of the green component, the red component, and the blue component of RGB is greater than or equal to a threshold 4095, a value 4095 is obtained, and when the numerical value is less than the threshold 4095, the original value is taken, and then data fusion is performed, and finally high definition 1080pRGB image data is output, where a data bit width is 36 bits.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (4)
1. An interpolation method based on FPAG image color reduction is characterized by comprising the following steps,
step S1: constructing a sliding matrix sequence for restoring the green component G of the BAYER image, in particular using a matrix of (2k +1) X (2k +1), whichWhere k is a positive integer of 1<k<M and 1<k<N, using the odd-even of the image row as the basis for distinguishing the color at the restoration position, and when the original BAYER color component at the matrix (i, j) position is the green component Gi,jWhen the green component is restored, the matrix data reading format can be divided into the following two cases:
when the original BAYER color component of the matrix (i, j) position is Ri,j(blue component) to restore the green component G missing thereini,jThe matrix data reading format is as follows:
② when the original BAYER color component of the matrix (i, j) position is Bi,j(blue component) to restore the green component G missing thereini,jThe matrix data reading format is as follows:
;
step S2: calculating the gradient size in the horizontal or vertical direction of the position of the green component to be restored, wherein the gradient edge calculation formula at the position is as follows:
wherein m is {1,2, 3.., k }, m is a positive integer and 2m-1 is less than or equal to k,f (x) is the pixel data at (i, j), when calculating the edge gradient in the horizontal direction, f (x + -2 m) and f (x + -2m + -1) represent the pixel data at the coordinates shifted by 2m to the left and right or (2m + -1) when calculating the edge gradient in the vertical direction, f (x) is the pixel data at (i, j)x ± 2m) and f (x ± 2m ± 1) represent pixel data at coordinates (i, j) shifted up and down by 2m or (2m ± 1);
step S3: selecting an interpolation direction by taking a calculation result of a gradient edge calculation formula as a guide, determining a green component interpolation reduction rule according to the gradient magnitude and direction of non-green components of surrounding pixels of a reconstruction pixel and the gradient relation of the non-green components at the reconstruction pixel, specifically, performing green component interpolation reduction along the direction of an edge according to the direction of a gradient edge at a (i, j) coordinate position obtained by calculation of the gradient edge calculation formula, and performing G component interpolation reductioni,jRepresenting the restored green components, f (x) being pixel data at (i, j) when the green components are reconstructed in the horizontal direction, f (x ± 2m) and f (x ± 2m ± 1) representing pixel data at coordinates shifted left and right by 2m or (2m ± 1) when the (i, j) coordinates are reconstructed, f (x) being pixel data at (i, j) when the green components are reconstructed in the vertical direction, f (x ± 2m) and f (x ± 2m ± 1) representing pixel data at coordinates shifted up and down by 2m or (2m ± 1) when the (i, j) coordinates are reconstructed, the interpolation restoration rule following the following calculation formula:
step S4: performing data fusion on the green component obtained in the step 3 and the original BAYER image, changing the bit width of the image data into 3Z, and constructing a sliding matrix required for restoring the red component and the blue component based on the fused image data, specifically, performing (2h +1) X (2h +1) matrix construction on the fused image data, wherein h is a positive integer, 1< k < M and 1< k < N, and the matrix data reading format can be divided into the following four cases:
when the matrix (i, j) has only a green component, the other color components in the vertical direction are red components, and the other color components in the horizontal direction are blue components, the matrix data reading format is:
when the matrix (i, j) has only a green component, the other color components in the vertical direction are blue components, and the other color components in the horizontal direction are red components, the matrix data reading format is as follows:
when there are green and blue components at the matrix (i, j) position, the matrix data reading format is:
when the green component and the red component exist in the position of the matrix (i, j), the matrix data reading format is as follows:
;
step S5: g only for original BAYER imagei,jThe green component position is restored to the red component Ri,jAnd a blue component Bi,jSpecifically, the matrix readout format is as shown in the first and second matrix readout formats in step S4, and the interpolation reduction rule follows the following calculation formula:
(1) when the red component is restored at the matrix (i, j) position, where Ri,jRepresenting the reconstructed red component, f (x) is the red component at (i, j), g (x) is the green component at (i, j), the reconstruction formula is as follows:
(2) when the blue component is restored at the matrix (i, j) position, where Bi,jRepresenting the reconstructed red component, f (x) being the blue component at (i, j), g (x) being the green component at (i, j), the reconstruction formula is as follows:
wherein p is {1,2, 3.., h }, p is a positive integer and 2p-1 is less than or equal to h,for the peripheral green component influence factor on the reconstructed color component, when the color is restored in the horizontal direction, f (x) is a color component identical to the restored color at (i, j), and f (x ± 2p ± 1) represent color components identical to the restored color at coordinates shifted by 2p to the left and right of (i, j) coordinates or (2p ± 1); when a color is restored in the vertical direction, f (x) is pixel data at (i, j), and f (x ± 2p ± 1) represent color components that are the same as the restored color at coordinates shifted up and down by 2p or (2p ± 1) of (i, j); when the color is restored in the horizontal direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represents a left-right shift of the (i, j) coordinate by 2p or a (2p ± 1) coordinate from the green component; when the color is restored in the vertical direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represents a green component at coordinates shifted up and down by 2p or (2p ± 1) of (i, j);
step S6, R Only for original BAYER imagei,jThe red component position restores the blue component Bi,jOr only Bi,jThe blue component position is restored to the red component Bi,jSpecifically, the matrix readout format follows the following calculation formula as shown in the matrix readout format of the third and fourth types in step S4:
wherein h > m > 0, m is a positive integer,. DELTA.f45°(i, j) and Δ f45°(i, j) 45 DEG directional and 135 DEG upper edge gradient information respectively,andthe influence factors of other color components around 45 degrees and 135 degrees on the reconstructed color component are respectively, f (i, j) is the color component needing to be restored at (i, j), f (i +2m-1, j-2m +1) and f (i-2m +1, j +2m-1) respectively represent the color component which is the same as the restored color component at the position of the coordinate of (i, j) which is deviated in the 45-degree direction by (2m +/-1);
f (i +2m-1, j-2m +1), f (i-2m +1, j +2m-1) represent the same color component as the restored color component at the coordinate (i, j) which is shifted in the direction of 135 degrees, g (i, j) is the green component at (i, j), g (i +2m-1, j-2m +1), g (i-2m +1, j +2m-1) represent the green component at the coordinate (i, j) which is shifted in the direction of 45 degrees, 2m or (2m +1), respectively; g (i +2m-1, j-2m +1) and g (i-2m +1, j +2m-1) represent green components at coordinates which are deviated in the direction of 135 degrees from the coordinates (2m +/-1) of (i, j);
step S7, according to the gradient results of 45 DEG and 135 DEG directions, judging the direction of the image gradient at the reduction position, and performing color reconstruction according to the direction of the edge, datai,jRepresenting the colors of the last two positions which are lacked in the image, and substituting the color component of the position in the formula according to the color category needing to be reconstructed to calculate the corresponding color component, wherein the reconstruction formula is as follows:
when the color is reconstructed along the direction of 45 degrees, the color is reconstructed whenAnd isOrAnd isWhen the temperature of the water is higher than the set temperature,
② when the color is reconstructed along the 45-degree directionAnd isOrAnd isWhen the temperature of the water is higher than the set temperature,
③ when the color is reconstructed along the 135-degree directionAnd isOr whenAnd isWhen the temperature of the water is higher than the set temperature,
when the color is reconstructed along the 135-degree directionAnd isOr whenAnd isWhen the temperature of the water is higher than the set temperature,
step S8, performing data threshold constraint on the interpolated color component, specifically, when one or more color component values of the green, red, and blue components of RGB are greater than threshold 2zWhen-1, take the value of 2z-1, below threshold 2zTaking an original value at the moment of-1, and performing data fusion to output RGB image data;
the invention ensures that the restored color component is close to one side with small gradient change along the edge direction and has larger difference with one side with large gradient change along the edge direction, thereby keeping and enhancing the details contained in other colors, eliminating the problem of false color at the edge of the image in the traditional method at the position with violent color change according to the color restored by interpolation of a new color restoration formula, simultaneously ensuring that the edge of the image is clearer and sharper, being applied to an endoscope camera system, being capable of restoring richer details and realizing real-time pipeline processing in FPGA.
2. The interpolation method for image color restoration based on FPGA of claim 1, wherein in step S3, the interpolation of green component restores the calculation formula:
wherein the matrix size used is (2k +1) X (2k +1), k being a positive integer, 1<k<M and 1<k<N, m ═ {1,2, 3.., k }, m is a positive integer and 2m-1 ≦ k, δ is the influence factor of other peripheral color components on the reconstructed green component, Gi,jRepresents the restored green components, f (x) is the pixel data at (i, j) when the green components are reconstructed in the horizontal direction, and f (x ± 2m) and f (x ± 2m ± 1) represent the pixel data at coordinates shifted left and right by 2m or (2m ± 1) of (i, j); when the green component is reconstructed in the vertical direction, f (x) is pixel data at (i, j), and f (x ± 2m) and f (x ± 2m ± 1) represent pixel data at coordinates shifted up and down by 2m or (2m ± 1) in (i, j).
3. The interpolation method for FPGA-based image color restoration according to claim 1, wherein in step S3, the restoration formula for restoring the red color component R _ (i, j) and the blue color component B _ (i, j) for only G _ (i, j) green color component position of the original BAYER image is:
(1) when the red component is restored at the matrix (i, j) position, where Ri,jRepresenting the reconstructed red component, f (x) is the red component at (i, j), g (x) is the green component at (i, j), the reconstruction formula is as follows:
(2) when the blue component is restored at the matrix (i, j) position, where Bi,jRepresenting the reconstructed red component, f (x) being the blue component at (i, j), g (x) being the green component at (i, j), the reconstruction formula is as follows:
where p is {1,2,3 …, h }, p is a positive integer and 2p-1 ≦ h, p is an influence factor of the peripheral green component on the reconstructed color component, f (x) is a color component identical to the restored color at (i, j) when the color is restored in the horizontal direction, and f (x ± 2p) and f (x ± 2p ± 1) represent color components identical to the restored color at coordinates shifted by 2p to the left and right of (i, j) or (2p ± 1) coordinates; when a color is restored in the vertical direction, f (x) is pixel data at (i, j), and f (x ± 2p ± 1) represent color components that are the same as the restored color at coordinates shifted up and down by 2p or (2p ± 1) of (i, j); when the color is restored in the horizontal direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represents a left-right shift of the (i, j) coordinate by 2p or a (2p ± 1) coordinate from the green component; when the color is restored in the vertical direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represent green components at coordinates shifted up and down by 2p or (2p ± 1) in (i, j).
4. The interpolation method for FPGA-based image color restoration according to claim 1, wherein only R of the original BAYER image is processed in step S7i,jThe red component position restores the blue component Bi,jOr only Bi,jThe position of the blue component is restored to the red componentBi,jThe calculation formula of (2):
judging the direction of the image gradient at the reduction position according to the 45-degree and 135-degree direction gradient results, and performing color reconstruction according to the direction of the edge, datai,jRepresenting the colors of the last two positions which are lacked in the image, and substituting the color component of the position in the formula according to the color category needing to be reconstructed to calculate the corresponding color component, wherein the reconstruction formula is as follows:
when the color is reconstructed along the direction of 45 degrees, the color is reconstructed whenAnd isOrAnd isWhen the temperature of the water is higher than the set temperature,
② when the color is reconstructed along the 45-degree directionAnd isOrAnd isWhen the temperature of the water is higher than the set temperature,
③ when the color is reconstructed along the 135-degree directionAnd isOr whenAnd isWhen the temperature of the water is higher than the set temperature,
when the color is reconstructed along the 135-degree directionAnd isOr whenAnd isWhen the temperature of the water is higher than the set temperature,
where p is {1,2,3 …, h }, p is a positive integer and 2p-1 ≦ h, p is an influence factor of the peripheral green component on the reconstructed color component, f (x) is a color component identical to the restored color at (i, j) when the color is restored in the horizontal direction, and f (x ± 2p) and f (x ± 2p ± 1) represent color components identical to the restored color at coordinates shifted by 2p to the left and right of (i, j) or (2p ± 1) coordinates; when a color is restored in the vertical direction, f (x) is pixel data at (i, j), and f (x ± 2p ± 1) represent color components that are the same as the restored color at coordinates shifted up and down by 2p or (2p ± 1) of (i, j); when the color is restored in the horizontal direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represents a left-right shift of the (i, j) coordinate by 2p or a (2p ± 1) coordinate from the green component; when the color is restored in the vertical direction, g (x) is a green component at (i, j), and g (x ± 2p ± 1) represent green components at coordinates shifted up and down by 2p or (2p ± 1) in (i, j).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111474556.0A CN114155166A (en) | 2021-12-06 | 2021-12-06 | Interpolation method for image color restoration based on FPGA |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111474556.0A CN114155166A (en) | 2021-12-06 | 2021-12-06 | Interpolation method for image color restoration based on FPGA |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114155166A true CN114155166A (en) | 2022-03-08 |
Family
ID=80452497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111474556.0A Pending CN114155166A (en) | 2021-12-06 | 2021-12-06 | Interpolation method for image color restoration based on FPGA |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114155166A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116977826A (en) * | 2023-08-14 | 2023-10-31 | 北京航空航天大学 | Reconfigurable neural network target detection system and method under edge computing architecture |
-
2021
- 2021-12-06 CN CN202111474556.0A patent/CN114155166A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116977826A (en) * | 2023-08-14 | 2023-10-31 | 北京航空航天大学 | Reconfigurable neural network target detection system and method under edge computing architecture |
CN116977826B (en) * | 2023-08-14 | 2024-03-22 | 北京航空航天大学 | Reconfigurable neural network target detection method under edge computing architecture |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105517671B (en) | Video frame interpolation method and system based on optical flow method | |
JP6094863B2 (en) | Image processing apparatus, image processing method, program, integrated circuit | |
TWI387935B (en) | Image generation method, program therefor, and storage medium for storing the program | |
CN107895345B (en) | Method and device for improving resolution of face image | |
TW201044863A (en) | Image processing method, image processing apparatus, and recording medium | |
CN110557584B (en) | Image processing method and device, and computer readable storage medium | |
CN104376568B (en) | Method for processing DICOM (digital imaging and communications in medicine) medical images on basis of formats | |
US8520099B2 (en) | Imaging apparatus, integrated circuit, and image processing method | |
CN104580933A (en) | Multi-scale real-time monitoring video stitching device based on feature points and multi-scale real-time monitoring video stitching method | |
CN108734668B (en) | Image color recovery method and device, computer readable storage medium and terminal | |
US20110080463A1 (en) | Image processing apparatus, method, and recording medium | |
CN107680043A (en) | Single image super-resolution output intent based on graph model | |
CN110852953A (en) | Image interpolation method and device, storage medium, image signal processor and terminal | |
CN114155166A (en) | Interpolation method for image color restoration based on FPGA | |
CN107292865B (en) | Three-dimensional display method based on two-dimensional image processing | |
CN104680484B (en) | A kind of method and device of image enhaucament | |
CN116542889A (en) | Panoramic video enhancement method with stable view point | |
WO2011074121A1 (en) | Device and method for detecting motion vector | |
CN112308773A (en) | Unmanned aerial vehicle aerial image nondestructive amplification and splicing fusion method | |
CN101976558B (en) | Method and device for scaling video images | |
Chung et al. | New joint demosaicing and arbitrary-ratio resizing algorithm for color filter array based on DCT approach | |
CN116309033A (en) | Super-resolution image generation method, device and storage medium | |
CN107610070A (en) | Free stereo matching process based on three shooting collections | |
JP2009207001A (en) | Image resolution improving apparatus, and learning device and method | |
CN104506784A (en) | Bell format image broken line eliminating method based on directional interpolation correction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |