CN116503259B - Mosaic interpolation method and system - Google Patents

Mosaic interpolation method and system Download PDF

Info

Publication number
CN116503259B
CN116503259B CN202310764929.0A CN202310764929A CN116503259B CN 116503259 B CN116503259 B CN 116503259B CN 202310764929 A CN202310764929 A CN 202310764929A CN 116503259 B CN116503259 B CN 116503259B
Authority
CN
China
Prior art keywords
gradient
green component
interpolation result
component interpolation
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310764929.0A
Other languages
Chinese (zh)
Other versions
CN116503259A (en
Inventor
王军
张智涵
范益波
殷海兵
朱旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Xinmai Microelectronics Co ltd
Original Assignee
Zhejiang Xinmai Microelectronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Xinmai Microelectronics Co ltd filed Critical Zhejiang Xinmai Microelectronics Co ltd
Priority to CN202310764929.0A priority Critical patent/CN116503259B/en
Publication of CN116503259A publication Critical patent/CN116503259A/en
Application granted granted Critical
Publication of CN116503259B publication Critical patent/CN116503259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4023Scaling of whole images or parts thereof, e.g. expanding or contracting based on decimating pixels or lines of pixels; based on inserting pixels or lines of pixels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Algebra (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a mosaic interpolation method and a mosaic interpolation system, which relate to an image data processing technology, wherein the method comprises the following steps: traversing the Bayer format image by using an n-by-n window, calculating gradient values in the upper, lower, left and right directions of the central point of the n-by-n window, and calculating gradient values in the horizontal direction and the vertical direction; calculating green component interpolation results A1 and A2 in the horizontal direction and the vertical direction of the center point according to gradient values in the upper, lower, left and right directions of the center point; according to gradient values in the horizontal direction and the vertical direction, green component interpolation results A1 and A2 of the center point in the horizontal direction and the vertical direction are weighted to obtain a green component interpolation result B with a missing center point; and calculating a red component interpolation result and a blue component interpolation result according to the green component interpolation result B of the current pixel point and the color difference component of the current pixel point. The application provides a demosaicing algorithm based on gradient value fusion, which determines gradient values existing in an image through an accurate and efficient gradient detection process.

Description

Mosaic interpolation method and system
Technical Field
The present application relates to image data processing technology, and in particular, to a mosaic interpolation method and system.
Background
The image is an important carrier of visual information and has extremely rich information content expression. The most common way of image acquisition is by a camera, where an image sensor is responsible for converting ambient incident light into an electron stream and an image processor is responsible for generating a suitable digital image. The sensor imaging process adopts a CFA array to filter incident light rays so as to reduce imaging cost. The most common CFA array is in Bayer format, and an imaged image filtered by Bayer format is called a Raw image, and each position only contains one of R, G, B three-color components, corresponding to an imaging effect of gray scale. Because human eyes are more sensitive to colors, in order to improve the visual effect of an image, a demosaicing algorithm is required to restore a Bayer image into an RGB true color image, the demosaicing algorithm is a core algorithm for determining the imaging quality of a camera, and the efficient low-complexity demosaicing algorithm is always a research hot spot in the industry. The traditional demosaicing algorithm is often too simple, and mostly ignores the effect of edge information, so that the interpolation quality is poor, and the interpolation result is pseudo-color and zipper effect. A representative interpolation algorithm principle is outlined below:
bilinear interpolation is the simplest demosaicing method, which simply uses color information in the area near the interpolation center point, and adopts average weighting calculation to recover the missing color component of the interpolation center. Bilinear interpolation has the advantages of low calculation cost and simple principle, but the omission of the weighting weight and the edge information of the mean value leads the interpolated image to easily generate pseudo-color and zipper effects in a high-frequency edge area.
The high quality linear interpolation increases the consideration of correlation between different color planes based on bilinear interpolation. For different interpolation conditions of Bayer images, the optimal weight of linear interpolation is fitted based on a large number of image data sets, the color recovery quality superior to that of bilinear interpolation is obtained, and the calculation cost is not increased greatly. However, it does not use gradient values in the image, and therefore it is impossible to identify edges and flat areas, resulting in a large interpolation error near the edges.
The HA interpolation algorithm is a classical content adaptive interpolation algorithm, and detection and utilization of image edge information are increased on the basis of linear interpolation. Firstly, calculating gradient values in all directions of a center to be interpolated, selecting a direction with the smallest gradient to interpolate a missing green component, and then calculating a missing red and blue component. The HA algorithm HAs a good edge recovery effect by utilizing gradient values in the image, but the edge detection means is too simple, and the image error after interpolation is not purified, so that the HA algorithm HAs a visual effect defect.
Disclosure of Invention
Aiming at the defects in the prior art, the application provides a mosaic difference method.
In order to solve the technical problems, the application is solved by the following technical scheme:
a mosaic interpolation method comprising the steps of:
receiving a Bayer format image;
traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
calculating gradient values in the up, down, left and right directions of the n x n window center points, and calculating gradient values in the horizontal direction and the vertical direction;
calculating a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point according to gradient values in the up, down, left and right directions of the central point;
according to gradient values in the horizontal direction and the vertical direction, weighting a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point to obtain a green component interpolation result B with a missing central point;
after the calculation of the green component interpolation results B of all the center points is completed according to the steps, the red component interpolation results and the blue component interpolation results are calculated according to the green component interpolation results B of the current pixel point and the color difference components of the current pixel point.
As a preferred solution, the method for calculating the gradient values in the up, down, left and right directions of the n×n window center point includes:
the gradient numerical calculation in the left direction includes: in the window range, calculating a gradient value according to the pixel points on the left line of the central point and the data of the pixel points on the left line of the central point, which extend more than one line up and down, and weighting the parameters calculated by the pixel points on the left line of the central point;
the gradient numerical calculation in the right direction includes: in the window range, calculating a gradient value according to the pixel points on the right line of the central point and the data of the pixel points on the right line of the central point, which extend more than one line up and down respectively, and adding the weight of the parameter calculated by the pixel points on the right line of the central point;
the calculation of the gradient value in the upward direction includes: in the window range, calculating a gradient value according to the pixel points listed on the central point and the data of more than one column of pixel points extending left and right of the pixel points listed on the central point, and weighting the parameters calculated by the pixel points listed on the central point;
the gradient numerical calculation in the lower direction includes: and in the window range, calculating a gradient value according to the pixel points below the center point and the data of more than one row of pixel points extending to the left and right of the pixel points below the center point, and weighting the parameters calculated by the pixel points below the center point.
As a preferred embodiment, the method for calculating the gradient values in the horizontal direction and the vertical direction includes:
the gradient numerical calculation in the horizontal direction includes: calculating the sum of gradient values in the left direction and the right direction;
the gradient numerical calculation in the vertical direction includes: the sum of the gradient values in the up-and down-directions is calculated.
As a preferred embodiment, the method for calculating the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point from the gradient values in the up, down, left and right directions of the center point includes:
comparing the relative magnitudes of the gradient values in the left direction and the right direction, if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction, taking the smaller gradient value as a green component interpolation result A in the horizontal direction, and if the difference multiple of the gradient values in the two directions is within the first preset parameter times, adopting an HA interpolation algorithm to calculate;
comparing the relative magnitudes of the gradient values in the upper direction and the lower direction, taking the smaller gradient value compared with the smaller gradient value as a green component interpolation result A in the vertical direction if the gradient value in one direction exceeds a first preset parameter times of the gradient value in the other direction, and adopting an HA interpolation algorithm to calculate if the gradient value difference multiple of the gradient values in the two directions is within the first preset parameter times.
As a preferred solution, the method for obtaining the green component interpolation result B with missing center point after weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point according to the gradient values in the horizontal direction and the vertical direction includes:
comparing the relative magnitudes of the gradient values in the horizontal direction and the vertical direction of the center point, and if the gradient value in one direction is larger than the second preset parameter times of the gradient value in the other direction, taking the green component interpolation result in the smaller direction as a green component interpolation result B with the missing center point;
comparing the relative magnitudes of gradient values in the horizontal direction and the vertical direction of the center point, if the gradient value in one direction is larger than the gradient value in the other direction by more than a third preset parameter, taking the green component interpolation results A1 and A2 in the two directions as a green component interpolation result B with missing center points, wherein the weighted calculation method comprises the following steps: the green component interpolation result of the comparatively large direction is reduced, or weight of green component interpolation result of the smaller direction is emphasized;
wherein the second preset parameter is greater than the third preset parameter.
As a preferred solution, a method for calculating a red component interpolation result and a blue component interpolation result from a green component interpolation result B of a current pixel point and a color difference component of the current pixel point, includes:
red component interpolation and blue component interpolation calculations for the same line of red components for the green sampling locations: the original chromaticity of the green sampling position is subtracted by the color difference component of the green sampling position after the green component is recovered.
As a preferred solution, a method for calculating a red component interpolation result and a blue component interpolation result from a green component interpolation result B of a current pixel point and a color difference component of the current pixel point, includes:
blue component calculation of red sample position: subtracting a color difference component of the red sampling position after restoring the green component from a green component interpolation result B of the red sampling position;
red component calculation for blue sample position: and subtracting the color difference component of the blue sampling position after restoring the green component from the green component interpolation B result of the blue sampling position.
As a preferable scheme, the mosaic interpolation method also comprises an interpolation optimization method, which optimizes the green component interpolation result first, then calculates the red component interpolation result and the blue component interpolation result according to the green component interpolation result after green optimization,
optimizing the green component interpolation results includes: optimizing the green component interpolation of the red sampling position and the blue sampling position of the original sampling position;
calculating the color difference values of the upper, lower, left and right points of the center point of the window after interpolation and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
Based on the above manner, a mosaic interpolation system is provided, which comprises the following structures:
an image receiving module: for receiving Bayer format images;
the window module is used for traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
the gradient calculation module is used for calculating gradient values in the upper, lower, left and right directions of the central point of the n-n window and calculating gradient values in the horizontal direction and the vertical direction;
the first green component recovery module is used for calculating a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point according to gradient values in the up, down, left and right directions of the central point;
the green component recovery module II is used for weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the central point according to the gradient values in the horizontal direction and the vertical direction to obtain a green component interpolation result B with the missing central point;
and the red and blue component recovery module is used for calculating a red component interpolation result and a blue component interpolation result according to the green component interpolation result B of the current pixel point and the color difference component of the current pixel point.
As a preferred embodiment, the present application further comprises the following structure:
the effect improving module is used for optimizing the interpolation of the red component of the original sampling position and the green component of the blue sampling position, is connected behind the red component recovery module and returns an output result to the red component recovery module;
the functions of the effect promotion module include: calculating the color difference values of the upper, lower, left and right points of the center point and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
The application has the beneficial effects that:
the principle of the classical demosaicing algorithm is analyzed, the gradient values in the image are reasonably utilized to greatly improve the edge retaining effect after interpolation, and the post-processing effect of the image after interpolation is improved to reduce the visual phenomenon distortion. The application provides a demosaicing algorithm based on gradient value fusion, which is characterized in that gradient values existing in an image are determined through an accurate and efficient edge gradient detection process, missing green information in a Bayer format image is calculated, and missing blue components are recovered through calculation of the green information and color difference information.
All calculations in the application are limited in n x n windows, such as 5*5 window, and the number of hardware cache lines is small; secondly, the calculation process does not involve complex calculations such as division, evolution, reciprocal and the like, so that the delay on a key path is reduced; meanwhile, the difference method does not need iteration and prior parameter calculation, and reduces the consumption of operation resources. The method has lower calculation cost, is suitable for hardware realization, and can meet the high image precision and real-time processing requirements of modern application.
Interpolation calculation in other directions, such as interpolation in the diagonal direction, is deleted, and the scheme is simpler only for interpolation in the horizontal direction and the vertical direction of the calculation center point. Left and right/up and down are calculated separately, and pixels located at edge endpoints are accurately detected.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a flow chart of a mosaic interpolation method;
FIG. 2 is a schematic view of a window in a mosaic interpolation method;
fig. 3 is a window schematic diagram of gradient numerical calculation in the mosaic interpolation method.
Detailed Description
The present application will be described in further detail with reference to the following examples, which are illustrative of the present application and are not intended to limit the present application thereto.
Example 1
A mosaic interpolation method, as in fig. 1, comprising the steps of:
1) Receiving a Bayer format image;
2) Traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
3) Calculating gradient values of n x n window center points in the up, down, left and right directions, and calculating gradient values in the horizontal direction and the vertical direction;
4) Calculating a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point according to gradient values in the up, down, left and right directions of the central point;
5) According to gradient values in the horizontal direction and the vertical direction, weighting a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point to obtain a green component interpolation result B with a missing central point;
6) After the calculation of the green component interpolation results B of all the center points is completed according to the above 2) -5), the red component interpolation result and the blue component interpolation result are calculated according to the green component interpolation result B of the current pixel point and the color difference component of the current pixel point. As a preferred embodiment, the method for calculating the gradient values in the up, down, left and right directions of the n×n window center point includes:
gradient value in left directionGrad_HLThe calculation includes: in the window range, calculating a gradient value according to the pixel points on the left line of the central point and the data of the pixel points on the left line of the central point, which extend more than one line up and down, and weighting the parameters calculated by the pixel points on the left line of the central point;
gradient value in right directionGrad_HRThe calculation includes: in the window range, calculating a gradient value according to the pixel points on the right line of the central point and the data of the pixel points on the right line of the central point, which extend more than one line up and down respectively, and adding the weight of the parameter calculated by the pixel points on the right line of the central point;
upper gradient valueGrad_VTThe calculation includes: in the window range, calculating a gradient value according to the pixel points listed on the central point and the data of more than one column of pixel points extending left and right of the pixel points listed on the central point, and weighting the parameters calculated by the pixel points listed on the central point;
gradient value in downward directionGrad_VDThe calculation includes: within the window, according to the following pixel points of the center point,and extending data of more than one column of pixel points from left to right of the pixel points below the center point, calculating gradient values, and weighting parameters calculated by the pixel points below the center point.
In this embodiment, a 5*5 window is taken as an example, and fig. 2 shows a set of window pixel point data, and a calculation formula is listed for the understanding of the above disclosed method by taking this data as an example.
In step 1, a 5*5 window is used for sliding and traversing the Bayer format image, and gradient data of a central point in each window in all directions is calculated, wherein the gradient data comprise gradient data of the central point in four directions, namely an upper direction, a lower direction, a left direction and a right direction. Gradient values in four directions of the upper direction, the lower direction, the left direction and the right direction of the central point are respectively expressed asGrad_VT、Grad_VD、Grad_HL、Grad_HRThe gradient data calculation process in each direction is based on adjacent same-color components, and the gradient calculation process in each direction is a selected same-color pixel pair, for example, R20 and R22, G21 and G23 are same-color pixel pairs, and R20 and G21 are different-color pixel pairs.
The detailed calculation formula is as follows, wherein R, G, B represents the color component of which the sampling position of the Bayer format image is originally known, R is red, G is green, B is blue, and the adopted calculation area is shown in fig. 2 and 3.
Wherein 2 is a weight coefficient, a row b in the left direction of the center point is used as main calculation data to increase the weight, and a row a and a row c are respectively extended up and down to calculate the gradient value in the left direction of the center point; the right direction is the same.
Wherein 2 is a weight coefficient, a line e in the upper direction of the center point is used as main calculation data to increase the weight, and a line d and a line f are extended leftwards and rightwards respectively to calculate the gradient value in the upper direction of the center point; the same applies to the lower direction.
If the calculation is only performed by using the pixels of the line where the center point is located, the presence of an edge with a width of 1 in the extreme case cannot be detected, and an erroneous result is generated. According to the scheme, the edge can be effectively detected through extension calculation, the gradient numerical calculation mode is more careful and accurate, the detection range is large, and the weight is reasonable.
The method for calculating the gradient values in the horizontal direction and the vertical direction comprises the following steps:
the gradient numerical calculation in the horizontal direction includes: calculating the sum of gradient values in the left direction and the right direction;
the gradient numerical calculation in the vertical direction includes: calculating the sum of the gradient values in the upward direction and the downward direction, and recording the gradient value in the horizontal direction asGrad_H,The gradient value in the vertical direction is recorded asGrad_V,The calculation formula is as follows:
as a preferred embodiment, the method for calculating the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point according to the gradient values in the respective directions of the center point includes:
comparing the relative magnitudes of the gradient values in the left direction and the right direction, if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction, taking the smaller gradient value as a green component interpolation result in the horizontal direction, and if the difference multiple of the gradient values in the two directions is within the first preset parameter times, adopting an HA interpolation algorithm to calculate;
comparing the relative magnitudes of the gradient values in the upper direction and the lower direction, taking the smaller gradient value compared with the smaller gradient value as a green component interpolation result A in the vertical direction if the gradient value in one direction exceeds a first preset parameter times of the gradient value in the other direction, and adopting an HA interpolation algorithm to calculate if the gradient value difference multiple of the gradient values in the two directions is within the first preset parameter times.
Calculating a green component interpolation result A1 of the center point in the horizontal direction and a green component interpolation result A2 of the center point in the vertical direction, and comparingShould be noted asIntp_HAndIntp_Vin the followingIntp_HAndIntp_Vin the calculation process of (1), a calculation result is determined according to gradient data of each direction, gradient direction interpolation information with smaller difference is fused, and interpolation information with larger difference in gradient direction is abandoned. The specific calculation formula is as follows, wherein the formula is a medium variableP ij The positional pair (where P may take on the value R, G, B) is shown in figure 2,
referring to the above formula, taking the horizontal direction as an example, if the left direction gradient data exceeds 3 times of the right direction gradient data, the left direction of the center point is analyzed to be the edge with intense texture change, so that the value is discarded and only the value in the right direction is used as the horizontal interpolation result, and the same is true;
and (3) calculating the gradient difference between the left side and the right side of the center point by adopting an HA interpolation algorithm (namely the last else).
As a preferred solution, the method for obtaining the green component interpolation result B with missing center point after weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point according to the gradient values in the horizontal direction and the vertical direction includes:
comparing the relative magnitudes of the gradient values in the horizontal direction and the vertical direction of the center point, and if the gradient value in one direction is larger than the second preset parameter times of the gradient value in the other direction, taking the green component interpolation result in the smaller direction as a green component interpolation result B with the missing center point;
comparing the relative magnitudes of gradient values in the horizontal direction and the vertical direction of the center point, if the gradient value in one direction is larger than the gradient value in the other direction by more than a third preset parameter, taking the green component interpolation result A in the two directions as a green component interpolation result B with missing center point, wherein the weighted calculation method comprises the following steps: reducing the green component interpolation result A of the larger direction or emphasizing the green component interpolation result A of the smaller direction;
wherein the second preset parameter is greater than the third preset parameter.
The method adopts a mode that the green interpolation result A1 in the horizontal direction and the green interpolation result A2 in the vertical direction are mutually fused, and with reference to FIG. 2, a specific formula of a corresponding calculation process of a group of window data shown in FIG. 2 is as follows:
where g22 is the final calculated green component interpolation of the R22 sample position.
It can be understood that the final interpolation result is weighted by the combination of the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction. By analysing the gradient values in the horizontal and vertical directions, data that may be image edges are analysed, the scheme follows that the larger the gradient the more likely an image edge is and therefore the less weighted it is. The coefficients 3 and 1.5 are verified optimal solutions, and other optimal solutions exist after the data analysis is enlarged, so that the weighting purpose is to adjust the weighting weights when gradient difference degrees are different in the horizontal direction and the vertical direction.
Gradient fusion in the horizontal direction and the vertical direction adopts a stepwise fitting mode, and a final interpolation result is fused according to the gradient difference in the two directions, so that the interpolation accuracy is improved.
Completing all green component interpolation results according to the above, and then calculating missing red components and blue components in the Bayer format image based on the green component interpolation results includes:
blue component calculation of red sample position: subtracting a color difference component of the red sampling position from a green component interpolation result B of the red sampling position;
red component calculation for blue sample position: the second interpolation result of the green component of the blue sampling position subtracts the color difference component of the blue sampling position.
By way of example, the blue component calculation process for the red sample position is described with reference to position (2, 2) in fig. 2, with the following calculation formula,
in the above-mentioned method, the step of,b22 andg22 denote the blue component interpolation result and the green component interpolation result at the positions (2, 2), respectively, g11, g13, g31, and g33 denote the green component interpolation results at the corresponding positions, and B11, B13, B31, and B33 denote the original blue components at the corresponding positions, respectively.
The calculation formula of the interpolation result of the red component of the blue sampling position is as follows:
in the above formula, r13 and g13 represent the red component interpolation calculation result and the green component interpolation result at the positions (1, 3), respectively, g02, g04, g22 and g24 represent the green component interpolation result at the corresponding positions, and B02, B04, B22 and B24 represent the original blue component at the corresponding positions, respectively.
The red component and the blue component of the green sampling position are divided into two cases of red component same line and blue component same line.
First, red component interpolation and blue component interpolation calculation of the same row of red components of the green sampling position: the green sample position original chromaticity is subtracted by the color difference component of the green sample position, and with reference to the data of fig. 2, the calculation formula is as follows:
in the formula, R21 represents the red component interpolation calculation result at the position (2, 1), G20 and G22 represent the green component interpolation result at the corresponding position, R20 and R22 represent the original red component at the corresponding position, B11 and B31 represent the original blue component at the corresponding position, and G21 represents the original green component at the corresponding position.
And secondly, red component interpolation and blue component interpolation calculation of the same row of blue components of the green sampling position can be obtained by the same way: the green sample position original chromaticity is subtracted by the color difference component of the green sample position, and with reference to the data of fig. 2, the calculation formula is as follows:
in the formula, R12 represents the red component interpolation calculation result at the position (1, 2), G02, G22 represent the green component interpolation result at the corresponding position, R02, R22 represent the original red component at the corresponding position, B11, B13 represent the original blue component at the corresponding position, and G12 represents the original green component at the corresponding position.
Example 2:
the mosaic interpolation method further comprises an image restoration effect optimization method after the calculation of the interpolation result of all the chrominance components based on the embodiment 1 is completed, and the method comprises the following steps:
optimizing the green component interpolation results includes: optimizing the green component interpolation of the red sampling position and the blue sampling position of the original sampling position; and calculating a red component interpolation result and a blue component interpolation result according to the green component interpolation result after green optimization.
Calculating the color difference values of the upper, lower, left and right four points of the center point of the window after the interpolation results of all the chromaticities are completed according to the embodiment 1, and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
The interpolated green component is first optimized, and only the non-green positions of the original sampling positions are optimized (i.e. redColor sampling position and blue sampling position), the calculation process is as follows, referring to the window data disclosed in fig. 2, wherecdf 1-4 Respectively representing the color difference value calculation results of four points of the center up, down, left and right,cdf c as a result of the color difference calculation of the center point after interpolation,diff 1-4 respectively representcdf 1-4 And (3) withcdf c Is used to determine the absolute value of the difference of (c),weight 1-4 respectively represent the weight magnitudes of the weights in four directions,g22' are the green interpolation results of the purified center points.
The basic premise of this embodiment is that the Bayer format image has been interpolated preliminarily, and each position contains an original chromaticity and two interpolated chromaticities. According to the chromaticity type of a 5*5 window center point, calculating the chromatic aberration of the upper, lower, left and right four-adjacent domain positions of the center point and the chromatic aberration of the current interpolation center, then calculating the chromatic aberration gradient of the upper, lower, left and right directions of the center point and the interpolation center, linearly combining the chromatic aberration of the four directions according to the same logic in the interpolation process, adopting the reciprocal form to reduce the weighting weight of the chromatic aberration gradient in the large direction, and finally superposing the chromatic aberration and the known chromaticity of the center point to obtain an interpolation result. The optimization principle of the green component is a reciprocal value fusion idea of color difference after interpolation, so that interpolation false colors can be removed efficiently, and the image quality effect is improved.
Regarding n×n windows, after n=3, 5, and 7 are used as variables for calculation, the n=5 parameter obtains the result that is optimal for edge recognition and recovery, and it is not excluded that there are better effects obtained by executing the core thought disclosed above for other window parameters with more than 5, so n is protected to be an odd number value with more than 5 in this scheme.
Example 3:
based on the above disclosed method or any combination of methods, the present embodiment discloses a mosaic interpolation system, as shown in fig. 1, including the following structures:
an image receiving module: for receiving Bayer format images;
the window module is used for traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
the gradient calculation module is used for calculating gradient values in the upper, lower, left and right directions of the central point of the n-n window and calculating gradient values in the horizontal direction and the vertical direction;
the first green component recovery module is used for calculating a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point according to gradient values in the up, down, left and right directions of the central point;
the green component recovery module II is used for weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the central point according to the gradient values in the horizontal direction and the vertical direction to obtain a green component interpolation result B with the missing central point;
and the red and blue component recovery module is used for calculating a red component interpolation result and a blue component interpolation result according to the green component interpolation result B of the current pixel point and the color difference component of the current pixel point.
Based on the optimization method set forth in embodiment 2, the system further comprises the following structure:
the effect improving module is used for optimizing the interpolation of the red component of the original sampling position and the green component of the blue sampling position, is connected behind the red component recovery module and returns an output result to the red component recovery module;
the functions of the effect promotion module include: calculating the color difference values of the upper, lower, left and right points of the center point and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
Based on this, there are other sub-modules, and the sub-module functions are available with reference to the disclosure of embodiment 1 and embodiment 2, and will not be described again.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and the division of the modules or units, for example, is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed.
The modules may or may not be physically separate, i.e. located in one place, or may be distributed over a plurality of different places. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules may be stored in a readable storage medium if implemented in the form of software functional units and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A mosaic interpolation method, comprising the steps of:
receiving a Bayer format image; traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
calculating gradient values of n x n window center points in the up, down, left and right directions, and calculating gradient values in the horizontal direction and the vertical direction;
calculating a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the center point according to gradient values in the up, down, left and right directions of the center point, wherein the method specifically comprises the following steps:
comparing the relative magnitudes of the gradient values in the left direction and the right direction, and taking the gradient value which is smaller than the first preset parameter times of the gradient value in the other direction as a green component interpolation result A1 in the horizontal direction if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction;
comparing the relative magnitudes of the gradient values in the upper direction and the lower direction, and taking the smaller gradient value compared with the first preset parameter times of the gradient value in the other direction as a green component interpolation result A2 in the vertical direction if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction;
according to gradient values in the horizontal direction and the vertical direction, weighting a green component interpolation result A1 in the horizontal direction and a green component interpolation result A2 in the vertical direction of the central point to obtain a green component interpolation result B with a missing central point;
after the calculation of the green component interpolation results B of all the center points is completed according to the steps, the red component interpolation results and the blue component interpolation results are calculated according to the green component interpolation results B of the current pixel point and the color difference components of the current pixel point.
2. The mosaic interpolation method according to claim 1, wherein the calculating method for calculating gradient values in up, down, left and right directions of the n x n window center point comprises:
the gradient numerical calculation in the left direction includes: in the window range, calculating a gradient value according to the pixel points on the left line of the central point and the data of the pixel points on the left line of the central point, which extend more than one line up and down, and weighting the parameters calculated by the pixel points on the left line of the central point;
the gradient numerical calculation in the right direction includes: in the window range, calculating a gradient value according to the pixel points on the right line of the central point and the data of the pixel points on the right line of the central point, which extend more than one line up and down respectively, and adding the weight of the parameter calculated by the pixel points on the right line of the central point;
the calculation of the gradient value in the upward direction includes: in the window range, calculating a gradient value according to the pixel points listed on the central point and the data of more than one column of pixel points extending left and right of the pixel points listed on the central point, and weighting the parameters calculated by the pixel points listed on the central point;
the gradient numerical calculation in the lower direction includes: and in the window range, calculating a gradient value according to the pixel points below the center point and the data of more than one row of pixel points extending to the left and right of the pixel points below the center point, and weighting the parameters calculated by the pixel points below the center point.
3. A mosaic interpolation method according to claim 1 or 2, wherein said method for calculating gradient values in the horizontal and vertical directions comprises:
the gradient numerical calculation in the horizontal direction includes: calculating the sum of gradient values in the left direction and the right direction;
the gradient numerical calculation in the vertical direction includes: the sum of the gradient values in the up-and down-directions is calculated.
4. The mosaic interpolation method according to claim 1 or 2, wherein the method of calculating the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point from the gradient values in the up, down, left and right directions of the center point further comprises:
comparing the relative magnitudes of the gradient values in the left direction and the right direction, and if the difference multiple of the gradient values in the two directions is within a first preset parameter multiple, adopting an HA interpolation algorithm to calculate;
comparing the relative magnitudes of the gradient values in the upper direction and the lower direction, and if the difference multiple of the gradient values in the two directions is within a first preset parameter multiple, adopting an HA interpolation algorithm to calculate.
5. The mosaic interpolation method according to claim 4, wherein the method for obtaining the green component interpolation result B with missing center point by weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the center point according to the gradient values in the horizontal direction and the vertical direction comprises:
comparing the relative magnitudes of the gradient values in the horizontal direction and the vertical direction of the center point, and if the gradient value in one direction is larger than the second preset parameter times of the gradient value in the other direction, taking the green component interpolation result in the smaller direction as a green component interpolation result B with the missing center point;
comparing the relative magnitudes of gradient values in the horizontal direction and the vertical direction of the center point, if the gradient value in one direction is larger than the gradient value in the other direction by more than a third preset parameter, taking the green component interpolation results A1 and A2 in the two directions as a green component interpolation result B with missing center points, wherein the weighted calculation method comprises the following steps: the green component interpolation result of the comparatively large direction is reduced, or weight of green component interpolation result of the smaller direction is emphasized;
wherein the second preset parameter is greater than the third preset parameter.
6. The mosaic interpolation method according to claim 1, wherein the method for calculating the red component interpolation result and the blue component interpolation result from the green component interpolation result B of the current pixel point and the color difference component of the current pixel point comprises:
red component interpolation and blue component interpolation calculations for the same line of red components for the green sampling locations: the original chromaticity of the green sampling position is subtracted by the color difference component of the green sampling position after the green component is recovered.
7. The mosaic interpolation method according to claim 1 or 6, wherein the method for calculating the red component interpolation result and the blue component interpolation result from the green component interpolation result B of the current pixel point and the color difference component of the current pixel point comprises:
blue component calculation of red sample position: subtracting a color difference component of the red sampling position after restoring the green component from a green component interpolation result B of the red sampling position;
red component calculation for blue sample position: and subtracting the color difference component of the blue sampling position after restoring the green component from the green component interpolation result B of the blue sampling position.
8. The mosaic interpolation method according to claim 1, wherein the mosaic interpolation method further comprises an interpolation optimization method of optimizing the green component interpolation result first, and then calculating the red component interpolation result and the blue component interpolation result based on the green component interpolation result after green optimization,
optimizing the green component interpolation results includes: optimizing the green component interpolation of the red sampling position and the blue sampling position of the original sampling position;
calculating the color difference values of the upper, lower, left and right points of the center point of the window after interpolation and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
9. A mosaic interpolation system comprising the structure of:
an image receiving module: for receiving Bayer format images;
the window module is used for traversing the Bayer format image by using an n-by-n window, wherein n is an odd number more than 5;
the gradient calculation module is used for calculating gradient values in the upper, lower, left and right directions of the central point of the n-n window and calculating gradient values in the horizontal direction and the vertical direction;
the first green component recovery module is configured to calculate a green component interpolation result A1 in a horizontal direction and a green component interpolation result A2 in a vertical direction of the center point according to gradient values in up, down, left and right directions of the center point, and specifically includes:
comparing the relative magnitudes of the gradient values in the left direction and the right direction, and taking the gradient value which is smaller than the first preset parameter times of the gradient value in the other direction as a green component interpolation result A1 in the horizontal direction if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction;
comparing the relative magnitudes of the gradient values in the upper direction and the lower direction, and taking the smaller gradient value compared with the first preset parameter times of the gradient value in the other direction as a green component interpolation result A2 in the vertical direction if the gradient value in one direction exceeds the first preset parameter times of the gradient value in the other direction;
the green component recovery module II is used for weighting the green component interpolation result A1 in the horizontal direction and the green component interpolation result A2 in the vertical direction of the central point according to the gradient values in the horizontal direction and the vertical direction to obtain a green component interpolation result B with the missing central point;
and the red and blue component recovery module is used for calculating a red component interpolation result and a blue component interpolation result according to the green component interpolation result B of the current pixel point and the color difference component of the current pixel point.
10. The mosaic interpolation system of claim 9, further comprising the structure of:
the effect improving module is used for optimizing the interpolation of the red component of the original sampling position and the green component of the blue sampling position, is connected behind the red component recovery module and returns an output result to the red component recovery module;
the functions of the effect promotion module include: calculating the color difference values of the upper, lower, left and right points of the center point and the color difference value of the center point;
calculating the color difference gradient of the center point and the upper, lower, left and right points according to the color difference values of the upper, lower, left and right points and the color difference value of the center point;
and (3) carrying out weight treatment on the four color difference gradients, carrying out linear combination on color difference values of the upper, lower, left and right points by combining the color difference gradients after weight treatment, and superposing original color components to obtain an optimized green component interpolation result.
CN202310764929.0A 2023-06-27 2023-06-27 Mosaic interpolation method and system Active CN116503259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310764929.0A CN116503259B (en) 2023-06-27 2023-06-27 Mosaic interpolation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310764929.0A CN116503259B (en) 2023-06-27 2023-06-27 Mosaic interpolation method and system

Publications (2)

Publication Number Publication Date
CN116503259A CN116503259A (en) 2023-07-28
CN116503259B true CN116503259B (en) 2023-11-21

Family

ID=87330521

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310764929.0A Active CN116503259B (en) 2023-06-27 2023-06-27 Mosaic interpolation method and system

Country Status (1)

Country Link
CN (1) CN116503259B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117095134B (en) * 2023-10-18 2023-12-22 中科星图深海科技有限公司 Three-dimensional marine environment data interpolation processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008086037A2 (en) * 2007-01-10 2008-07-17 Flextronics International Usa Inc. Color filter array interpolation
EP2124187A1 (en) * 2008-05-22 2009-11-25 Telefonaktiebolaget LM Ericsson (PUBL) Apparatus and method for demosaicing
CN103347190A (en) * 2013-07-25 2013-10-09 华北电力大学 Edge-related and color-combined demosaicing and amplifying method
CN104159091A (en) * 2014-07-30 2014-11-19 广东顺德中山大学卡内基梅隆大学国际联合研究院 Color interpolation method based on edge detection
KR20150094350A (en) * 2014-02-11 2015-08-19 인천대학교 산학협력단 Method and apparatus for executing demosaicking using edge information
CN114445290A (en) * 2021-12-28 2022-05-06 中国科学技术大学 Hardware-oriented combined denoising and demosaicing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165389B2 (en) * 2004-03-15 2012-04-24 Microsoft Corp. Adaptive interpolation with artifact reduction of images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008086037A2 (en) * 2007-01-10 2008-07-17 Flextronics International Usa Inc. Color filter array interpolation
EP2124187A1 (en) * 2008-05-22 2009-11-25 Telefonaktiebolaget LM Ericsson (PUBL) Apparatus and method for demosaicing
CN103347190A (en) * 2013-07-25 2013-10-09 华北电力大学 Edge-related and color-combined demosaicing and amplifying method
KR20150094350A (en) * 2014-02-11 2015-08-19 인천대학교 산학협력단 Method and apparatus for executing demosaicking using edge information
CN104159091A (en) * 2014-07-30 2014-11-19 广东顺德中山大学卡内基梅隆大学国际联合研究院 Color interpolation method based on edge detection
CN114445290A (en) * 2021-12-28 2022-05-06 中国科学技术大学 Hardware-oriented combined denoising and demosaicing method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于残余插值的卷积神经网络去马赛克算法;贾慧秒;李春平;周登文;;南京信息工程大学学报(自然科学版)(第06期);全文 *
朱波 ; 汶德胜 ; 王飞 ; .改进的Bayer插值算法及其硬件实现.光电子.激光.2013,(第06期),全文. *

Also Published As

Publication number Publication date
CN116503259A (en) 2023-07-28

Similar Documents

Publication Publication Date Title
CN101917629B (en) Green component and color difference space-based Bayer format color interpolation method
JP5454075B2 (en) Image processing apparatus, image processing method, and program
JP6012375B2 (en) Pixel interpolation processing device, imaging device, program, and integrated circuit
JP5724185B2 (en) Image processing apparatus, image processing method, and program
CN102665030B (en) Improved bilinear Bayer format color interpolation method
JP4054184B2 (en) Defective pixel correction device
US8229212B2 (en) Interpolation system and method
CN103327220B (en) With green channel for the denoising method guided on low-light (level) Bayer image
CN110730336B (en) Demosaicing method and device
KR101389562B1 (en) Image signal processing apparatus and Method for the same
KR101225056B1 (en) Apparatus and method for reducing noise from image sensor
US8253829B2 (en) Image processing apparatus, imaging apparatus, and image processing method
US20100214446A1 (en) Image processing apparatus and image processing method
CN103595981A (en) Method for demosaicing color filtering array image based on non-local low rank
CN111539892A (en) Bayer image processing method, system, electronic device and storage medium
US11481873B2 (en) Method and apparatus for image processing
CN111539893A (en) Bayer image joint demosaicing denoising method based on guided filtering
CN116503259B (en) Mosaic interpolation method and system
KR101257946B1 (en) Device for removing chromatic aberration in image and method thereof
CN114445290A (en) Hardware-oriented combined denoising and demosaicing method
CN112422940A (en) Self-adaptive color correction method
CN110139087B (en) Image processing method based on Bayer arrangement
KR101327790B1 (en) Image interpolation method and apparatus
CN115471420A (en) Image processing device, imaging apparatus, method, electronic apparatus, and storage medium
JP5981824B2 (en) Pixel interpolation processing device, imaging device, program, and integrated circuit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 310000 4th floor, building 9, Yinhu innovation center, No.9 Fuxian Road, Yinhu street, Fuyang District, Hangzhou City, Zhejiang Province

Applicant after: Zhejiang Xinmai Microelectronics Co.,Ltd.

Address before: 310000 4th floor, building 9, Yinhu innovation center, No.9 Fuxian Road, Yinhu street, Fuyang District, Hangzhou City, Zhejiang Province

Applicant before: Hangzhou xiongmai integrated circuit technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant